Creating and sharing knowledge for telecommunications

Evaluating a Feedback Channel based Transform Domain Wyner-Ziv Video Codec

Brites , C. ; Ascenso, J. ; Pedro , J. ; Pereira, F.

Signal Processing: Image Communication Vol. 23, Nº 4, pp. 269 - 297, April, 2008.

ISSN (print): 0923-5965
ISSN (online):

Scimago Journal Ranking: 0,48 (in 2008)

Digital Object Identifier: 10.1016/j.image.2008.03.002

Wyner-Ziv (WZ) video coding – a particular case of distributed video coding (DVC) – is a new video coding paradigm based on two major Information Theory results: the Slepian-Wolf and Wyner-Ziv theorems. In recent years, some practical WZ video coding solutions have been proposed with promising results. One of the most popular WZ video coding architectures in the literature uses turbo codes based Slepian-Wolf coding and a feedback channel to perform rate control at the decoder. This WZ video coding architecture has been first proposed by researchers at Stanford University and has been after adopted and improved by many research groups around the world. However, while there are many papers published with changes and improvements to this architecture, the precise and detailed evaluation of its performance, targeting its deep understanding for future advances, has not been made. Available performance results are mostly partial, under unclear and incompatible conditions, using vaguely defined and also sometimes architecturally unrealistic codec solutions. This paper targets the provision of a detailed, clear, and complete performance evaluation of an advanced transform domain WZ video codec derived from the Stanford turbo coding and feedback channel based architecture. Although the WZ video codec proposed for this evaluation is among the best available, the main purpose and novelty of this paper is the solid and comprehensive performance evaluation made which will provide a strong, and very much needed, performance reference for researchers in this WZ video coding field, as well as a solid way to steer future WZ video coding research.