Skip to main content


  • Research Article
  • Open Access

A Motion-Adaptive Deinterlacer via Hybrid Motion Detection and Edge-Pattern Recognition

EURASIP Journal on Image and Video Processing20082008:741290

  • Received: 31 March 2007
  • Accepted: 13 January 2008
  • Published:


A novel motion-adaptive deinterlacing algorithm with edge-pattern recognition and hybrid motion detection is introduced. The great variety of video contents makes the processing of assorted motion, edges, textures, and the combination of them very difficult with a single algorithm. The edge-pattern recognition algorithm introduced in this paper exhibits the flexibility in processing both textures and edges which need to be separately accomplished by line average and edge-based line average before. Moreover, predicting the neighboring pixels for pattern analysis and interpolation further enhances the adaptability of the edge-pattern recognition unit when motion detection is incorporated. Our hybrid motion detection features accurate detection of fast and slow motion in interlaced video and also the motion with edges. Using only three fields for detection also renders higher temporal correlation for interpolation. The better performance of our deinterlacing algorithm with higher content-adaptability and less memory cost than the state-of-the-art 4-field motion detection algorithms can be seen from the subjective and objective experimental results of the CIF and PAL video sequences.


  • Video Sequence
  • Temporal Correlation
  • Slow Motion
  • Video Content
  • Motion Detection

Publisher note

To access the full article, please see PDF.

Authors’ Affiliations

Department of Electrical Engineering, National Cheng Kung University, 1 Ta-Hsueh Road, Tainan, 701, Taiwan
Sunplus Technology Company Ltd, 19 Chuangsin 1st Road, Hsinchu, 300, Taiwan


© Gwo Giun Lee et al. 2008

This article is published under license to BioMed Central Ltd. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.