- Research Article
- Open Access

# Edge Adaptive Color Demosaicking Based on the Spatial Correlation of the Bayer Color Difference

- Hyun Mook Oh
^{1}, - Chang Won Kim
^{1}, - Young Seok Han
^{1}and - Moon Gi Kang
^{1}Email author

**2010**:874364

https://doi.org/10.1155/2010/874364

© Hyun Mook Oh et al. 2010

**Received:**10 April 2010**Accepted:**24 September 2010**Published:**29 September 2010

## Abstract

An edge adaptive color demosaicking algorithm that classifies the region types and estimates the edge direction on the Bayer color filter array (CFA) samples is proposed. In the proposed method, the optimal edge direction is estimated based on the spatial correlation on the Bayer color difference plane, which adopts the local directional correlation of an edge region of the Bayer CFA samples. To improve the image quality with the consistent edge direction, we classify the region of an image into three different types, such as edge, edge pattern, and flat regions. Based on the region types, the proposed method estimates the edge direction adaptive to the regions. As a result, the proposed method reconstructs clear edges with reduced visual distortions in the edge and the edge pattern regions. Experimental results show that the proposed method outperforms conventional edge-directed methods on objective and subjective criteria.

## Keywords

- Edge Direction
- Green Channel
- Region Classification
- Linear Minimum Mean Square Error
- Color Filter Array

## 1. Introduction

*color demosaicking*or

*color interpolation*[2–25]. Generally, the correlation between color channels is utilized by assuming the smoothness color ratio [3, 4] or smoothness color difference [5–7]. These methods produce satisfactory results in a homogeneous region, while visible artifacts (such as zippers, Moiré effects, and blurring artifacts) are shown in edge regions.

In order to reduce interpolation errors in these regions, various approaches have been applied to color demosaicking. In [8–12], various edge indicators were used to prevent interpolation across edges. Gunturk et al. decomposed color channels into frequency subbands and updated the high-frequency subbands by applying a projection onto convex-sets (POCS) technique [13]. Zhang and Wu modeled color artifacts as noise factors and removed them by fusing the directional linear minimum mean squares error (LMMSE) estimates [14]. Alleysson et al. proposed frequency selective filters which adopt localization of the luminance and chrominance frequency components of a mosaicked image [15]. All of these approaches show highly improved results on the edge regions. However, the interpolation error and smooth edges in edge patterns or edge junctions are challenging issues in demosaicking methods.

As an approach to reconstruct the sharp edge, edge directed color demosaicking algorithms were proposed which aimed to find the optimal edge direction at each pixel location [16–25]. Since the interpolation is performed along the estimated edge direction, the edge direction estimation techniques play a main roll in these methods. In some methods [20–22], the edge directions of missing pixels are indirectly estimated in aid of the additional information from the horizontally and vertically prereconstructed images. Wu and Zhang found the edge direction based on the Fisher's linear discriminant so that the chance of the misclassification of each pixel is minimized [20]. Hirakawa and Parks proposed a homogeneity map-based estimation process, which adopted the luminance and chrominance similarities between the pixels on an edge [21]. Menon et al. proposed the direction estimation scheme using the smoothness color differences on the edges, where the color difference was obtained based on the directionally filtered green images [22]. In these methods, the sharp edges are effectively restored with the temporally interpolated images. However, the insufficient consideration for the competitive regions results in outstanding artifacts due to the inconsistent directional edge interpolation.

Recently, some methods that directly deal with the CFA problems such as CFA sampling [23–25], CFA noise [26] or both of the problems [27] were proposed. These methods studied the characteristics of the CFA samples and reconstructed the image without the CFA error propagation and the inefficient computations due to the preinterpolation process. Focusing on the demosaicking directly on the CFA samples, Chung and Chan studied the color difference variance of the pixels located along the horizontal or the vertical axis of CFA samples [23]. Tsai and Song introduced the concept of the spectral-spatial correlation (SSC) which represented the direct difference between Bayer CFA color samples [24]. Based on the SSC, they proposed heterogeneity-projection technique that used the smoothness derivatives of the Bayer sample differences on the horizontal or vertical edges. Based on the Tsai and Song's method, Chung et al. proposed modified heterogeneity-projection method that adaptively changed the mask size of the derivative [25].

As shown in [24, 25], difference of the Bayer samples provides key to directly estimate the edge direction on the Bayer pattern. In the conventional SSC-based methods, the smoothness of the Bayer color difference along an edge is examined, and the derivative of the differences along the horizontal or vertical axis is adopted as a criterion for edge direction estimation. However, in the complicated edge region, such as edge patterns or edge junctions, the edge direction is usually indistinguishable since derivatives along the line are very close to the horizontal and vertical directions. To carry out more accurate interpolation on these regions, region adaptive interpolation scheme which estimates the edge direction adaptive to the region types with the given directional correlation on Bayer color difference is required.

In this paper, a demosaicking method that estimates the edge direction directly on the Bayer CFA samples is proposed based on the spatial correlation of the Bayer color difference. To estimate the edge direction with accuracy, we investigate the consistency of the Bayer color difference within a local region. We focus on the local similarity of the Bayer color difference plane not only along the directional axis but also beside the axis within the local region. Since the edge directions of the pixels on and around the edge contribute to the estimation simultaneously, the correlation adopted in the proposed method is a stable and effective basis to estimate the edge direction in the complicated edge regions. Based on the spatial correlation on the Bayer color difference plane, we propose an edge adaptive demosaicking method that classifies an image into edge, edge pattern, and flat regions, and that estimates the edge direction according to the region type. From the result of the estimated edge direction, the proposed method interpolates the missing pixel values along the edge direction.

The rest of the paper is organized as follows. Using the difference plane of the down sampled CFA images, the spatial correlation on the Bayer color difference plane is examined in Section 2. Based on the examined correlation between the CFA sample differences, the proposed edge adaptive demosaicking method is described with the criteria for the edge direction detection and the region classification in Section 3. Also, the interpolation scheme along the estimated edge direction is depicted, which aims to restore the missing pixels with reduced artifacts. Section 4 presents comparisons between the proposed and conventional edge directed methods in terms of the quantitative and qualitative criteria. Finally, the paper is concluded with Section 5.

## 2. Spatial Correlation on the Bayer Color Difference Plane

In the proposed method, the region type and the edge direction are determined directly on the Bayer CFA samples based on the correlation of the Bayer color difference. For the efficient criteria for these main parts of the proposed demosaicking method, the Bayer color difference is reexamined on the down sampled low-resolution (LR) Bayer image plane so that the direction-oriented consistency of the Bayer color differences is emphasized within the local region of an edge.

where the , and are Bayer CFA samples of red and green channels in pixel location, respectively, is a missing sample of green channel, and and are the Bayer color difference on the horizontal and vertical directional lines, respectively. The Bayer color difference is assumed piecewise constant along an edge since it inherits the characteristics of spectral and spatial correlations [24].

where is the Bayer color difference plane given the different Bayer LR images, . Note that, the correlation between the sampling positions are simultaneously considered with the inter channel correlation in (3).

## 3. Proposed Edge Directed Color Demosaicking Algorithm Using Region Classifier

In the proposed edge adaptive demosaicking method, the edge directions are optimally estimated according to the region type. Based on the spatial correlation of the Bayer color difference, the proposed method classifies an image into three regions, such as edge, edge pattern, and flat regions. In each of the regions, we classify the edge direction type ( ) as the horizontal ( ) or vertical ( ) direction. When the direction is not obviously determined, we decide the direction as nondirectional ( ). Therefore, the final types of the edge direction are . In the proposed edge direction estimation, the diagonal directional edge is considered as the combination of the horizontal and vertical directional edges. According to the determined edge direction, the missing pixels are interpolated with weighting functions. Following the edge types and the edge directions, we present the way to classify the region and to estimate the edge direction based on the spatial correlation on the Bayer color difference plane. To utilize the correlation, we describe the details of the interpolation process as the restoration of missing channels of LR images. Given the obtained LR images in Figure 1(b), the missing channels of each LR color images are . By considering the sampling rate of the green channel, the proposed method first interpolates the missing green channels, than the red and blue channels are interpolated by using the fully interpolated green channel images. This is helpful to improve the red and blue channel interpolation quality, since the green channel has more edge information than the red and blue channels. Since the Bayer LR images are shifted to each other, they are interpolated in the same way for each channel. Once all of the missing channels are reconstructed at each sampling position, the full-color LR images are upsampled and they are registered according to the original position in the HR grid. The overall process of the proposed adaptive demosaicking method is depicted in Figure 6, where the process is composed of estimating Bayer color difference plane, the region classification, the edge direction estimation, and the directional interpolation for each green and red/blue channel interpolation. In the following subsections, the way of interpolating the missing pixels in and are described as a representative of green and red(blue) channel interpolations.

### 3.1. Green Channel Interpolation

#### 3.1.1. Region Classification: Sharp Edges

where represents the position of the pixels in the LR images in the north, south, east, and west from the center position. Note that the notation inherits the relative pixel position in Bayer CFA samples from the center pixel position.

where and represent the local average of the differences between the horizontally and vertically shifted green images, respectively. The local similarity becomes small when the global shift and the local edge directions are coincided.

With the measured local variation and local similarity criteria, the of each pixel is determined by,

Classification 1

where
and
represents the sharp edges along horizontal or vertical directions, respectively. When the direction is not determined, the region is considered as a nonsharp edge region and these regions are investigated again in the following region classification step: *Classification 2*.

#### 3.1.2. Region Classification: Edge Patterns

The regions of which edge types are not determined in (14) belong to the flat or the edge pattern region. The edge pattern region represents the region in the HR image that contains high-frequency components above the Nyquist rate of the Bayer CFA sampling. When the image is down sampled, the high frequency components that exceed the sampling rate are contaminated due to the aliasing effect. Therefore, the edge pattern region appears as locally flat in the LR image as shown in Figure 4(b). In this section, we derive the detection rule for the edge pattern region (pseudoflat region in the LR grid) and estimate the edge direction of the edge pattern.

where is a variation of and is the average of local variations.

With the intensity offset and the restrictive condition, the pseudoflat region (edge pattern region) is classified from the nonsharp edge region, such as

Classification 2

where and represent that the region is determined as the edge pattern region and a flat region in this classification, respectively, and and are thresholds that control the accuracy of the classification. If is larger (and is smaller) than the threshold, the pixel at is considered as being in the edge pattern region and the direction of the edge pattern is determined by the following criteria.

where and represent that the edge pattern is horizontally or vertically directed, respectively, and represents the region of which the edge direction is not clearly determined. Once the edge type of the edge pattern region is determined, the statistics of neighboring edge directions, such as the horizontal or vertical direction, are compared within a neighborhood. Following the majority of the directions, the consistency of the edge directions in the region is improved.

#### 3.1.3. Edge Directed Interpolation

where is for , respectively.

### 3.2. Red and Blue Channel Interpolation

Similar to the green plane interpolation, the missing red and blue channel LR images are interpolated along the edge direction by the region classification and the edge direction estimation. The fully interpolated green channels which have much information on edges are utilized to improve interpolation accuracy of the red and blue channels. To compensate insufficient LR images, the diagonally shifted LR images of are estimated using linear interpolation on the color difference domain [7]. In this section, the missing red and blue channels are found in aid of the sampled images and the interpolated images .

where . The weight function is computed as the same way in (20), but the gradient values are calculated in the green LR images.

## 4. Experimental Results

*Kodak 5, 6, 8, 15*, and

*19*. The proposed method outperforms the conventional edge directed methods in the majority of the images including those challenging images with dB and improvements of the averaged PSNR and NCD values, respectively.

The PSNR comparison of the conventional and proposed methods using the average of the three channels (dB) on the 24 test images in Figure 7(a).

ED | Indirect ED | Direct ED | |||||||
---|---|---|---|---|---|---|---|---|---|

[7] | [13] | [14] | [20] | [21] | [22] | [23] | [25] | Proposed | |

1 | 34.036 | 37.080 | 38.781 | 33.733 | 35.333 | 35.335 | 35.379 | 36.090 | 36.421 |

2 | 39.142 | 39.639 | 41.237 | 39.173 | 39.525 | 40.010 | 39.446 | 40.748 | 40.746 |

3 | 41.190 | 41.760 | 42.956 | 40.777 | 41.974 | 42.232 | 41.771 | 42.611 | 42.757 |

4 | 39.950 | 40.616 | 41.289 | 38.965 | 39.860 | 39.878 | 39.837 | 40.415 | 40.530 |

5 | 35.512 | 37.406 | 38.263 | 35.023 | 36.338 | 36.440 | 35.890 | 36.853 | 37.431 |

6 | 35.206 | 38.159 | 40.458 | 35.083 | 38.001 | 38.070 | 37.661 | 38.290 | 38.589 |

7 | 40.704 | 41.686 | 42.277 | 41.016 | 41.267 | 41.490 | 40.935 | 42.130 | 42.708 |

8 | 30.974 | 34.487 | 36.385 | 32.293 | 33.969 | 33.934 | 34.059 | 34.539 | 35.596 |

9 | 39.785 | 41.298 | 42.813 | 40.277 | 41.371 | 41.526 | 41.314 | 41.748 | 42.292 |

10 | 40.265 | 41.562 | 42.277 | 39.841 | 41.038 | 41.174 | 40.717 | 41.276 | 41.738 |

11 | 36.596 | 39.000 | 40.236 | 36.298 | 37.947 | 37.988 | 37.648 | 38.661 | 39.087 |

12 | 40.300 | 42.325 | 43.653 | 40.866 | 42.238 | 42.500 | 42.032 | 42.732 | 42.899 |

13 | 31.545 | 34.096 | 35.062 | 29.857 | 31.951 | 31.643 | 31.791 | 32.417 | 32.781 |

14 | 35.940 | 36.280 | 37.198 | 35.823 | 35.954 | 36.402 | 36.209 | 37.263 | 37.270 |

15 | 38.811 | 39.492 | 40.133 | 37.682 | 38.871 | 39.003 | 38.842 | 39.250 | 39.662 |

16 | 38.327 | 41.454 | 44.026 | 38.664 | 41.982 | 42.009 | 41.486 | 41.761 | 42.358 |

17 | 39.367 | 40.850 | 41.611 | 38.542 | 39.920 | 39.693 | 39.512 | 40.213 | 40.663 |

18 | 35.364 | 36.714 | 37.210 | 33.898 | 35.225 | 34.942 | 34.860 | 35.699 | 36.112 |

19 | 35.512 | 38.511 | 40.809 | 37.338 | 38.677 | 38.688 | 38.667 | 39.503 | 39.958 |

20 | 38.954 | 40.596 | 41.442 | 38.547 | 39.543 | 39.400 | 39.299 | 40.376 | 40.702 |

21 | 36.039 | 38.558 | 39.502 | 35.396 | 36.923 | 36.694 | 36.723 | 37.675 | 38.035 |

22 | 36.941 | 37.766 | 38.507 | 36.564 | 37.119 | 37.339 | 36.970 | 37.832 | 38.169 |

23 | 42.118 | 42.186 | 43.297 | 42.107 | 42.322 | 42.628 | 42.407 | 42.595 | 43.217 |

24 | 33.905 | 34.871 | 35.765 | 32.232 | 34.168 | 33.913 | 33.630 | 34.164 | 34.467 |

avg. | 37.353 | 39.016 | 40.216 | 37.083 | 38.397 | 38.455 | 38.212 | 38.952 | 39.341 |

The NCD comparison of the conventional and proposed methods on the 24 test images in Figure 7(a).

ED | Indirect ED | Direct ED | |||||||
---|---|---|---|---|---|---|---|---|---|

[7] | [13] | [14] | [20] | [21] | [22] | [23] | [25] | Proposed | |

1 | 3.372 | 2.724 | 1.994 | 3.286 | 2.663 | 2.924 | 2.789 | 2.517 | 2.426 |

2 | 2.311 | 2.244 | 1.905 | 2.201 | 2.177 | 2.133 | 2.179 | 1.910 | 1.906 |

3 | 1.409 | 1.321 | 1.211 | 1.428 | 1.311 | 1.318 | 1.333 | 1.231 | 1.212 |

4 | 1.876 | 1.800 | 1.725 | 2.051 | 1.897 | 1.950 | 1.924 | 1.777 | 1.764 |

5 | 4.217 | 3.491 | 3.040 | 4.105 | 3.608 | 3.821 | 3.843 | 3.329 | 3.152 |

6 | 2.373 | 1.939 | 1.384 | 2.206 | 1.636 | 1.751 | 1.782 | 1.636 | 1.565 |

7 | 1.677 | 1.553 | 1.458 | 1.554 | 1.563 | 1.535 | 1.569 | 1.431 | 1.347 |

8 | 4.064 | 3.158 | 2.246 | 3.271 | 2.780 | 3.004 | 2.847 | 2.567 | 2.364 |

9 | 1.352 | 1.183 | 1.028 | 1.271 | 1.144 | 1.175 | 1.158 | 1.131 | 1.066 |

10 | 1.342 | 1.203 | 1.124 | 1.369 | 1.238 | 1.282 | 1.279 | 1.223 | 1.168 |

11 | 3.014 | 2.526 | 2.056 | 2.798 | 2.403 | 2.499 | 2.528 | 2.227 | 2.142 |

12 | 1.006 | 0.887 | 0.744 | 0.958 | 0.830 | 0.857 | 0.865 | 0.810 | 0.784 |

13 | 4.737 | 3.898 | 3.313 | 5.648 | 4.387 | 4.991 | 4.707 | 4.208 | 4.032 |

14 | 3.203 | 2.918 | 2.593 | 3.160 | 2.969 | 3.034 | 2.972 | 2.647 | 2.595 |

15 | 2.148 | 2.052 | 1.958 | 2.329 | 2.155 | 2.201 | 2.183 | 2.018 | 1.980 |

16 | 2.150 | 1.749 | 1.218 | 1.918 | 1.409 | 1.507 | 1.525 | 1.499 | 1.386 |

17 | 2.663 | 2.363 | 2.207 | 2.771 | 2.490 | 2.631 | 2.578 | 2.465 | 2.333 |

18 | 4.152 | 3.828 | 3.720 | 4.711 | 4.284 | 4.440 | 4.397 | 4.019 | 3.833 |

19 | 2.528 | 2.126 | 1.661 | 2.321 | 2.011 | 2.135 | 2.065 | 1.897 | 1.792 |

20 | 1.483 | 1.303 | 1.155 | 1.522 | 1.356 | 1.443 | 1.411 | 1.264 | 1.216 |

21 | 2.393 | 1.989 | 1.684 | 2.511 | 2.078 | 2.292 | 2.212 | 1.965 | 1.891 |

22 | 2.133 | 2.007 | 1.884 | 2.289 | 2.125 | 2.167 | 2.197 | 1.983 | 1.909 |

23 | 1.261 | 1.245 | 1.216 | 1.290 | 1.307 | 1.284 | 1.286 | 1.248 | 1.187 |

24 | 2.514 | 2.239 | 1.968 | 2.684 | 2.310 | 2.472 | 2.430 | 2.199 | 2.114 |

avg. | 2.474 | 2.156 | 1.854 | 2.486 | 2.172 | 2.285 | 2.253 | 2.050 | 1.965 |

*Kodak 19, 15*and real images, respectively. At first, the competitive regions of

*Kodak 19*are shown in Figure 8. In each of the image crop, the vertically directed line edge pattern of the fence and the edge junctions of the window are depicted. In spite of the high PSNR performance, POCS method shows the Moiré pattern and the zipper artifacts in Figure 8(c). In Zhang's method and the edge directed methods in Figures 8(d)–8(i), the fence regions are highly improved with reduced errors. However, visible artifacts were remained on the vertical edges of the high frequency region or boundaries between the fence and the grass. Moreover, the zippers and disconnection were shown in the edge junctions in the upper image crop in Figures 8(b)–8(i). In Figure 8(j), the resultant image of the proposed algorithm shows better results in terms of the clear edges and the reduced visible artifacts. The resultants of the methods in the textures with diagonal patterns or diagonal lines are shown in Figure 9. While the artifacts were produced along the ribbon boundary in Figures 9(b)–9(i), the proposed method produced consistent edges with accurate edge direction estimation.

By using the high-resolution 12-bit Bayer CFA raw data in Figure 7(b), we can demonstrate the performance of each algorithm in the presents of noise. In Figures 10 and 11, the resultant images are shown with the region which contains edge junctions. In these regions, most of the algorithms show zipper artifacts caused by the false estimation of the edge direction. Among the conventional methods, edge directed techniques such as the variance of color differences method and the adaptive heterogeneity-projection method in Figures 10(g) and 10(h) demonstrates good performance on the horizontal and vertical directional edges. Similar results are shown in the diagonal edges in Figures 11(g) and 11(h). However, some artifacts are remained in the edge direction changing regions. In the resultants of the proposed method in Figures 10(i) and 11(i), the interpolated pixels are consistent along the edge and this shows the robustness of the spatial correlation of the Bayer color difference based method.

## 5. Conclusion

In this paper, we have proposed the edge adaptive color demosaicking algorithm that effectively estimates the edge direction on the Bayer CFA samples. We examined the spatial correlation on the Bayer color difference plane, and proposed the criteria for the region classification and the edge direction estimation. To estimate the edge direction in the complicated edge regions, the proposed method classified regions of an image into three types: edge, edge pattern, and flat regions. According to the edge types, the edge direction were effectively estimated and the directional interpolation resulted in clear edge. The proposed edge adaptive demosaicking method improved the overall image quality in terms of consistent edge directions around the edges. The proposed method was compared with the conventional edge directed and nonedge directed methods on the several images including the Bayer raw data. The simulation results indicated that the proposed method outperforms conventional edge directed algorithms with respect to both objective and subjective criteria.

## Declarations

### Acknowledgments

This research was supported by Mid-career Researcher Program through the NRF(National Research Foundation of Korea) grant funded by the MEST (no. 2010-0000345) and by the MKE(The Ministry of Knowledge Economy), Korea, under the ITRC (Information Technology Research Center) support program supervised by the NIPA(National IT Industry Promotion Agency) (NIPA-2010-( C1090-1011-0003)).

## Authors’ Affiliations

## References

- Bayer BE:
**Color imaging array.**US patent no. 3 971 065, July 1976Google Scholar - Gunturk BK, Glotzbach J, Altunbasak Y, Schafer RW, Mersereau RM:
**Demosaicking: color filter array interpolation.***IEEE Signal Processing Magazine*2005,**22**(1):44-54.View ArticleGoogle Scholar - Cok DR:
**Signal processing method and apparatus for producing interpolated chrominance values in a sampled color image signal.**US patent no. 4 642 678, February 1987Google Scholar - Lukac R, Martin K, Plataniotis KN:
**Demosaicked image postprocessing using local color ratios.***IEEE Transactions on Circuits and Systems for Video Technology*2004,**14**(6):914-920. 10.1109/TCSVT.2004.828316View ArticleGoogle Scholar - Adams JE Jr.:
**Interactions between color plane interpolation and other image processing functions in electronic photography.***Cameras and Systems for Electronic Photography and Scientific Imaging, February 1995, Proceedings of SPIE***2416:**144-151.View ArticleGoogle Scholar - Adams JE Jr.:
**Design of practical color filter array interpolation algorithms for digital cameras, Part 2.***Proceedings of the International Conference on Image Processing (ICIP '98), October 1998*488-492.Google Scholar - Pei S-C, Tam I-K:
**Effective color interpolation in CCD color filter arrays using signal correlation.***IEEE Transactions on Circuits and Systems for Video Technology*2003,**13**(6):503-513. 10.1109/TCSVT.2003.813422View ArticleGoogle Scholar - Kimmel R:
**Demosaicing: image reconstruction from color CCD samples.***IEEE Transactions on Image Processing*1999,**8**(9):1221-1228. 10.1109/83.784434View ArticleGoogle Scholar - Hur BS, Kang MG:
**Edge-adaptive color interpolation algorithm for progressive scan charge-coupled device image sensors.***Optical Engineering*2001,**40**(12):2698-2708. 10.1117/1.1418713View ArticleGoogle Scholar - Lu W, Tan Y-P:
**Color filter array demosaicing: new method and performance measures.***IEEE Transactions on Image Processing*2003,**12**(10):1194-1210. 10.1109/TIP.2003.816004View ArticleGoogle Scholar - Park SW, Kang MG:
**Color interpolation with variable color ratio considering cross-channel correlation.***Optical Engineering*2004,**43**(1):34-43. 10.1117/1.1631000View ArticleMathSciNetGoogle Scholar - Kim CW, Kang MG: Noise insensitive high resolution color interpolation scheme considering cross-channel correlation. Optical Engineering 2005.,44(12):Google Scholar
- Gunturk BK, Altunbasak Y, Mersereau RM:
**Color plane interpolation using alternating projections.***IEEE Transactions on Image Processing*2002,**11**(9):997-1013. 10.1109/TIP.2002.801121View ArticleGoogle Scholar - Zhang L, Wu X:
**Color demosaicking via directional linear minimum mean square-error estimation.***IEEE Transactions on Image Processing*2005,**14**(12):2167-2178.View ArticleGoogle Scholar - Alleysson D, Süsstrunk S, Hérault J:
**Linear demosaicing inspired by the human visual system.***IEEE Transactions on Image Processing*2005,**14**(4):439-449.View ArticleGoogle Scholar - Laroche CA, Prescott MA:
**Apparatus and method for adaptively interpolating a full color image utilizing chrominance gradients.**US patent no. 5 373 322, December 1994Google Scholar - Hibbard RH:
**Apparatus and method for adaptively interpolating a full color image utilizing luminance gradients.**US patent no. 5 382 976, January 1995Google Scholar - Adams JE, Hamilton JF Jr.:
**Adaptive color plane interpolation in single color electronic camera.**US patent no. 5 506 619, April 1996Google Scholar - Adams JE Jr.:
**Design of practical color filter array interpolation algorithms for digital cameras.***Real-Time Imaging II, February 1997, Proceedings of SPIE***3028:**117-125.View ArticleGoogle Scholar - Wu X, Zhang N:
**Primary-consistent soft-decision color demosaicking for digital cameras (patent pending).***IEEE Transactions on Image Processing*2004,**13**(9):1263-1274. 10.1109/TIP.2004.832920View ArticleGoogle Scholar - Hirakawa K, Parks TW:
**Adaptive homogeneity-directed demosaicing algorithm.***IEEE Transactions on Image Processing*2005,**14**(3):360-369.View ArticleGoogle Scholar - Menon D, Andriani S, Calvagno G:
**Demosaicing with directional filtering and a posteriori decision.***IEEE Transactions on Image Processing*2007,**16**(1):132-141.View ArticleMathSciNetGoogle Scholar - Chung K-H, Chan Y-H:
**Color demosaicing using variance of color differences.***IEEE Transactions on Image Processing*2006,**15**(10):2944-2955.View ArticleGoogle Scholar - Tsai C-Y, Song K-T:
**Heterogeneity-projection hard-decision color interpolation using spectral-spatial correlation.***IEEE Transactions on Image Processing*2007,**16**(1):78-91.View ArticleMathSciNetGoogle Scholar - Chung K-L, Yang W-J, Yan W-M, Wang C-C:
**Demosaicing of color filter array captured images using gradient edge detection masks and adaptive heterogeneity-projection.***IEEE Transactions on Image Processing*2008,**17**(12):2356-2367.View ArticleMathSciNetGoogle Scholar - Zhang L, Lukac R, Wu X, Zhang D:
**PCA-based spatially adaptive denoising of CFA images for single-sensor digital cameras.***IEEE Transactions on Image Processing*2009,**18**(4):797-812.View ArticleMathSciNetGoogle Scholar - Zhang L, Wu X, Zhang D:
**Color reproduction from noisy CFA data of single sensor digital cameras.***IEEE Transactions on Image Processing*2007,**16**(9):2184-2197.View ArticleMathSciNetGoogle Scholar

## Copyright

This article is published under license to BioMed Central Ltd. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.