 Research
 Open Access
 Published:
Waveletbased color modification detection based on variance ratio
EURASIP Journal on Image and Video Processing volume 2018, Article number: 47 (2018)
Abstract
Color modification is one of the popular image forgery techniques. It can be used to eliminate criminal evidence in various ways, such as modifying the color of a car used in a crime. If the color of a digital image is modified, the locations of the interpolated and original samples may be changed. Because the original and interpolated pixels have different statistical characteristics, these differences can serve as a basic clue for estimating the degree of color modification. It is assumed that the variance of original samples is greater than that of the interpolated samples. Therefore, we present a novel algorithm for color modification estimation using the variance ratio of color difference images in the wavelet domain. The color difference model is used to emphasize the differences between the original and interpolated samples. For color difference images, we execute a wavelet transform and use the highest frequency subband to calculate variances. We define a variance ratio measurement to quantify the level of color modification. Additionally, changed color local regions can be efficiently detected using the proposed algorithm. Experimental results demonstrate that the proposed method generates accurate estimation results for detecting color modification. Compared to the conventional method, our method provides superior color modification detection performance.
Introduction
Using image forgery techniques requires minimal expertise because digitized images are easily replicated or manipulated. Furthermore, it is difficult to verify the authenticity of images using only the human eye. Therefore, developing reliable image forgery detection methods to determine the authenticity of images has become an important issue [1, 2]. A trace of image manipulations can be used as a clue for detecting altered images. If we can uncover evidence indicating image alterations, we can conclude that an image has been forged. There have been several studies on detecting various image forgery techniques, such as copymove [3,4,5], image splicing [6,7,8], scaling [9, 10], rotation [11], blurring [12, 13], contrast change [14], and color modification [15].
Color modification is one of the commonly used image forgery techniques. It is often exploited to obfuscate a person by changing their face color, eliminate criminal evidence by modifying the color of a car used in a crime, or mislead customers by changing the color of a product. Forensic approaches that can detect color modification have not been extensively studied. In 2013, Choi et al. introduced a basic color modification detection method [15]. They defined a color modification attack as a change in the ratio between red, blue, and green channels. According to this definition, brightness adjustment, which modifies the luminance of an image, is not a color modification attack. However, hue and white balance adjustments are included in this definition. They designed a color modification detection algorithm based on the fact that color filter array (CFA) patterns change if the color of a digital image is modified. They achieved good color modification detection performance using an advanced intermediate value counting (AIVC) algorithm [16]. However, their algorithm is only valid for intrachannel demosaicing methods, such as bilinear and bicubic interpolations. Because many demosaicing approaches tend to emphasize highfrequency components by using the correlations between color channels [17,18,19,20], the color change detection accuracy of Choi’s algorithm may be reduced.
In this paper, we present a novel color modification detection method using the variance ratio in the wavelet domain. For a suspicious image, we first decompose the image into four subimages. The decomposition process is performed in an evenodd manner in the vertical and horizontal directions. Next, we construct four color difference images in the form of a Bayer CFA pattern: interpolated green minus original red, interpolated green minus interpolated red, interpolated green minus original blue, and interpolated green minus interpolated blue. For the color difference images, we execute a wavelet transform to extract highfrequency components and then use the highest frequency subband to calculate variances. We define a variance ratio measurement to quantify the level of color modification. Through various experiments, we demonstrate that the proposed method generates accurate estimation results for detecting color modification. Compared to the conventional method, our method achieves superior results.
The remainder of this paper is organized as follows: Section 2 describes the color modification detection method based on CFA pattern changes. The proposed color modification detection method and localization algorithm for changed color regions are presented in Section 3. Section 4 reports the experimental results obtained using the proposed approach, and Section 5 summarizes our conclusions.
Estimation of color modification
Color sample changes through color modification
Most commercial image acquisition devices, such as digital cameras, use an inexpensive CFA with a specific shape to acquire color images. The CFA is placed over the image sensor in a digital camera. Each pixel uses only one color from the available choices of red, green, and blue because the image sensor can only measure one color per pixel. Each color channel has missing pixels and requires interpolation. The process of estimating missing pixels is referred to as demosaicing. Figure 1a presents an example of demosaicing using the RGGB Bayer pattern [21]. As shown in the leftmost 2 × 2 square in Fig. 1a, the red value in the topleft position (R_{1}), green values in the topright position (G_{2} and G_{3}), and blue value in the bottomright position (B_{4}) are all original values. The remaining three red, two green, and three blue values are interpolated. In the RGGB Bayer pattern, demosaicing is the interpolation process used to estimate \( \left\{{\overset{\frown }{\mathbf{R}}}_2,{\overset{\frown }{\mathbf{R}}}_3,{\overset{\frown }{\mathbf{R}}}_4,{\overset{\frown }{\mathbf{G}}}_1,{\overset{\frown }{\mathbf{G}}}_4,{\overset{\frown }{\mathbf{B}}}_1,{\overset{\frown }{\mathbf{B}}}_2,{\overset{\frown }{\mathbf{B}}}_3\right\} \) from the original {R_{1}, G_{2}, G_{3}, B_{4}} pattern.
If the colors of an image have been modified, we can observe that the locations of the interpolated pixels and original pixels are changed. Figure 1b presents the color location changes resulting in modifying the hue angle by 120°. In this case, red is changed to green, green is changed to blue, and blue is changed to red. The original R_{1} in the topleft position is changed to an interpolated \( {\overset{\frown }{\mathbf{R}}}_1 \), while the interpolated \( {\overset{\frown }{\mathbf{R}}}_4 \) in the bottomright position is changed to the original R_{4}. The other color channels follow a similar pattern. The changed characteristics of color samples are important clues for detecting altered images.
Conventional color change detection method
In general, an image generated by the demosaicing process is represented in the RGB color space. Because color modification is related to color information, rather than intensity and saturation, handling images in the hue (H), saturation (S), and intensity (I) color space (HSI color space) is more efficient than handling images in the RGB color space. The hue value H can be obtained by using R, G, and B as follows:
H represents color information as an angle, in the range of H is 0^{°} ≤ H < 360^{°}. Figure 2 illustrates the relationship between RGB values and hue angles. As shown in Fig. 2, RGB values change in a periodic manner according to hue changes. If the color of an image is modified, then the angle of the hue changes. Conversely, a hue change caused by color modification results in an RGB value change.
The conventional algorithm for estimating color modification [15] exploits the color sample changes caused by color modification. This method is based on the idea that the trace of a CFA pattern, which is closely related to color information, changes when the color of an image is modified. If a pixel is interpolated, then its value exists between maximum and minimum values within a predefined window with a very high probability. For a given green sample value G(i, j) at pixel location (i, j), the maximum value (\( {G}_{\left(i,j\right)}^{\mathrm{max}} \)) and minimum value (\( {G}_{\left(i,j\right)}^{\mathrm{min}} \)) within the predefined window surrounding G(i, j) are obtained as follows:
where W(i, j) is the predefined window surrounding G(i, j). If \( {G}_{\left(i,j\right)}^{\mathrm{min}}<G\left(i,j\right)<{G}_{\left(i,j\right)}^{\mathrm{max}} \), then G(i, j) can be considered as an interpolated green sample. In this case, the count value for interpolated samples is incremented by one. Otherwise, G(i, j) is very likely to be an original sample. In this case, the count value for original samples is incremented by one.
For a given hue angle H, let N_{ i }(H) be the accumulated total count value of interpolated green pixels in the image. Let N_{ o }(H) be the accumulated total count value of original green pixels. That is, for a given H,
To normalize the total count value, the ratio between two count values R(H) is defined as follows: for a given H, R(H) is
If a green sample is an original, then N_{ i }(H) is small and N_{ o }(H) is large. By rotating the hue angle from 0° to 359°, we can find the hue angle,H_{min} for which R(H) is minimized as follows:
The shifted hue angle is estimated as follows:
where H_{ F } is the altered hue angle. Let us assume that the modified hue angle is 30°. To estimate the changed hue angle, we can find an angle with minimum R(H) by rotating the angle from 0° to 359°. Assuming an accurate estimate, the angle with the minimum variance is H_{min} = 330. Therefore, the shifted hue angle is estimated as H_{ F } = 360 − 330 = 30. If an image is not changed, then H_{ F } = 0.
Figure 3 depicts the R(H) values of an original image (the shifted hue is 0°) and a forged image (the shifted hue is 90°) based on their hue angles. The estimated values of H_{ F } for various demosaicing methods are presented in Fig. 3. We tested four demosaicing methods: bilinear interpolation, the adaptive homogeneitydirected (AHD) method [22], variable number of gradients (VNG) algorithm [23], and aliasing minimization and zipper elimination (AMaZE) [24]. The demosaiced images are obtained by using the RawTherapee [25] tool. As shown in Fig. 3, the estimation errors are small for bilinear interpolation, which exploits only the information of its own color channel to interpolate missing color values, because the AIVC method checks the number of pixels outside of the range between the minimum and maximum pixel values.
Intrachannel demosaicing schemes, such as bilinear or bicubic interpolations, have many drawbacks, including fringe effects, false colors, and zipper effects. Therefore, the majority of demosaicing algorithms exploit color channel correlations, because the highfrequency information in color channels is highly crosscorrelated in natural color images. The demosaicing results of these techniques have more highfrequency components than those of bilinear interpolation. Therefore, we can expect that the estimation error will increase with the use of demosaicing algorithms that use interchannel and color difference information. As shown in Fig. 3, the estimation errors for a 90° hue shift are 6°, 5°, 28°, and 16° for bilinear, AHD, VNG, and AMaZE, respectively. For this reason, we will introduce a novel color forgery estimation algorithm based on an interchannel demosaicing method with a color difference model.
Proposed method
Basically, an interpolator kernel has the characteristics of a lowpass filter. Therefore, it can be easily deduced that the variance of the original block is larger than that of the interpolated block. Consequently, in the edge regions, the original and interpolated pixels have different characteristics. On the contrary, in the background regions, these pixels have similar characteristics. Therefore, background regions can have an adverse influence on estimating color modification. The conventional methods employ an inefficient estimation process because they use the same block regardless of the characteristics of the image. To remove an adverse influence of background region, we exploit the wavelet transform. Furthermore, to emphasize the effect of highfrequency components, we use the highest frequency subband (HH) in the wavelet domain.
Many demosaicing algorithms employ color differences and attempt to enhance edges. Accordingly, common demosaicing approaches use spectral and spatial correlation for estimating missing pixels using neighboring color pixels [26]. Therefore, the majority of color interpolation algorithms are based on the differences between the original sample and filtered sample or between two interpolated samples. Based on this fact, we introduce a novel estimation algorithm for color change detection using color difference images in the wavelet domain. Recently, the variance measurement of the color difference model has been successfully exploited in Bayer CFA pattern identification [27].
Let \( {\mathbf{D}}_{GR_1} \) and \( {\mathbf{D}}_{GR_4} \) be the color difference images (green minus red) for an RGGB Bayer CFA pattern, respectively. That is,
\( {\mathbf{D}}_{GR_1} \) is obtained from the difference between the interpolated green samples and original red samples. \( {\mathbf{D}}_{GR_4} \) is calculated from the interpolated green and red samples. We can expect the variance of \( {\mathbf{D}}_{GR_4} \) to be smaller than that of \( {\mathbf{D}}_{GR_1} \). Similarly, we construct two difference images using green and blue components as follows:
where \( {\mathbf{D}}_{GB_1} \) and \( {\mathbf{D}}_{GB_4} \) are the color difference images (green minus blue) for the RGGB Bayer CFA pattern. As shown in (10), \( {\mathbf{D}}_{GB_1} \) is calculated by using interpolated green and blue samples. \( {\mathbf{D}}_{GB_4} \) is the difference between the interpolated green samples and original blue samples. Similar to the case of the differences between the green and red samples, we can assume that the variance of \( {\mathbf{D}}_{GB_1} \) is smaller than that of \( {\mathbf{D}}_{GB_4} \).
Let DWT(X) be the wavelet transform for the image, X, that produces four subbands. Let D_{ CD }(H) be the color difference image with hue shift angle H, where CD ∈ {GR_{1}, GR_{4}, GB_{1}, GB_{4}}. We obtain four wavelet subband images as follows:
where \( {\mathbf{W}}_{CD}^{LL}(H) \), \( {\mathbf{W}}_{CD}^{LH}(H) \), \( {\mathbf{W}}_{CD}^{HL}(H) \), and \( {\mathbf{W}}_{CD}^{HH}(H) \) represent four subbands with a hue shift angle H in the wavelet domain. In the wavelet transform domain, the highest frequency components exist in the finest frequency subband in the diagonal direction. Therefore, the variances of the four difference images in the finest diagonal subband serve as a good measurement for detecting color modification. In this paper, we propose two variance ratio measurements using the variances in the diagonal subbands of the color difference images as follows:
where V_{ GR }(H) and V_{ GB }(H) are the variance ratios for a given hue angle, H. The denominators in (13) and (14) are obtained from the differences between the filtered green samples and original red or blue samples, respectively. The numerators in (13) and (14) are calculated from the differences between the two interpolated color samples. Therefore, we can expect that the variance ratios will have a smaller value for the original sample. If the image is modified through hue shifting, the minimum value will occur at the position of the shifted hue angle.
Finally, we define the estimation measurement for detecting color modification, V_{ E }(H) as follows:
To estimate a changed hue angle, the hue of a suspicious image is changed from 0° to 359° in increments of a predefined interval factor. For each H, V_{ E }(H) is calculated. H_{min} is the hue angle where V_{ E }(H) yields the minimum value, which is written as follows:
Figure 4 presents the graphs of V_{ E }(H) for the hue shifted image and estimated as H_{ F } = 360 − H_{min}. The test conditions are the same as in Fig. 3. As shown in Fig. 4, the estimation errors obtained by the proposed method are smaller than those obtained by the AIVC algorithm. The average estimation error of the proposed method for the eight sample images is 0.25°. The average estimation error of the AIVC algorithm for the eight sample images is 12.75°, as shown in Fig. 3. However, we notice that the minimum value occurs twice at H_{min} and H_{min} + 180^{°} in our method. This is because our algorithm is based on the variance of color difference values. The absolute values of color difference have periodicity because blue and red color values are shifted from the green value by 120° and 240°, respectively, as shown in Fig. 5. This situation yields conflicting results. Because the minimum value occurs twice at H_{min} and H_{min} + 180^{°}, we can trace only half of the total angle range to estimate the exact shifted hue angle. Therefore, we can cut the estimation time in half based on the periodicity of H_{min}. However, we need an additional decision process to determine which of the angles is the real shifted hue angle.
In this paper, we introduce the following refinement process to resolve the situation of having two minimum values. The hue value of a suspicious image is changed from 0° to 179° in increments of the predefined interval factor. The hue angle having the minimum V_{ E }(H) value is obtained by changing the hue angle from 0° to 179° as follows:
where H_{ r } is the estimated hue angle. We know that H_{ r } + 180^{°} is also the minimum angle. To select the actual minimum value, we use the AIVC algorithm. The angle, H_{ F } from among the minimum values of R(H_{ r }) and R(H_{ r } + 180^{°}) can be considered as the real shifted hue angle. We can determine the final shifted hue angle by using the AIVC algorithm for only the two angles obtained by the proposed method.
The proposed algorithm is summarized in Table 1. As in existing methods, we determine the size of a square image block to estimate the level of color modification and set the hue angle to zero (H = 0). We choose the square image block located at the center of the image and construct four color difference images \( {\mathbf{D}}_{GR_1} \), \( {\mathbf{D}}_{GR_4} \), \( {\mathbf{D}}_{GB_1} \), and \( {\mathbf{D}}_{GB_4} \). For these four difference images, we perform a discrete wavelet transform and compute the variances of the HH subbands corresponding to the four difference images. We then calculate two variance ratios using (13) and (14). Next, the estimation measurement, V_{ E }(H) for detecting color modification is obtained using (15). V_{ E }(H) is calculated by repeating the same process until H becomes 360^{°}. After obtaining H_{ r } using (17), we estimate the modified hue angle H_{ F } using the AIVC algorithm.
Simulation results
Test image sets and simulation conditions
In this paper, we used 1460 raw images provided by the Dresden Image Database [28] to evaluate the performance of the proposed method. The camera types and detailed information regarding the images used in our experiments are contained in Table 2. CFA interpolations were performed using RawTherapee [25], which is a wellknown crossplatform raw image processing program. Seven CFA interpolation algorithms were used in our experiments: bilinear interpolation, AHD interpolation [22], AMaZE interpolation [24], DCB demosaicing [29], linear minimum mean square error (LMMSE) demosaicing [30], VNG algorithm [23], and heterogeneityprojection harddecision (HPHD) color interpolation [31].
To estimate the shifted hue angles, the center regions of the sample images were cropped into square blocks of various sizes, such as 512 × 512, 256 × 256, 128 × 128, 64 × 64, and 32 × 32. The hue of the sample images was shifted from 0° to 359° in 45° increments to generate a test image set. In total, 408,800 (1460 × 5 × 8 × 7) test images were generated in our simulations. The mean values of estimation errors, as well as standard deviations, were calculated to evaluate the estimation performance for altered hue angles.
Results of color modification detection
Table 3 shows the hue shift estimation performance for the proposed and conventional methods. The mean error values were drawn from the results of the seven demosaicing algorithms, which used a 256 × 256 centered block. As shown in Table 3, the mean values of the estimation errors obtained by the proposed algorithm are slightly larger than those of the AIVC method in the case of bilinear interpolation. However, our estimation method is superior to the AIVC algorithm for AHD, AMaZE, DCB, LMMSE, VNG, and HPHD interpolations. The averaged mean error values achieved by the proposed method for the shifted hue angles ranged from 8.71 (for 0°) to 8.83 (for both 45° and 135°). On the other hand, the averaged mean error values obtained by the AIVC method ranged from 14.87 (for 0°) to 19.56 (for 225°). These results indicate that the AIVC method based on intermediate value counting is only suitable for bilinear interpolation. In summary, our method achieves better estimation performance than the AIVC method when considering various demosaicing methods and altered hue angles. In particular, the estimation errors in our algorithm are nearly independent of the altered hue angles. The AIVC algorithm has superior estimation performance only when image forgeries are not created.
Table 4 contains the color forgery detection performance results for the proposed and conventional methods. For a shifted hue angle of 45°, the mean error values are calculated based on the various cropped center block sizes. As shown in Table 4, the averaged mean error values of the proposed method for various cropped center block sizes ranged from 7.98 (for a 512 × 512 block) to 18.59 (for a 32 × 32 block). The averaged mean error values obtained by the AIVC method ranged from 16.81 (for a 512 × 512 block) to 29.97 (for a 32 × 32 block). The size of the cropped center block can affect both estimation performance and computational cost. A larger block size can provide better estimation performance but suffers from high computation time.
Mean error value comparisons between our method and the AIVC algorithm for various shifted hue angles are presented in Table 5. The error values were averaged according to the cropped center block sizes. As shown in Table 5, our method achieves superior estimation performance compared to the AIVC method, except for bilinear interpolation. Table 6 shows the mean error value comparisons between our method and the AIVC algorithm for various block sizes. The error values were averaged from all shifted angles. As shown in Table 6, our color change detection algorithm is superior to the conventional AIVC method, except for the three bilinear interpolation examples that have center block sizes larger than 128 × 128.
In general, images are captured and then used through image pipelines such as white balancing, gamma correction, and noise reduction. In this paper, we also tested our method for whitebalanced and gammacorrected images. For white balancing, we exploited “white patch” algorithm [32]. We used “lin2rgb” function in Matlab to obtain gammacorrected images. Table 7 shows the mean error value comparisons between our method and the AIVC algorithm after white balancing and gamma correction for various block sizes. As shown in Table 7, we can see that the white balancing and gamma correction have some influence on the color forgery detection. Unlike Table 6, the AIVC method shows better detection performance than the proposed algorithm in the bilinear interpolation, LMMSE, and VNG demosaicing method. The AIVC algorithm has a positive effect, while the proposed method is negatively affected. It is considered that the white balancing and gamma correction affect the variance, and the performance of the proposed method will be degraded because our algorithm is based on the variance. However, the proposed method still achieves better detection results for other demosaicing algorithms.
Forged color region detection
For this paper, we generated four forged color images using Adobe Photoshop CS4 (64 bit) as shown in Fig. 6. The castle in “sample 1” was altered with a 180° hue shift, and the poster on the door in “sample 2” was shifted by − 90°. In “sample 3”, two regions (the fish tank and the pictures on the wall) were altered by − 45°. Finally, two countries on the globe in sample 4 were shifted by − 45° and 90°. The test sample images were sliced into 32 × 32 blocks. The test was performed by shifting the hue angle in 5° increments. Figure 7 presents the forged color region detection results for the seven demosaicing methods. For all demosaicing methods, our proposed method outperforms the conventional AIVC algorithm. As shown in Fig. 7, it is hard to estimate color shift angles or detect a forged region using the AMaZE, LMMSE, VNG, and HPHD demosaicing methods. This indicates that forged color detection still requires significant research.
Discussion
The proposed algorithm has a reasonable accuracy in estimating the level of color modification. The results of the proposed method are superior to those obtained using the conventional method for all demosaicing algorithms except bilinear interpolation. In addition, the proposed approach can be applicable to localize forged color regions. However, the area of color modification detection still needs a lot of research based on the results of the previous and our studies.
Conclusions
In this paper, we presented a new color modification detection algorithm using the variance ratio of color difference images in the wavelet domain. We generated a color difference model to emphasize the differences between original pixels and interpolated pixels. We also introduced a variance ratio measurement using the highest frequency subband in the wavelet domain to quantify the level of hue angle modification. Experimental results demonstrated that the proposed method generates accurate estimation results for detecting color modification. Compared to the conventional method, our method provides superior color modification detection performance. In addition, we verified that modified color regions could be efficiently detected using the proposed algorithm.
Abbreviations
 AHD:

Adaptive homogeneitydirected
 AIVC:

Advanced intermediate value counting
 AMaZE:

Aliasing minimization and zipper elimination
 CFA:

Color filter array
 H:

Hue
 HPHD:

Heterogeneityprojection harddecision
 I:

Intensity
 LMMSE:

Linear minimum mean square error
 S:

Saturation
 VNG:

Variable number of gradients
References
 1.
H Farid, A survey of image forgery detection. IEEE Signal Process. Mag. 2(26), 16–25 (2009)
 2.
B Mahdian, S Saic, A bibliography on blind methods for identifying image forgery. Signal Process. Image Commun. 25(6), 389–399 (2010)
 3.
B Xiuli, P ChiMan, Fast reflective offsetguided searching method for copymove forgery detection. Inf. Sci. 418419, 531–545 (2017)
 4.
D Cozzolino, G Poggi, L Verdoliva, Efficient densefield copy–move forgery detection. IEEE Trans. Inf. Forensics Secur. 10, 2284–2297 (2015)
 5.
R Davarzani, K Yaghmaie, S Mozaffari, M Tapak, Copymove forgery detection using multiresolution local binary patterns. Forensic Sci. Int. 231, 61–72 (2013)
 6.
X Zhao, S Wang, S Li, J Li, Passive imagesplicing detection by a 2D noncausal Markov model. IEEE Trans. Circuits Syst. Video Technol. 25(2), 185–199 (2015)
 7.
Z He, W Lu, W Sun, J Huang, Digital image splicing detection based on Markov features in DCT and DWT domain. Pattern Recogn. 45(12), 4292–4299 (2012)
 8.
JG Han, TH Park, YH Moon, IK Eom, Efficient Markov feature extraction method for image splicing detection using maximization and threshold expansion. J. Electron. Imaging. 25(2), 023031 (2016)
 9.
AC Popescu, H Farid, Exposing digital forgeries by detecting traces of resampling. IEEE Trans. Signal Process. 53(2), 758–767 (2005)
 10.
B Mahdian, S Saic, Blind authentication using periodic properties of interpolation. IEEE Trans. Inf. Forensics Secur. 3(3), 529–538 (2008)
 11.
W Wei, S Wang, Estimation of image rotation angle using interpolationrelated spectral signatures with application to blind detection of image forgery. IEEE Trans. Inf. Forensics Secur. 5(3), 507–517 (2010)
 12.
G Liu, J Wangb, S Lian, Y Dai, Detect image splicing with artificial blurred boundary. Math. Comput. Model. 57, 2647–2659 (2013)
 13.
G Cao, Y Zhao, R Ni, Edgebased blur metric for tamper detection. J. Inf. Hiding Multimedia Signal Process. 1(1), 20–27 (2010)
 14.
G Cao, Y Zhao, R Ni, X Li, Contrast enhancementbased forensics in digital images. IEEE Trans. Inf. Forensics Secur. 9(3), 515–525 (2014)
 15.
CH Choi, HY Lee, HK Lee, Estimation of color modification in digital images by CFA pattern changes. Forensic Sci. Int. 226, 94–105 (2013)
 16.
CH Choi, JH Choi, HK Lee, CFA pattern identification of digital cameras using intermediate value counting, Proceedings of ACM Workshop in Multimedia and Security (2011), pp. 21–26
 17.
DR Cok, Signal processing method and apparatus for producing interpolated chrominance values in a sampled color image signal. US Patent 4642678, (1987).
 18.
WT Freeman, Median filter for reconstructing missing color samples. US Patent 4724395, (1988)
 19.
CA Laroche, MA Prescott, Apparatus and method for adaptively interpolating a full color image utilizing chrominance gradients. US Patent 5373322, (1994)
 20.
JF Hamilton, JE Adams, Adaptive color plan interpolation in single sensor color electronic camera. US Patent 5629734, (1997)
 21.
BE Bayer, Color imaging array. US Patent 3971065, (1976)
 22.
K Hirakawa, TW Parks, Adaptive homogeneity directed demosaicing algorithm. IEEE Trans. Image Process. 14(3), 360–369 (2005)
 23.
E Chang, S Cheung, DY Pan, Color filter array recovery using a thresholdbased variable number of gradients, Proceedings of SPIE, Sensors, Cameras, and Applications for Digital Photography (1999), pp. 36–43
 24.
E Martinec, P Lee, AMAZE demosaicing algorithm. http://www.rawtherapee.com/. Accessed 15 June 2017
 25.
G Horvath, http://www.rawtherapee.com/. Accessed 22 Apr 2017.
 26.
K HS, SS Kim, IK Eom, Waveletdomain demosaicking using linear estimation of interchannel correlation. Opt. Eng. 47(6), 067002 (2008)
 27.
HJ Shin, JJ Jeon, IK Eom, Color filter array pattern identification using variance of color difference image. J. Electron. Imaging. 26(4), 0243303015 (2017)
 28.
T Gloe, R Bohme, The ‘Dresden Image Database’ for benchmarking digital image forensics, Proceedings of the 25th Symposium on Applied Computing (2010), pp. 1585–1591
 29.
J Gozd, DCB demosaicing algorithm. http://www.linuxphoto.org/html/dcb.html. Accessed 15 June 2017
 30.
L Zhang, X Wu, Color demosaicking via directional linear minimum mean squareerror estimation. IEEE Trans. Image Process. 14(12), 2167–2178 (2005)
 31.
CY Tsai, KT Song, Heterogeneityprojection harddecision color interpolation using spectralspatial correlation. IEEE Trans. Image Process. 16(11), 78–91 (2007)
 32.
EH Land, JJ McCann, Lightness and retinex theory. J. Opt. Soc. Am. 61(1), 1–11 (1971)
Funding
This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (NRF2015R1D1A3A01019561), and BK21PLUS, Creative Human Resource Development Program for IT Convergence.
Availability of data and materials
Please contact the author for data requests.
Author information
Affiliations
Contributions
JJJ proposed the framework of this work, carried out the whole experiments, and drafted the manuscript. IKE initiated the main algorithm of this work, supervised the whole work, and wrote the final manuscript. Both authors read and approved the final manuscript.
Corresponding author
Correspondence to Il Kyu Eom.
Ethics declarations
Competing interests
Both authors declare that they have no competing interests.
Publisher’s Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
About this article
Received
Accepted
Published
DOI
Keywords
 Color modification
 Image forgery
 Color difference
 Demosaicing
 Variance ratio
 Wavelet transform
 Color filter array