Skip to main content

Wavelet-based color modification detection based on variance ratio

Abstract

Color modification is one of the popular image forgery techniques. It can be used to eliminate criminal evidence in various ways, such as modifying the color of a car used in a crime. If the color of a digital image is modified, the locations of the interpolated and original samples may be changed. Because the original and interpolated pixels have different statistical characteristics, these differences can serve as a basic clue for estimating the degree of color modification. It is assumed that the variance of original samples is greater than that of the interpolated samples. Therefore, we present a novel algorithm for color modification estimation using the variance ratio of color difference images in the wavelet domain. The color difference model is used to emphasize the differences between the original and interpolated samples. For color difference images, we execute a wavelet transform and use the highest frequency subband to calculate variances. We define a variance ratio measurement to quantify the level of color modification. Additionally, changed color local regions can be efficiently detected using the proposed algorithm. Experimental results demonstrate that the proposed method generates accurate estimation results for detecting color modification. Compared to the conventional method, our method provides superior color modification detection performance.

1 Introduction

Using image forgery techniques requires minimal expertise because digitized images are easily replicated or manipulated. Furthermore, it is difficult to verify the authenticity of images using only the human eye. Therefore, developing reliable image forgery detection methods to determine the authenticity of images has become an important issue [1, 2]. A trace of image manipulations can be used as a clue for detecting altered images. If we can uncover evidence indicating image alterations, we can conclude that an image has been forged. There have been several studies on detecting various image forgery techniques, such as copy-move [3,4,5], image splicing [6,7,8], scaling [9, 10], rotation [11], blurring [12, 13], contrast change [14], and color modification [15].

Color modification is one of the commonly used image forgery techniques. It is often exploited to obfuscate a person by changing their face color, eliminate criminal evidence by modifying the color of a car used in a crime, or mislead customers by changing the color of a product. Forensic approaches that can detect color modification have not been extensively studied. In 2013, Choi et al. introduced a basic color modification detection method [15]. They defined a color modification attack as a change in the ratio between red, blue, and green channels. According to this definition, brightness adjustment, which modifies the luminance of an image, is not a color modification attack. However, hue and white balance adjustments are included in this definition. They designed a color modification detection algorithm based on the fact that color filter array (CFA) patterns change if the color of a digital image is modified. They achieved good color modification detection performance using an advanced intermediate value counting (AIVC) algorithm [16]. However, their algorithm is only valid for intra-channel demosaicing methods, such as bilinear and bicubic interpolations. Because many demosaicing approaches tend to emphasize high-frequency components by using the correlations between color channels [17,18,19,20], the color change detection accuracy of Choi’s algorithm may be reduced.

In this paper, we present a novel color modification detection method using the variance ratio in the wavelet domain. For a suspicious image, we first decompose the image into four sub-images. The decomposition process is performed in an even-odd manner in the vertical and horizontal directions. Next, we construct four color difference images in the form of a Bayer CFA pattern: interpolated green minus original red, interpolated green minus interpolated red, interpolated green minus original blue, and interpolated green minus interpolated blue. For the color difference images, we execute a wavelet transform to extract high-frequency components and then use the highest frequency subband to calculate variances. We define a variance ratio measurement to quantify the level of color modification. Through various experiments, we demonstrate that the proposed method generates accurate estimation results for detecting color modification. Compared to the conventional method, our method achieves superior results.

The remainder of this paper is organized as follows: Section 2 describes the color modification detection method based on CFA pattern changes. The proposed color modification detection method and localization algorithm for changed color regions are presented in Section 3. Section 4 reports the experimental results obtained using the proposed approach, and Section 5 summarizes our conclusions.

2 Estimation of color modification

2.1 Color sample changes through color modification

Most commercial image acquisition devices, such as digital cameras, use an inexpensive CFA with a specific shape to acquire color images. The CFA is placed over the image sensor in a digital camera. Each pixel uses only one color from the available choices of red, green, and blue because the image sensor can only measure one color per pixel. Each color channel has missing pixels and requires interpolation. The process of estimating missing pixels is referred to as demosaicing. Figure 1a presents an example of demosaicing using the RGGB Bayer pattern [21]. As shown in the left-most 2 × 2 square in Fig. 1a, the red value in the top-left position (R1), green values in the top-right position (G2 and G3), and blue value in the bottom-right position (B4) are all original values. The remaining three red, two green, and three blue values are interpolated. In the RGGB Bayer pattern, demosaicing is the interpolation process used to estimate \( \left\{{\overset{\frown }{\mathbf{R}}}_2,{\overset{\frown }{\mathbf{R}}}_3,{\overset{\frown }{\mathbf{R}}}_4,{\overset{\frown }{\mathbf{G}}}_1,{\overset{\frown }{\mathbf{G}}}_4,{\overset{\frown }{\mathbf{B}}}_1,{\overset{\frown }{\mathbf{B}}}_2,{\overset{\frown }{\mathbf{B}}}_3\right\} \) from the original {R1, G2, G3, B4} pattern.

Fig. 1
figure 1

An example of color sample changes in the demosaicing process. a Original color samples. b Modified color samples through color change

If the colors of an image have been modified, we can observe that the locations of the interpolated pixels and original pixels are changed. Figure 1b presents the color location changes resulting in modifying the hue angle by 120°. In this case, red is changed to green, green is changed to blue, and blue is changed to red. The original R1 in the top-left position is changed to an interpolated \( {\overset{\frown }{\mathbf{R}}}_1 \), while the interpolated \( {\overset{\frown }{\mathbf{R}}}_4 \) in the bottom-right position is changed to the original R4. The other color channels follow a similar pattern. The changed characteristics of color samples are important clues for detecting altered images.

2.2 Conventional color change detection method

In general, an image generated by the demosaicing process is represented in the RGB color space. Because color modification is related to color information, rather than intensity and saturation, handling images in the hue (H), saturation (S), and intensity (I) color space (HSI color space) is more efficient than handling images in the RGB color space. The hue value H can be obtained by using R, G, and B as follows:

$$ H={\tan}^{-1}\left(\frac{\sqrt{3}\left(G-B\right)}{2R-G-B}\right). $$
(1)

H represents color information as an angle, in the range of H is 0° ≤ H < 360°. Figure 2 illustrates the relationship between RGB values and hue angles. As shown in Fig. 2, RGB values change in a periodic manner according to hue changes. If the color of an image is modified, then the angle of the hue changes. Conversely, a hue change caused by color modification results in an RGB value change.

Fig. 2
figure 2

The relationship between RGB values and hue angles

The conventional algorithm for estimating color modification [15] exploits the color sample changes caused by color modification. This method is based on the idea that the trace of a CFA pattern, which is closely related to color information, changes when the color of an image is modified. If a pixel is interpolated, then its value exists between maximum and minimum values within a pre-defined window with a very high probability. For a given green sample value G(i, j) at pixel location (i, j), the maximum value (\( {G}_{\left(i,j\right)}^{\mathrm{max}} \)) and minimum value (\( {G}_{\left(i,j\right)}^{\mathrm{min}} \)) within the pre-defined window surrounding G(i, j) are obtained as follows:

$$ {G}_{\left(i,j\right)}^{\mathrm{max}}=\underset{G\left(k,l\right)\in W\left(i,j\right)}{\max }G\left(k,l\right), $$
(2)
$$ {G}_{\left(i,j\right)}^{\mathrm{min}}=\underset{G\left(k,l\right)\in W\left(i,j\right)}{\min }G\left(k,l\right), $$
(3)

where W(i, j) is the pre-defined window surrounding G(i, j). If \( {G}_{\left(i,j\right)}^{\mathrm{min}}<G\left(i,j\right)<{G}_{\left(i,j\right)}^{\mathrm{max}} \), then G(i, j) can be considered as an interpolated green sample. In this case, the count value for interpolated samples is incremented by one. Otherwise, G(i, j) is very likely to be an original sample. In this case, the count value for original samples is incremented by one.

For a given hue angle H, let N i (H) be the accumulated total count value of interpolated green pixels in the image. Let N o (H) be the accumulated total count value of original green pixels. That is, for a given H,

$$ \left\{\begin{array}{cc}{N}_i(H)\leftarrow {N}_i(H)+1,& {}_{\left(i,j\right)}^{\mathrm{min}}<G\left(i,j\right)<{G}_{\left(i,j\right)}^{\mathrm{max}}\\ {}{N}_o(H)\leftarrow {N}_o(H)+1,& \mathrm{otherwise}\end{array}\right.. $$
(4)

To normalize the total count value, the ratio between two count values R(H) is defined as follows: for a given H, R(H) is

$$ R(H)=\frac{N_i(H)}{N_o(H)}. $$
(5)

If a green sample is an original, then N i (H) is small and N o (H) is large. By rotating the hue angle from 0° to 359°, we can find the hue angle,Hmin for which R(H) is minimized as follows:

$$ {H}_{\mathrm{min}}=\underset{0\le H<359}{\arg \min }R(H). $$
(6)

The shifted hue angle is estimated as follows:

$$ {H}_F=360-{H}_{\mathrm{min}} $$
(7)

where H F is the altered hue angle. Let us assume that the modified hue angle is 30°. To estimate the changed hue angle, we can find an angle with minimum R(H) by rotating the angle from 0° to 359°. Assuming an accurate estimate, the angle with the minimum variance is Hmin = 330. Therefore, the shifted hue angle is estimated as H F  = 360 − 330 = 30. If an image is not changed, then H F  = 0.

Figure 3 depicts the R(H) values of an original image (the shifted hue is 0°) and a forged image (the shifted hue is 90°) based on their hue angles. The estimated values of H F for various demosaicing methods are presented in Fig. 3. We tested four demosaicing methods: bilinear interpolation, the adaptive homogeneity-directed (AHD) method [22], variable number of gradients (VNG) algorithm [23], and aliasing minimization and zipper elimination (AMaZE) [24]. The demosaiced images are obtained by using the RawTherapee [25] tool. As shown in Fig. 3, the estimation errors are small for bilinear interpolation, which exploits only the information of its own color channel to interpolate missing color values, because the AIVC method checks the number of pixels outside of the range between the minimum and maximum pixel values.

Fig. 3
figure 3

Estimated shifted angles from various demosaicing methods when hue angles are shifted by 0° and 90°

Intra-channel demosaicing schemes, such as bilinear or bicubic interpolations, have many drawbacks, including fringe effects, false colors, and zipper effects. Therefore, the majority of demosaicing algorithms exploit color channel correlations, because the high-frequency information in color channels is highly cross-correlated in natural color images. The demosaicing results of these techniques have more high-frequency components than those of bilinear interpolation. Therefore, we can expect that the estimation error will increase with the use of demosaicing algorithms that use inter-channel and color difference information. As shown in Fig. 3, the estimation errors for a 90° hue shift are 6°, 5°, 28°, and 16° for bilinear, AHD, VNG, and AMaZE, respectively. For this reason, we will introduce a novel color forgery estimation algorithm based on an inter-channel demosaicing method with a color difference model.

3 Proposed method

Basically, an interpolator kernel has the characteristics of a low-pass filter. Therefore, it can be easily deduced that the variance of the original block is larger than that of the interpolated block. Consequently, in the edge regions, the original and interpolated pixels have different characteristics. On the contrary, in the background regions, these pixels have similar characteristics. Therefore, background regions can have an adverse influence on estimating color modification. The conventional methods employ an inefficient estimation process because they use the same block regardless of the characteristics of the image. To remove an adverse influence of background region, we exploit the wavelet transform. Furthermore, to emphasize the effect of high-frequency components, we use the highest frequency subband (HH) in the wavelet domain.

Many demosaicing algorithms employ color differences and attempt to enhance edges. Accordingly, common demosaicing approaches use spectral and spatial correlation for estimating missing pixels using neighboring color pixels [26]. Therefore, the majority of color interpolation algorithms are based on the differences between the original sample and filtered sample or between two interpolated samples. Based on this fact, we introduce a novel estimation algorithm for color change detection using color difference images in the wavelet domain. Recently, the variance measurement of the color difference model has been successfully exploited in Bayer CFA pattern identification [27].

Let \( {\mathbf{D}}_{GR_1} \) and \( {\mathbf{D}}_{GR_4} \) be the color difference images (green minus red) for an RGGB Bayer CFA pattern, respectively. That is,

$$ {\mathbf{D}}_{GR_1}={\overset{\frown }{\mathbf{G}}}_{\mathbf{1}}-{\mathbf{R}}_{\mathbf{1}}, $$
(8)
$$ {\mathbf{D}}_{GR_4}={\overset{\frown }{\mathbf{G}}}_4-{\overset{\frown }{\mathbf{R}}}_4. $$
(9)

\( {\mathbf{D}}_{GR_1} \) is obtained from the difference between the interpolated green samples and original red samples. \( {\mathbf{D}}_{GR_4} \) is calculated from the interpolated green and red samples. We can expect the variance of \( {\mathbf{D}}_{GR_4} \) to be smaller than that of \( {\mathbf{D}}_{GR_1} \). Similarly, we construct two difference images using green and blue components as follows:

$$ {\mathbf{D}}_{GB_1}={\overset{\frown }{\mathbf{G}}}_1-{\overset{\frown }{\mathbf{B}}}_1, $$
(10)
$$ {\mathbf{D}}_{GB_4}={\overset{\frown }{\mathbf{G}}}_4-{\mathbf{B}}_4, $$
(11)

where \( {\mathbf{D}}_{GB_1} \) and \( {\mathbf{D}}_{GB_4} \) are the color difference images (green minus blue) for the RGGB Bayer CFA pattern. As shown in (10), \( {\mathbf{D}}_{GB_1} \) is calculated by using interpolated green and blue samples. \( {\mathbf{D}}_{GB_4} \) is the difference between the interpolated green samples and original blue samples. Similar to the case of the differences between the green and red samples, we can assume that the variance of \( {\mathbf{D}}_{GB_1} \) is smaller than that of \( {\mathbf{D}}_{GB_4} \).

Let DWT(X) be the wavelet transform for the image, X, that produces four subbands. Let D CD (H) be the color difference image with hue shift angle H, where CD {GR1, GR4, GB1, GB4}. We obtain four wavelet subband images as follows:

$$ \left[{\mathbf{W}}_{CD}^{LL}(H)\ {\mathbf{W}}_{CD}^{LH}(H)\ {\mathbf{W}}_{CD}^{HL}(H)\ {\mathbf{W}}_{CD}^{HH}(H)\right]=\mathrm{DWT}\left({\mathbf{D}}_{CD}(H)\right), $$
(12)

where \( {\mathbf{W}}_{CD}^{LL}(H) \), \( {\mathbf{W}}_{CD}^{LH}(H) \), \( {\mathbf{W}}_{CD}^{HL}(H) \), and \( {\mathbf{W}}_{CD}^{HH}(H) \) represent four subbands with a hue shift angle H in the wavelet domain. In the wavelet transform domain, the highest frequency components exist in the finest frequency subband in the diagonal direction. Therefore, the variances of the four difference images in the finest diagonal subband serve as a good measurement for detecting color modification. In this paper, we propose two variance ratio measurements using the variances in the diagonal subbands of the color difference images as follows:

$$ {V}_{GR}(H)=\frac{\sigma^2\left({\mathbf{W}}_{GR_4}^{HH}(H)\right)}{\sigma^2\left({\mathbf{W}}_{GR_1}^{HH}(H)\right)}, $$
(13)
$$ {V}_{GB}(H)=\frac{\sigma^2\left({\mathbf{W}}_{GB_1}^{HH}(H)\right)}{\sigma^2\left({\mathbf{W}}_{GB_4}^{HH}(H)\right)}, $$
(14)

where V GR (H) and V GB (H) are the variance ratios for a given hue angle, H. The denominators in (13) and (14) are obtained from the differences between the filtered green samples and original red or blue samples, respectively. The numerators in (13) and (14) are calculated from the differences between the two interpolated color samples. Therefore, we can expect that the variance ratios will have a smaller value for the original sample. If the image is modified through hue shifting, the minimum value will occur at the position of the shifted hue angle.

Finally, we define the estimation measurement for detecting color modification, V E (H) as follows:

$$ {V}_E(H)={V}_{GR}(H)+{V}_{GB}(H). $$
(15)

To estimate a changed hue angle, the hue of a suspicious image is changed from 0° to 359° in increments of a pre-defined interval factor. For each H, V E (H) is calculated. Hmin is the hue angle where V E (H) yields the minimum value, which is written as follows:

$$ {H}_{\mathrm{min}}=\underset{0\le H<359}{\arg \min }{V}_E(H). $$
(16)

Figure 4 presents the graphs of V E (H) for the hue shifted image and estimated as H F  = 360 − Hmin. The test conditions are the same as in Fig. 3. As shown in Fig. 4, the estimation errors obtained by the proposed method are smaller than those obtained by the AIVC algorithm. The average estimation error of the proposed method for the eight sample images is 0.25°. The average estimation error of the AIVC algorithm for the eight sample images is 12.75°, as shown in Fig. 3. However, we notice that the minimum value occurs twice at Hmin and Hmin + 180° in our method. This is because our algorithm is based on the variance of color difference values. The absolute values of color difference have periodicity because blue and red color values are shifted from the green value by 120° and 240°, respectively, as shown in Fig. 5. This situation yields conflicting results. Because the minimum value occurs twice at Hmin and Hmin + 180°, we can trace only half of the total angle range to estimate the exact shifted hue angle. Therefore, we can cut the estimation time in half based on the periodicity of Hmin. However, we need an additional decision process to determine which of the angles is the real shifted hue angle.

Fig. 4
figure 4

Estimated shift angles obtained by the proposed algorithm for various demosaicing methods when angles are shifted by 0° and 90°

Fig. 5
figure 5

The 180° periodicity of the absolute values of color differences

In this paper, we introduce the following refinement process to resolve the situation of having two minimum values. The hue value of a suspicious image is changed from 0° to 179° in increments of the pre-defined interval factor. The hue angle having the minimum V E (H) value is obtained by changing the hue angle from 0° to 179° as follows:

$$ {H}_r=\underset{0^{\circ}\le H\le {179}^{\circ }}{\mathrm{argmin}}{V}_E(H), $$
(17)

where H r is the estimated hue angle. We know that H r  + 180° is also the minimum angle. To select the actual minimum value, we use the AIVC algorithm. The angle, H F from among the minimum values of R(H r ) and R(H r  + 180°) can be considered as the real shifted hue angle. We can determine the final shifted hue angle by using the AIVC algorithm for only the two angles obtained by the proposed method.

The proposed algorithm is summarized in Table 1. As in existing methods, we determine the size of a square image block to estimate the level of color modification and set the hue angle to zero (H = 0). We choose the square image block located at the center of the image and construct four color difference images \( {\mathbf{D}}_{GR_1} \), \( {\mathbf{D}}_{GR_4} \), \( {\mathbf{D}}_{GB_1} \), and \( {\mathbf{D}}_{GB_4} \). For these four difference images, we perform a discrete wavelet transform and compute the variances of the HH subbands corresponding to the four difference images. We then calculate two variance ratios using (13) and (14). Next, the estimation measurement, V E (H) for detecting color modification is obtained using (15). V E (H) is calculated by repeating the same process until H becomes 360°. After obtaining H r using (17), we estimate the modified hue angle H F using the AIVC algorithm.

Table 1 Summarization of the proposed method

4 Simulation results

4.1 Test image sets and simulation conditions

In this paper, we used 1460 raw images provided by the Dresden Image Database [28] to evaluate the performance of the proposed method. The camera types and detailed information regarding the images used in our experiments are contained in Table 2. CFA interpolations were performed using RawTherapee [25], which is a well-known cross-platform raw image processing program. Seven CFA interpolation algorithms were used in our experiments: bilinear interpolation, AHD interpolation [22], AMaZE interpolation [24], DCB demosaicing [29], linear minimum mean square error (LMMSE) demosaicing [30], VNG algorithm [23], and heterogeneity-projection hard-decision (HPHD) color interpolation [31].

Table 2 Camera models and image information for the experiments

To estimate the shifted hue angles, the center regions of the sample images were cropped into square blocks of various sizes, such as 512 × 512, 256 × 256, 128 × 128, 64 × 64, and 32 × 32. The hue of the sample images was shifted from 0° to 359° in 45° increments to generate a test image set. In total, 408,800 (1460 × 5 × 8 × 7) test images were generated in our simulations. The mean values of estimation errors, as well as standard deviations, were calculated to evaluate the estimation performance for altered hue angles.

4.2 Results of color modification detection

Table 3 shows the hue shift estimation performance for the proposed and conventional methods. The mean error values were drawn from the results of the seven demosaicing algorithms, which used a 256 × 256 centered block. As shown in Table 3, the mean values of the estimation errors obtained by the proposed algorithm are slightly larger than those of the AIVC method in the case of bilinear interpolation. However, our estimation method is superior to the AIVC algorithm for AHD, AMaZE, DCB, LMMSE, VNG, and HPHD interpolations. The averaged mean error values achieved by the proposed method for the shifted hue angles ranged from 8.71 (for 0°) to 8.83 (for both 45° and 135°). On the other hand, the averaged mean error values obtained by the AIVC method ranged from 14.87 (for 0°) to 19.56 (for 225°). These results indicate that the AIVC method based on intermediate value counting is only suitable for bilinear interpolation. In summary, our method achieves better estimation performance than the AIVC method when considering various demosaicing methods and altered hue angles. In particular, the estimation errors in our algorithm are nearly independent of the altered hue angles. The AIVC algorithm has superior estimation performance only when image forgeries are not created.

Table 3 Mean error value comparison between AIVC and the proposed method for various demosaicing methods and shifted hue angles in a 256 × 256 centered block

Table 4 contains the color forgery detection performance results for the proposed and conventional methods. For a shifted hue angle of 45°, the mean error values are calculated based on the various cropped center block sizes. As shown in Table 4, the averaged mean error values of the proposed method for various cropped center block sizes ranged from 7.98 (for a 512 × 512 block) to 18.59 (for a 32 × 32 block). The averaged mean error values obtained by the AIVC method ranged from 16.81 (for a 512 × 512 block) to 29.97 (for a 32 × 32 block). The size of the cropped center block can affect both estimation performance and computational cost. A larger block size can provide better estimation performance but suffers from high computation time.

Table 4 Mean error value comparison between AIVC and the proposed method for various demosaicing methods and centered blocks with a shifted angle of 45°

Mean error value comparisons between our method and the AIVC algorithm for various shifted hue angles are presented in Table 5. The error values were averaged according to the cropped center block sizes. As shown in Table 5, our method achieves superior estimation performance compared to the AIVC method, except for bilinear interpolation. Table 6 shows the mean error value comparisons between our method and the AIVC algorithm for various block sizes. The error values were averaged from all shifted angles. As shown in Table 6, our color change detection algorithm is superior to the conventional AIVC method, except for the three bilinear interpolation examples that have center block sizes larger than 128 × 128.

Table 5 Mean error value comparisons between our method and the AIVC algorithm for various hue shift angles
Table 6 Mean error value comparisons between our method and the AIVC algorithm for various block sizes

In general, images are captured and then used through image pipelines such as white balancing, gamma correction, and noise reduction. In this paper, we also tested our method for white-balanced and gamma-corrected images. For white balancing, we exploited “white patch” algorithm [32]. We used “lin2rgb” function in Matlab to obtain gamma-corrected images. Table 7 shows the mean error value comparisons between our method and the AIVC algorithm after white balancing and gamma correction for various block sizes. As shown in Table 7, we can see that the white balancing and gamma correction have some influence on the color forgery detection. Unlike Table 6, the AIVC method shows better detection performance than the proposed algorithm in the bilinear interpolation, LMMSE, and VNG demosaicing method. The AIVC algorithm has a positive effect, while the proposed method is negatively affected. It is considered that the white balancing and gamma correction affect the variance, and the performance of the proposed method will be degraded because our algorithm is based on the variance. However, the proposed method still achieves better detection results for other demosaicing algorithms.

Table 7 Mean error value comparisons between our method and the AIVC algorithm after white balancing and gamma correction for various block sizes

4.3 Forged color region detection

For this paper, we generated four forged color images using Adobe Photoshop CS4 (64 bit) as shown in Fig. 6. The castle in “sample 1” was altered with a 180° hue shift, and the poster on the door in “sample 2” was shifted by − 90°. In “sample 3”, two regions (the fish tank and the pictures on the wall) were altered by − 45°. Finally, two countries on the globe in sample 4 were shifted by − 45° and 90°. The test sample images were sliced into 32 × 32 blocks. The test was performed by shifting the hue angle in 5° increments. Figure 7 presents the forged color region detection results for the seven demosaicing methods. For all demosaicing methods, our proposed method outperforms the conventional AIVC algorithm. As shown in Fig. 7, it is hard to estimate color shift angles or detect a forged region using the AMaZE, LMMSE, VNG, and HPHD demosaicing methods. This indicates that forged color detection still requires significant research.

Fig. 6
figure 6

Sample images for detecting forged color regions

Fig. 7
figure 7

Forged region detection results for various demosaicing algorithms

5 Discussion

The proposed algorithm has a reasonable accuracy in estimating the level of color modification. The results of the proposed method are superior to those obtained using the conventional method for all demosaicing algorithms except bilinear interpolation. In addition, the proposed approach can be applicable to localize forged color regions. However, the area of color modification detection still needs a lot of research based on the results of the previous and our studies.

6 Conclusions

In this paper, we presented a new color modification detection algorithm using the variance ratio of color difference images in the wavelet domain. We generated a color difference model to emphasize the differences between original pixels and interpolated pixels. We also introduced a variance ratio measurement using the highest frequency subband in the wavelet domain to quantify the level of hue angle modification. Experimental results demonstrated that the proposed method generates accurate estimation results for detecting color modification. Compared to the conventional method, our method provides superior color modification detection performance. In addition, we verified that modified color regions could be efficiently detected using the proposed algorithm.

Abbreviations

AHD:

Adaptive homogeneity-directed

AIVC:

Advanced intermediate value counting

AMaZE:

Aliasing minimization and zipper elimination

CFA:

Color filter array

H:

Hue

HPHD:

Heterogeneity-projection hard-decision

I:

Intensity

LMMSE:

Linear minimum mean square error

S:

Saturation

VNG:

Variable number of gradients

References

  1. H Farid, A survey of image forgery detection. IEEE Signal Process. Mag. 2(26), 16–25 (2009)

    Article  Google Scholar 

  2. B Mahdian, S Saic, A bibliography on blind methods for identifying image forgery. Signal Process. Image Commun. 25(6), 389–399 (2010)

    Article  Google Scholar 

  3. B Xiuli, P Chi-Man, Fast reflective offset-guided searching method for copy-move forgery detection. Inf. Sci. 418-419, 531–545 (2017)

    Article  Google Scholar 

  4. D Cozzolino, G Poggi, L Verdoliva, Efficient dense-field copy–move forgery detection. IEEE Trans. Inf. Forensics Secur. 10, 2284–2297 (2015)

    Article  Google Scholar 

  5. R Davarzani, K Yaghmaie, S Mozaffari, M Tapak, Copy-move forgery detection using multiresolution local binary patterns. Forensic Sci. Int. 231, 61–72 (2013)

    Article  Google Scholar 

  6. X Zhao, S Wang, S Li, J Li, Passive image-splicing detection by a 2-D noncausal Markov model. IEEE Trans. Circuits Syst. Video Technol. 25(2), 185–199 (2015)

    Article  Google Scholar 

  7. Z He, W Lu, W Sun, J Huang, Digital image splicing detection based on Markov features in DCT and DWT domain. Pattern Recogn. 45(12), 4292–4299 (2012)

    Article  Google Scholar 

  8. JG Han, TH Park, YH Moon, IK Eom, Efficient Markov feature extraction method for image splicing detection using maximization and threshold expansion. J. Electron. Imaging. 25(2), 023031 (2016)

    Article  Google Scholar 

  9. AC Popescu, H Farid, Exposing digital forgeries by detecting traces of resampling. IEEE Trans. Signal Process. 53(2), 758–767 (2005)

    Article  MathSciNet  MATH  Google Scholar 

  10. B Mahdian, S Saic, Blind authentication using periodic properties of interpolation. IEEE Trans. Inf. Forensics Secur. 3(3), 529–538 (2008)

    Article  Google Scholar 

  11. W Wei, S Wang, Estimation of image rotation angle using interpolation-related spectral signatures with application to blind detection of image forgery. IEEE Trans. Inf. Forensics Secur. 5(3), 507–517 (2010)

    Article  Google Scholar 

  12. G Liu, J Wangb, S Lian, Y Dai, Detect image splicing with artificial blurred boundary. Math. Comput. Model. 57, 2647–2659 (2013)

    Article  MATH  Google Scholar 

  13. G Cao, Y Zhao, R Ni, Edge-based blur metric for tamper detection. J. Inf. Hiding Multimedia Signal Process. 1(1), 20–27 (2010)

    Google Scholar 

  14. G Cao, Y Zhao, R Ni, X Li, Contrast enhancement-based forensics in digital images. IEEE Trans. Inf. Forensics Secur. 9(3), 515–525 (2014)

    Article  Google Scholar 

  15. CH Choi, HY Lee, HK Lee, Estimation of color modification in digital images by CFA pattern changes. Forensic Sci. Int. 226, 94–105 (2013)

    Article  Google Scholar 

  16. CH Choi, JH Choi, HK Lee, CFA pattern identification of digital cameras using intermediate value counting, Proceedings of ACM Workshop in Multimedia and Security (2011), pp. 21–26

    Google Scholar 

  17. DR Cok, Signal processing method and apparatus for producing interpolated chrominance values in a sampled color image signal. US Patent 4642678, (1987).

    Google Scholar 

  18. WT Freeman, Median filter for reconstructing missing color samples. US Patent 4724395, (1988)

    Google Scholar 

  19. CA Laroche, MA Prescott, Apparatus and method for adaptively interpolating a full color image utilizing chrominance gradients. US Patent 5373322, (1994)

    Google Scholar 

  20. JF Hamilton, JE Adams, Adaptive color plan interpolation in single sensor color electronic camera. US Patent 5629734, (1997)

    Google Scholar 

  21. BE Bayer, Color imaging array. US Patent 3971065, (1976)

    Google Scholar 

  22. K Hirakawa, TW Parks, Adaptive homogeneity directed demosaicing algorithm. IEEE Trans. Image Process. 14(3), 360–369 (2005)

    Article  Google Scholar 

  23. E Chang, S Cheung, DY Pan, Color filter array recovery using a threshold-based variable number of gradients, Proceedings of SPIE, Sensors, Cameras, and Applications for Digital Photography (1999), pp. 36–43

    Google Scholar 

  24. E Martinec, P Lee, AMAZE demosaicing algorithm. http://www.rawtherapee.com/. Accessed 15 June 2017

  25. G Horvath, http://www.rawtherapee.com/. Accessed 22 Apr 2017.

  26. K HS, SS Kim, IK Eom, Wavelet-domain demosaicking using linear estimation of interchannel correlation. Opt. Eng. 47(6), 067002 (2008)

    Article  Google Scholar 

  27. HJ Shin, JJ Jeon, IK Eom, Color filter array pattern identification using variance of color difference image. J. Electron. Imaging. 26(4), 0243303015 (2017)

    Article  Google Scholar 

  28. T Gloe, R Bohme, The ‘Dresden Image Database’ for benchmarking digital image forensics, Proceedings of the 25th Symposium on Applied Computing (2010), pp. 1585–1591

    Google Scholar 

  29. J Gozd, DCB demosaicing algorithm. http://www.linuxphoto.org/html/dcb.html. Accessed 15 June 2017

  30. L Zhang, X Wu, Color demosaicking via directional linear minimum mean square-error estimation. IEEE Trans. Image Process. 14(12), 2167–2178 (2005)

    Article  Google Scholar 

  31. CY Tsai, KT Song, Heterogeneity-projection hard-decision color interpolation using spectral-spatial correlation. IEEE Trans. Image Process. 16(11), 78–91 (2007)

    Article  MathSciNet  Google Scholar 

  32. EH Land, JJ McCann, Lightness and retinex theory. J. Opt. Soc. Am. 61(1), 1–11 (1971)

    Article  Google Scholar 

Download references

Funding

This research was supported by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education (NRF-2015R1D1A3A01019561), and BK21PLUS, Creative Human Resource Development Program for IT Convergence.

Availability of data and materials

Please contact the author for data requests.

Author information

Authors and Affiliations

Authors

Contributions

JJJ proposed the framework of this work, carried out the whole experiments, and drafted the manuscript. IKE initiated the main algorithm of this work, supervised the whole work, and wrote the final manuscript. Both authors read and approved the final manuscript.

Corresponding author

Correspondence to Il Kyu Eom.

Ethics declarations

Competing interests

Both authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jeon, J.J., Eom, I.K. Wavelet-based color modification detection based on variance ratio. J Image Video Proc. 2018, 47 (2018). https://doi.org/10.1186/s13640-018-0286-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13640-018-0286-6

Keywords