# Analysis of colour constancy algorithms using the knowledge of variation of correlated colour temperature of daylight with solar elevation

- Sivalogeswaran Ratnasingam
^{1, 3, 4}Email author, - Steve Collins
^{1}and - Javier Hernández-Andrés
^{2}

**2013**:14

https://doi.org/10.1186/1687-5281-2013-14

© Ratnasingam et al.; licensee Springer. 2013

**Received: **5 August 2012

**Accepted: **16 February 2013

**Published: **18 March 2013

## Abstract

In this article, we present an investigation of possible improvement of the colour constant reflectance features that can be obtained from daylight illuminated scenes using pixel-level colour constancy algorithms (model-based algorithm: S Ratnasingam, S Collins, J. Opt. Soc. Am. A **27**, 286–294 (2010) and Projection-based algorithm: GD Finlayson, MS Drew, IEEE ICCV, 2001, pp. 473–480). Based on the investigation we describe a method to improve the performance of the colour constancy algorithms using the correlation between the correlated colour temperature of measured daylight with the solar elevation and phase of the day (morning, midday and evening). From this observation, the data from 1 year are used to create a solar elevation and phase of day-dependant method of interpreting the information obtained the colour constancy algorithms. Test results show that using the proposed method with 40-dB signal-to-noise ratio the performance of the projection-based algorithm and model-based algorithm can be improved on average by 33.7 and 45.4%, respectively. More importantly, a larger improvement (85.9 and 113.7%) was obtained during the middle period of each day which is defined as when the solar elevation is larger than 20°.

### Keywords

Colour constancy Illuminant invariant chromaticity## 1. Introduction

Colour is a useful feature in several machine vision applications including object recognition and image indexing. However, the apparent colour of an object varies depending on the viewing environment, particularly in scenes illuminated by daylight. Colour-based recognition in naturally illuminated scenes is therefore a difficult problem and existing techniques are not effective in real scenes [1–6]. The problems with obtaining colour information from naturally illuminated outdoor scenes arise from the uncontrolled changes to both the intensity and the spectral power distribution of daylight. The resulting variations in the recorded colour of the same object under different daylight conditions make it difficult to use colour as a reliable source of information in machine vision applications [3–6].

In contrast to camera-based systems, the human visual system’s perception of the colour of an object is largely independent of the illuminant. Several approaches to replicating the colour constancy achieved by the human visual system have been proposed [6]. However, many of these algorithms assume that scene is illuminated uniformly and shadows mean that this assumption is not always valid for outdoor scenes. The assumption that a scene is uniformly illuminated is avoided by those colour constancy algorithms that only use pixel-level information. Marchant and Onyango [7], Finlayson and Hordley [8] and Romero et al. [9] have all proposed methods for obtaining a single illuminant invariant feature from pixel data under daylight based on the assumptions that the spectral responses of the sensors are infinitely narrow and that the power spectral density of the illuminant can be approximated by the power spectral density of a blackbody illuminant. However, a single feature potentially leads to confusion between some perceptually different colours [6]. This confusion can be avoided using the method proposed by Finlayson and Drew [10] that obtains two illuminant independent features from the responses of four sensors with different spectral responses. Based on the same assumptions of infinitely narrow band sensors and the blackbody model of the illuminant, Ratnasingam and Collins [11] proposed a simple method of obtaining two illuminant independent features from the responses of four sensors. Since neither of the assumptions upon which all these methods is strictly applicable, Ratnasingam and Collins [11] also proposed a method for determining the quality of the spectral information that can be obtained from the features obtained using these methods. One important conclusion from their work was that pixel-level colour constancy algorithms extract useful information from the responses of sensors whose spectral response covers a wavelength range of 80 nm or less [11]. Subsequently, Ratnasingam et al. [12] investigated the possibility of optimising the spectral responses of the image sensors to improve the quality of the illuminant invariant features that can be obtained from daylight illuminated scenes using pixel-level colour constancy algorithms. The conclusion of this study was that the widely available International Commission on Illumination (CIE) standard illuminants was not representative enough to the actual daylight measurements obtained under widely varying weather conditions to be used to reliably optimise the spectral responses [12].

Studying the pattern of variation of the daylight spectra would be very useful because daylight dominates in the day time and some important machine vision applications, including remote sensing, controlling unmanned vehicles and simultaneous localisation and mapping, are based upon processing outdoor scenes. The variation in spectral power distribution of the daylight due to the variation in solar elevation, time of year and geographic location leads to difficulties in reliably recognising objects or scenes. In this article, we investigate the advantages of using solar elevation and phase of the day as auxiliary information to enhance the performance of a colour constancy algorithm. This auxiliary information can be calculated if the time of day, date and the geographic location of the camera are known. However, it is difficult to model such a relation with the artificial illuminants. But, one could use the knowledge of whether the scene is indoor or outdoor to account for the illuminant in these scenarios. Based on this evidence an approach for improving the performance of colour constancy algorithms that can be obtained from daylight illuminated scenes is proposed based upon the observed variation of the correlated colour temperature (CCT) of measured daylight with solar elevation and date. Here, it is worth noting that in our investigation for possible improvement of colour constancy algorithms we only use the approximate geographic location (example country or region) and phase of the day (i.e. morning, midday and afternoon) as auxiliary information to the colour constancy algorithms. The performance improvement obtained using the proposed approach has been illustrated using two existing colour constancy algorithms proposed by Finlayson and Drew [10] and that proposed by Ratnasingam and Collins [11]. Here, we should note that the algorithms chosen to investigate the performance enhancement of colour constancy using solar elevation and date, do not rely on the content of the rest of the scene in extracting the illuminant invariant features. This choice was made because outdoor scene might consist of regions illuminated by direct sunlight or skylight or under shadow and the power spectrum of the incident light in these regions will not be the same [13]. This means in the outdoor scenes where the intensity and the relative power spectrum vary spatially, the performance of any algorithm that processes the entire scene to obtain an estimate for the illuminant, significantly degrades in any real-world scenes therefore not appropriate for machine vision applications. For this reason, we have chosen two algorithms that extract illuminant invariant features at pixel level rather than processing the whole scene.

The remainder of this article is organised as follows: In Section 2, a brief description of the algorithm of obtaining useful features related to the reflectance of a surface proposed by Finlayson and Drew [10] and Ratnasingam and Collins [11] is presented. In Section 3, a description of the proposed method for improving the degree of illuminant invariance obtained from the features determined using the two different methods is given. Simulation results to assess the effectiveness of the proposed method are then presented in Section 4. Finally, Section 5 contains the conclusions of this study.

## 2. Descriptions of the algorithms and assessment method

*E*

^{ x }(

*λ*) reflects on an object’s surface at point

*x*with reflectance spectrum

*S*

^{ x }(

*λ*) the image sensor response

*R*

^{ x,E }can be given by

*F*(

*λ*) is the spectral sensitivity of the image sensor and

*I*

^{ x }is the intensity of the incident light at scene point

*x*. The unit vectors ${i}_{-}^{\mathit{x}}$ and ${j}_{-}^{\mathit{x}}$ denote the directions of the illuminant and the surface normal, respectively. The dot product between these two unit vectors ${i}_{-}^{\mathit{x}}.{j}_{-}^{\mathit{x}}$ represents the scene geometry. Equation (1) can be simplified by assuming that the image sensor samples the scene at a single wavelength, i.e. the sensitivity of the image sensor can be approximated by Dirac delta function. Applying the sifting property of the Dirac delta function simplifies Equation (1) as follows:

*G*

^{ x }(= i

^{ x }. j

^{ x }) is the geometry factor. On the right-hand side of the above equation, the first term (

*G*

^{ x }

*I*

^{ x }) depends on the geometry of the scene and illuminant intensity. The second term (

*E*

^{ x }(

*λ*

_{ i })) depends only on the relative power spectrum of the illuminant and the third term (

*S*

^{ x }(

*λ*

_{ i })) depends only on the reflectance of the object’s surface. From this one can understand that for obtaining scene independent perceptual descriptors we need to remove the first two components. Moreover, the first term can be removed by taking the difference between two image sensor responses that have different spectral sensitivities. Once the scene geometry component and intensity components have been removed the only scene dependent term is the power spectrum of the illuminant. This could be removed by assuming a parametric model for the illuminant power spectrum. Based on this, Finlayson and Drew [10] proposed an algorithm for colour constancy based on the assumptions that the spectral power distribution of the illuminant can be modelled by Wien’s radiation law. To remove the changes in the power spectral density of the illuminant from the log-difference features, Finlayson and Drew used eigen vector decomposition and projected the three log-difference features in the direction of largest eigen vector. This resulted in a two-dimensional space where a reflectance can be located approximately invariant to illuminant. To find the relevant eigen vectors, Finlayson and Drew used 18 non-grey patches of the Macbeth Colour Checker chart and blackbody illuminants with colour temperatures between 5,500 and 10,500 K. As the blackbody model of illuminant is not the most appropriate model for real-world illuminants we adapted Finlayson and Drew’s algorithm to improve in such a way that we used the CIE standard [14] representing the daylight at different phase of the day and the standard Munsell reflectance data base [15] for obtaining more realistic estimation of the direction of illuminant induced variation. In the remainder of the article, we refer to this adapted version of the algorithm as the projection algorithm. Once we have estimated the more realistic direction of illuminant-induced variation using eigen vector decomposition we projected the logarithm difference features in that direction and obtained two features that are invariant to illuminant, and scene geometry. The two illuminant invariant features (

*F*

_{1}

^{Proj}and

*F*

_{2}

^{Proj}) from the projection-based algorithm were obtained as follows:

where Log_{Diff} = [log (*R*
_{2}) − log (*R*
_{1}), log (*R*
_{3}) − log (*R*
_{1}), log (*R*
_{4}) − log (*R*
_{1})] and 〈 〉 means scalar product. The quantities *R*
_{1}, *R*
_{2}, *R*
_{3} and *R*
_{4} are the sensor responses numbered from the shortest wavelength end of the spectrum and **Eig**
_{
1
} and **Eig**
_{
2
} are the two smallest eigen vectors obtained from the eigen vector decomposition applied on the normalised sensor responses.

*R*

_{1},

*R*

_{2},

*R*

_{3}and

*R*

_{4,}the model-based algorithm estimates an illuminant invariant reflectance feature by estimating the illuminant effect on the sensor response

*R*

_{2}using the sensor responses

*R*

_{1}and

*R*

_{3}. Similarly, a second illuminant invariant reflectance feature is extracted by estimating the illuminant effect on the sensor response

*R*

_{3}using the sensor responses

*R*

_{2}and

*R*

_{4}. The two illuminant invariant reflectance features (

*F*

_{1}

^{Model}and

*F*

_{2}

^{Model}) are obtained as follows:

_{1}, λ

_{2}, λ

_{3}and λ

_{4}are the peak wavelengths of the four image sensors the two features are ideally independent of the illuminant if the two channel coefficients are chosen so that the following two equations are satisfied [11].

**The wavelengths corresponding to the peak sensor responses and full width half maximum (FWHM) of the spectral responses of sensors with Gaussian spectral responses used to investigate both the model-based and projection-based algorithms**

Sensor ID | 1 | 2 | 3 | 4 |
---|---|---|---|---|

Peak position (nm) | 437.5 | 512.5 | 587.5 | 662.5 |

Spectral width (nm) (FWHM) | 80.0 | 80.0 | 80.0 | 80.0 |

Channel coefficient | α = 0.4268, β = 0.4362 |

**The wavelengths corresponding to the peak sensor responses and FWHM of the spectral responses of sensors with Gaussian spectral responses used to investigate the projection-based algorithm**

Sensor ID | 1 | 2 | 3 | 4 |
---|---|---|---|---|

Peak position (nm) | 437.5 | 512.5 | 587.5 | 662.5 |

Spectral width (nm) (FWHM) | 80.0 | 80.0 | 80.0 | 80.0 |

Eigen vector 1 | 0.0977 0.366 –0.925 | |||

Eigen vector 2 | 0.647 –0.729 –0.220 |

The results in Figure 1 show that, except for a region of metamers (metamers for *F*
_{1}
^{Model} and *F*
_{2}
^{Model} space), different colours are projected to different parts of this feature space. Previously, the results obtained from the same Munsell reflectances when illuminated by CIE standard daylight with different CCTs showed that a residual illuminant dependency causes small variations in the projected position of each reflectance sample in the feature space [11].

As the model-based and projection-based algorithms extract features that are independent of the lightness component of a colour, the performance of these two algorithms have been investigated using three sets of 100 pairs of reflectances chosen from normalised Munsell reflectance spectra [12]. As described by Ratnasingam et al. [12], each of these three test reflectance sets has 100 pairs of reflectances with pair wise distances of either 0.975 to 1.025 CIELab units, 2.99 to 3.01 CIELab units or 5.995 to 6.005 CIELab units, respectively. These three sets were chosen based on the colourimetric description of the CIELab space. In the CIELab space, colours separated by less than 1 unit are described as not perceptibly different, 1–3 units are as very good match to each other and 3–6 units as a good match for an average person [16]. These colourimetric descriptions have widely been used by researchers in colourimetric experiments [16–18]. Even though differences of 1 CIELab unit are described as not perceptible by an average human observer [18] we have included this data in the test set to investigate if the proposed algorithm is better than the human visual system in recognising perceptually similar colours. In fact our results show that the algorithm can differentiate more than 25% of the pairs of ‘colours’ that are described as not perceptibly different to an average human.

## 3. Proposed approach for improving colour constancy

The Sun is the dominant natural light source during the day. The power spectral density of the sunlight above the Earth’s atmosphere can be approximated by the spectrum of a blackbody. However, this is only an approximation and the spectral peak wavelength, 475 nm, is the same as the peak wavelength for a blackbody with colour temperature 6,101 K, whilst taking into account the entire spectrum the CCT is 5750 K [19]. To reach the Earth’s surface this light must travel through the atmosphere where it is both scattered and absorbed. The path length of the light from the edge of the atmosphere to the Earth’s surface depends upon the angle of elevation between the horizon and the Sun’s position in the sky. As a result the CCT of the daylight reaching the surface of the Earth is expected to depend upon the solar elevation.

The path length of the light through the atmosphere is only one of the factors that determine the amount of scattering and absorption that will occur at a particular moment. Moreover, the amount of absorption and scattering per unit length along this path will depend upon atmospheric conditions. These atmospheric conditions will vary during each day and between days. However, seasonal weather patterns will mean that conditions on days within the same season will be more similar than between days in different seasons. Effects such as typical weather patterns and the interaction between the sunlight and the atmosphere mean that although the CCT of the daylight will primarily depend upon the solar elevation, it may also depend upon the time of year.

## 4. Results and discussion

### 4.1. Fitting the Mahalanobis distance boundary

*C*and a point

*P*is defined as follows:

*D*

_{ m 1}is the Mahalanobis distance from cluster centre 1 and

*D*

_{ m 2}is the Mahalanobis distance from cluster centre 2) that go through the midpoint (

*P*) of the line connecting the two mean points (

*C*

_{1}and

*C*

_{2}) of the pairs of clusters were then calculated.

where *Σ*
_{1}
^{−1} and *Σ*
_{2}
^{−1} are the inverse covariance matrices of the clusters 1 and 2, respectively. The boundaries obtained using *D*
_{
m 1} and *D*
_{
m 2} as the distances from the relevant cluster centre lead to overlap of some of the cluster boundaries (shown in Figure 5a). To find the boundaries that just touch each other (referred to as the touching boundaries), the points on a boundary at a Mahalanobis distance slightly smaller than *D*
_{
m 1} and *D*
_{
m 2} from the respective cluster centres were then calculated for both pairs of reflectances. Previously, Ratnasingam and Collins [11] have determined the touching boundaries by gradually increasing the Mahalanobis distances between the boundaries and the centres until the boundaries touched (shown in Figure 5b). This process maximises the number of responses that are correctly classified; however, it is time consuming. To make the process of finding the Mahalanobis distance boundary simpler and to avoid any overlap, boundaries were drawn with the Mahalanobis distance of 90% of the distance *D*
_{
m 1} and *D*
_{
m 2} (shown in Figure 5c). Results such as the typical results shown in Figure 5 show that the boundaries drawn with 90% of *D*
_{
m 1} and *D*
_{
m 2} do not overlap but will slightly underestimate the ability to correctly identify the source of a feature point generated using either the model-based or projection-based algorithms. Slightly underestimating the identification performance is acceptable compared to overestimating the performance of an algorithm.

### 4.2. Selecting the boundary parameters based on the phase of a day

The possible benefits of employing the phase of the day in conjunction with date information when identifying the possible source of a set of features obtained using a pixel-level colour constancy algorithm have been assessed using the measured daylight spectra [20]. In particular, the daylight spectra measured in the year of 1996 (daylight spectra set 1) were used to determine the 90% of *D*
_{
m 1} and *D*
_{
m 2} boundaries for 100 pairs of reflectances. The daylight spectra measured in the year of 1997 (daylight spectra set 2) was then used to generate the simulated sensor responses from which features were obtained. The percentage of the feature points that fell within the correct boundary was then determined for each of the 100 pairs of reflectances in the test dataset.

Based on the above investigation a solar elevation of 20° has been used to separate each day into three parts. In addition, days have been grouped into four quarters, in which quarter 1 consists of months 12, 1 and 2; quarter 2 consists of months 3, 4 and 5; quarter 3 consists of months 6, 7 and 8; quarter 4 consists of months 9, 10 and 11. In the rest of the article, we will use these four quarters to determine the boundary settings. A Mahalanobis distance boundary for each pair of test reflectances was then obtained for each of these set of illuminants. Subsequently, the phase of the day and the date that a measured daylight spectrum was obtained in the second year was used to determine the appropriate set of classification boundaries from the 12 available sets of boundaries. The impact of this method of adapting the Mahalanobis distance boundaries was investigated with features obtained using both the model-based and projection-based algorithms (parameters listed in Tables 1 and 2).

### 4.3. Results for 30-dB SNR

**Performance of the model-based and projection-based algorithms when applying the sensor responses generated by 80-nm FWHM evenly spread sensors**

Phase of day | Solar elevation | Projection-based algorithm | Model-based algorithm | |||||
---|---|---|---|---|---|---|---|---|

Without solar elevation and phase of day | With solar elevation and phase of day | Improvement (%) | Without solar elevation and phase of data | With solar elevation and phase of day | Improvement (%) | |||

Quarter 1 | Morning | −10 to 20 | 13.9 | 12.4 | −10.7 | 13.9 | 11.8 | −15.1 |

Midday | 20 to 90 | 17.4 | 20.0 | +14.9 | 17.1 | 20.1 | +17.5 | |

Evening | −10 to 20 | 12.0 | 11.9 | −0.8 | 12.2 | 11.9 | −2.4 | |

Quarter 2 | Morning | −10 to 20 | 15.3 | 15.5 | +1.3 | 15.5 | 15.8 | +1.9 |

Midday | 20 to 90 | 16.2 | 20.1 | +24.1 | 16.0 | 20.4 | +27.5 | |

Evening | −10 to 20 | 14.2 | 14.0 | −1.4 | 13.9 | 14.0 | +0.7 | |

Quarter 3 | Morning | −10 to 20 | 13.8 | 14.1 | +2.2 | 13.5 | 14.0 | +3.7 |

Midday | 20 to 90 | 15.8 | 20.1 | +27.2 | 15.7 | 20.4 | +29.9 | |

Evening | −10 to 20 | 14.3 | 15.0 | +4.9 | 14.0 | 15.1 | +7.8 | |

Quarter 4 | Morning | −10 to 20 | 14.6 | 14.9 | +2.1 | 14.4 | 14.7 | +2.1 |

Midday | 20 to 90 | 15.0 | 19.4 | +29.3 | 14.8 | 19.5 | +31.8 | |

Evening | −10 to 20 | 13.4 | 15.1 | +12.7 | 13.3 | 15.3 | +15.0 |

**Test results of the model-based and projection-based algorithms when applying the sensor responses generated by 80-nm FWHM evenly spread sensors**

Phase of day | Solar elevation | Projection-based algorithm | Model-based algorithm | |||||
---|---|---|---|---|---|---|---|---|

Without solar elevation and phase of day | With solar elevation and phase of day | Improvement (%) | Without solar elevation and phase of day | With solar elevation and phase of day | Improvement (%) | |||

Quarter 1 | morning | −10 to 20 | 35.2 | 23.6 | −32.9 | 34.6 | 19.6 | −43.3 |

midday | 20 to 90 | 41.4 | 57.2 | +38.2 | 37.8 | 57.6 | +52.4 | |

evening | −10 to 20 | 27.5 | 25.4 | −7.6 | 26.7 | 24.5 | −8.2 | |

Quarter 2 | Morning | −10 to 20 | 39.1 | 38.8 | −0.8 | 38.5 | 39.5 | +2.6 |

midday | 20 to 90 | 36.7 | 57.7 | +57.2 | 34.1 | 58.5 | +71.5 | |

evening | −10 to 20 | 28.6 | 35.8 | +25.2 | 26.3 | 35.5 | +34.9 | |

Quarter 3 | Morning | −10 to 20 | 29.0 | 34.2 | +17.9 | 27.7 | 33.0 | +19.1 |

midday | 20 to 90 | 35.6 | 56.6 | +59.0 | 32.5 | 57.7 | +77.5 | |

evening | −10 to 20 | 32.0 | 44.6 | +39.4 | 28.8 | 43.6 | +51.4 | |

Quarter 4 | Morning | −10 to 20 | 32.5 | 33.0 | +1.5 | 31.1 | 32.2 | +3.5 |

midday | 20 to 90 | 30.2 | 51.2 | +69.5 | 27.4 | 51.4 | +87.6 | |

evening | −10 to 20 | 27.5 | 33.8 | +22.9 | 24.8 | 32.6 | +31.4 |

**Performance of the original Finlayson and Drew’s**
[10] **algorithm when applying the sensor responses generated by 80-nm FWHM evenly spread sensors**

Phase of day | Solar elevation | Projection-based algorithm | |||
---|---|---|---|---|---|

Without solar elevation and phase of day | With solar elevation and phase of day | Improvement (%) | |||

Quarter 1 | Morning | −10 to 20 | 2.9 | 11.9 | 310 |

Midday | 20 to 90 | 3.4 | 19.0 | 458 | |

Evening | −10 to 20 | 2.5 | 11.2 | 348 | |

Quarter 2 | Morning | −10 to 20 | 3.0 | 14.5 | 383 |

Midday | 20 to 90 | 3.2 | 18.9 | 490 | |

Evening | −10 to 20 | 3.0 | 13.2 | 340 | |

Quarter 3 | Morning | −10 to 20 | 3.0 | 13.2 | 340 |

Midday | 20 to 90 | 3.0 | 19.1 | 536 | |

Evening | −10 to 20 | 2.9 | 14.1 | 386 | |

Quarter 4 | Morning | −10 to 20 | 3.1 | 14.0 | 351 |

Midday | 20 to 90 | 2.9 | 18.4 | 534 | |

Evening | −10 to 20 | 2.7 | 14.4 | 433 |

### 4.4. Results for 40-dB SNR

Phase of day | Solar elevation | Projection-based algorithm | Model-based algorithm | |||||
---|---|---|---|---|---|---|---|---|

Without solar elevation and phase of day | With solar elevation and phase of day | Improvement (%) | Without solar elevation and phase of day | With solar elevation and phase of day | Improvement (%) | |||

Quarter 1 | Morning | −10 to 20 | 13.5 | 10.8 | −20.0 | 12.5 | 8.2 | −34.4 |

Midday | 20 to 90 | 11.6 | 26.4 | +128 | 9.3 | 25.8 | +177 | |

Evening | −10 to 20 | 9.0 | 8.2 | −8.9 | 8.2 | 7.3 | −10.9 | |

Quarter 2 | Morning | −10 to 20 | 13.8 | 13.3 | −3.6 | 12.7 | 12.9 | +1.6 |

Midday | 20 to 90 | 9.0 | 26.5 | +194 | 7.5 | 26.6 | +255 | |

Evening | −10 to 20 | 7.6 | 13.4 | +76.3 | 6.1 | 13.0 | +113 | |

Quarter 3 | Morning | −10 to 20 | 8.1 | 8.3 | +2.5 | 7.2 | 7.5 | +4.2 |

Midday | 20 to 90 | 8.9 | 26.1 | +193 | 7.2 | 26.1 | +263 | |

Evening | −10 to 20 | 9.8 | 13.3 | +35.7 | 7.6 | 11.5 | +51.3 | |

Quarter 4 | Morning | −10 to 20 | 10.4 | 10.8 | +3.8 | 9.1 | 9.8 | +7.7 |

Midday | 20 to 90 | 7.2 | 23.4 | +225 | 5.8 | 23.2 | +300 | |

Evening | −10 to 20 | 7.1 | 9.0 | +26.8 | 5.6 | 7.9 | +41.0 |

Phase of day | Solar elevation | Projection-based algorithm | Model-based algorithm | |||||
---|---|---|---|---|---|---|---|---|

Without solar elevation and phase of day | With solar elevation and phase of day | Improvement (%) | Without solar elevation and phase of day | With solar elevation and phase of day | Improvement (%) | |||

Quarter 1 | Morning | −10 to 20 | 57.6 | 48.6 | −15.6 | 56.8 | 44.3 | −22.0 |

Midday | 20 to 90 | 75.8 | 86.0 | +13.4 | 74.3 | 86.9 | +16.9 | |

Evening | −10 to 20 | 50.8 | 51.0 | +0.4 | 49.2 | 49.8 | +1.2 | |

Quarter 2 | Morning | −10 to 20 | 66.0 | 67.6 | +2.4 | 65.2 | 67.8 | +3.9 |

Midday | 20 to 90 | 74.0 | 87.1 | +17.7 | 73.0 | 88.6 | +21.3 | |

Evening | −10 to 20 | 63.9 | 61.7 | −3.4 | 61.9 | 60.2 | −2.7 | |

Quarter 3 | Morning | −10 to 20 | 64.7 | 68.1 | +5.2 | 63.0 | 67.0 | +6.3 |

Midday | 20 to 90 | 73.5 | 86.9 | +18.2 | 72.3 | 88.6 | +22.5 | |

Evening | −10 to 20 | 64.1 | 74.3 | +15.9 | 62.0 | 73.4 | +18.3 | |

Quarter 4 | Morning | −10 to 20 | 62.1 | 62.6 | +0.8 | 60.2 | 61.0 | +1.3 |

Midday | 20 to 90 | 69.7 | 83.3 | +19.5 | 68.3 | 83.9 | +22.8 | |

Evening | −10 to 20 | 61.2 | 62.1 | +1.5 | 59.1 | 60.6 | +2.5 |

## 5. Conclusions

If the spectrum of the scene illuminant can be approximated by the spectrum of a blackbody then pixel-based colour constancy algorithms can be used to extract features that are almost independent of the illuminant. The blackbody approximation used to derive these algorithms is most appropriate when the illuminant is daylight. Results have been presented using measured daylight spectra that show that the residual illuminant dependence of the features is caused by a combination of noise, finite spectral width and the non-blackbody spectrum. The latter effect means that the features obtained from a reflective surface vary with the spectrum of the daylight, usually characterised by its CCT. An investigation of the CCTs of daylight spectra measured in Granada, Spain, over 2 years shows that the CCT of daylight varies with the solar elevation and to a lesser extent on the date. We investigated the optimum number of periods in a year and the solar elevation threshold that results the best performance. Based upon these observations, the measured daylight spectra for 1 year have been used to create 12 sets of illuminants. These illuminants have then been used to create boundaries in the feature space that can be used to distinguish between pairs of surfaces whose colours are very good matches to each other. Using the data from a second year which is different from the training set, it has been shown that for good quality images the ability to successfully distinguish these surfaces is significantly improved on average using the phase of the day and date information by 33.7 and 45.4% for projection-based and model-based algorithms, respectively. In particular, a larger improvement (average improvement: 85.9 and 113.7%) was obtained during the middle period of each day which is defined as when the solar elevation is larger than 20°. This method would therefore seem to be particularly useful for scenes illuminated by daylight in the middle of the day. One possible application that suits these restrictions is precision farming, in which machine vision techniques can be used to target the treatment of small areas of a field or individual plants [22, 23]. In the future, the possibility of optimising the sensor characteristic and using a third illuminant invariant feature will be investigated.

## Declarations

### Acknowledgement

This study was done while Sivalogeswaran Ratnasingam was at the University of Oxford, Oxford, UK. Javier Hernández-Andrés's work was supported by the Ministry of Economy and Competitiveness, Spain, under research grant DPI2011-23202.

## Authors’ Affiliations

## References

- Buluswar SD, Draper BA: Color machine vision for autonomous vehicles.
*Eng. Appl. Artif. Intell*1998, 11: 245-256. 10.1016/S0952-1976(97)00079-1View ArticleGoogle Scholar - Hordley S: Scene illuminant estimation: past, present, and future.
*Color Res. Appl*2006, 31(4):303-314. 10.1002/col.20226View ArticleGoogle Scholar - Foster DH: Color constancy.
*Visi. Res*2011, 51(7):674-700. 10.1016/j.visres.2010.09.006View ArticleGoogle Scholar - Funt B, Barnard K, Martin L: Is colour constancy good enough? In
*Proceeding ECCV '98 Proceedings of the 5th European Conference on Computer Vision-Volume I - Volume I*. Springer-Verlag, London, UK; 1998:445-459. ISBN 3-540-64569-1View ArticleGoogle Scholar - Romero J, Hernández-Andrés J, Nieves JL, García JA: Color coordinates of objects with daylight changes.
*Color Res. Appl*2003, 28: 25-35. 10.1002/col.10111View ArticleGoogle Scholar - Ebner M:
*Color Constancy*. Wiley-IS&T Series in Imaging Science and Technology, New York; 2007.Google Scholar - Marchant JA, Onyango CM: Shadow-invariant classification for scenes illuminated by daylight.
*J. Opt. Soc. Am. A*2000, 17: 1952-1961. 10.1364/JOSAA.17.001952View ArticleGoogle Scholar - Finlayson GD, Hordley SD: Color constancy at a pixel.
*J. Opt. Soc. Am. A*2001, 18: 253-264. 10.1364/JOSAA.18.000253View ArticleGoogle Scholar - Romero J, Hernández-Andrés J, Nieves JL, Valero EM: Spectral sensitivity of sensors for a color-image descriptor invariant to changes in daylight conditions.
*Color Res. Appl*2006, 31(5):391-398. 10.1002/col.20243View ArticleGoogle Scholar - Finlayson GD, Drew MS: 4-sensor camera calibration for image representation invariant to shading, shadows, lighting, and specularities. In
*ICCV 2001: International Conference on Computer Vision, Volume 2*. IEEE, Los Alamitos; 2001:473-480.Google Scholar - Ratnasingam S, Collins S: Study of the photodetector characteristics of a camera for color constancy in natural scenes.
*J. Opt. Soc. Am. A*2010, 27: 286-294.View ArticleGoogle Scholar - Ratnasingam S, Collins S, Hernández-Andrés J: Optimum sensors for color constancy in scenes illuminated by daylight.
*J. Opt. Soc. Am. A*2010, 27: 2198-2207. 10.1364/JOSAA.27.002198View ArticleGoogle Scholar - Forsyth D, Ponce J:
*Computer Vision: A Modern Approach*. Prentice-Hall, Upper Saddle River, NJ; 2003.Google Scholar -
*The Munsell Color Science Laboratory*. http://www.cis.rit.edu/mcsl/online/cie.php - Database – “Munsell Colours Matt”. ftp://ftp.cs.joensuu.fi/pub/color/spectra/mspec/Google Scholar
- Abrardo A, Cappellini V, Cappellini M, Mecocci A, Abrardo A, Cappellini V, Cappellini M, Mecocci A: Art-works colour calibration using the VASARI scanner. In
*Published in Fourth Color Imaging Conference: Color Science, Systems, and Applications*. Scottsdale, Arizona; 1996:94-96. ISBN 0-89208-196-1Google Scholar - Hardeberg JY:
*Acquisition and reproduction of color images: colorimetric and multispectral approaches, Ph.D. dissertation, Ecole Nationale Supérieure des Télécommunications*. 1999.Google Scholar - Mahy M, van Eycken L, Oosterlinck A: Evaluation of uniform color spaces developed after the adoption of CIELAB and CIELUV.
*Color Res. Appl*1994, 19(2):105-121.Google Scholar - Lee HC:
*Introduction to Color Imaging Science*. Cambridge University Press, Cambridge, MA; 2005:46-47. 138–141 and 450–459View ArticleGoogle Scholar - Hernández-Andrés J, Romero J, Nieves JL, Lee RL Jr: Color and spectral analysis of daylight in southern Europe.
*J. Opt. Soc. Am. A*2001, 18: 1325-1335. 10.1364/JOSAA.18.001325View ArticleGoogle Scholar - Kalogirou S:
*Solar Energy Engineering: Processes and Systems*. Academic Press, New York; 2009.Google Scholar - Herwitz SR, Johnson LF, Dunagan SE, Higgins RG, Sullivan DV, Zheng J, Lobitz BM, Brass JA: Imaging from an unmanned aerial vehicle: agricultural surveillance and decision support.
*Comput. Electron. Agric*2004, 44(1):49-61. 10.1016/j.compag.2004.02.006View ArticleGoogle Scholar - Moran MS, Inoue Y, Barnes EM: Opportunities and limitations for image-based remote sensing in precision crop management.
*Remote Sens. Environ*1997, 61(3):319-346. 10.1016/S0034-4257(97)00045-XView ArticleGoogle Scholar

## Copyright

This article is published under license to BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.