Skip to main content

No-reference image quality assessment based on quality patches in real time

Abstract

Image quality assessment methods quantify the quality of an image that is highly correlated with human-perceived image quality. In this paper, the image quality is represented by the image patches that are selected by applying the spatial correlation of pixel-pairs. The quality image patches are selected from the database LIVE2, and quality feature detectors are learned by running the FastICA algorithm on the patches. Then being quality features, the independent component coefficients are found out from each quality image patch. A Hash lookup table is built by the binarization of the independent component coefficients of quality image patches, and the Hash table can be indexed by the binary code of the independent component coefficients of a tested image. The results of proposed approach were compared with results from other methods of image quality assessment and with the subjective image quality evaluated scores. And the experimental results expressed that the proposed metric method of no-reference image quality assessment could accurately measure the distorted degree of images. The Pearson linear correlation coefficient (PCC) achieves a high value 0.949 for the accuracy of evaluating results. The Spearman order correlation (SRC) achieves a high value 0.996 for the monotonicity of evaluating results. And the root mean square error (RMSE) is 5.917 for the subjective consistency of evaluating results. Extra evaluating aerial images, the proposed method obtained the result that the foggy weather led to the worst quality.

1 Introduction

Higher image quality is required to transmit and acquire images with the rapid development of the digital vision technology [1, 2]. So it is necessary to embed a module of monitoring image quality at displaying terminals. The module can be used to adjust the quality of displaying images and videos in real-time and satisfy the demands of human watching.

No-reference image quality assessment technique is proposed to monitor the image quality of processing systems and overcome the difficulty of reference images being absent [3]. This technique is deeply studied by many scholars [4,5,6,7,8]. The quality features are important factors that are extracted to represent the properties of image quality in the technique of no-reference image quality assessment [9]. The models were built based on the quality features for representing the image quality properties. And the models can obtain the evaluated results of being consistent with the human visual nerve responses. The models based on the structures [10] and the models based on the natural scene statistics (NSS) [11,12,13] got better results of image quality assessment. Besides modelling the human visual nerve responses, other effective methods of image quality assessment were proposed by some scholars for representing the characteristics of visual responses. The method of image quality assessment based on structures was proposed by Wang et al. [14], which evaluated image quality by using structural information as image quality features, according to the structural sensitiveness of the human vision system. And the method based on the sparse was proposed by He et al. [15] with the satisfactory assessment results, where the sparsity of the human vision system in processing information was considered. Specially, NSS provided the developing basis for natural image quality assessment [16,17,18,19]. The natural image quality assessment methods based on NSS have almost obtained the results of being higher consistent with subjective perception. At the same time, machine learning was introduced to increase the accuracy of no-reference image quality assessment, and some methods had achieved the better results that were consistent with human perception. Despite the fact that the image quality assessment has acquired some results being consistent with subjective perception, the studies are lacking to evaluate image quality in real time. It is imperative that no-reference image quality assessment approaches monitor image quality instantaneously. The methods should be more effective and the assessment results should be more consistent with human subjective perception as well.

In this paper, modelling the pixel-pair statistics [20] is applied to extract quality patches, which represent the image quality, for speeding up the algorithm and increasing the real-time ability of evaluating quality. The spatial correlation of pixel-pairs is used to find out the quality patches. The selected quality patches are calculated by the FastICA for obtaining the independent components. The assessment results of no-reference image quality are obtained effectively and accurately by matching the Hamming distance between the binary codes. One code is calculated from a tested quality image patch, and the other code is the address code of the Hash lookup table. The image quality is expressed as the weighted DMOS (differential mean opinion score) value that is stored in the indexed unit of the Hash table, and the stored DMOS value comes from the published database LIVE2. The proposed no-reference image quality assessment method is shown in Fig. 1 with the mainly operating procedure.

Fig. 1
figure 1

The mainly operating procedure of the real-time no-reference image quality assessment based on quality patches

The rest of the paper is organized as follows. The method of generating image quality features is elucidated in Section 2. Then, the proposed PIH-IQA algorithm based on image quality patches is described in Section 3. Experiments and discussions are illustrated with the form of charts and tables in Section 4. Finally, Section 5 concludes the paper.

2 Generating image quality features

The operating scheme is shown in Fig. 1, and the generative method of selecting image quality patches is illustrated in this chapter. Two neighboring pixels sampled from natural images have an extensive correlation. And the distributing range of the dots in a pixel-pairs scatter plot can represent the dispersed extent of pixel values in an image. The value difference is relatively large between pixels locating on the edges of the scatter plot. And if the quantity of edges in an image is larger, the distributed range of the dots in a pixel-pairs scatter plot is wider. At the same time, the distributed width of the dots in a scatter plot represents the more edges and structures of an image, where the human vision system is more sensitive to structures and edges. According to the target of representing the image quality, the distributed width of the dots in a scatter plot is applied to find out the image quality patches for speeding up the algorithm.

2.1 Extracting image quality patches

Given a pair of neighboring pixels at the position (m, n) and (m, n + 1), the spatial dependency of pixel-pairs is defined as the frequency on a grey-scale group [G1,G2], where G1 is the grey-scale value of the pixel at the position (m, n) and G2 is the grey-scale value of the pixel at the position (m, n + 1). The scatter plot representing the spatial dependency is shown in Fig. 2. In the plot, the distributed width of the dots is found out by calculating the maximum vertical distance from the locations of the non-zero dots to the diagonal line of the plot. According to the proceeding study in the paper [20], the patches, which are with the largest standard deviation, are with the largest distribution width and with the largest gray-scale deviation. Hence, the more edges and structures are included in an image, the more distributed width value is calculated.

Fig. 2
figure 2

The original image and its scatter plot of the spatial dependency of pixel-pairs

It is necessary to neglect the smooth parts of an image for speeding up the image quality assessment. The smooth parts of an image have little information and their distortions introduce little effects on subjective perception. According to human vision system, the salient ranges in an image need to be extracted and considered focally for evaluating image quality. As the target of this paper is to get a real-time method of image quality assessment, the image quality is evaluated by using several image patches to represent the image quality. The selected image patches are called “image quality patches” in this paper. And the patches are selected by calculating the distributed width of the dots in the scatter plot that models the spatial dependency of pixel-pairs. The selected image quality patches in Lena are shown in Fig. 3.

Fig. 3
figure 3

The selected image quality patches in Lean

Image quality patches are selected by the following steps:

  1. (1)

    Generate a scatter plot of the image being evaluated.

  2. (2)

    Calculate the dots distributed width Iw of the scatter plot of the input image.

  3. (3)

    Split the input image into patches with the size of a × b (the patch of 32 × 32 is selected in this paper) and calculate the distributed width Ipw of each image patch.

  4. (4)

    Calculate the ratio of Ipw to Iw. And the ratio records as RWID. Here, RWID is defined as the ratio of the dots distributed width of the scatter plot of an image patch to the dots distributed width of the scatter plot of the entire image.

  5. (5)

    Put out image quality patches according to following rules:

  6. 1)

    While \( RWID=\frac{I_{\mathrm{pw}}}{I_{\mathrm{w}}}=1 \), put out image quality patches belonging to the first class and record the value of RWID1(i), where i is the order number.

  7. 2)

    While \( 0.95< RWID=\frac{I_{pw}}{I_w}<1 \), put out image quality patches belonging to the second class and record the value of RWID2(i).

  8. 3)

    While \( 0.9> RWID=\frac{I_{pw}}{I_w}<0.95 \), put out image quality patches belonging to the third class and record the value of RWID3(i).

  9. 4)

    While \( 0.85< RWID=\frac{I_{pw}}{I_w}<0.9 \), put out image quality patches belonging to the fourth class and record the value of RWID4(i).

  10. 5)

    While \( 0.8< RWID=\frac{I_{pw}}{I_w}<0.85 \), put out image quality patches belonging to the fifth class and record the value of RWID5(i).

In this paper, it is almost impossible to obtain the analytical solution of classifying parameters due to the nonlinearity of RWID. Instead, we construct a validation set and apply the brute-force search algorithm to get suboptimal values of classifying the image quality patches.

2.2 Generating image quality features

Independent component analysis (ICA) is applied to generate the features of image quality patches in this paper. ICA is a kind of non-orthogonal linear transformation method, and its target is making variables independent after transformation.

A pixel grey-scale value is denoted by I(x,y) in an original image quality, and the patch is generated as a linear superposition of features Ai in ICA.

$$ I\left(x,y\right)=\sum \limits_{i=1}^m{A}_i\left(x,y\right){S}_i $$
(1)

where (x,y) expresses the coordinate of a pixel in an image quality patch, i is the order number of the features Ai, Ai are the same for all patches, and the Si are coefficients with different values to different image quality patches.

There are three assumptions being satisfied in ICA calculating. They are (1) Si are independent statistically, (2) the distributions of the Si are non-Gaussian, and (3) the linear system defined by Ai is linearly invertible. Then, an invertible matrix Wi = Ai−1 can be found out based on above assumptions, and Si can be calculated using the following equation:

$$ {S}_i=\sum \limits_{x,y}{W}_1\left(x,y\right)I\left(x,y\right) $$
(2)

Two preprocessing steps, centralizing and whitening, must be carried out to produce the uncorrelated components in ICA.

Centralization means subtracting the mean value in an image quality patch and making a zero mean value matrix. Whitening matrix is obtained by eigen-decomposition to the covariance matrix of the image quality patch, and the whitened result is that each element is uncorrelated and normalized.

There are many algorithms to calculate ICA. The FastICA algorithm is proposed and developed by Finnish scholar Hyvärinen [21]. The algorithm is based on the principle of non-Gaussian maximization, and it makes maximized minus entropy as the objective function. The algorithm can separate out an independent component each time.

The basic form of the FastICA algorithm is as follows:

  1. (1)

    Centralize and whiten image quality patches I(x,y), then obtain the data X.

  2. (2)

    Choose an initial (e.g., random) weight vector W.

  3. (3)

    Let \( {W}_i=E\left\{ Xg\left({W}_i^TX\right)\right\}-E\left\{{g}^{\prime }{W}_i^TX\right\}{W}_i \), where g(x) = α1[1 − tanh(α1x)] and g(x) = α1[1 − tanh2(α1x)].

  4. (4)

    Normalize Wi by Wi = WiWi.

  5. (5)

    If Wi is not convergent, then return to step (3).

In this paper, randomly selecting 600 images in database LIVE2 [22], CISQ [23], and TID2013 [24] are used as training sample images. Image quality patches were extracted from sample images according to the steps in the above context. A group of statistically independent Wi, called feature detectors, were obtained by the FastICA algorithm, where each image patch was built as a column in the sample matrix. The obtained Wi are shown in Fig. 4.

Fig. 4
figure 4

36 Wis are obtained by the FastICA algorithm

3 Method—no-reference image quality assessment

Hash indexing method [25, 26] is applied to build the matching relation between independent component coefficients Si and the address code of the Hash table. And DMOS values and RWID values are stored in the Hash table for fast obtaining the evaluated results of image quality. DMOS values are applied for the assessment results being consistent with subjective perception.

3.1 Building the Hash lookup table

The independent components Si are obtained by multiplying the detectors Wi with the image patch I(x,y) respectively. The absolute value in the independent component Si can be described as values between 0 and 255, and the number of the values between 0 and 255 is 36. The binarization of Si can be seen in the Fig. 5.

Fig. 5
figure 5

The binarization process of the independent components coefficients Si

First, calculate the absolute value of the difference between Si (i = 1,…,36) by Eq. (3). The absolute value SD is defined as:

$$ {SD}_i=\left\{\begin{array}{cc}\left|{S}_{i+1}-{S}_i\right.& if\kern1em i<36\\ {}\left|{S}_i-{S}_{36}\right.& otherwise\end{array}\right. $$
(3)

Then, calculate the binarization of SDi by Eq. (4), where M is trained and taken as 17 in Fig. 5. The value of M is designated as the minimum among mean values of each columns, and the M value can be adjusted according to the binarization results.

$$ {b}_i=\left\{\begin{array}{cc}1,& if\left({SD}_i\le M\right)\\ {}0,& otherwise\end{array}\right. $$
(4)

A Hash lookup table is applied to fast index the similar independent components. And a Hash function is used to point the position in the Hash lookup table in this paper. Here, a binary code with 36 bits is transformed by the Hash function. As the Hash lookup table is built from the sampled 600 images, the image quality patches are collected with a large number. Every image quality patch is assigned a unit in the Hash lookup table. Considering the inevitable conflicts, the least of binary address with 12 bits is generated necessarily by the Hash function.

Let Ni = bi × 3 + 1 × 20 + bi × 3 + 2 × 21 + bi × 3 + 3 × 22 (i = 0, , 11) and N12 = N0, then

$$ {h}_i=\left\{\begin{array}{cc}1,& if\;{N}_i\le {N}_{i+1}\\ {}0,& \mathrm{otherwise}\end{array}.\right. $$
(5)

The Hash function is defined as:

$$ H\left({h}_0,\cdots, {h}_{11}\right)={h}_0\times {2}^0+\cdots +{h}_{11}\times {2}^{11}. $$
(6)

The binary code of independent components and its DMOS value and RWID value are placed in the specific unit, which is pointed by the address pointer from the Hash function.

3.2 Indexing the quality features

The independent components coefficients Sip of the tested image are obtained by calculating the independent components of image quality patches. Then, the binarization values of Sip are transformed by the Hash function; then, the DMOS and RWID values can be indexed from the unit with obtained address.

The independent components coefficients Sip of an image quality patch are changed Hash address codes according to the Hash function in the Eqs. (3)–(6). Then, the units pointed by the address codes in the Hash lookup table are found, and the hamming distances are calculated between the placed independent components in the units and the Sip. If the distances are smaller than the value Th, the number of similar patches and the DMOS and RWID values are recorded and stored together.

3.3 . Evaluating images quality

The final no-reference image quality assessment score PIH-IQA (simply named the method in this paper) is calculated by using the following Eq. (7), where DMOSi and RWIDi are obtained by indexing in the Hash lookup table and RWIDn(i) are the ratios of the distributed widths that are found out from the scatters of the spatial dependent relation of image pairs.

$$ PIH- IQA=\frac{\mathbf{1}}{N}\sum \limits_{i=1}^N{DMOS}_i\ast \frac{RWID_n(i)}{RWID_i} $$
(7)

4 Experimental results and discussions

Three groups of experiments are applied to analyze the effects of PIH-IQA in evaluating the image quality without references. Three groups of experiments include the experiments of comparing the subjective consistence, the experiments of comparing with other methods, and the experiments of comparing the results in different distortions.

The consistence between the evaluated values using PIH-IQA method and the DMOS values in databases are shown by the scatter plot in the subjective consistence experiments. Three metrics are used to compare different image quality assessment methods and different image distortions, which are the Pearson linear correlation coefficient for predicting the accuracy of the results, the Spearman order correlation for the monotonicity of evaluating results, and the root mean square error for the subjective consistency of the results.

4.1 Comparing with subjective perception

Testing its subjective consistence is the main criterion that evaluates whether an image quality assessment method is effective, in other words, whether the evaluated scores are consistent with the human subjective perception. And the scatter of the subjective scores and the evaluated results to the database LIVE2 are shown in Fig. 6, where the horizontal axis is the scores calculated by the PIH-IQA method and the vertical axis is the DMOS values of the subjective score. The dots in the figure are mainly scattered around the diagonal line except for special dots. It can be seen that the evaluated results of PIH-IQA keep the better consistence with the subjective perception.

Fig. 6
figure 6

The scatter of the subjective consistence of the PIH-IQA method in the database LIVE2

4.2 Comparing with other evaluated methods

The results of the proposed PIH-IQA method in this paper are compared with other more used image quality assessment methods. The compared methods include full reference methods, such as PSNR, IFC [27], MSSSIM [14], SSIM [10], and VIF [28], and no reference methods like CBIQ [29], SRNSS [15], QAC [30], NSS-GS [13], BLINDS [31], BLINDS-II, GMLOG [32], CORNIA [33], and PATCHIQ [9]. The compare is finished by calculating the three metrics PCC, SRC, and RMSE, and the comparing results are listed in Table 1. The first three methods are shown in the table and the PIH-IQA gets the better no-reference assessment results. To the database LIVE2, the PIH-IQA method is located in the first three methods of the three metrics.

Table 1 Performance of the proposed PIH-IQA and the other competing models

Table 1 shows that the proposed PIH-IQA has the largest number of scores highlighted in italics. From the first three scores of the PIH-IQA, the performance is superior to that of the other IQA models in terms of SRC and RMSE. The conclusion that the proposed PIH-IQA is superior can be obtained for the largest number of values in italics.

4.3 Comparing different distortions

An image quality assessment method should be suitable to evaluate images with difference distortions. Images with 5 kinds of distortions are evaluated in the database LIVE2, and the assessed results are compared respectively. The compares are finished by calculating the three metrics PCC, SRC and RMSE, and the compared results are listed in the Table 2. From the results shown in Table 2, the PIH-IQA method can be used to assess images with different distorted types.

Table 2 Performance comparison of the IQA models on each individual distortion type in the database LIVE 2

The proposed PIH-IQA is the third best in terms of the number of scores highlighted in italics. As we did in Table 1, the results in Table 2 also show similar tendency in terms of SRC and RMSE. Of course, it seems for the proposed method to have room to improve the value of PCC. The conclusion also can be obtained that the proposed PIH-IQA is superior.

Table 3 shows the running time of the 11 IQA models on an image of size 512 × 512. All algorithms were run on a ThinkPad X220 notebook with Intel Core i5-2450M CPU@2.5GHz and 4G RAM. The software platform used to run all algorithms was MATLAB R2012b (8.0.0783). The MATLAB source codes of the other IQA methods were obtained from the original authors. (It should be noted that whether the code is optimized may affect the running time of an algorithm.) Clearly, PSNR is the fastest, followed by SSIM and MS-SSIM, but for no-reference IQA metrics, especially for the training IQA models, the running time of proposed PIH-IQA is much faster than NSRCIQ.

Table 3 Running time of the IQA models

4.4 Evaluating aerial images in different weathers

The quality of aerial images is mainly affected by weather, and the bad weather can lead to executing subsequent processes incorrectly. There are images collected from the Internet, including 25 images in sunny weather, 11 images in cloudy weather, 17 images in foggy weather, 13 images in rainy weather, and 18 images in snowy weather. The aerial images were labeled according to the weather before the following experiments. These images are evaluated quality by the proposed method, and the results are shown in Fig. 7. In Fig. 7, the horizontal axis shows the different weathers, where 1 expresses sunny weather, 2 expresses cloudy weather, 3 expresses foggy weather, 4 expresses rainy weather, and 5 expresses snowy weather. Being analyzed, the PIH-IQA scores in different weathers scatter on different values. The quality scores of images in foggy, rainy, and snowy weathers are almost more than 10 except a few points. Although foggy, rainy, and snowy weathers have different influences on aerial images quality, the foggy weather has the most serious impact on aerial images quality.

Fig. 7
figure 7

The scatter of the relation between aerial images and weathers of the PIH-IQA method

5 Conclusion

The proposed real-time no-reference image quality assessment method obtains the excellent effects from the experimental results of compares. The first excellent point is the selection of the image quality patches based on the spatial correlation of image pairs. Although the spatial correlation is not good at expressing the image quality as an index, it can effectively extract the image patches with abundant information as image quality patches, which provide important information to evaluate the image quality in the following procedures. The second excellent point is the 36 features are applied to extract the independent components of image quality patches. Although the features are not more, they are sufficient to evaluate the image quality according to the assessed results. The last point for speeding up the evaluation is the Hash lookup table that greatly increases the speed of the proposed image quality assessment method. Summarizing the above contents, both the better features representing the image quality and the faster evaluating method are very important and indispensable.

Abbreviations

DMOS:

Differential mean opinion score

ICA:

Independent component analysis

NSS:

Natural scene statistics

PCC:

Pearson linear correlation coefficient

RMSE:

Root mean square error

SRC:

Spearman order correlation

References

  1. R.A. Manap, L. Shao, Non-distortion-specific no-reference image quality assessment: a survey. Information Science 301, 141–160 (2015).

    Article  Google Scholar 

  2. C. Yan, H. Xie, S. Liu, et al., Effective Uyghur language text detection in complex background images for traffic prompt identification. IEEE Trans. Intell. Transp. Syst. 99, 1–10 (2017).

    Google Scholar 

  3. F. Zhou, Z. Lu, C. Wang, W. Sun, S. Xia, Q. Liao, Image quality assessment based on inter-patch and intra-patch similarity. PLoS One 10, e0116312 (2015).

    Article  Google Scholar 

  4. C. Li, Y. Ju, A.C. Bovik, et al., No-training, no-reference image quality index using perceptual features. Opt. Eng. 52, 057003 (2013).

    Article  Google Scholar 

  5. P. Peng, Z.N. Li, General-purpose image quality assessment based on distortion-aware decision fusion. Neurocomputing 134, 117–121 (2014).

    Article  Google Scholar 

  6. N. Yun, Z. Feng, J. Yang, et al., The objective quality assessment of stereo image. Neurocomputing 120, 121–129 (2013).

    Article  Google Scholar 

  7. A. Mittal, G.S. Muralidhar, J. Ghosh, et al., Blind image quality assessment without human training using latent quality factors. IEEE Signal Processing Letters 19, 75–78 (2012).

    Article  Google Scholar 

  8. J. Xu, P. Ye, Q. Li, et al., Blind image quality assessment based on high order statistics aggregation. IEEE Transactions on Image Processing 25, 4444–4457 (2016).

    Article  MathSciNet  Google Scholar 

  9. R.A. Manap, L. Shao, A.F. Frangi, PATCH-IQ: a patch based learning framework for blind image quality assessment. Information Sciences 420, 349–344 (2017).

    Google Scholar 

  10. Z. Wang, A.C. Bovik, H.R. Sheikh, et al., Image quality assessment: from error visibility to structural similarity. IEEE transactions on image processing 13, 600–612 (2004).

    Article  Google Scholar 

  11. W. Zhang, C. Qu, L. Ma, et al., Learning structure of stereoscopic image for no-reference quality assessment with convolutional neural network. Pattern Recognition 59, 176–187 (2016).

    Article  Google Scholar 

  12. A.K. Moorthy, A.C. Bovik, Blind image quality assessment: from natural scene statistics to perceptual quality. IEEE transactions on Image Processing 20, 3350–3364 (2011).

    Article  MathSciNet  Google Scholar 

  13. X. Gao, F. Gao, D. Tao, et al., Universal blind image quality assessment metrics via natural scene statistics and multiple kernel learning. IEEE Transactions on Neural Networks and Learning Systems 24, 2013–2016 (2013).

    Article  Google Scholar 

  14. Wang Z., Simoncelli E. P., Bovik A. C. Multiscale structural similarity for image quality assessment. Signals, Systems and Computers, 2004. Conference Record of the Thirty-Seventh Asilomar Conference on IEEE. 2;1398–1402 (2004).

  15. L. He, D. Tao, X. Li, et al., Sparse representation for blind image quality assessment. IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 1146–1153 (2012).

  16. B.C. Hansen, R.F. Hess, Structural sparseness and spatial phase alignment in natural scenes. Journal of the Optical Society of America A Optics Image Science & Vision 24, 1873–1885 (2007).

    Article  Google Scholar 

  17. L.D. Griffin, The second order local-image-structure solid. IEEE Transactions on Pattern Analysis and Machine Intelligence 29, 1355–1366 (2007).

    Article  Google Scholar 

  18. A. Hyvärinen, An alternative approach to infomax and independent component analysis. Neurocomputing 44, 1089–1097 (2002).

    Article  Google Scholar 

  19. J. Malo, J. Gutiérrez, V1 non-linear properties emerge from local-to-global non-linear ICA. Network: Computation in Neural Systems 17, 85–102 (2006).

    Article  Google Scholar 

  20. C. Zhang, L. Bai, Y. Zhang, et al., Fusion image quality appraisal method of dual-spectrum night vision technology. IEEE First International Conference on Innovative Computing, Information and Control, 656–659 (2006).

  21. A. Hyvärinen, Complexity pursuit: separating interesting components from time series. Neural Computation 13, 883–898 (2001).

    Article  Google Scholar 

  22. H.R. Sheikh, Z. Wang, L. Cormack, A.C. Bovik, LIVE Image Quality Assessment Database Release 2. http://live.ece.utexas.edu/research/quality. Accessed 7 June 2014.

  23. E.C. Larson, D.M. Chandler, Most apparent distortion: full-reference image quality assessment and the role of strategy. Journal of Electronic Imaging 19, 011006 (2010).

    Article  Google Scholar 

  24. N. Ponomarenko, O. Ieremeiev, V. Lukin, et al., Color image database TID2013: peculiarities and preliminary results. European Workshop on Visual Information Processing. IEEE, 106–111 (2013).

  25. C.C. Chen, S.L. Hsieh, Using binarization and hashing for efficient SIFT matching. Journal of Visual Communication and Image Representation 30, 86–93 (2015).

    Article  Google Scholar 

  26. C. Yan, H. Xie, D. Yang, et al., Supervised hash coding with deep neural network for environment perception of intelligent. IEEE Transactions on Intelligent Transportation Systems 19, 284–295 (2017).

    Article  Google Scholar 

  27. H.R. Sheikh, A.C. Bovik, Image information and visual quality. IEEE Transactions on image processing 15, 430–444 (2006).

    Article  Google Scholar 

  28. H.R. Sheikh, A.C. Bovik, G. De Veciana, An information fidelity criterion for image quality assessment using natural scene statistics. IEEE Transactions on image processing 14, 2117–2128 (2005).

    Article  Google Scholar 

  29. H. Tang, N. Joshi, A. Kapoor, Learning a blind measure of perceptual image quality. IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 305–312 (2011).

  30. W. Xue, L. Zhang, X. Mou, Learning without human scores for blind image quality assessment. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 995–1002 (2013).

  31. M.A. Saad, A.C. Bovik, C. Charrier, A DCT statistics-based blind image quality index. IEEE Signal Processing Letters 17, 583–586 (2010).

    Article  Google Scholar 

  32. W. Xue, X. Mou, L. Zhang, A.C. Bovik, X. Feng, Blind image quality assessment using joint statistics of gradient magnitude and Laplacian features. IEEE Transactions on Image Processing 23, 4850–4862 (2014).

    Article  MathSciNet  Google Scholar 

  33. P. Ye, J. Kumar, L. Kang, D. Doermann, Unsupervised feature learning framework for no-reference image quality assessment. in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), 1098–1105 (2012).

  34. A. Mittal, A.K. Moorthy, A.C. Bovik, No-reference image quality assessment in the spatial domain. IEEE Transactions on Image Processing 21, 4695–4708 (2012).

    Article  MathSciNet  Google Scholar 

  35. A.K. Moorthy, A.C. Bovik, A two-step framework for constructing blind image quality indices. IEEE Signal Processing Letters 17, 513–516 (2010).

    Article  Google Scholar 

Download references

Acknowledgements

The authors thank the editor and anonymous reviewers for their helpful comments and valuable suggestions.

About the authors

Chuang Zhang graduated and received her PhD degree from Nanjing University of Science and Technology in 2008. Now, she is an assistant professor at the Jiangsu Key Lab of Meteorological Observation and Information Processing, Nanjing University of Information Science & Technology. Her recent research interests include optical signal processing and optical image processing.

Xianzhao Yang is a student at School of Electronic & Information Engineering, Nanjing University of Information Science & Technology. Her recent research interests include computer vision and image processing.

Xiaoyu Huang is an undergraduate student at Nanjing University of Information Science & Technology. His research interests include embedded systems and machine learning algorithms.

Guiyue Yu is an undergraduate student at Nanjing University of Information Science & Technology. His research interests include optical image processing and electronic and communication engineering.

Suting Chen received her Ph.D. degree from Institute of Optics and Electronics, the Chinese Academy of Sciences in 2007. She is currently a professor at Nanjing University of Information Science & Technology. Her research interests include digital image processing and computer vision.

Funding

This research was funded partly by the China Postdoctoral Science Foundation (No. 2015M571781), partly by the Jiangsu Key Laboratory of Meteorological Observation and Information Processing Open Project (No. KDXS1805), and partly by the Jiangsu Overseas Research & Training Program for University Prominent Young & Middle-ages Teachers and Presidents.

Availability of data and materials

We can provide the data.

Author information

Authors and Affiliations

Authors

Contributions

CZ and SC conceived the idea; CZ, XY, and XH performed the experiments, analyzed the data, and wrote the paper; Yuegui Yu developed the proofs. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Chuang Zhang.

Ethics declarations

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, C., Yang, X., Huang, X. et al. No-reference image quality assessment based on quality patches in real time. J Image Video Proc. 2018, 111 (2018). https://doi.org/10.1186/s13640-018-0361-z

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s13640-018-0361-z

Keywords