Skip to main content

Table 3 Extraction of twenty-two GLCM features

From: A probabilistic segmentation and entropy-rank correlation-based feature selection approach for the recognition of fruit diseases

Feature name

Equation

Auto correlation

\(\widetilde {\phi ^{R}}=\sum _{k}\sum _{l}(k\times l)P(k,l)\)

Contrast

\(\widetilde {\phi ^{C}}=\sum _{k=1}^{\phi ^{\Gamma }}\sum _{l=1}^{\phi ^{\Gamma }}|k-l|^{2}P(k,l) \)

Correlation 1

\(\widetilde {\phi ^{R_{1}}}=\sum _{k=0}^{\phi ^{\Gamma }-1}\sum _{l=0}^{\phi ^{\Gamma }-1}(k\times l)P(k,l)-\phi ^{\mu _{x}}\phi ^{\mu _{y}} \)

Correlation 2

\(\widetilde {\phi ^{R_{2}}}=\frac {\sum _{k=0}^{\phi ^{\Gamma }-1}\sum _{l=0}^{\phi ^{\Gamma }-1}(k-\phi ^{\mu _{k}})(l-\phi ^{\mu _{l}})P(k,l)}{\sigma }\)

Cluster prominence

\(\widetilde {\phi ^{\mathbb {P}}}=\sum _{k=0}^{\phi ^{\Gamma }-1}\sum _{l=0}^{\phi ^{\Gamma }-1}\{k+l-\phi ^{\mu _{x}}-\phi ^{\mu _{y}}\}^{4}P(k,l)\)

Cluster shade

\(\widetilde {\phi ^{S}}=\sum _{k=0}^{\phi ^{\Gamma }-1}\sum _{l=0}^{\phi ^{\Gamma }-1}\{k+l-\phi ^{\mu _{x}}\phi ^{\mu _{y}}\}^{3}P(k,l)\)

Dissimilarity

\(\widetilde {\phi ^{D}}=\sum _{k}\sum _{l}P(k,l)|k-l|\)

Energy

\(\widetilde {\phi ^{\mathbb {E}}}=\sum _{k}\sum _{l} P(k,l)^{2}\)

Entropy

\(\widetilde {\phi ^{H}}=\sum _{k}\sum _{l} P(k,l)logP(k,l)\)

Homogeneity 1

\(\widetilde {\phi ^{\alpha _{1}}}=\frac {\sum _{k}^{\phi ^{\Gamma }-1}\sum _{l}^{\phi ^{\Gamma }-1}P(k,l)}{1+|k-l|}\)

Homogeneity 2

\(\widetilde {\phi ^{\alpha _{2}}}=\frac {\sum _{k}^{\phi ^{\Gamma }-1}\sum _{l}^{\phi ^{\Gamma }-1}P(k,l)}{1+(k-l)^{2}}\)

Maximum probability

\(\widetilde {\phi ^{P}}=max_{k,l}\ P(k,l)\)

Sum of squares (variance)

\(\widetilde {\phi ^{\sum {\hat \sigma ^{2}}}}=\sum _{k=0}^{\phi ^{\Gamma }-1}\sum _{l=0}^{\phi ^{\Gamma }-1} (k-\phi ^{\mu })^{2} P(k,l)\)

Sum average

\(\widetilde {\phi ^{\sum \mathbb {A}}}=\sum _{k=2}^{2\phi ^{\Gamma }-2}k P_{x+y}(k)\)

Sum entropy

\( \widetilde {\phi ^{\sum {H}}}=-\sum _{k=2}^{2\phi ^{\Gamma }-2}P_{x+y}(k)log(P_{k+l}(k))\)

Sum variance

\(\widetilde {\phi ^{\sum \sigma ^{2}}}= \sum _{k=2}^{2\phi ^{\Gamma }-2}(k-\widetilde {\phi ^{H}})P_{x+y}(k) \)

Difference variance

\(\widetilde {\phi ^{\bar {\sigma ^{2}}}}=\sigma ^{2}(P_{x-y})\)

Difference entropy

\(\widetilde {\phi ^{\breve {H}}} = -\sum _{k=0}^{\phi ^{\Gamma }-1}P_{k-l}(k)log\{P_{k-l}(k)\}\)

Information measure of correlation 1

\(\widetilde {\phi ^{MR_{1}}}=\frac {\widetilde {\phi ^{H}}-H_{xy1}}{max(H_{x},H_{y})} \)

Information measure of correlation 2

\(\widetilde {\phi ^{MR_{2}}}=\sqrt {(1-exp\left [-2.0\left (H_{xy2}-\widetilde {\phi ^{H}}\right)\right ]}\)

Inverse difference normalized

\(\widetilde {\phi ^{D^{-1}}}=\sum _{k} \sum _{l}\frac {P(k,l)}{1+\frac {|k-l|}{\phi ^{\Gamma }}}\)

Inverse difference moment normalized

\(\widetilde {\phi ^{DM^{-1}}}=\sum _{k} \sum _{l}\frac {P(k,l)}{1+\frac {(k-l)^{2}}{\phi ^{\Gamma }}}\)