Skip to main content

An improved localization method for lesion area in gynecological ultrasound image

Abstract

The false positive and false negative rates of current image localization methods in gynecological lesion area are high because the effectiveness is affected by random noise. Therefore, by using Bhattacharyya coefficient-based scale-invariant feature transform (B-SIFT), a novel localization method of lesion area in gynecological ultrasound image is proposed in this paper. Firstly, Rayleigh mean filtering is used to suppress the noise in the ultrasound image based on Rayleigh distribution characteristics of the noise. Then, the segmentation method of the lesion region is designed by using the scale-invariant feature transform (SIFT). Furthermore, the feature extraction function B-SIFT is proposed to locate the lesion region based on the Bhattacharyya coefficient. Finally, two lesion characteristics of Bhattacharyya coefficients are defined, and the B-SIFT-based feature region descriptors are obtained by constructing an eigenvector normalized based on the Bhattacharyya coefficients. Experimental results show that the proposed method has a high positioning accuracy, strong recall ratio, low energy consumption, and low time consumption, which is more effective and feasible than the traditional method for localization of lesions.

Introduction

Ultrasound image diagnosis is a medical diagnosis that uses reflected or scattered echo information of ultrasound to detect lesion areas in the human body based on the difference of acoustic impedance in different human tissues [18]. It greatly improves the objectivity and accuracy of diagnosis by comparing with traditional diagnosis. Meanwhile, the diagnosis based on ultrasonic transmission currently has less damage to the human body than other medical imaging methods such as computed tomography, magnetic resonance imaging, and radionuclide imaging. Especially, ultrasound diagnosis is widely used in the domain of gynecological diagnosis because it has many good characteristics such as real-time, non-invasive, inexpensive, portable, and painless [7, 8].

At present, there are several effective traditional segmentation methods for multi-resolution image which are generated by three-dimensional (3D) computed tomography (CT). One kind of method is based on source-point mapping scanning. First, the global optimal segmentation of multi-resolution 3D CT images is calculated based on the micro-partial equations of the C-V method. Then, the symbol distance function is generated by using the source point mapping scanning method. In order to improve the computational stability and segmentation effect, the method combines the fast-step method to generate the symbol table. However, the segmentation time is very long because it requires multiple iterations to obtain a better image segmentation effect [9].

Another kind is multi-resolution 3D CT image segmentation method based on Hough transform theory. First, the parameter space of the image is subdivided step by step according to the inverse transformation of Hough transform theory. Then, the region without the straight line portion is gradually excluded with the calculated endpoint and length of the CT image curve contour. This kind of method is computationally intensive; however, the required storage capacity is large [4].

Moreover, another kind of segmentation method for multi-resolution 3D CT image is based on geometric deformation [6]. First, the inner and outer homogeneous regions of the multi-resolution 3D CT image are located. Then, the surface evolves along the second derivative of the gradient direction of the image by edge energy of the target. This method has better adaptability to background noise; however, the time complexity is very high.

In summary, current recognition and localization of ultrasound image lesion have certain limitations. Therefore, in order to find a segmentation method with short time/low complexity and small storage, this paper proposes a novel method in the detection and positioning of gynecological lesions. Section 2 introduces related work. Section 3 provides materials and methods in this paper. Section 4 shows and discusses the results. Section 5 concludes the whole paper.

Related work

Recently, the efficacy of positron emission tomography (PET)/magnetic resonance imaging (MRI) fusion imaging in the diagnosis of pelvic recurrence and metastasis after gynecologic tumor surgery was explored [17]. This study analyzed postoperative PET/computed tomography (CT) and pelvic MRI images in 38 patients with gynecologic malignancies. PET and MRI images were fused, and the value of PET/CT, pelvic MRI, and PET/MRI fusion in the diagnosis of local pelvic recurrence and metastasis was compared to achieve lesion detection and localization. In the meantime, a multi-view coordinate system based on breast physiology features was established for multi-view detection and analysis of breast X-ray lesions [22, 23]. First, the mammogram multi-view analysis coordinate system was established by the extraction of typical physiological features of the nipple, chest muscle, and ellipse fitting of the breast edge in the axial, mid-oblique view image. Then, gynecological lesion recognition was achieved by nonlinearly mapping the bilateral four-views of the mammogram to the same coordinate frame. The evidence-based method was explored for finding the most prone to lesions in the cervix and looking for a random examination of the cervix under the naked eye [15]. It was based on retrospective analysis of biopsy position in patients with high-grade squamous intraepithelial lesion (HSIL) diagnosed by colposcopy-guided biopsy. During the course of the study, the location and number of biopsy of patients with HSIL under the guidance of colposcopy-guided cervical biopsy from January 1, 2015, to December 31, 2015, in one hospital were retrospectively analyzed. The results showed colposcopy-guided biopsy to diagnose 1096 cases of cervical HSIL, a total of 3563 points, an average of 3.25 points per case. In additional, colposcopy was an adjunct to early screening for cervical cancer and precancerous lesions [2]. The significance of various colposcopy abnormal images in cervical screening was highlighted in the article. The risk rankings of these abnormal images were as follows: acetic acid white epithelium, punctate blood vessels, cuff-like gland openings, inlays, iodine test without staining, internal borders, ridge-like signs, and shaped blood vessels. They needed to be processed according to their risk.

There were certain deficiencies in the above research results. For example, images were not processed in the method of image detection and localization, which results in both higher false positive and false negative rates of recognition and localization. In addition, positioning by the naked eye also had high false positive and false negative rate to some extent. Therefore, one improved method was proposed to solve the shortcomings of the above-related research. The innovation of our proposed method in this paper was as follows.

  1. (1)

    The Rayleigh mean filtering method was used to suppress and process the noise in gynecological ultrasound images, which laid a foundation for improving positioning accuracy and positioning efficiency.

  2. (2)

    B-SIFT was introduced to detect and locate the lesion area using the Bhattacharyya coefficient based on the SIFT feature detection.

  3. (3)

    The rationality and reliability of the proposed method were verified by the false positive and false negative rates in the experiment.

Materials and methods

Ultrasound image processing

In order to decrease the high false positive and false negative rates, the Rayleigh mean is used to filter the image to remove the location interference term in this paper. That is, when assuming that the region of interest has been determined, the Rayleigh mean filtering method is used to suppress noise in the ultrasound image based on Rayleigh distribution features of the noise.

The effect of noise on the signal is generally modeled as a form of multiplicative noise in the filtering process by Eq. 1.

$$ v=u\bullet \eta $$
(1)

where v represents the noise observation signal, u represents no noise signal, and η represents the multiplicative noise component.

Since the influence of the additive noise component is small compared to the multiplicative part, it is generally not considered. The Rayleigh distribution model of the noise can be expressed as Eq. 2 in the multiplicative form.

$$ \uppsi (x)=\frac{x}{\upsigma_n^2}\exp \left(-\frac{x^2}{2{\upsigma}_n^2}\right) $$
(2)

where σn represents the shape parameter and x denotes the total of multiplicative noise.

If u(i, j) in Eq. 1 is estimated in the ultrasound image, it is assumed that u(i, j) is a constant value β for the estimated observation window Ω, which means that u(i, j) ≈ β for (i, j)Ω. Therefore, (i, j)Ω, σv = σηβ for every v(i, j), where σv is the parameter of Rayleigh distribution of v(i, j). Therefore, the maximum likelihood estimation expression of σv can be defined as Eq. 3.

$$ \hat{\upsigma_v}={\left(\frac{1}{2\left|\Omega \right|}\sum \limits_{\left(i,j\right)\in \Omega}{v}^2\left(i,j\right)\right)}^{\frac{1}{2}} $$
(3)

where |Ω| represents the potential of Ω. The maximum likelihood estimate of u(i, j) can be expressed as Eq. 4.

$$ \hat{\mathrm{u}}\left({i}_c,{j}_c\right)=\hat{\upbeta}=\hat{\upsigma_v}{\left({\upsigma}_{\upeta}\right)}^{-1}=\frac{1}{\upsigma_{\upeta}}{\left(\frac{1}{2\left|\Omega \right|}\sum \limits_{\left(i,j\right)\in \Omega}{v}^2\left(i,j\right)\right)}^{\frac{1}{2}} $$
(4)

where (ic, jc) represents the position of the estimated pixel.

The relative value of the noiseless signal can be defined as Eq. 5 according to the above calculation and analysis.

$$ {u}_r\left({i}_c,{j}_c\right)={\left(\frac{1}{2\left|\Omega \right|}\sum \limits_{\left(i,j\right)\in \Omega}{v}^2\left(i,j\right)\right)}^{\frac{1}{2}} $$
(5)

In Eq. 5, ση is a constant amount for gynecological ultrasound images.

Then, the relative standard deviation of the local noise-free pixel values in the observation window Ω with center (ic, jc) is calculated by Eq. 6.

$$ \mathrm{RSD}\left({i}_c,{j}_c\right)={\left(\frac{1}{\left|\Omega \right|}\sum \limits_{\left({m}_c,{n}_c\right)\in \Omega}{\left[{u}_r\left({m}_c,{n}_c\right)-\frac{1}{\left|\Omega \right|}\sum \limits_{\left({p}_c,{q}_c\right)\in \Omega}{u}_r\left({p}_c,{q}_c\right)\right]}^2\right)}^{\frac{1}{2}} $$
(6)

Equation 6 shows that the noise-free image has similar intensity in the observation window with a relatively small value of RSD(ic, jc), and the region is relatively smooth in the absence of noise. In the meantime, a relatively large RSD(ic, jc) value indicates that there are some structures in the area that need to be maintained.

Then, we introduce α-mean filtering to achieve adaptive removal of ultrasound image noise based on the application of RSD.

The output fα of α-mean filtering for a set of ascending sets {g(1), g(2),…, g(n)} can be defined as Eq. 7

$$ {f}_{\alpha }=\frac{1}{n-2\left[\alpha n\right]}\sum \limits_{j=1+\left[\alpha n\right]}^{n-\left[\alpha n\right]}g(j) $$
(7)

where [ ] represents the rounding operation, and α(0, 0.05] represents the balance degree of the filter. When α tends to 0, the α-means filter approaches a mean filter used to smooth the image; when α approaches 0.5, the α-mean filter’s characteristics approximate a median filter, thereby maintaining the structure in the image [11, 12].

A logarithmic form in Eq. 8 is proposed to establish the relationship between RSD and α to make the processing of α-means filtering more prone to near-zero values

$$ \alpha ={\log}_a\left(\mathrm{RSD}+b\right)+c $$
(8)

where a, b, and c represent the filter influence parameters.

The input ultrasound image follows the steps below to achieve adaptive noise suppression. First, the value of RSD is calculated based on Eq. 6. Then, the value of α is obtained based on Eq. 8. Finally, the corresponding α-means filtering is finished to suppress noise in the ultrasound image adaptively.

Segmentation of lesions

A multi-resolution ultrasound image is segmented by using graph cut theory into the network map which is constructed by gray and spatial information of the image [21]. The graph cut theory constructs an energy function and then uses combinatorial optimization techniques to minimize the energy function. In this theory. an image segmentation problem is a typical combination optimization problem for binary label of pixels with foreground/background [10]. It transforms the labeling problem to the maximum/minimum flow segmentation problem by constructing the network, minimizing the energy function and the network flow theory [3, 13]. The graph cut theory calculation framework is shown in Fig. 1.

Fig. 1
figure1

Computational framework of graph cut theory

Then, the segmentation of lesions is studied based on the above graph cutting theory. The specific segmentation process is as follows.

First, a network map G(v, e) is established based on the filtering results of the multi-resolution ultrasound image in Section 3.1, where v represents all nodes in the network map and e represents a collection of edges connecting all neighbor nodes. Next, network termination nodes s and t are added, where s represents the source point and t represents the sink point. The capacity of the network edge is defined as the information of the grayscale and space between the multi-resolution ultrasound image regions. The information of the gray level and space between the regions are obtained by averaging all the node pixels between the ultrasound image regions [5, 16].

Secondly, the multi-resolution ultrasound image is segmented by solving the minimum cut in the network map. A cut is a division of fixed points in the network, which is defined by dividing all nodes of the network graph into disjoint sets S and T′. The disjoint sets S and T′ are created by deleting the edges between them, where all the source points sS and all the sink points tT′. The capacity of a cut is the sum of the capacity of all the edges proposed by dividing the network map into two sets with not intersect. The minimum cut is given in Eq. 9

$$ \min \mathrm{cut}\left(S,{T}^{\prime}\right)=\sum \limits_{\mu \in S,v\in T^{\prime }}\upomega \left(\mu, v\right) $$
(9)

where cut(S, T′) represents the capacity of the cut and ω(μ, v) represents the capacity of the edges between nodes μ and v. Since the multi-resolution ultrasound image is divided into a plurality of regions after by its growth, the capacity of the edge in the network map G contains the grayscale and spatial information of the ultrasound image region. The capacity of the edge is calculated by following Eq. 10

$$ \upomega \left(\mu, v\right)=\exp \left(\frac{-{\left|I\left(\mu \right)-I(v)\right|}^2}{\alpha}\right)\bullet \exp \left(\frac{-{\left|D\left(\mu, v\right)\right|}^2}{\beta}\right) $$
(10)

where I(μ) and I(v) represent the average values of the grayscales of the multi-resolution ultrasound image region μ and the region v, respectively. D(μ, v) represents the distance between the ultrasound image area μ and v, α, and β are adjustment parameters. The minimum cut is defined as Eq. 11 by input Eq.10 into Eq.9.

$$ \min \mathrm{cut}\left(S,{T}^{\prime}\right)=\sum \limits_{\mu \in S,v\in T^{\prime }}\exp \left(\frac{-{\left|I\left(\mu \right)-I(v)\right|}^2}{\alpha}\right)\bullet \exp \left(\frac{-{\left|D\left(\mu, v\right)\right|}^2}{\beta}\right) $$
(11)

According to the theorem of maximum stream/minimum cut, the value of the maximum stream is equal to the capacity of the minimum cut in any network. The minimum cut can be solved, and the multi-resolution ultrasound image is segmented based on Eq. 11 [20, 24].

Pathological localization based on B-SIFT

SIFT is a local feature detection method widely used in many fields of computer vision. The utility model has the advantages that the scale and the rotation invariance has good resistance to noise and image brightness conversion, and the amount of local feature information is rich. In this way, the B-SIFT feature detection method is introduced to locate lesion region based on the above-described ultrasound image processing. B-SIFT is a feature detection method based on SIFT optimization and its reliability is better than SIFT.

The extraction of SIFT feature vectors follows the steps below.

(1) A 16 × 16 pixel frame is created around each feature point in the initial shape vector x0′, and a gradient of each of the pixels is calculated; (2) The obtained gradient histograms of 8 directions per 4 × 4 pixels, thereby obtaining the accumulated value of each gradient direction.

In this way, a 4 × 4 × 8 = 128 dimensional SIFT feature vector (also called a descriptor) can be obtained. Based on this, the problem of poor recognition and location of lesions in current research results can be further improved [19, 25].

In addition, the obtained SIFT feature descriptors needed to be normalized to further reduce the influence of the difference in the gradient histogram [1, 14]. That is, for one SIFT feature hsR128 × 1, we have \( {\left\Vert {h}_s\right\Vert}_2^2=1 \). In this paper, the B-SIFT feature based on the Babbitt coefficient is used as the lesion feature region, and its construction is as follows.

First, the Bhattacharyya coefficient is defined as Eq. 12 for the two lesion feature vectors x′ and y′

$$ {\mathrm{Coe}}_{\mathrm{B}}\left({\mathrm{x}}^{\prime },{\mathrm{y}}^{\prime}\right)=\upalpha \sum \limits_{\mathrm{i}=1}\sqrt{{\mathrm{x}}_{\mathrm{i}}^{\prime }{\mathrm{y}}_{\mathrm{i}}^{\prime }} $$
(12)

where \( \sum \limits_{\mathrm{i}=1}{\mathrm{x}}_{\mathrm{i}}^{\prime }=1 \) (xi′ ≥ 0) and \( \sum \limits_{\mathrm{i}=1}{\mathrm{y}}_{\mathrm{i}}^{\prime }=1 \) (yi′ ≥ 0). Therefore, \( {\mathrm{Coe}}_{\mathrm{B}}\left({\mathrm{x}}^{\prime },{\mathrm{x}}^{\prime}\right)=\sum \limits_{\mathrm{i}=1}{\mathrm{x}}_{\mathrm{i}}^{\prime }=1 \). The formula for solving the Euclidean distance between \( \sqrt{{\mathrm{x}}_{\mathrm{i}}^{\prime }} \) and \( \sqrt{{\mathrm{y}}_{\mathrm{i}}^{\prime }} \) are expressed as Eq. 13 to analyze the algebraic relationship between the Bhattacharyya coefficient and the Euclidean distance.

$$ \mathrm{Dist}\left(\sqrt{{\mathrm{x}}_{\mathrm{i}}^{\prime }},\sqrt{{\mathrm{y}}_{\mathrm{i}}^{\prime }}\right)=\left\Vert \sqrt{{\mathrm{x}}_{\mathrm{i}}^{\prime }}-\sqrt{{\mathrm{y}}_{\mathrm{i}}^{\prime }}\right\Vert =2-2{\mathrm{Coe}}_{\mathrm{B}}\left({\mathrm{x}}^{\prime },{\mathrm{y}}^{\prime}\right) $$
(13)

The solution to the original Euclidean distance optimization problem can be transformed into a solution to the Babbitt coefficient from Eq. 13. As long as the eigenvector normalized based on the Babbitt coefficient is constructed according to Eq.12, the final B-SIFT-based feature region descriptor hB is obtained by Eq. 14.

$$ {\mathrm{h}}_{\mathrm{B}}=\sqrt{\frac{\upvarphi_{\mathrm{k}}^{\left(\mathrm{i}\right)}}{\sum \limits_{i=1}{h}_i}}\kern1.25em \mathrm{where}\ {\mathrm{h}}_{\mathrm{i}}\in {\upvarphi}_{\mathrm{k}}^{\left(\mathrm{i}\right)} $$
(14)

where \( {\upvarphi}_{\mathrm{k}}^{\left(\mathrm{i}\right)} \) represents the SIFT feature descriptor.

The mapping relationship between the B-SIFT feature and the SIFT feature is obtained according to Eq.14. In this processing, the influence weight of the larger value on the feature vector similarity is reduced, and the sensitivity of the smaller value to the feature vector similarity is enhanced, which improves the positioning strength of the proposed method. Finally, \( {\upvarphi}_{\mathrm{k}}^{\left(\mathrm{i}\right)} \) is replaced by hB as a new feature point descriptor, which is the result of localization of lesions in gynecological ultrasound images.

Results and discussion

In order to verify the effectiveness and feasibility of the improved method for locating lesions in gynecological ultrasound images, the following experiments are presented.

The experimental platform is Matlab 10.0b, and the image library used in the experiment contains 150 breast ultrasound images, including 90 benign and 60 vicious images. The ultrasound images used in the experiment were obtained from the gynecological ultrasound image library of the Second Affiliated Hospital of one provincial medical university. These images are acquired in real-time during the clinical diagnosis by using the VIVID ultrasound imaging system. The experimental probes are all 5.6–14 MHz with 38 mm linear probes. The experimental parameter settings are shown in Table 1.

Table 1 Software and hardware parameter settings

The experiment compares the performance of the proposed method with some traditional methods. The experimental indicators are false positive rate and false negative rate. Figure 2 is an example of experimental data.

Fig. 2
figure2

Experimental data example. a Example 1. b Example 2

Figure 3 is obtained based on the above experimental environment and data.

Fig. 3
figure3

Comparison of false positive rates in different methods. a False positive rate of [17]. b False positive rate of Wininger et al. [23]. c False positive rate of lesion localization in gynecological ultrasound images based on B-SIFT (proposed)

Figure 3 shows that the false positive rates of the literature results are high. Compared with the results of traditional methods, the false positive rate of the proposed method shows a lower state, which means that the method had higher positioning accuracy. The method uses the Rayleigh mean filtering to adaptively denoise the ultrasound image before the lesion region is located, which improves the positioning accuracy to some extent. By using B-SIFT to carry out lesion localization, B-SIFT is mainly proposed according to SIFT, which has scale and rotation invariance, which has good resistance to noise and image brightness conversion. Therefore, the proposed method improves the positioning accuracy.

Figure 4 shows that the false negative rate of positioning gradually relates to the number of image frames in Pillay’s method, the maximum false negative rate reaches 3%, and the average false negative rate is about 2.4%. Positioning false negative rate changes steadily with the increment of the number of image frames in Arnaud’s method, the average false negative rate is about 1.5%, and the maximum false negative rate is 3%. The proposed method maintains stable change with the increment of the number of image frames with the maximum false negative rate 0.8%, and the average false negative rate 0.35%.

Fig. 4
figure4

Comparison of false negative rates in different methods. a False negative rate of [15]. b False negative rate of Arnaud et al. [2]. c False negative rate of lesion localization in gynecological ultrasound images based on B-SIFT (proposed)

From the comparison, we find that the false negative rate of the proposed method is lower than traditional methods, which means that the proposed method is more comprehensive. Since the proposed method performs lesion localization based on B-SIFT, the SIFT feature vector is extracted by building a 16 × 16 pixel frame around each feature point in the initial shape vector. Therefore, the gradient of each pixel with every 8 directions in a 4 × 4 pixel is calculated. Then the accumulated value of each gradient direction is obtained. This highly enhances the positioning of the method and reduces the false negative rate.

The comparison between the proposed method and the traditional results is analyzed to further verify the effectiveness of the proposed method. Here, the energy consumption and positioning time of the lesion area are selected as indicators. The results are shown in Fig. 5 and Table 2.

Fig. 5
figure5

Comparison of energy consumption in different methods. a Positioning energy consumption of [15]. b Positioning energy consumption of Arnaud et al. [2]. c Lesion localization energy consumption of gynecological ultrasound images based on B-SIFT (proposed)

Table 2 Positioning time comparison

Figure 5 shows that the energy consumption of gynecological ultrasound image lesion localization based on B-SIFT is low to 30 J. In addition, the method has better positioning stability with small fluctuation by time and the amount of experimental data. The energy consumption of the lesions in the two traditional methods are all at a high level, and the average energy consumption was about 50 J, which is significantly higher than the proposed method. Therefore, it verifies that the B-SIFT based gynecological ultrasound image lesion localization method has better performance.

The time-consuming comparison of gynecological ultrasound image lesion localization is shown in Table 2.

Table 2 shows that the positioning time of all three methods increases continuously with the increment of the amount of experimental data. However, the overall time-consuming value of the proposed method is the smallest (no more than 6 min). The positioning results of the other two methods can take up to 19 min and 26 min, respectively. It verifies that the proposed method can quickly complete the localization of lesions in gynecological ultrasound images, which greatly saved the positioning time. It is because that the proposed method obtains the feature region descriptor based on B-SIFT by constructing the eigenvector normalized based on the Babbitt coefficient, which is used to realize the image lesion region localization. This step is a greatly simplified traditional method and saves the positioning and time-consuming.

Conclusion

Medical diagnosis using ultrasound images is an important method in clinical diagnosis. Non-destructive, real-time, non-invasive, and other advantages have made it widely used in gynecological diagnosis. In recent years, early diagnosis of gynecology based on ultrasound images is a research hotspot. Since the ultrasound image is limited by the low signal to noise ratio at the time of imaging, this increases the difficulty of accurate diagnosis. In view of the current problems in the recognition and location of gynecological lesions, a localization method based on B-SIFT for gynecological ultrasound image lesions is proposed. The image is enhanced by the Rayleigh mean, and the lesion region is localized by B-SIFT based on the enhanced image. The experimental results show that the proposed method is robust and can be used in real application.

Today, there are great differences in the problems that need to be solved in early diagnosis of gynecology from the perspective of gynecological ultrasound image analysis and processing. In addition, different image analyses and processing technologies have different pertinences and problems to be solved. Therefore, how we can effectively represent the causal link in the doctor’s cognitive process on this basis will be a very important and urgent problem to be solved.

Availability of data and materials

Data sharing not applicable to this article as no datasets were generated or analyzed during the current study.

Abbreviations

SIFT:

Scale-invariant feature transform

B-SIFT:

Bhattacharyya coefficient-based scale-invariant feature transform

3D:

Three dimensional

CT:

Computed tomography

HSIL:

High-grade squamous intraepithelial lesion

PET:

Positron emission tomography

MRI:

Magnetic resonance imaging

References

  1. 1.

    S. Alamdaran, D. Farokh, A. Haddad, et al., Assessment of ultrasound / radio-guided occult lesion localization in non-palpable breast lesions. Asia Oceania J Nucl. Med. Biol. 6(1), 10–14 (2018)

    Google Scholar 

  2. 2.

    A. Arnaud, F. Forbes, N. Coquery, et al., Fully automatic lesion localization and characterization: Application to brain tumors using multiparametric quantitative MRI data. IEEE Trans. Med. Imaging 37(7), 1678–1689 (2018)

    Article  Google Scholar 

  3. 3.

    M.S. Brown, K. Ghj, G.H. Chu, et al., Quantitative bone scan lesion area as an early surrogate outcome measure indicative of overall survival in metastatic prostate cancer. J. Med. Imaging 5(1), 011017 (2018)

    Article  Google Scholar 

  4. 4.

    L. Cao, H. Chen, J. Li, et al., Physiological features based coordinate system for multi-view analysis in mammograms. J. Electron. Inf. Technol. 39(1), 176–182 (2017)

    Google Scholar 

  5. 5.

    E. Cheang, R. Ha, C. Thornton, et al., Innovations in image guided preoperative breast lesion localization. Br. J. Radiol. 91(1085), 20170740 (2018)

    Article  Google Scholar 

  6. 6.

    Q. Cong, N.H. Jiang, L. Sui, A study on punch biopsy locations of cervical lesions in the absence of colposcopy. Progress Obstet. Gynecol. 26(9), 666–669 (2017)

    Google Scholar 

  7. 7.

    M. Dinomais, L. Hertz-Pannier, S. Groeschel, et al., Long term motor function after neonatal stroke: Lesion localization above all. Hum. Brain Mapp. 36(12), 4793–4807 (2015)

    Article  Google Scholar 

  8. 8.

    C. Duarte, F. Bastidas, A.D.L. Reyes, et al., Randomized controlled clinical trial comparing radioguided occult lesion localization with wire-guided lesion localization to evaluate their efficacy and accuracy in the localization of nonpalpable breast lesions. Surgery 159(4), 1140–1145 (2016)

    Article  Google Scholar 

  9. 9.

    A. Dwivedi, G. Srivastava, S. Dhar, et al., A decentralized privacy-preserving healthcare blockchain for iot[J]. Sensors 19(2), 326 (2019)

    Article  Google Scholar 

  10. 10.

    G. Han, X. Liu, G. Zheng, Automated detection of lesion regions in lung computed tomography images. A review. Acta Automat. Sin. 43(12), 2071–2090 (2017)

    MATH  Google Scholar 

  11. 11.

    S. Liu, C. Guo, F. Al-Turjman, et al., Reliability of response region: A novel mechanism in visual tracking by edge computing for IIoT environments, Mechanical Systems and Signal Processing 138, 106537 (2020)

  12. 12.

    S. Liu, W. Bai, G. Liu, et al., Parallel fractal compression method for big video data. Complexity 2018, 2016976 (2018). https://doi.org/10.1155/2018/2016976

    MATH  Article  Google Scholar 

  13. 13.

    L. Shuai, W. Shuai, L. Xinyu, et al. Fuzzy Detection aided Real-time and Robust Visual Tracking under Complex Environments. IEEE Transactions on Fuzzy Systems, accepted (2020). https://doi.org/10.1109/TFUZZ.2020.3006520

  14. 14.

    L. Mengye, L. Shuai, Nucleosome positioning based on generalized relative entropy, soft computing 23, 9175–9188 (2020)

  15. 15.

    S.B. Pillay, J.R. Binder, C. Humphries, et al., Lesion localization of speech comprehension deficits in chronic aphasia. Neurology 88(10), 970–975 (2017)

    Article  Google Scholar 

  16. 16.

    F. Pozzati, F. Mascilini, G. Magoga, et al., P30.08: Imaging in gynecological disease: Clinical and ultrasound characteristics of ovarian metastasis from low-grade appendiceal mucinous neoplasms (LAMNs). Ultrasound Obstet. Gynecol. 50(S1), 255 (2017)

    Article  Google Scholar 

  17. 17.

    D.Y. Qian, Screening of abnormal images of colposcopy for cervical lesions. Chin. J Pract. Gynecol. Obstet 32(5), 399–402 (2016)

    Google Scholar 

  18. 18.

    G.B. Raina, M.G. Cersosimo, S.S. Folgar, et al., Holmes tremor: Clinical description, lesion localization, and treatment in a series of 29 cases. Neurology 86(10), 931–938 (2016)

    Article  Google Scholar 

  19. 19.

    K.I. Seo, W. Moon, C.W. Lee, et al., Minimal resection of jejuna Dieulafoy’s lesion using an intraoperative fluoroscopic localization of the metallic coils used in angiography. Korean J Gastroenterol. 69(2), 135–138 (2017)

    Article  Google Scholar 

  20. 20.

    L. Shuai, L. Gaocheng, Z. Huiyu, A robust parallel object tracking method for illumination variations. Mobile Networks Appl. 24(1), 5–17 (2019)

    Article  Google Scholar 

  21. 21.

    D. Taylor, A. Bourke, M. Hobbs, et al., Breast localization techniques and margin definitions used by Australian and New Zealand surgeons. ANZ J. Surg. 86(10), 744–745 (2016)

    Article  Google Scholar 

  22. 22.

    H. Wang, J. Li, L. Guo, et al., Fractal complexity-based feature extraction algorithm of communication signals. Fractals 25(5), 1740008 (2017)

    Article  Google Scholar 

  23. 23.

    Y.D. Wininger, N.A. Buckalew, R.A. Kaufmann, et al., Ultrasound combined with electrodiagnosis improves lesion localization and outcome in posterior interosseous neuropathy. Muscle Nerve 52(6), 1117–1121 (2015)

    Article  Google Scholar 

  24. 24.

    J. Xiong, X.Z. Zhou, H. Guo, et al., An improved active bat indoor ultrasonic positioning method. J. Detect. Control 39(1), 101–105 (2017)

    Google Scholar 

  25. 25.

    P. Zheng, L. Shuai, K. Arun, et al., Visual attention feature (VAF): A novel strategy for visual tracking based on cloud platform in intelligent surveillance systems. J. Parallel Dist, Computing 120, 182–194 (2018)

    Article  Google Scholar 

Download references

Acknowledgements

No other acknowledgements.

Funding

This research is supported by no funds.

Author information

Affiliations

Authors

Contributions

Dr. Yan Sujuan designs and codes the total algorithm. Dr. Jin Hong provides images and analyzes the results. The author(s) read and approved the final manuscript.

Corresponding author

Correspondence to Hong Jin.

Ethics declarations

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Yan, S., Jin, H. An improved localization method for lesion area in gynecological ultrasound image. J Image Video Proc. 2020, 43 (2020). https://doi.org/10.1186/s13640-020-00530-6

Download citation

Keywords

  • Gynecology
  • Ultrasound image
  • Lesion area
  • Localization
  • Rayleigh distribution
  • Noise suppression
  • Bhattacharyya coefficient