<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.1 20151215//EN" "http://jats.nlm.nih.gov/publishing/1.1/JATS-journalpublishing1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" article-type="research-article" dtd-version="1.1">
<front>
<journal-meta>
<journal-id journal-id-type="pmc">CSSE</journal-id>
<journal-id journal-id-type="nlm-ta">CSSE</journal-id>
<journal-id journal-id-type="publisher-id">CSSE</journal-id>
<journal-title-group>
<journal-title>Computer Systems Science &#x0026; Engineering</journal-title>
</journal-title-group>
<issn pub-type="ppub">0267-6192</issn>
<publisher>
<publisher-name>Tech Science Press</publisher-name>
<publisher-loc>USA</publisher-loc>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="publisher-id">32935</article-id>
<article-id pub-id-type="doi">10.32604/csse.2023.032935</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Article</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>An Efficient Hybrid Optimization for Skin Cancer Detection Using PNN Classifier</article-title><alt-title alt-title-type="left-running-head">An Efficient Hybrid Optimization for Skin Cancer Detection Using PNN Classifier</alt-title><alt-title alt-title-type="right-running-head">An Efficient Hybrid Optimization for Skin Cancer Detection Using PNN Classifier</alt-title>
</title-group>
<contrib-group content-type="authors">
<contrib id="author-1" contrib-type="author" corresp="yes">
<name name-style="western"><surname>Jaculin Femil</surname><given-names>J.</given-names></name>
<xref ref-type="aff" rid="aff-1">1</xref><email>j.jacqueln19@gmail.com</email>
</contrib>
<contrib id="author-2" contrib-type="author">
<name name-style="western"><surname>Jaya</surname><given-names>T.</given-names></name>
<xref ref-type="aff" rid="aff-2">2</xref>
</contrib>
<aff id="aff-1"><label>1</label><institution>Ponjesly College of Engineering</institution>, <addr-line>Nagercoil, Tamilnadu</addr-line>, <country>India</country></aff>
<aff id="aff-2"><label>2</label><institution>CSI Institute of Technology Thovalai</institution>, <country>India</country></aff>
</contrib-group><author-notes><corresp id="cor1"><label>&#x002A;</label>Corresponding Author: J. Jaculin Femil. Email: <email>j.jacqueln19@gmail.com</email></corresp></author-notes>
<pub-date publication-format="print" date-type="pub" iso-8601-date="2022-12-15"><day>15</day><month>12</month>
<year>2022</year></pub-date>
<volume>45</volume>
<issue>3</issue>
<fpage>2919</fpage>
<lpage>2934</lpage>
<history>
<date date-type="received"><day>02</day><month>6</month><year>2022</year></date>
<date date-type="accepted"><day>12</day><month>7</month><year>2022</year></date>
</history>
<permissions>
<copyright-statement>&#x00A9; 2023 Jaculin Femil and Jaya</copyright-statement>
<copyright-year>2023</copyright-year>
<copyright-holder>Jaculin Femil and Jaya</copyright-holder>
<license xlink:href="https://creativecommons.org/licenses/by/4.0/">
<license-p>This work is licensed under a <ext-link ext-link-type="uri" xlink:type="simple" xlink:href="https://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</ext-link>, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.</license-p>
</license>
</permissions>
<self-uri content-type="pdf" xlink:href="TSP_CSSE_32935.pdf"></self-uri>
<abstract>
<p>The necessity of on-time cancer detection is extremely high in the recent days as it becomes a threat to human life. The skin cancer is considered as one of the dangerous diseases among other types of cancer since it causes severe health impacts on human beings and hence it is highly mandatory to detect the skin cancer in the early stage for providing adequate treatment. Therefore, an effective image processing approach is employed in this present study for the accurate detection of skin cancer. Initially, the dermoscopy images of skin lesions are retrieved and processed by eliminating the noises with the assistance of Gabor filter. Then, the pre-processed dermoscopy image is segmented into multiple regions by implementing cascaded Fuzzy C-Means (FCM) algorithm, which involves in improving the reliability of cancer detection. The A Gabor Response Co-occurrence Matrix (GRCM) is used to extract melanoma parameters in an efficient manner. A hybrid Particle Swarm Optimization (PSO)-Whale Optimization is then utilized for efficiently optimizing the extracted features. Finally, the features are significantly classified with the assistance of Probabilistic Neural Network (PNN) classifier for classifying the stages of skin lesion in an optimal manner. The whole work is stimulated in MATLAB and the attained outcomes have proved that the introduced approach delivers optimal results with maximal accuracy of 97.83&#x0025;.</p>
</abstract>
<kwd-group kwd-group-type="author">
<kwd>Gabor filter</kwd>
<kwd>GRCM</kwd>
<kwd>hybrid PSO-whale optimization algorithm</kwd>
<kwd>PNN classifier</kwd>
</kwd-group>
</article-meta>
</front>
<body>
<sec id="s1">
<label>1</label>
<title>Introduction</title>
<p>World Health Organization has stated that the cancer is one of the major causes of death. Among different cancers, skin cancer is regarded as one of the common and hazardous types. There exist different skin disorders like Basal Cell Carcinoma (BCC), Squamous Cell Carcinoma (SCC) and Malignant Melanoma (MM). The MM is the hazardous type of all as it affects other organs, which in turn leads to death [<xref ref-type="bibr" rid="ref-1">1</xref>]. Melanoma is a harmful tumor, which is produced by the cells containing melanocytes. Although melanoma is found anywhere in the body, it mostly grows on the back of the lower limbs. The American cancer society has assessed that around 7,230 individuals are recovered from melanoma but around 96,480 new melanoma cases are found in US in 2019. If cancer is detected and treated early, it is possible to minimize the mortality rate of cancer. This has inspired researchers to look for new strategies to achieve the early identification of skin malignancy [<xref ref-type="bibr" rid="ref-2">2</xref>&#x2013;<xref ref-type="bibr" rid="ref-4">4</xref>].</p>
<p>Specially trained dermatologist utilizes an apparatus to look at pigmented skin lesions depend on a complex set of visible patterns. A novel automated method [<xref ref-type="bibr" rid="ref-5">5</xref>] further analyzes the direction or longitudinal positioning of the streak lines and categorizes the lesions. Irregular streaks are one of the most basic symptoms that show a high relationship with melanoma. A Computer Aided System (CAD) [<xref ref-type="bibr" rid="ref-6">6</xref>] for the detection of melanoma is built with the recognition of streaks. The non-invasive skin imaging strategies have been preferred in a wider range because of having the beneficial impacts like lesser risk of complication, easy recovery and minimum cost. Hence, the dermoscopy [<xref ref-type="bibr" rid="ref-7">7</xref>] has been widely used in the distinguishing proof of melanoma. Thus, the doctors have different diagnosis results and poor reproducibility. The modern melanoma detection is done by the classifier based on Convolutional Neural Network (CNN) [<xref ref-type="bibr" rid="ref-8">8</xref>]. Studies have shown that the CNN effectively classifies skin cancer and provides life-saving diagnosis. The challenge is that the differences between classes in medical image are usually much smaller than the normal images. Color [<xref ref-type="bibr" rid="ref-9">9</xref>] is a key feature of melanoma while combining through the area of the lesion to capture color and variegation. Usually. The color descriptors are Red, Green and Blue (RGB). Due to refined color alteration, location-related color information, poor color divergence and large differences between images of the identical category, it is highly hard to perform clinical color evaluations in dermoscopy images. The Asymmetrical, Border, Color, Diameter, Evolving (ABCDE) rule is a basic structure that doctors, beginner dermatologists and non-physicians can understand the characteristics of curable early melanoma, thereby enhancing the initial detection of melanoma [<xref ref-type="bibr" rid="ref-10">10</xref>,<xref ref-type="bibr" rid="ref-11">11</xref>]. The ABCDE method has low specificity and sensitivity [<xref ref-type="bibr" rid="ref-12">12</xref>]. Then it applies various algorithms and techniques to measure and determine whether the point is melanoma [<xref ref-type="bibr" rid="ref-13">13</xref>]. Uveal melanoma [<xref ref-type="bibr" rid="ref-14">14</xref>] is an uncommon disease but it is the most well-known primary ocular malignancy. Uveal melanoma patients undergo Radiation Therapy (RT). However, the specialist just gets patient&#x2019;s eye development by watching the video in the control room.</p>
<p>Several prescient models are described, because of the significance of nuclear structure in cancer diagnosis. In [<xref ref-type="bibr" rid="ref-15">15</xref>], Support Vector Machine (SVM) choice limit in a greatest delicate edge issue, which deciphers patient&#x2019;s nuclear pattern from various classes. Total Body Skin Examination (TBSE) is a main strategy for melanoma detection [<xref ref-type="bibr" rid="ref-16">16</xref>]. Each pigmented skin lesions are checked separately for identifying the melanoma signs. Artificially initiated changes in all lesions are accurately identified. The millimeter wave imaging setting is performed accurately by the characterizing the penetration depth of millimeter waves [<xref ref-type="bibr" rid="ref-17">17</xref>] in skin. It accurately simulates the dielectric properties of human skin tissue under normal and malignant conditions. High sensitive detection of skin cancer using semiconductor membranes used in new water based Terahertz (THZ) Meta-Materials (MM) are discussed in [<xref ref-type="bibr" rid="ref-18">18</xref>&#x2013;<xref ref-type="bibr" rid="ref-20">20</xref>].</p>
<p>In [<xref ref-type="bibr" rid="ref-21">21</xref>], skin lesions are examined by a set of High-Level Intuitive Features (HLIFS) in the standard camera image. Noninvasive micro-sensor system is used to analyze the evolution of melanoma [<xref ref-type="bibr" rid="ref-22">22</xref>]. The recent trend is to select local patches in the image and describe each patch by a set of local features. By using global methods and local features, melanoma is detected in [<xref ref-type="bibr" rid="ref-23">23</xref>]. The incidence of Melanoma in Suit (MIS) [<xref ref-type="bibr" rid="ref-24">24</xref>] has increased significantly. The highest cure rate for melanoma is attained by testing at the MIS stage but dermoscopy alone is not sufficient enough to reliably detect MIS. Use of microwave reflection method as a diagnostic tool for detecting skin cancer is discussed in [<xref ref-type="bibr" rid="ref-25">25</xref>]. In [<xref ref-type="bibr" rid="ref-26">26</xref>], skin cancer and other skin pathologies are evaluated using electrical bio-impedance. The aim is to distinguish skin cancer from benign moles using multi-frequency impedance spectroscopy. In [<xref ref-type="bibr" rid="ref-27">27</xref>], the entire mobile imaging system is employed to detect melanoma in the early period. Human-Computer Interface (HCI) is designed to understand usability and acceptability issues in mobile imaging system.</p>
<p>The design of existing research shows that it requires a lot of space and cost. The purpose of the proposed method is to detect skin cancer by using image processing to produce optimal results. These features include texture, color, border, height and thickness. This research is the basis for identifying new developments in skin lesions. Here, the input image has gone through several processes from preprocessing to classification to efficiently diagnose the cancer with high accuracy. The classifier used in the proposed system helps to perform operations at high speed.</p>
</sec>
<sec id="s2">
<label>2</label>
<title>Proposed Methodology</title>
<p>The objective of this proposed strategy is to distinguish the melanoma more accurately as the treatment depends upon the location and size. This method has gone through five steps for obtaining high precision. This method starts by retrieving images of skin lesions. The collected images are preprocessed to obtain high quality images. Gabor filter is used to eliminate unnecessary noises in an optimal manner. Segmentation is performed after removing the excess noises. The CFC is used in segmentation process, where the preprocessed image is divided into many segments. Extraction of feature is done by GRCM. The optimal features are selected by means of utilizing Hybrid PSO-whale optimization. This is an advanced approach extensively used in image processing applications. It has the ability to deliver optimal outcomes with maximal reliability. At last, classification is done by using PNN. Compared with other classifiers, it performs better. Through the use of new algorithms and methods, the obtained images have gone through various steps and provide better information to develop treatment plans. The accuracy obtained through this process is 97.83&#x0025;. The block diagram of this introduced methodology is portrayed in <xref ref-type="fig" rid="fig-1">Fig. 1</xref>.</p>
<fig id="fig-1">
<label>Figure 1</label>
<caption>
<title>Proposed block diagram</title></caption>
<graphic mimetype="image" mime-subtype="png" xlink:href="CSSE_32935-fig-1.png"/>
</fig>
<sec id="s2_1">
<label>2.1</label>
<title>Preprocessing</title>
<p>The input of skin lesions is initially obtained from the Kaggle dataset. The obtained input image contains distortions such as noise and redundancy. Preprocessing is performed to otain high quality images. To get better image, the obtained RGB image is converter to gray scale image. The formula used to accomplish this step is expressed as,<disp-formula id="eqn-1"><label>(1)</label>
<mml:math id="mml-eqn-1" display="block"><mml:mrow><mml:mi mathvariant="normal">G</mml:mi><mml:mi mathvariant="normal">r</mml:mi><mml:mi mathvariant="normal">a</mml:mi><mml:mi mathvariant="normal">y</mml:mi><mml:mi mathvariant="normal">s</mml:mi><mml:mi mathvariant="normal">c</mml:mi><mml:mi mathvariant="normal">a</mml:mi><mml:mi mathvariant="normal">l</mml:mi><mml:mi mathvariant="normal">e</mml:mi><mml:mo>=</mml:mo></mml:mrow><mml:mspace width="thinmathspace" /><mml:mn>0.29</mml:mn><mml:mo>&#x2217;</mml:mo><mml:mi>R</mml:mi><mml:mo>+</mml:mo><mml:mn>0.58</mml:mn><mml:mo>&#x2217;</mml:mo><mml:mi>G</mml:mi><mml:mo>+</mml:mo><mml:mn>0.11</mml:mn><mml:mspace width="thickmathspace" /><mml:mi>B</mml:mi></mml:math>
</disp-formula></p>
<p>Image enhancement improves image quality by eliminating noise and minimizes blur. Gabor filter is widely used in image processing applications. As a matter of fact, an acclaimed class of capacities that are known to accomplish both spatial and spatial recurrence limitation is the Gabor function that is not genuinely a wavelet but rather it can be executed in such a way as to impersonate the properties of wavelets. The Gabor function is characterized by the product of the complex exponential function and Gaussian function. Therefore, the two-dimensional Gabor filter matches the receiving field model of mammalian retinal nerve cells. In a two-dimensional coordinate system, a Gabor filter containing real and imaginary parts is expressed as,</p>
<p><disp-formula id="eqn-2"><label>(2)</label>
<mml:math id="mml-eqn-2" display="block"><mml:mi>g</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>a</mml:mi><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mi>b</mml:mi><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mi>&#x03B4;</mml:mi><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mi>&#x03B8;</mml:mi><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mi>&#x03C8;</mml:mi><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mi>&#x03C3;</mml:mi><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mi>&#x03B3;</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mi>exp</mml:mi><mml:mo>&#x2061;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mrow><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mrow><mml:mfrac><mml:mrow><mml:mo>&#x2212;</mml:mo><mml:msup><mml:mi>a</mml:mi><mml:mrow><mml:msup><mml:mi></mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:msup><mml:mn>2</mml:mn></mml:mrow></mml:msup><mml:mo>+</mml:mo><mml:msup><mml:mi>&#x03B3;</mml:mi><mml:mn>2</mml:mn></mml:msup><mml:msup><mml:mi>b</mml:mi><mml:mrow><mml:msup><mml:mi></mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:msup><mml:mn>2</mml:mn></mml:mrow></mml:msup></mml:mrow><mml:mrow><mml:mn>2</mml:mn><mml:msup><mml:mi>&#x03C3;</mml:mi><mml:mn>2</mml:mn></mml:msup></mml:mrow></mml:mfrac></mml:mrow></mml:mstyle></mml:mrow></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mo>&#x00D7;</mml:mo><mml:mi>e</mml:mi><mml:mi>x</mml:mi><mml:mi>p</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>j</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mn>2</mml:mn><mml:mi>&#x03C0;</mml:mi><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mrow><mml:mfrac><mml:mrow><mml:msup><mml:mrow><mml:mi>a</mml:mi></mml:mrow><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:msup></mml:mrow><mml:mi>&#x03B4;</mml:mi></mml:mfrac></mml:mrow><mml:mo>+</mml:mo><mml:mi>&#x03C8;</mml:mi></mml:mstyle></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:math>
</disp-formula></p>
<p><disp-formula id="eqn-3"><label>(3)</label>
<mml:math id="mml-eqn-3" display="block"><mml:msup><mml:mrow><mml:mi>a</mml:mi></mml:mrow><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:msup><mml:mo>=</mml:mo><mml:mi>a</mml:mi><mml:mspace width="thickmathspace" /><mml:mi>c</mml:mi><mml:mi>o</mml:mi><mml:mi>s</mml:mi><mml:mspace width="thickmathspace" /><mml:mi>&#x03B8;</mml:mi><mml:mo>+</mml:mo><mml:mi>b</mml:mi><mml:mspace width="thickmathspace" /><mml:mi>s</mml:mi><mml:mi>i</mml:mi><mml:mi>n</mml:mi><mml:mspace width="thickmathspace" /><mml:mi>&#x03B8;</mml:mi><mml:mspace width="thickmathspace" /></mml:math>
</disp-formula></p>
<p><disp-formula id="eqn-4"><label>(4)</label>
<mml:math id="mml-eqn-4" display="block"><mml:msup><mml:mrow><mml:mi>b</mml:mi></mml:mrow><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:msup><mml:mo>=</mml:mo><mml:mo>&#x2212;</mml:mo><mml:mi>a</mml:mi><mml:mspace width="thickmathspace" /><mml:mi>s</mml:mi><mml:mi>i</mml:mi><mml:mi>n</mml:mi><mml:mi>&#x03B8;</mml:mi><mml:mo>+</mml:mo><mml:mi>b</mml:mi><mml:mspace width="thickmathspace" /><mml:mi>c</mml:mi><mml:mi>o</mml:mi><mml:mi>s</mml:mi><mml:mi>&#x03B8;</mml:mi></mml:math>
</disp-formula></p>
<p>Among them, <inline-formula id="ieqn-2">
<mml:math id="mml-ieqn-2"><mml:mi>&#x03B4;</mml:mi></mml:math>
</inline-formula> is the sinusoidal factor&#x2019;s wavelength, <inline-formula id="ieqn-3">
<mml:math id="mml-ieqn-3"><mml:mi>&#x03B8;</mml:mi></mml:math>
</inline-formula> as orientation, <inline-formula id="ieqn-4">
<mml:math id="mml-ieqn-4"><mml:mi>&#x03C8;</mml:mi></mml:math>
</inline-formula> as phase offset, <inline-formula id="ieqn-5">
<mml:math id="mml-ieqn-5"><mml:mi>&#x03C3;</mml:mi></mml:math>
</inline-formula> as a scale, and <inline-formula id="ieqn-6">
<mml:math id="mml-ieqn-6"><mml:mi>&#x03B3;</mml:mi></mml:math>
</inline-formula> is the spatial aspect of Gabor function. Real and imaginary parts are specified as <inline-formula id="ieqn-7">
<mml:math id="mml-ieqn-7"><mml:mi>&#x03C8;</mml:mi><mml:mo>=</mml:mo><mml:mn>0</mml:mn></mml:math>
</inline-formula> and <inline-formula id="ieqn-8">
<mml:math id="mml-ieqn-8"><mml:mi>&#x03C8;</mml:mi><mml:mo>=</mml:mo><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mrow><mml:mfrac><mml:mi>&#x03C0;</mml:mi><mml:mn>2</mml:mn></mml:mfrac></mml:mrow></mml:mstyle></mml:math>
</inline-formula> respectively. As shown in <xref ref-type="fig" rid="fig-2">Fig. 2</xref>, the required output is only the real part. Parameter <inline-formula id="ieqn-9">
<mml:math id="mml-ieqn-9"><mml:mi>&#x03C3;</mml:mi></mml:math>
</inline-formula> is assessed by <inline-formula id="ieqn-10">
<mml:math id="mml-ieqn-10"><mml:mi>&#x03B4;</mml:mi></mml:math>
</inline-formula> and bandwidth of spatial frequency as,<disp-formula id="eqn-5"><label>(5)</label>
<mml:math id="mml-eqn-5" display="block"><mml:mi>&#x03C3;</mml:mi><mml:mo>=</mml:mo><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mrow><mml:mfrac><mml:mi>&#x03B4;</mml:mi><mml:mi>&#x03C0;</mml:mi></mml:mfrac></mml:mrow><mml:msqrt><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mrow><mml:mfrac><mml:mrow><mml:mi>ln</mml:mi><mml:mo>&#x2061;</mml:mo><mml:mn>2</mml:mn></mml:mrow><mml:mn>2</mml:mn></mml:mfrac></mml:mrow></mml:mstyle></mml:msqrt><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mrow><mml:mfrac><mml:mrow><mml:msup><mml:mn>2</mml:mn><mml:mrow><mml:mi>b</mml:mi><mml:mi>w</mml:mi></mml:mrow></mml:msup><mml:mo>+</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:msup><mml:mn>2</mml:mn><mml:mrow><mml:mi>b</mml:mi><mml:mi>w</mml:mi></mml:mrow></mml:msup><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:mfrac></mml:mrow><mml:mspace width="thickmathspace" /></mml:mstyle></mml:mstyle></mml:math>
</disp-formula></p>
<fig id="fig-2">
<label>Figure 2</label>
<caption>
<title>Gabor function results (right) from the product of Gaussian envelope by a complex exponential carrier. Only real parts are shown</title></caption>
<graphic mimetype="image" mime-subtype="png" xlink:href="CSSE_32935-fig-2.png"/>
</fig>
</sec>
<sec id="s2_2">
<label>2.2</label>
<title>Segmentation Using Cascaded Fuzzy C-Mean</title>
<p>The main objective of segmentation is to divide an image into multiple regions and each region shows different data in the image. Segmentation method depends on both attributes such as identifying discontinuities and similarities. Cascaded FCM is used to process the image of skin lesions in this process of segmentation. FCM ordinarily puts the cluster prototype in a region with many input vectors. In order to give an exact clustering, the fuzzy rule base is effectively used, which in turn maximizes the reliability of the disease detection in an optimal manner. The accuracy and convergence speed of cascaded FCM is significantly maximal than the conventional FCM. In the next stage, only the strength voxels that are adjacent to the tumor are included. The ultimate aim of FCM is to classify cluster centers. The target capacity to be limited is characterized as,</p>
<p><disp-formula id="eqn-6"><label>(6)</label>
<mml:math id="mml-eqn-6" display="block"><mml:msub><mml:mi>J</mml:mi><mml:mrow><mml:mi>F</mml:mi><mml:mi>C</mml:mi><mml:mi>M</mml:mi><mml:mo>=</mml:mo><mml:munderover><mml:mrow><mml:mo movablelimits="false">&#x2211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mi>c</mml:mi></mml:munderover><mml:mo>&#x2061;</mml:mo><mml:munderover><mml:mrow><mml:mo movablelimits="false">&#x2211;</mml:mo></mml:mrow><mml:mrow><mml:mi>k</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mi>n</mml:mi></mml:munderover><mml:mo>&#x2061;</mml:mo><mml:msubsup><mml:mi>u</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>k</mml:mi></mml:mrow><mml:mi>m</mml:mi></mml:msubsup><mml:mo fence="false" stretchy="false">&#x2016;</mml:mo><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mi>k</mml:mi><mml:mspace width="thickmathspace" /></mml:mrow></mml:msub><mml:mi mathvariant="normal">&#x005F;</mml:mi><mml:mspace width="thickmathspace" /><mml:msub><mml:mi>V</mml:mi><mml:mi>i</mml:mi></mml:msub><mml:msup><mml:mo fence="false" stretchy="false">&#x2016;</mml:mo><mml:mn>2</mml:mn></mml:msup><mml:mi>A</mml:mi><mml:mo>=</mml:mo><mml:munderover><mml:mrow><mml:mo movablelimits="false">&#x2211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mi>c</mml:mi></mml:munderover><mml:mo>&#x2061;</mml:mo><mml:munderover><mml:mrow><mml:mo movablelimits="false">&#x2211;</mml:mo></mml:mrow><mml:mrow><mml:mi>k</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mi>n</mml:mi></mml:munderover><mml:mo>&#x2061;</mml:mo><mml:msubsup><mml:mi>u</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>k</mml:mi></mml:mrow><mml:mi>m</mml:mi></mml:msubsup><mml:msubsup><mml:mi>d</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>k</mml:mi></mml:mrow><mml:mn>2</mml:mn></mml:msubsup></mml:mrow></mml:msub></mml:math>
</disp-formula></p>
<p>where <inline-formula id="ieqn-11">
<mml:math id="mml-ieqn-11"><mml:msub><mml:mi>X</mml:mi><mml:mi>k</mml:mi></mml:msub></mml:math>
</inline-formula> specifies input data (k&#x2009;&#x003D;&#x2009;1,&#x2026;&#x2026;.n), fuzzy participation work <inline-formula id="ieqn-12">
<mml:math id="mml-ieqn-12"><mml:msub><mml:mi>u</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mspace width="thickmathspace" /><mml:mo>&#x2208;</mml:mo><mml:mspace width="thickmathspace" /><mml:mrow><mml:mo>[</mml:mo><mml:mrow><mml:mn>0</mml:mn><mml:mspace width="thickmathspace" /><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mspace width="thickmathspace" /><mml:mn>1</mml:mn></mml:mrow><mml:mo>]</mml:mo></mml:mrow></mml:math>
</inline-formula> indicating how much vector <inline-formula id="ieqn-13">
<mml:math id="mml-ieqn-13"><mml:msub><mml:mi>X</mml:mi><mml:mi>k</mml:mi></mml:msub></mml:math>
</inline-formula> has a place with cluster (i), boundary fuzzification m&#x2009;&#x003E;&#x2009;1 and <inline-formula id="ieqn-14">
<mml:math id="mml-ieqn-14"><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>k</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> specifies separation <inline-formula id="ieqn-15">
<mml:math id="mml-ieqn-15"><mml:msub><mml:mi>X</mml:mi><mml:mi>k</mml:mi></mml:msub></mml:math>
</inline-formula> vector and <inline-formula id="ieqn-16">
<mml:math id="mml-ieqn-16"><mml:msub><mml:mi>V</mml:mi><mml:mi>i</mml:mi></mml:msub></mml:math>
</inline-formula> prototype cluster. Probabilistic segment is utilized by FCM, which implies that the fuzzy enrollments of any information vector <inline-formula id="ieqn-17">
<mml:math id="mml-ieqn-17"><mml:msub><mml:mi>X</mml:mi><mml:mi>k</mml:mi></mml:msub></mml:math>
</inline-formula> regarding classes fulfill the <inline-formula id="ieqn-18">
<mml:math id="mml-ieqn-18"><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mspace width="thickmathspace" /><mml:msubsup><mml:mrow><mml:mo movablelimits="false">&#x2211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mi>c</mml:mi></mml:msubsup><mml:mo>&#x2061;</mml:mo><mml:msub><mml:mi>u</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:math>
</inline-formula> probability requirement. During every cycle, the ideal qualities are concluded as of the zero angle situation and langrage multiple is given as,</p>
<p><disp-formula id="eqn-7"><label>(7)</label>
<mml:math id="mml-eqn-7" display="block"><mml:msubsup><mml:mi>u</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>k</mml:mi></mml:mrow><mml:mo>&#x2217;</mml:mo></mml:msubsup><mml:mo>=</mml:mo><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mrow><mml:mfrac><mml:mrow><mml:msubsup><mml:mi>d</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>k</mml:mi></mml:mrow><mml:mrow><mml:mrow><mml:mfrac><mml:mrow><mml:mi mathvariant="normal">&#x005F;</mml:mi><mml:mn>2</mml:mn></mml:mrow><mml:mrow><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>m</mml:mi><mml:mspace width="thickmathspace" /><mml:mi mathvariant="normal">&#x005F;</mml:mi><mml:mspace width="thickmathspace" /><mml:mn>1</mml:mn></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow></mml:mfrac></mml:mrow></mml:mrow></mml:msubsup></mml:mrow><mml:mrow><mml:msubsup><mml:mrow><mml:mo movablelimits="false">&#x2211;</mml:mo></mml:mrow><mml:mrow><mml:mi>j</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mi>c</mml:mi></mml:msubsup><mml:mo>&#x2061;</mml:mo><mml:msubsup><mml:mi>d</mml:mi><mml:mrow><mml:mi>j</mml:mi><mml:mi>k</mml:mi></mml:mrow><mml:mrow><mml:mrow><mml:mfrac><mml:mrow><mml:mi mathvariant="normal">&#x005F;</mml:mi><mml:mn>2</mml:mn></mml:mrow><mml:mrow><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>m</mml:mi><mml:mspace width="thickmathspace" /><mml:mi mathvariant="normal">&#x005F;</mml:mi><mml:mspace width="thickmathspace" /><mml:mn>1</mml:mn></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow></mml:mfrac></mml:mrow></mml:mrow></mml:msubsup></mml:mrow></mml:mfrac></mml:mrow><mml:mspace width="thickmathspace" /><mml:mtable rowspacing="4pt" columnspacing="1em"><mml:mtr><mml:mtd><mml:mrow><mml:mi mathvariant="normal">&#x2200;</mml:mi><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn><mml:mo>&#x2026;</mml:mo><mml:mi>c</mml:mi></mml:mrow></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mrow><mml:mi mathvariant="normal">&#x2200;</mml:mi><mml:mi>k</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn><mml:mo>&#x2026;</mml:mo><mml:mi>n</mml:mi></mml:mrow></mml:mtd></mml:mtr></mml:mtable><mml:mspace width="thickmathspace" /><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mspace width="thickmathspace" /></mml:mstyle></mml:math>
</disp-formula></p>
<p><disp-formula id="eqn-8"><label>(8)</label>
<mml:math id="mml-eqn-8" display="block"><mml:msubsup><mml:mi>V</mml:mi><mml:mi>i</mml:mi><mml:mo>&#x2217;</mml:mo></mml:msubsup><mml:mo>=</mml:mo><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mrow><mml:mfrac><mml:mrow><mml:msubsup><mml:mrow><mml:mo movablelimits="false">&#x2211;</mml:mo></mml:mrow><mml:mrow><mml:mi>k</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mi>n</mml:mi></mml:msubsup><mml:mspace width="thickmathspace" /><mml:msubsup><mml:mi>u</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>k</mml:mi></mml:mrow><mml:mi>m</mml:mi></mml:msubsup><mml:msub><mml:mi>X</mml:mi><mml:mi>k</mml:mi></mml:msub></mml:mrow><mml:mrow><mml:msubsup><mml:mrow><mml:mo movablelimits="false">&#x2211;</mml:mo></mml:mrow><mml:mrow><mml:mi>k</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mi>n</mml:mi></mml:msubsup><mml:mo>&#x2061;</mml:mo><mml:msubsup><mml:mi>u</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>k</mml:mi></mml:mrow><mml:mi>m</mml:mi></mml:msubsup></mml:mrow></mml:mfrac></mml:mrow><mml:mspace width="thickmathspace" /><mml:mi mathvariant="normal">&#x2200;</mml:mi><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn><mml:mo>&#x2026;</mml:mo><mml:mi>c</mml:mi><mml:mo>.</mml:mo></mml:mstyle></mml:math>
</disp-formula></p>
<p>As per the exchanging optimizing plan of FCM calculation, <xref ref-type="disp-formula" rid="eqn-7">Eqs. (7)</xref>, <xref ref-type="disp-formula" rid="eqn-8">(8)</xref> are alternately applied until cluster model stabilizes. The accuracy of segmentation determines the success rate of the analysis process.</p>
</sec>
<sec id="s2_3">
<label>2.3</label>
<title>Gabor Based Region Covariance Matrix</title>
<p>The significance of GRCM in the process of extracting the features is remarkably high as it assists in optimal detection of lesions with maximal accuracy. At GRCM, the discriminant information is extracted through convolution among the skin lesion image and a fixed of Gabor kernels initially through utilizing various scales, directions. At the end of circular, Gaussian wave in 2-dimentional Gabor wavelet is provided as,<disp-formula id="eqn-9"><label>(9)</label>
<mml:math id="mml-eqn-9" display="block"><mml:mrow><mml:msub><mml:mi>&#x03C6;</mml:mi><mml:mrow><mml:mi>u</mml:mi><mml:mo>,</mml:mo><mml:mtext>&#x2009;</mml:mtext><mml:mi>v</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mi>z</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:mo>&#x2016;</mml:mo><mml:msub><mml:mi>K</mml:mi><mml:mrow><mml:mi>u</mml:mi><mml:mo>,</mml:mo><mml:mi>v</mml:mi></mml:mrow></mml:msub><mml:msup><mml:mo>&#x2016;</mml:mo><mml:mn>2</mml:mn></mml:msup></mml:mrow><mml:mrow><mml:msup><mml:mi>&#x03C3;</mml:mi><mml:mn>2</mml:mn></mml:msup></mml:mrow></mml:mfrac><mml:mtext>&#x2009;</mml:mtext><mml:msup><mml:mi>e</mml:mi><mml:mrow><mml:mo>&#x2212;</mml:mo><mml:mfrac><mml:mrow><mml:mo>&#x2016;</mml:mo><mml:msub><mml:mi>K</mml:mi><mml:mrow><mml:mi>u</mml:mi><mml:mo>,</mml:mo><mml:mi>v</mml:mi></mml:mrow></mml:msub><mml:msup><mml:mo>&#x2016;</mml:mo><mml:mn>2</mml:mn></mml:msup><mml:msup><mml:mrow><mml:mrow><mml:mo>[</mml:mo> <mml:mi>Z</mml:mi> <mml:mo>]</mml:mo></mml:mrow></mml:mrow><mml:mn>2</mml:mn></mml:msup></mml:mrow><mml:mrow><mml:mn>2</mml:mn><mml:msup><mml:mi>&#x03C3;</mml:mi><mml:mn>2</mml:mn></mml:msup></mml:mrow></mml:mfrac><mml:mrow><mml:mo>[</mml:mo> <mml:mrow><mml:msup><mml:mi>e</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>K</mml:mi></mml:mrow></mml:msup><mml:mi>u</mml:mi><mml:mo>,</mml:mo><mml:msup><mml:mi>v</mml:mi><mml:mn>2</mml:mn></mml:msup><mml:mo>&#x2212;</mml:mo><mml:msup><mml:mi>e</mml:mi><mml:mrow><mml:mfrac><mml:mrow><mml:mo>&#x2212;</mml:mo><mml:msup><mml:mi>&#x03C3;</mml:mi><mml:mn>2</mml:mn></mml:msup></mml:mrow><mml:mn>2</mml:mn></mml:mfrac></mml:mrow></mml:msup></mml:mrow> <mml:mo>]</mml:mo></mml:mrow></mml:mrow></mml:msup></mml:mrow></mml:math>
</disp-formula></p>
<p>Here, u is the direction of Gabor kernel and v is the Gabor kernel proportion. GRCM is able to catch mathematical and measurable belongings in input image. Formation of the covariance matrix is contributed by pixel location and Gabor coefficient. A few component subordinates are removed from the image by applying GRCM. Some ongoing advances are additionally made in Gabor stage data to recover the highlights in a successful manner.<disp-formula id="eqn-10"><label>(10)</label>
<mml:math id="mml-eqn-10" display="block"><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>&#x03BC;</mml:mi><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mi>v</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>x</mml:mi><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mi>y</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mi>a</mml:mi><mml:mi>x</mml:mi><mml:mi>tan</mml:mi><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mrow><mml:mfrac><mml:mrow><mml:mi>I</mml:mi><mml:mi>m</mml:mi><mml:mspace width="thickmathspace" /><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msub><mml:mi>O</mml:mi><mml:mrow><mml:mi>&#x03BC;</mml:mi><mml:mo>,</mml:mo><mml:mi>v</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>x</mml:mi><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mi>y</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mi>R</mml:mi><mml:mi>e</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msub><mml:mi>O</mml:mi><mml:mrow><mml:mi>&#x03BC;</mml:mi><mml:mo>,</mml:mo><mml:mi>v</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>x</mml:mi><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mi>y</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow></mml:mfrac></mml:mrow></mml:mstyle></mml:math>
</disp-formula></p>
<p>Gabor based RCM mapping is figured by the condition,<disp-formula id="eqn-11"><label>(11)</label>
<mml:math id="mml-eqn-11" display="block"><mml:msub><mml:mi>W</mml:mi><mml:mi>i</mml:mi></mml:msub><mml:mo>=</mml:mo><mml:mrow><mml:mo>[</mml:mo><mml:mrow><mml:mi>x</mml:mi><mml:mspace width="thickmathspace" /><mml:mi>y</mml:mi><mml:mspace width="thickmathspace" /><mml:mi>I</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>x</mml:mi><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mi>y</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>o</mml:mi><mml:mi>o</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>x</mml:mi><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mi>y</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mo>&#x2026;</mml:mo><mml:mspace width="thickmathspace" /><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mn>74</mml:mn></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>x</mml:mi><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mi>y</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mo>]</mml:mo></mml:mrow></mml:math>
</disp-formula></p>
<p>Gabor based GRCM for region R,<disp-formula id="eqn-12"><label>(12)</label>
<mml:math id="mml-eqn-12" display="block"><mml:msub><mml:mi>W</mml:mi><mml:mi>R</mml:mi></mml:msub><mml:mo>=</mml:mo><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mrow><mml:mfrac><mml:mn>1</mml:mn><mml:mi>n</mml:mi></mml:mfrac></mml:mrow><mml:munderover><mml:mrow><mml:mo movablelimits="false">&#x2211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mi>n</mml:mi></mml:munderover><mml:mo>&#x2061;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msub><mml:mi>W</mml:mi><mml:mi>i</mml:mi></mml:msub><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>V</mml:mi><mml:mi>R</mml:mi></mml:msub></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msub><mml:mi>W</mml:mi><mml:mi>i</mml:mi></mml:msub><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>V</mml:mi><mml:mi>R</mml:mi></mml:msub></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mi>T</mml:mi></mml:mstyle></mml:math>
</disp-formula></p>
<p>Calculation of <inline-formula id="ieqn-19">
<mml:math id="mml-ieqn-19"><mml:msub><mml:mi>V</mml:mi><mml:mi>R</mml:mi></mml:msub></mml:math>
</inline-formula> is represented as,<disp-formula id="eqn-13"><label>(13)</label>
<mml:math id="mml-eqn-13" display="block"><mml:msub><mml:mi>V</mml:mi><mml:mi>R</mml:mi></mml:msub><mml:mo>=</mml:mo><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mrow><mml:mfrac><mml:mn>1</mml:mn><mml:mi>n</mml:mi></mml:mfrac></mml:mrow><mml:munderover><mml:mrow><mml:mo movablelimits="false">&#x2211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mi>n</mml:mi></mml:munderover><mml:mo>&#x2061;</mml:mo><mml:msub><mml:mi>w</mml:mi><mml:mi>i</mml:mi></mml:msub></mml:mstyle></mml:math>
</disp-formula></p>
<p>For the description of area color, texture and shape are utilized. Texture is the important function to identify and classify objects. Contrast, homogeneity, dissimilarity, energy and entropy are utilized for describing texture. Change in intensity is measured by texture images using GRCM.</p>
<p><bold>Contrast</bold>: It gauges the differentiation by joining pixel and the nearby value. This process is done by the subsequent condition,<disp-formula id="eqn-14"><label>(14)</label>
<mml:math id="mml-eqn-14" display="block"><mml:mrow><mml:mi mathvariant="normal">C</mml:mi><mml:mi mathvariant="normal">o</mml:mi><mml:mi mathvariant="normal">n</mml:mi><mml:mi mathvariant="normal">t</mml:mi><mml:mi mathvariant="normal">r</mml:mi><mml:mi mathvariant="normal">a</mml:mi><mml:mi mathvariant="normal">s</mml:mi><mml:mi mathvariant="normal">t</mml:mi><mml:mo>=</mml:mo></mml:mrow><mml:mspace width="thinmathspace" /><mml:munder><mml:mrow><mml:mo movablelimits="false">&#x2211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mi>j</mml:mi></mml:mrow></mml:munder><mml:mo>&#x2061;</mml:mo><mml:msup><mml:mrow><mml:mo>|</mml:mo><mml:mrow><mml:mi>i</mml:mi><mml:mspace width="thickmathspace" /><mml:mo>&#x2212;</mml:mo><mml:mi>j</mml:mi></mml:mrow><mml:mo>|</mml:mo></mml:mrow><mml:mn>2</mml:mn></mml:msup><mml:mi>p</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mi>j</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:math>
</disp-formula></p>
<p><bold>Energy</bold>: For a steady image, energy esteem sets to 1. It restores the aggregate of squared component. It is determined by utilizing,<disp-formula id="eqn-15"><label>(15)</label>
<mml:math id="mml-eqn-15" display="block"><mml:mrow><mml:mi mathvariant="normal">E</mml:mi><mml:mi mathvariant="normal">n</mml:mi><mml:mi mathvariant="normal">e</mml:mi><mml:mi mathvariant="normal">r</mml:mi><mml:mi mathvariant="normal">g</mml:mi><mml:mi mathvariant="normal">y</mml:mi><mml:mo>=</mml:mo></mml:mrow><mml:mspace width="thinmathspace" /><mml:munder><mml:mrow><mml:mo movablelimits="false">&#x2211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mi>j</mml:mi></mml:mrow></mml:munder><mml:mo>&#x2061;</mml:mo><mml:mi>p</mml:mi><mml:msup><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mi>j</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mn>2</mml:mn></mml:msup></mml:math>
</disp-formula></p>
<p><bold>Homogeneity</bold>: It gauges the tightness of element distribution. The value range is 0&#x223C;1.<disp-formula id="eqn-16"><label>(16)</label>
<mml:math id="mml-eqn-16" display="block"><mml:msub><mml:mrow><mml:mrow><mml:mi mathvariant="normal">H</mml:mi><mml:mi mathvariant="normal">o</mml:mi><mml:mi mathvariant="normal">m</mml:mi><mml:mi mathvariant="normal">o</mml:mi><mml:mi mathvariant="normal">g</mml:mi><mml:mi mathvariant="normal">e</mml:mi><mml:mi mathvariant="normal">n</mml:mi><mml:mi mathvariant="normal">e</mml:mi><mml:mi mathvariant="normal">i</mml:mi><mml:mi mathvariant="normal">t</mml:mi><mml:mi mathvariant="normal">y</mml:mi><mml:mo>=</mml:mo></mml:mrow><mml:mspace width="thinmathspace" /><mml:mo>&#x2211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mrow><mml:mfrac><mml:mrow><mml:mi>p</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mi>j</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mrow><mml:mo>|</mml:mo><mml:mrow><mml:mi>i</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mi>j</mml:mi></mml:mrow><mml:mo>|</mml:mo></mml:mrow></mml:mrow></mml:mfrac></mml:mrow></mml:mstyle></mml:math>
</disp-formula></p>
<p><bold>Correlation</bold>: It quantifies a pixel to their neighboring values. For an ideal positive and negative correlation, it ranges from &#x2212;1 to 1.<disp-formula id="eqn-17"><label>(17)</label>
<mml:math id="mml-eqn-17" display="block"><mml:mrow><mml:mi mathvariant="normal">C</mml:mi><mml:mi mathvariant="normal">o</mml:mi><mml:mi mathvariant="normal">r</mml:mi><mml:mi mathvariant="normal">r</mml:mi><mml:mi mathvariant="normal">e</mml:mi><mml:mi mathvariant="normal">l</mml:mi><mml:mi mathvariant="normal">a</mml:mi><mml:mi mathvariant="normal">t</mml:mi><mml:mi mathvariant="normal">i</mml:mi><mml:mi mathvariant="normal">o</mml:mi><mml:mi mathvariant="normal">n</mml:mi><mml:mo>=</mml:mo></mml:mrow><mml:mspace width="thinmathspace" /><mml:msub><mml:mrow><mml:mo movablelimits="false">&#x2211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mrow><mml:mfrac><mml:mrow><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>i</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mi>&#x03BC;</mml:mi><mml:mi>i</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>j</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mi>&#x03BC;</mml:mi><mml:mi>j</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mi>p</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mi>j</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:msub><mml:mi>&#x03C3;</mml:mi><mml:mi>i</mml:mi></mml:msub><mml:msub><mml:mi>&#x03C3;</mml:mi><mml:mi>j</mml:mi></mml:msub></mml:mrow></mml:mfrac></mml:mrow></mml:mstyle></mml:math>
</disp-formula></p>
<p><bold>Entropy</bold>: It is utilized to portray the surface of lesion image. When all elements of the covariance matrix remain constant, its value remains high. The calculation is as follows,<disp-formula id="eqn-18"><label>(18)</label>
<mml:math id="mml-eqn-18" display="block"><mml:mrow><mml:mi mathvariant="normal">E</mml:mi><mml:mi mathvariant="normal">n</mml:mi><mml:mi mathvariant="normal">t</mml:mi><mml:mi mathvariant="normal">r</mml:mi><mml:mi mathvariant="normal">o</mml:mi><mml:mi mathvariant="normal">p</mml:mi><mml:mi mathvariant="normal">y</mml:mi><mml:mo>=</mml:mo></mml:mrow><mml:mspace width="thinmathspace" /><mml:munderover><mml:mrow><mml:mo movablelimits="false">&#x2211;</mml:mo></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>0</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:munderover><mml:mo>&#x2061;</mml:mo><mml:munderover><mml:mrow><mml:mo movablelimits="false">&#x2211;</mml:mo></mml:mrow><mml:mrow><mml:mi>j</mml:mi><mml:mo>=</mml:mo><mml:mn>0</mml:mn></mml:mrow><mml:mrow><mml:mi>N</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:munderover><mml:mo>&#x2061;</mml:mo><mml:mi>p</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mspace width="thickmathspace" /><mml:mi>j</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mo>&#x2212;</mml:mo><mml:mi>ln</mml:mi><mml:mo>&#x2061;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>p</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mi>j</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:math>
</disp-formula></p>
<p>These insights give data about the surface of a picture.</p>
</sec>
<sec id="s2_4">
<label>2.4</label>
<title>Hybrid PSO-Whale Optimization</title>
<p>The introduction of hybrid PSO-whale optimization delivers optimal results than other optimization since it owns plenty of advantageous elements like maximal convergence speed, easy computation, simple structure and maximal efficiency. This is the most developed calculation for the selection of feature. This is an iterative strategy that improves the issue by repeatedly attempting to upgrade arrangements against the acquired quality measurements. In order to perform optimization, hybrid optimization techniques are used. In order to alleviate the problems caused by conventional PSO, the technology uses a minimum number of secondary populations and updates the strategy to generate new speed and depth feature search. Retrieving important features that represent the depth and location of the affected area of melanoma is a difficult task.</p>
<p>The PSO calculation depends on the social and intellectual practices of members in the group. This algorithm is very popular because of sharing simple data and calculation. In this PSO, algorithm is dispersed in a multi-dimensional pursuit space, where every individual speaks to an applicant. The estimation of every arrangement depends on the exhibition capacity of the issue. Here, particle motion is influenced by two key factors, which use inter-particle information and iterate to iterative information.</p>
<p><italic>G Best and P Best</italic></p>
<p>In the G best model, each particle that forms a group has information about the current position and velocity in the solution space, so far of itself as P best and of an entire multitude as G best. Each particle points a worldwide ideal arrangement utilizing current speed, P best and G best.</p>
<p>Expression of G best model is given as,</p>
<p><disp-formula id="eqn-19"><label>(19)</label>
<mml:math id="mml-eqn-19" display="block"><mml:msubsup><mml:mi>&#x03BD;</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi><mml:mo>+</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msubsup><mml:mo>=</mml:mo><mml:mi>w</mml:mi><mml:msubsup><mml:mi>&#x03BD;</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow><mml:mi>k</mml:mi></mml:msubsup><mml:mo>+</mml:mo><mml:msub><mml:mi>c</mml:mi><mml:mn>1</mml:mn></mml:msub><mml:msub><mml:mi>r</mml:mi><mml:mn>1</mml:mn></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msubsup><mml:mrow><mml:mi mathvariant="normal">P</mml:mi></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow><mml:mi>k</mml:mi></mml:msubsup><mml:mo>&#x2212;</mml:mo><mml:msubsup><mml:mi>x</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow><mml:mi>k</mml:mi></mml:msubsup></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:msub><mml:mi>c</mml:mi><mml:mn>2</mml:mn></mml:msub><mml:msub><mml:mi>r</mml:mi><mml:mn>2</mml:mn></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msubsup><mml:mi>g</mml:mi><mml:mi>j</mml:mi><mml:mi>k</mml:mi></mml:msubsup><mml:mo>&#x2212;</mml:mo><mml:msubsup><mml:mi>x</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow><mml:mi>k</mml:mi></mml:msubsup></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:math>
</disp-formula></p>
<p><disp-formula id="eqn-20"><label>(20)</label>
<mml:math id="mml-eqn-20" display="block"><mml:msubsup><mml:mi>x</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi><mml:mo>+</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msubsup><mml:mo>=</mml:mo><mml:msubsup><mml:mi>x</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow><mml:mi>k</mml:mi></mml:msubsup><mml:mo>+</mml:mo><mml:msubsup><mml:mi>&#x03BD;</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi><mml:mo>+</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msubsup></mml:math>
</disp-formula></p>
<p>where, velocity and position are mentioned as v and x, i and j as particle number and direction, random numbers are given as <inline-formula id="ieqn-20">
<mml:math id="mml-ieqn-20"><mml:msub><mml:mi>r</mml:mi><mml:mn>1</mml:mn></mml:msub></mml:math>
</inline-formula> and <inline-formula id="ieqn-21">
<mml:math id="mml-ieqn-21"><mml:msub><mml:mi>r</mml:mi><mml:mn>2</mml:mn></mml:msub></mml:math>
</inline-formula> at the range [0 1], weight for each term is given as w, <inline-formula id="ieqn-22">
<mml:math id="mml-ieqn-22"><mml:msub><mml:mi>c</mml:mi><mml:mn>1</mml:mn></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-23">
<mml:math id="mml-ieqn-23"><mml:msub><mml:mi>c</mml:mi><mml:mn>2</mml:mn></mml:msub></mml:math>
</inline-formula>.</p>
<p>The Pbest arrangement, is put away in particle memory because of emphasis to iteration to iteration data among various particles. In multi-dimentional research space, the i-th molecule position yet speed by accompanying m-dimensional vectors, <inline-formula id="ieqn-24">
<mml:math id="mml-ieqn-24"><mml:msub><mml:mi>Y</mml:mi><mml:mi>i</mml:mi></mml:msub><mml:mo>=</mml:mo><mml:msub><mml:mi>y</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mn>1</mml:mn><mml:mo>,</mml:mo></mml:mrow></mml:msub><mml:msub><mml:mi>y</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mn>2</mml:mn><mml:mo>,</mml:mo><mml:mo>&#x2026;</mml:mo><mml:mo>&#x2026;</mml:mo><mml:mo>&#x2026;</mml:mo><mml:mo>&#x2026;</mml:mo><mml:mo>&#x2026;</mml:mo><mml:mo>&#x2026;</mml:mo><mml:mo>,</mml:mo></mml:mrow></mml:msub><mml:msub><mml:mi>y</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>m</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> and <inline-formula id="ieqn-25">
<mml:math id="mml-ieqn-25"><mml:msub><mml:mi>V</mml:mi><mml:mi>i</mml:mi></mml:msub><mml:mo>=</mml:mo><mml:msup><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msub><mml:mi>v</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mn>1</mml:mn><mml:mo>,</mml:mo></mml:mrow></mml:msub><mml:msub><mml:mi>v</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mn>2</mml:mn><mml:mo>,</mml:mo><mml:mo>&#x2026;</mml:mo><mml:mo>&#x2026;</mml:mo><mml:mo>&#x2026;</mml:mo><mml:mo>.</mml:mo><mml:mo>.</mml:mo></mml:mrow></mml:msub><mml:msub><mml:mi>v</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>m</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mi>T</mml:mi></mml:msup></mml:math>
</inline-formula>.</p>
<p>Velocity of ith particle is given as,<disp-formula id="eqn-21"><label>(21)</label>
<mml:math id="mml-eqn-21" display="block"><mml:msub><mml:mi>v</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>d</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:msub><mml:mi>v</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>d</mml:mi></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:msub><mml:mi>c</mml:mi><mml:mn>1</mml:mn></mml:msub><mml:msub><mml:mi>r</mml:mi><mml:mn>1</mml:mn></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msub><mml:mi>p</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>d</mml:mi></mml:mrow></mml:msub><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>d</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:msub><mml:mi>c</mml:mi><mml:mn>2</mml:mn></mml:msub><mml:msub><mml:mi>r</mml:mi><mml:mn>2</mml:mn></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msub><mml:mi>p</mml:mi><mml:mrow><mml:mi>g</mml:mi><mml:mi>d</mml:mi></mml:mrow></mml:msub><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>d</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:math>
</disp-formula>where, congnitive and social scaling parameters are expressed as <inline-formula id="ieqn-26">
<mml:math id="mml-ieqn-26"><mml:msub><mml:mi>c</mml:mi><mml:mn>1</mml:mn></mml:msub></mml:math>
</inline-formula> and <inline-formula id="ieqn-27">
<mml:math id="mml-ieqn-27"><mml:msub><mml:mi>c</mml:mi><mml:mn>2</mml:mn></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-28">
<mml:math id="mml-ieqn-28"><mml:msub><mml:mi>r</mml:mi><mml:mn>1</mml:mn></mml:msub></mml:math>
</inline-formula> and <inline-formula id="ieqn-29">
<mml:math id="mml-ieqn-29"><mml:msub><mml:mi>r</mml:mi><mml:mn>2</mml:mn></mml:msub></mml:math>
</inline-formula> as random numbers, dimension is mentioned as d, particle index and size are denoted as I and s. The whole process continues until the ideal solution to the problem is found.</p>
<p>The objective of the WOA is to consider the situation of inquiry space that streamline the target capacity of the advancement issue. If there are N whales called agent i in iteration t, it is denoted as,<disp-formula id="eqn-22"><label>(22)</label>
<mml:math id="mml-eqn-22" display="block"><mml:msubsup><mml:mi>X</mml:mi><mml:mi>i</mml:mi><mml:mi>t</mml:mi></mml:msubsup><mml:mo>=</mml:mo><mml:mrow><mml:mo>{</mml:mo><mml:mrow><mml:msubsup><mml:mi>X</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mi>t</mml:mi></mml:msubsup><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:msubsup><mml:mi>X</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mn>2</mml:mn></mml:mrow><mml:mi>t</mml:mi></mml:msubsup><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:msubsup><mml:mi>X</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mi>j</mml:mi></mml:mrow><mml:mi>t</mml:mi></mml:msubsup><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mo>&#x2026;</mml:mo><mml:mo>&#x2026;</mml:mo><mml:mo>&#x2026;</mml:mo><mml:mo>&#x2026;</mml:mo><mml:mo>.</mml:mo><mml:msubsup><mml:mi>X</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mi>d</mml:mi></mml:mrow><mml:mi>t</mml:mi></mml:msubsup></mml:mrow><mml:mo>}</mml:mo></mml:mrow><mml:mspace width="thickmathspace" /><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mspace width="thickmathspace" /><mml:mn>2</mml:mn><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mspace width="thickmathspace" /><mml:mn>3</mml:mn><mml:mo>&#x2026;</mml:mo><mml:mo>.</mml:mo><mml:mo>,</mml:mo><mml:mspace width="thickmathspace" /><mml:mi>N</mml:mi></mml:math>
</disp-formula>where, d as dimention, <inline-formula id="ieqn-30">
<mml:math id="mml-ieqn-30"><mml:msubsup><mml:mi>X</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mi>j</mml:mi></mml:mrow><mml:mi>t</mml:mi></mml:msubsup></mml:math>
</inline-formula> is the position of i, dimention of j and t for iteration. Hybrid PSO-whale optimization is used to perform an in-depth search and to find its best function. The obtained solutions are integrated to produce a new leader and if it has the best adaptability, the leader is used to replace the previous one.</p>
</sec>
<sec id="s2_5">
<label>2.5</label>
<title>PNN Classifier</title>
<p>The classification process is mainly divided into two steps, one is the training phase and another is the testing phase. In the first stage of training, the known data are given and then in the next stage, some data sets are used to train the recommended system. In the second stage of the testing phase, unknown data is provided and then the clustering classifier is used to perform classification after training. Feature vectors are converted from the previous stage features at this stage. Thus, the obtained feature vectors are utilized to distinguish among microcalcifications, external masses that are classified further as benign or malignant or normal.</p>
<p>PNN classifier delivers accurate prediction with fast convergence and the parellel structure of PNN significantly aids in providing optimal outcomes. The computation heap of the preparation stage is moved to the assessment stage, which is the principle recognizing highlight of PNN. PNN has input, pattern, summing and output layer. PNN supports multiple classifications. At the point when the info is introduced to the organization, the underlying layer ascertains the seperation between the information vector and the learning vector. The subsequent layer figures the total of these commitments for each class of sections to create at its yields a vector of probabilities. At last, an exchange work at the output of the subsequent layer takes the limit of these propabilities and produce 1 for this class and 0 for other classes. Therefore, the input of vector class of the probabilistic neural network in the possibility category with the highest accuracy is the extension of the radial basis transfer function. The PNN structure is given in <xref ref-type="fig" rid="fig-3">Fig. 3</xref>.</p>
<fig id="fig-3">
<label>Figure 3</label>
<caption>
<title>Structure of PNN</title></caption>
<graphic mimetype="image" mime-subtype="png" xlink:href="CSSE_32935-fig-3.png"/>
</fig>
<p>PNN utilizes radial and circular gaussian function fixated on each learning vector. The likelihood that a vector has a place with a specific class can be signified as,<disp-formula id="eqn-23"><label>(23)</label>
<mml:math id="mml-eqn-23" display="block"><mml:msub><mml:mi>f</mml:mi><mml:mi>i</mml:mi></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mi>x</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mrow><mml:mfrac><mml:mn>1</mml:mn><mml:mrow><mml:mn>2</mml:mn><mml:msup><mml:mi>&#x03C0;</mml:mi><mml:mrow><mml:mrow><mml:mfrac><mml:mi>P</mml:mi><mml:mn>2</mml:mn></mml:mfrac></mml:mrow></mml:mrow></mml:msup><mml:msup><mml:mi>&#x03C3;</mml:mi><mml:mi>P</mml:mi></mml:msup><mml:msub><mml:mi>M</mml:mi><mml:mi>i</mml:mi></mml:msub></mml:mrow></mml:mfrac></mml:mrow><mml:munderover><mml:mrow><mml:mo movablelimits="false">&#x2211;</mml:mo></mml:mrow><mml:mrow><mml:mi>j</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mi>M</mml:mi></mml:munderover><mml:mo>&#x2061;</mml:mo><mml:msup><mml:mi>e</mml:mi><mml:mrow><mml:mrow><mml:mfrac><mml:mrow><mml:mo>&#x2212;</mml:mo><mml:msup><mml:mrow><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>x</mml:mi><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mi>T</mml:mi></mml:msup><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>x</mml:mi><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mn>2</mml:mn><mml:msup><mml:mi>&#x03C3;</mml:mi><mml:mn>2</mml:mn></mml:msup></mml:mrow></mml:mfrac></mml:mrow></mml:mrow></mml:msup></mml:mstyle></mml:math>
</disp-formula></p>
<p>where, number of classes as i, number of forms as j, <inline-formula id="ieqn-31">
<mml:math id="mml-ieqn-31"><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> as jth training vector in i, test vector as x, amount of class i&#x2019;s trainning vector as <inline-formula id="ieqn-32">
<mml:math id="mml-ieqn-32"><mml:msub><mml:mi>M</mml:mi><mml:mi>i</mml:mi></mml:msub></mml:math>
</inline-formula> vector x &#x2018;s dimention is as p, smoothing factor as <inline-formula id="ieqn-33">
<mml:math id="mml-ieqn-33"><mml:mi>&#x03C3;</mml:mi></mml:math>
</inline-formula>, propability density function of in class i is given as <inline-formula id="ieqn-34">
<mml:math id="mml-ieqn-34"><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula>. Decision of classification is given by,<disp-formula id="eqn-24"><label>(24)</label>
<mml:math id="mml-eqn-24" display="block"><mml:mi>d</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mi>x</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:msub><mml:mi>C</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mspace width="thickmathspace" /></mml:mrow></mml:msub><mml:mspace width="thinmathspace" /><mml:mi>i</mml:mi><mml:mrow><mml:mi mathvariant="normal">f</mml:mi><mml:mo>&#x003A;</mml:mo></mml:mrow><mml:mspace width="thinmathspace" /><mml:msub><mml:mi>f</mml:mi><mml:mi>i</mml:mi></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mi>x</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo>&#x003E;</mml:mo><mml:msub><mml:mi>f</mml:mi><mml:mi>k</mml:mi></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mi>x</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mspace width="thinmathspace" /><mml:mrow><mml:mi mathvariant="normal">f</mml:mi><mml:mi mathvariant="normal">o</mml:mi><mml:mi mathvariant="normal">r</mml:mi></mml:mrow><mml:mspace width="thinmathspace" /><mml:mi>k</mml:mi><mml:mo>&#x2260;</mml:mo><mml:mi>i</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mn>2</mml:mn><mml:mo>)</mml:mo></mml:mrow></mml:math>
</disp-formula></p>
<p>where, i as the class of <inline-formula id="ieqn-35">
<mml:math id="mml-ieqn-35"><mml:msub><mml:mi>c</mml:mi><mml:mi>i</mml:mi></mml:msub></mml:math>
</inline-formula>.</p>
<p>If the value of the derived feature value is equal to any training amount, then each extracted feature value of the test image is converted into each training example feature value. The example records the probability of the particular training example.</p>
</sec>
</sec>
<sec id="s3">
<label>3</label>
<title>Result and Discussion</title>
<p>The proposed characterization model is executed in MATLAB environment. The input dermocopsy images are gathered from the Kaggle dataset. All images with a size of 256 &#x002A; 256 are paired with an expert manual to effectively track the boundaries of skin lesions. According to the severity of the disease, many samples are retrieved and classified. In order to perform classification, it goes through several processes to have a high-level classification.</p>
<p>To enhance the standard of the input image preprocessing is performed. At first, the input image of skin lesions is converted from RGB to grayscale image. Through this process, denoised and resized images are obtained by using Gabor filter. The Gabor filter works well in removing distorted frequency bands from the skin lesion image and it allows other frequencies to pass through with minimal loss. This process helps to achieve a higher recognition rate and enhance the operation of the data set. The input image of skin lesion is illustrated in <xref ref-type="fig" rid="fig-4">Fig. 4</xref> and the filtered image is given in <xref ref-type="fig" rid="fig-5">Fig. 5</xref> in an efficient manner.</p>
<fig id="fig-4">
<label>Figure 4</label>
<caption>
<title>Input image of skin lesion</title></caption>
<graphic mimetype="image" mime-subtype="png" xlink:href="CSSE_32935-fig-4.png"/>
</fig><fig id="fig-5">
<label>Figure 5</label>
<caption>
<title>Filtered image</title></caption>
<graphic mimetype="image" mime-subtype="png" xlink:href="CSSE_32935-fig-5.png"/>
</fig>
<fig id="fig-6">
<label>Figure 6</label>
<caption>
<title>Enhanced image</title></caption>
<graphic mimetype="image" mime-subtype="png" xlink:href="CSSE_32935-fig-6.png"/>
</fig>
<p>Histogram representation of binary image is given in <xref ref-type="fig" rid="fig-7">Fig. 7</xref>.</p>
<fig id="fig-7">
<label>Figure 7</label>
<caption>
<title>Histogram of original image</title></caption>
<graphic mimetype="image" mime-subtype="png" xlink:href="CSSE_32935-fig-7.png"/>
</fig>
<p>The output of clustered image is portrayed in <xref ref-type="fig" rid="fig-8">Fig. 8</xref> whereas the melanoma in the segmented image is highlighted in <xref ref-type="fig" rid="fig-9">Fig. 9</xref>.</p>
<fig id="fig-8">
<label>Figure 8</label>
<caption>
<title>Clustered image</title></caption>
<graphic mimetype="image" mime-subtype="png" xlink:href="CSSE_32935-fig-8.png"/>
</fig><fig id="fig-9">
<label>Figure 9</label>
<caption>
<title>Analysis of melanoma in segmented area</title></caption>
<graphic mimetype="image" mime-subtype="png" xlink:href="CSSE_32935-fig-9.png"/>
</fig>
<p>The histogram feature is also called the amplitude feature. In grayscale images, it provides a clear and valuable representation of intensity levels. Segmentation deals with the attributes that pixel in the segmented area must satisfy. The histogram provides an image average gray value that represents the average intensity of the image. The morphological features of skin lesions help to characterize the type of tumor. Although the histogram of the binary image is quite reasonable in the classification of skin melanoma, this function alone is not enough because it has less information for deterministic melanoma classification.</p>
<p>Segmentation is the process of dividing objects into clusters. Segmentation is performed to extract objects and other corresponding information. It has to be be noted that the segmentation results of training and test data are not compromised. It helps the radiologists to detect important and suspicious structure.</p>
<p>To avoid the invasive biopsy methods, important morphological features of dermoscopy images need to be extracted. Since the difference between normal and melanoma skin lesions is small, it is very difficult to separate variations using visual perception. Some statistical characteristics were recovered. 12 features are calculated to estimate the diseased area, which helps to classify and better identify skin lesion images. Kurtosis and skewness are the best fit of the data. The skewness and kurtosis values seem to depend very much on the sample size. In the event that the kurtosis value is close to 0, it is normally distributed. If it exceeds &#x002B;1, the distribution will reach a peak; if it is less than &#x2212;1, the distribution is considered too flat. If the distribution is exaggerated, treat theses criteria as anomalies.</p>
<p>Hybrid PSO-whale optimization is used to further optimize the characteristics of skin lesions. PSO-whale is the optimal model. The melanoma classifier was used to analyze skin lesions. The segmented region using feature extraction and the retrieved parameters helps image classification. The proposed PNN classifier helps to distinguish cancer types and find abnormal stages. The initial training rate is 0.01, then decreased by 0.1. In classification, if the derived feature value is found to be similar to any training data, it will associate the individually retrieved feature and record a specific training data set. The lesions are classified in this system, as normal, bengin else melanoma.</p>
<p>In image segmentation, the threshold plays an significant role. It helps to obtain the clarity of the image. By choosing the threshold value, the grayscale image can be converted into a binary image. Two important modes are object pixels and background pixels. So as to isolate the object from the background, a threshold is performed. On the off chance that the pixel esteem is higher than the threshold, it is displayed as a bright spot. If the gray pixels are less than the threshold, the image will appear as dark dots. The binary image and the enhanced image are <xref ref-type="fig" rid="fig-6">Fig. 6</xref>, and the binary image obtained by the threshold is shown in <xref ref-type="fig" rid="fig-7">Fig. 7</xref>. The advantage of reterieving binary images is that it minimizes the complexity and reduces the process of recognition and classification. Analyze the shape, size, position and direction of the image in a two-dimensional space of effectively identify and classify tumors in skin lesions.</p>
<p><bold><italic>Performance Metrics</italic></bold></p>
<p>The performance of the PNN classifier is compared with the existing SVM and Deep Neural Network (DNN) classifiers. When simulated with other classifiers, the proposed classifier shows strong tolerance to input noise, is easy to execute and minimize the calculation time. Accuracy, sensitivity and specificity are evaluated in an optimal manner.</p>
<p>The PNN classifier shows a high accuracy of 97.83&#x0025;. Sensitivity and specivicity assumes a significant function in deciding the accuracy.</p>
<p>The comparative analysis of the performance metrics of the introduced classifier is significantly highlighted through <xref ref-type="fig" rid="fig-10">Fig. 10</xref> to <xref ref-type="fig" rid="fig-12">Fig. 12</xref> and <xref ref-type="table" rid="table-1">Tab. 1</xref> to <xref ref-type="table" rid="table-3">Tab. 3</xref>. The features extracted through GRCM are listed out in <xref ref-type="table" rid="table-4">Tab. 4</xref>.</p>
<fig id="fig-10">
<label>Figure 10</label>
<caption>
<title>Accuracy comparison</title></caption>
<graphic mimetype="image" mime-subtype="png" xlink:href="CSSE_32935-fig-10.png"/>
</fig><fig id="fig-11">
<label>Figure 11</label>
<caption>
<title>Sensitivity comparison</title></caption>
<graphic mimetype="image" mime-subtype="png" xlink:href="CSSE_32935-fig-11.png"/>
</fig><fig id="fig-12">
<label>Figure 12</label>
<caption>
<title>Specificity comparison</title></caption>
<graphic mimetype="image" mime-subtype="png" xlink:href="CSSE_32935-fig-12.png"/>
</fig><table-wrap id="table-1"><label>Table 1</label>
<caption>
<title>Accuracy comparison between SVM, DNN and PNN classifier</title></caption>
<table><colgroup><col align="left"/><col align="left"/><col align="left"/><col align="left"/><col align="left"/>
</colgroup>
<thead>
<tr>
<th align="left" rowspan="2">Dataset</th>
<th align="left" rowspan="2">Number of images used</th>
<th align="left" colspan="3">Accuracy</th>
</tr>
<tr>
<th align="left">SVM</th>
<th align="left">DNN</th>
<th align="left">PNN</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">PH2</td>
<td align="left">250</td>
<td align="left">83.14</td>
<td align="left">87.95</td>
<td align="left">90.12</td>
</tr>
<tr>
<td align="left">Dermis</td>
<td align="left">1280</td>
<td align="left">85.39</td>
<td align="left">88.15</td>
<td align="left">92.17</td>
</tr>
<tr>
<td align="left">Dermquest</td>
<td align="left">11860</td>
<td align="left">85.99</td>
<td align="left">89.90</td>
<td align="left">93.33</td>
</tr>
<tr>
<td align="left">Kaggle</td>
<td align="left">10150</td>
<td align="left">86.32</td>
<td align="left">93.86</td>
<td align="left">97.83</td>
</tr>
</tbody>
</table>
</table-wrap><table-wrap id="table-2"><label>Table 2</label>
<caption>
<title>Sensitivity comparison between SVM, DNN and PNN</title></caption>
<table><colgroup><col align="left"/><col align="left"/><col align="left"/><col align="left"/><col align="left"/>
</colgroup>
<thead>
<tr>
<th align="left" rowspan="2">Dataset</th>
<th align="left" rowspan="2">Number of images used</th>
<th align="left" colspan="3">Sensitivity</th>
</tr>
<tr>
<th align="left">SVM</th>
<th align="left">DNN</th>
<th align="left">PNN</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">PH2</td>
<td align="left">250</td>
<td align="left">82.63</td>
<td align="left">84.95</td>
<td align="left">90.71</td>
</tr>
<tr>
<td align="left">Dermis</td>
<td align="left">1280</td>
<td align="left">82.90</td>
<td align="left">85.98</td>
<td align="left">91.81</td>
</tr>
<tr>
<td align="left">Dermquest</td>
<td align="left">11860</td>
<td align="left">84.36</td>
<td align="left">89.83</td>
<td align="left">93.56</td>
</tr>
<tr>
<td align="left">Kaggle</td>
<td align="left">10150</td>
<td align="left">84.24</td>
<td align="left">87.39</td>
<td align="left">95.11</td>
</tr>
</tbody>
</table>
</table-wrap><table-wrap id="table-3"><label>Table 3</label>
<caption>
<title>Specificity comparison between SVM, DNN and PNN</title></caption>
<table><colgroup><col align="left"/><col align="left"/><col align="left"/><col align="left"/><col align="left"/>
</colgroup>
<thead>
<tr>
<th align="left" rowspan="2">Dataset</th>
<th align="left" rowspan="2">Number of images used</th>
<th align="left" colspan="3">Specificity</th>
</tr>
<tr>
<th align="left">SVM</th>
<th align="left">DNN</th>
<th align="left">PNN</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">PH2</td>
<td align="left">250</td>
<td align="left">84.33</td>
<td align="left">85.99</td>
<td align="left">91.66</td>
</tr>
<tr>
<td align="left">Dermis</td>
<td align="left">1280</td>
<td align="left">85.32</td>
<td align="left">86.98</td>
<td align="left">92.22</td>
</tr>
<tr>
<td align="left">Dermquest</td>
<td align="left">11860</td>
<td align="left">81.68</td>
<td align="left">87.93</td>
<td align="left">93.67</td>
</tr>
<tr>
<td align="left">Kaggle</td>
<td align="left">10150</td>
<td align="left">87.23</td>
<td align="left">89.98</td>
<td align="left">95.25</td>
</tr>
</tbody>
</table>
</table-wrap><table-wrap id="table-4"><label>Table 4</label>
<caption>
<title>Features of skin lession</title></caption>
<table><colgroup><col align="left"/><col align="left"/><col align="left"/><col align="left"/><col align="left"/><col align="left"/><col align="left"/><col align="left"/>
</colgroup>
<thead>
<tr>
<th align="left">Images</th>
<th align="left">Mean</th>
<th align="left">Standard deviation</th>
<th align="left">Entropy</th>
<th align="left">Area</th>
<th align="left">Perimeter</th>
<th align="left">Depth</th>
<th align="left">Result</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">Image 1</td>
<td align="left">2.5129</td>
<td align="left">6.6985</td>
<td align="left">0.8012</td>
<td align="left">54.98</td>
<td align="left">19.86</td>
<td align="left">41.9</td>
<td align="left">Melanoma</td>
</tr>
<tr>
<td align="left">Image 2</td>
<td align="left">2.8102</td>
<td align="left">6.6819</td>
<td align="left">0.7902</td>
<td align="left">55.16</td>
<td align="left">19.24</td>
<td align="left">24.1</td>
<td align="left">Non-melanoma (BCC)</td>
</tr>
<tr>
<td align="left">Image 3</td>
<td align="left">2.7914</td>
<td align="left">6.6741</td>
<td align="left">0.7214</td>
<td align="left">56.62</td>
<td align="left">19.84</td>
<td align="left">26.2</td>
<td align="left">Non-melanoma (BCC)</td>
</tr>
<tr>
<td align="left">Image 4</td>
<td align="left">2.6128</td>
<td align="left">6.8102</td>
<td align="left">0.7192</td>
<td align="left">51.79</td>
<td align="left">19.78</td>
<td align="left">47.9</td>
<td align="left">Melanoma</td>
</tr>
<tr>
<td align="left">Image 5</td>
<td align="left">2.4021</td>
<td align="left">6.6993</td>
<td align="left">0.7613</td>
<td align="left">51.61</td>
<td align="left">19.20</td>
<td align="left">47.7</td>
<td align="left">Melanoma</td>
</tr>
<tr>
<td align="left">Image 6</td>
<td align="left">2.7986</td>
<td align="left">6.7201</td>
<td align="left">0.6992</td>
<td align="left">53.12</td>
<td align="left">20.13</td>
<td align="left">17.1</td>
<td align="left">Non-melanoma (SCC)</td>
</tr>
<tr>
<td align="left">Image 7</td>
<td align="left">2.6084</td>
<td align="left">6.7854</td>
<td align="left">0.6911</td>
<td align="left">55.52</td>
<td align="left">20.19</td>
<td align="left">47.2</td>
<td align="left">Melanoma</td>
</tr>
<tr>
<td align="left">Image 8</td>
<td align="left">2.7123</td>
<td align="left">6.7124</td>
<td align="left">0.7236</td>
<td align="left">52.41</td>
<td align="left">20.17</td>
<td align="left">18.8</td>
<td align="left">Non-melanoma (SCC)</td>
</tr>
<tr>
<td align="left">Image 9</td>
<td align="left">2.9485</td>
<td align="left">6.8021</td>
<td align="left">0.7271</td>
<td align="left">51.27</td>
<td align="left">20.12</td>
<td align="left">18.9</td>
<td align="left">Non-melanoma (SCC)</td>
</tr>
<tr>
<td align="left">Image 10</td>
<td align="left">2.4986</td>
<td align="left">6.7178</td>
<td align="left">0.8012</td>
<td align="left">54.39</td>
<td align="left">20.15</td>
<td align="left">45.01</td>
<td align="left">Melanoma</td>
</tr>
<tr>
<td align="left">Image 11</td>
<td align="left">2.5143</td>
<td align="left">6.7654</td>
<td align="left">0.7321</td>
<td align="left">50.64</td>
<td align="left">20.15</td>
<td align="left">44.01</td>
<td align="left">Melanoma</td>
</tr>
<tr>
<td align="left">Image 12</td>
<td align="left">2.6851</td>
<td align="left">6.6983</td>
<td align="left">0.6954</td>
<td align="left">54.32</td>
<td align="left">19.97</td>
<td align="left">17.9</td>
<td align="left">Non-melanoma (SCC)</td>
</tr>
<tr>
<td align="left">Image 13</td>
<td align="left">2.4995</td>
<td align="left">6.7291</td>
<td align="left">0.7319</td>
<td align="left">54.31</td>
<td align="left">19.56</td>
<td align="left">45.01</td>
<td align="left">Melanoma</td>
</tr>
<tr>
<td align="left">Image 14</td>
<td align="left">2.4922</td>
<td align="left">6.7362</td>
<td align="left">0.72818</td>
<td align="left">52.02</td>
<td align="left">19.73</td>
<td align="left">44.58</td>
<td align="left">Melanoma</td>
</tr>
<tr>
<td align="left">Image 15</td>
<td align="left">2.2841</td>
<td align="left">6.6904</td>
<td align="left">0.7601</td>
<td align="left">56.92</td>
<td align="left">20.12</td>
<td align="left">43.59</td>
<td align="left">Melanoma</td>
</tr>
<tr>
<td align="left">Image 16</td>
<td align="left">2.5931</td>
<td align="left">6.7610</td>
<td align="left">0.7243</td>
<td align="left">54.31</td>
<td align="left">19.88</td>
<td align="left">53.9</td>
<td align="left">Melanoma</td>
</tr>
<tr>
<td align="left">Image 17</td>
<td align="left">2.9021</td>
<td align="left">6.8001</td>
<td align="left">0.6913</td>
<td align="left">54.38</td>
<td align="left">20.13</td>
<td align="left">24.8</td>
<td align="left">Non-melanoma (BCC)</td>
</tr>
<tr>
<td align="left">Image 18</td>
<td align="left">2.4829</td>
<td align="left">6.6928</td>
<td align="left">0.7274</td>
<td align="left">52.10</td>
<td align="left">19.97</td>
<td align="left">43.7</td>
<td align="left">Melanoma</td>
</tr>
<tr>
<td align="left">Image 19</td>
<td align="left">2.7395</td>
<td align="left">6.7391</td>
<td align="left">0.7309</td>
<td align="left">54.98</td>
<td align="left">19.96</td>
<td align="left">43.7</td>
<td align="left">Melanoma</td>
</tr>
<tr>
<td align="left">Image 20</td>
<td align="left">2.6983</td>
<td align="left">6.6932</td>
<td align="left">0.7661</td>
<td align="left">54.86</td>
<td align="left">20.14</td>
<td align="left">43.7</td>
<td align="left">Melanoma</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>Sensitivity distinguishes the individuals who are influensed by illness. When the disease is found to be negative and the percentage of TP is all affected people, it helps to rule out the disease. The specificity shows the percentage of TN in all healthy people. TN stands for true negative and TP stands for positive.</p>
<p><disp-formula id="eqn-25"><label>(25)</label>
<mml:math id="mml-eqn-25" display="block"><mml:mrow><mml:mi mathvariant="normal">S</mml:mi><mml:mi mathvariant="normal">e</mml:mi><mml:mi mathvariant="normal">n</mml:mi><mml:mi mathvariant="normal">s</mml:mi><mml:mi mathvariant="normal">i</mml:mi><mml:mi mathvariant="normal">t</mml:mi><mml:mi mathvariant="normal">i</mml:mi><mml:mi mathvariant="normal">v</mml:mi><mml:mi mathvariant="normal">i</mml:mi><mml:mi mathvariant="normal">t</mml:mi><mml:mi mathvariant="normal">y</mml:mi><mml:mo>=</mml:mo></mml:mrow><mml:mspace width="thinmathspace" /><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mrow><mml:mfrac><mml:mrow><mml:mi>T</mml:mi><mml:mi>P</mml:mi></mml:mrow><mml:mrow><mml:mi>T</mml:mi><mml:mi>P</mml:mi><mml:mo>+</mml:mo><mml:mi>F</mml:mi><mml:mi>N</mml:mi></mml:mrow></mml:mfrac></mml:mrow><mml:mo>=</mml:mo><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mrow><mml:mfrac><mml:mrow><mml:mi>T</mml:mi><mml:mi>P</mml:mi></mml:mrow><mml:mrow><mml:mi>A</mml:mi><mml:mi>l</mml:mi><mml:mi>l</mml:mi><mml:mspace width="thickmathspace" /><mml:mi>d</mml:mi><mml:mi>i</mml:mi><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>a</mml:mi><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>d</mml:mi><mml:mspace width="thickmathspace" /><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:mi>o</mml:mi><mml:mi>p</mml:mi><mml:mi>l</mml:mi><mml:mi>e</mml:mi></mml:mrow></mml:mfrac></mml:mrow></mml:mstyle></mml:mstyle></mml:math>
</disp-formula></p>
<p><disp-formula id="eqn-26"><label>(26)</label>
<mml:math id="mml-eqn-26" display="block"><mml:mrow><mml:mi mathvariant="normal">S</mml:mi><mml:mi mathvariant="normal">p</mml:mi><mml:mi mathvariant="normal">e</mml:mi><mml:mi mathvariant="normal">c</mml:mi><mml:mi mathvariant="normal">i</mml:mi><mml:mi mathvariant="normal">f</mml:mi><mml:mi mathvariant="normal">i</mml:mi><mml:mi mathvariant="normal">c</mml:mi><mml:mi mathvariant="normal">i</mml:mi><mml:mi mathvariant="normal">t</mml:mi><mml:mi mathvariant="normal">y</mml:mi><mml:mo>=</mml:mo></mml:mrow><mml:mspace width="thinmathspace" /><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mrow><mml:mfrac><mml:mrow><mml:mi>T</mml:mi><mml:mi>N</mml:mi></mml:mrow><mml:mrow><mml:mi>T</mml:mi><mml:mi>N</mml:mi><mml:mo>+</mml:mo><mml:mi>F</mml:mi><mml:mi>P</mml:mi></mml:mrow></mml:mfrac></mml:mrow><mml:mo>=</mml:mo><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mrow><mml:mfrac><mml:mrow><mml:mi>T</mml:mi><mml:mi>N</mml:mi></mml:mrow><mml:mrow><mml:mi>A</mml:mi><mml:mi>l</mml:mi><mml:mi>l</mml:mi><mml:mspace width="thickmathspace" /><mml:mi>h</mml:mi><mml:mi>e</mml:mi><mml:mi>a</mml:mi><mml:mi>l</mml:mi><mml:mi>t</mml:mi><mml:mi>h</mml:mi><mml:mi>y</mml:mi><mml:mspace width="thickmathspace" /><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:mi>o</mml:mi><mml:mi>p</mml:mi><mml:mi>l</mml:mi><mml:mi>e</mml:mi></mml:mrow></mml:mfrac></mml:mrow></mml:mstyle></mml:mstyle></mml:math>
</disp-formula></p>
<p><disp-formula id="eqn-27"><label>(27)</label>
<mml:math id="mml-eqn-27" display="block"><mml:mrow><mml:mi mathvariant="normal">A</mml:mi><mml:mi mathvariant="normal">c</mml:mi><mml:mi mathvariant="normal">c</mml:mi><mml:mi mathvariant="normal">u</mml:mi><mml:mi mathvariant="normal">r</mml:mi><mml:mi mathvariant="normal">a</mml:mi><mml:mi mathvariant="normal">c</mml:mi><mml:mi mathvariant="normal">y</mml:mi><mml:mo>=</mml:mo></mml:mrow><mml:mspace width="thinmathspace" /><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mrow><mml:mfrac><mml:mrow><mml:mi>T</mml:mi><mml:mi>P</mml:mi><mml:mo>+</mml:mo><mml:mi>T</mml:mi><mml:mi>N</mml:mi></mml:mrow><mml:mrow><mml:mi>T</mml:mi><mml:mi>P</mml:mi><mml:mo>+</mml:mo><mml:mi>T</mml:mi><mml:mi>N</mml:mi><mml:mo>+</mml:mo><mml:mi>F</mml:mi><mml:mi>P</mml:mi><mml:mo>+</mml:mo><mml:mi>F</mml:mi><mml:mi>N</mml:mi></mml:mrow></mml:mfrac></mml:mrow></mml:mstyle></mml:math>
</disp-formula></p>
<p>The sensitivity accomplished by the PNN classifier was shown to be 95.11&#x0025;.</p>
<p>The specificity accomplished by the PNN classifier was shown to be 95.25&#x0025;. The input dermoscopy images (entities) are gathered in a table with their properties perceiving several attributes and their associated interrelation. Samples of 20 MELANOMA and benign skin lesion images are taken. Considering the values of area and perimeter, the size of depth greater than 30&#x2005;mm are regarded as melanoma, the depth size that ranges between 20&#x2013;30&#x2005;mm are considered as BCC and those values which are less than 20&#x2005;mm are taken as SCC.</p>
<p>The classification result of the dermoscopy image depends on several attributes like mean, standard deviation, Entropy, area, perimeter and Depth. The texture features are measured in mm scale.</p>
<p>In the results, some images are non-melanoma as their area value is large and depth is greater than 20. If two images whose area values are same, then one of them would results as MELANOMA whose depth value is comparatively higher than the other image. The inconsistency in image attributes occurs if the images have same value for area and Depth. In such situations, the classifier will consider other attributes like perimeter and Entropy. Hence it is necessary to concentrate every attributes until all the images get classified. The magnitude of area value depends on the perimeter value and Depth value depends on Entropy. When more than two attributes are same, the classifier will segregate the images based on other characteristics that form the part of subset.</p>
</sec>
<sec id="s4">
<label>4</label>
<title>Conclusion</title>
<p>An effective on-time detection of Skin cancer using image processing approach is introduced in this present study, which significantly assists in identifying the type and stages of melanoma in an optimal manner. In this system, the noises in the input image are remarkably eliminated with the aid of Gabor filter and the noise free image is then divided into multiple sections with the aid of Fuzzy C-Meand approach. In addition, the features in the extracted images are remarkably extracted through the implementation of GRCM approach whereas the optimal features among the extracted features are significantly selected through the hybrid optimization using PSO-whale optimization algorithm, which in turn enhances the reliability of detecting the cancer in the early stage. Then, the selected features are efficiently classified using the PNN classifier, which significantly minimizes the complexity of the training data in a wider range. The comparative analysis of the proposed work is carried out with SVM and DNN. The attained outcomes have validated that the introduced approach provides high accuracy of 97.83&#x0025;, sensitivity of 95.11&#x0025; and specificity of 95.25&#x0025;. Which are comparatively optimal that the other existing approaches.</p>
</sec>
</body>
<back><fn-group>
<fn fn-type="other">
<p><bold>Funding Statement:</bold> The authors received no specific funding for this study.</p>
</fn>
<fn fn-type="conflict">
<p><bold>Conflicts of Interest:</bold> The authors declare that they have no conflicts of interest to report regarding the present study.</p>
</fn>
</fn-group>
<ref-list content-type="authoryear">
<title>References</title>
<ref id="ref-1"><label>[1]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>G.</given-names> <surname>Mansutti</surname></string-name>, <string-name><given-names>A. T.</given-names> <surname>Mobashsher</surname></string-name>, <string-name><given-names>K.</given-names> <surname>Bialkowski</surname></string-name>, <string-name><given-names>B.</given-names> <surname>Mohammed</surname></string-name> and <string-name><given-names>A.</given-names> <surname>Abbosh</surname></string-name></person-group>, &#x201C;<article-title>Millimeter-wave substrate integrated waveguide probe for skin cancer detection</article-title>,&#x201D; <source>IEEE Transactions on Biomedical Engineering</source>, vol. <volume>67</volume>, no. <issue>9</issue>, pp. <fpage>2462</fpage>&#x2013;<lpage>2472</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-2"><label>[2]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>A.</given-names> <surname>Naeem</surname></string-name>, <string-name><given-names>M. S.</given-names> <surname>Farooq</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Khelifi</surname></string-name> and <string-name><given-names>A.</given-names> <surname>Abid</surname></string-name></person-group>, &#x201C;<article-title>Malignant melanoma classification using deep learning: Datasets, performance measurements, challenges and opportunities</article-title>,&#x201D; <source>IEEE Access</source>, vol. <volume>8</volume>, pp. <fpage>110575</fpage>&#x2013;<lpage>110597</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-3"><label>[3]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S. S.</given-names> <surname>Mahmouei</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Aldeen</surname></string-name>, <string-name><given-names>W. V.</given-names> <surname>Stoecker</surname></string-name> and <string-name><given-names>R.</given-names> <surname>Garnavi</surname></string-name></person-group>, &#x201C;<article-title>Biologically inspired QuadTree color detection in dermoscopy images of melanoma</article-title>,&#x201D; <source>IEEE Journal of Biomedical and Health Informatics</source>, vol. <volume>23</volume>, no. <issue>2</issue>, pp. <fpage>570</fpage>&#x2013;<lpage>577</lpage>, <year>2018</year>.</mixed-citation></ref>
<ref id="ref-4"><label>[4]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>H.</given-names> <surname>Arab</surname></string-name>, <string-name><given-names>L.</given-names> <surname>Chioukh</surname></string-name>, <string-name><given-names>M. D.</given-names> <surname>Ardakani</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Dufour</surname></string-name> and <string-name><given-names>S. O.</given-names> <surname>Tatu</surname></string-name></person-group>, &#x201C;<article-title>Early-stage detection of melanoma skin cancer using contactless Millimeter-wave sensors</article-title>,&#x201D; <source>IEEE Sensors Journal</source>, vol. <volume>20</volume>, no. <issue>13</issue>, pp. <fpage>7310</fpage>&#x2013;<lpage>7317</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-5"><label>[5]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>M.</given-names> <surname>Sadeghi</surname></string-name>, <string-name><given-names>T. K.</given-names> <surname>Lee</surname></string-name>, <string-name><given-names>D.</given-names> <surname>McLean</surname></string-name>, <string-name><given-names>H.</given-names> <surname>Lui</surname></string-name> and <string-name><given-names>M. S.</given-names> <surname>Atkins</surname></string-name></person-group>, &#x201C;<article-title>Detection and analysis of irregular streaks in dermoscopic images of skin lesions</article-title>,&#x201D; <source>IEEE Transactions on Medical Imaging</source>, vol. <volume>32</volume>, no. <issue>5</issue>, pp. <fpage>849</fpage>&#x2013;<lpage>861</lpage>, <year>2013</year>.</mixed-citation></ref>
<ref id="ref-6"><label>[6]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>R.</given-names> <surname>Ashraf</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Afzal</surname></string-name>, <string-name><given-names>A. U.</given-names> <surname>Rehman</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Gul</surname></string-name>, <string-name><given-names>J.</given-names> <surname>Baber</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>Region-of-interest based transfer learning assisted framework for skin cancer detection</article-title>,&#x201D; <source>IEEE Access</source>, vol. <volume>8</volume>, pp. <fpage>147858</fpage>&#x2013;<lpage>147871</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-7"><label>[7]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>L.</given-names> <surname>Wei</surname></string-name>, <string-name><given-names>K.</given-names> <surname>Ding</surname></string-name> and <string-name><given-names>H.</given-names> <surname>Hu</surname></string-name></person-group>, &#x201C;<article-title>Automatic skin cancer detection in dermoscopy images based on ensemble lightweight deep learning network</article-title>,&#x201D; <source>IEEE Access</source>, vol. <volume>8</volume>, pp. <fpage>99633</fpage>&#x2013;<lpage>99647</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-8"><label>[8]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>L.</given-names> <surname>Yu</surname></string-name>, <string-name><given-names>H.</given-names> <surname>Chen</surname></string-name>, <string-name><given-names>Q.</given-names> <surname>Dou</surname></string-name>, <string-name><given-names>J.</given-names> <surname>Qin</surname></string-name> and <string-name><given-names>P. A.</given-names> <surname>Heng</surname></string-name></person-group>, &#x201C;<article-title>Automated melanoma recognition in dermoscopy images via very deep residual networks</article-title>,&#x201D; <source>IEEE Transactions on Medical Imaging</source>, vol. <volume>36</volume>, no. <issue>4</issue>, pp. <fpage>994</fpage>&#x2013;<lpage>1004</lpage>, <year>2016</year>.</mixed-citation></ref>
<ref id="ref-9"><label>[9]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>Pathan</surname></string-name>, <string-name><given-names>V.</given-names> <surname>Aggarwal</surname></string-name>, <string-name><given-names>K. G.</given-names> <surname>Prabhu</surname></string-name> and <string-name><given-names>P. C.</given-names> <surname>Siddalingaswamy</surname></string-name></person-group>, &#x201C;<article-title>Melanoma detection in dermoscopic images using color features</article-title>,&#x201D; <source>Biomedical and Pharmacology Journal</source>, vol. 12, no. 1, pp. 107&#x2013;115, 2019.</mixed-citation></ref>
<ref id="ref-10"><label>[10]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>A. R. H.</given-names> <surname>Ali</surname></string-name>, <string-name><given-names>J.</given-names> <surname>Li</surname></string-name> and <string-name><given-names>G.</given-names> <surname>Yang</surname></string-name></person-group>, &#x201C;<article-title>Automating the ABCD rule for melanoma detection: A survey</article-title>,&#x201D; <source>IEEE Access</source>, vol. <volume>8</volume>, pp. <fpage>83333</fpage>&#x2013;<lpage>83346</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-11"><label>[11]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>M.</given-names> <surname>Goyal</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Oakley</surname></string-name>, <string-name><given-names>P.</given-names> <surname>Bansal</surname></string-name>, <string-name><given-names>D.</given-names> <surname>Dancey</surname></string-name> and <string-name><given-names>M. H.</given-names> <surname>Yap</surname></string-name></person-group>, &#x201C;<article-title>Skin lesion segmentation in dermoscopic images with ensemble deep learning methods</article-title>,&#x201D; <source>IEEE Access</source>, vol. <volume>8</volume>, pp. <fpage>4171</fpage>&#x2013;<lpage>4181</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-12"><label>[12]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>D&#x00ED;az</surname></string-name>, <string-name><given-names>T.</given-names> <surname>Krohmer</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Moreira</surname></string-name>, <string-name><given-names>S. E.</given-names> <surname>Godoy</surname></string-name> and <string-name><given-names>M.</given-names> <surname>Figueroa</surname></string-name></person-group>, &#x201C;<article-title>An instrument for accurate and non-invasive screening of skin cancer based on multimodal imaging</article-title>,&#x201D; <source>IEEE Access</source>, vol. <volume>7</volume>, pp. <fpage>176646</fpage>&#x2013;<lpage>176657</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-13"><label>[13]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>H.</given-names> <surname>de Souza Ganzeli</surname></string-name>, <string-name><given-names>J. G.</given-names> <surname>Bottesini</surname></string-name>, <string-name><given-names>L.</given-names> <surname>de Oliveira Paz</surname></string-name> and <string-name><given-names>M. F. S.</given-names> <surname>Ribeiro</surname></string-name></person-group>, &#x201C;<article-title>Skan: Skin scanner-system for skin cancer detection using adaptive techniques</article-title>,&#x201D; <source>IEEE Latin America Transactions</source>, vol. <volume>9</volume>, no. <issue>2</issue>, pp. <fpage>206</fpage>&#x2013;<lpage>212</lpage>, <year>2011</year>.</mixed-citation></ref>
<ref id="ref-14"><label>[14]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>Y. C.</given-names> <surname>Lin</surname></string-name>, <string-name><given-names>Y. J.</given-names> <surname>Wang</surname></string-name>, <string-name><given-names>J. C. H.</given-names> <surname>Cheng</surname></string-name> and <string-name><given-names>Y. H.</given-names> <surname>Lin</surname></string-name></person-group>, &#x201C;<article-title>Contactless monitoring of pulse rate and eye movement for uveal melanoma patients undergoing radiation therapy</article-title>,&#x201D; <source>IEEE Transactions on Instrumentation and Measurement</source>, vol. <volume>68</volume>, no. <issue>2</issue>, pp. <fpage>474</fpage>&#x2013;<lpage>482</lpage>, <year>2018</year>.</mixed-citation></ref>
<ref id="ref-15"><label>[15]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>C.</given-names> <surname>Liu</surname></string-name>, <string-name><given-names>Y.</given-names> <surname>Huang</surname></string-name>, <string-name><given-names>J. A.</given-names> <surname>Ozolek</surname></string-name>, <string-name><given-names>M. G.</given-names> <surname>Hanna</surname></string-name>, <string-name><given-names>R.</given-names> <surname>Singh</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>SetSVM: An approach to set classification in nuclei-based cancer detection</article-title>,&#x201D; <source>IEEE Journal of Biomedical and Health Informatics</source>, vol. <volume>23</volume>, no. <issue>1</issue>, pp. <fpage>351</fpage>&#x2013;<lpage>361</lpage>, <year>2018</year>.</mixed-citation></ref>
<ref id="ref-16"><label>[16]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>K.</given-names> <surname>Korotkov</surname></string-name>, <string-name><given-names>J.</given-names> <surname>Quintana</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Puig</surname></string-name>, <string-name><given-names>J.</given-names> <surname>Malvehy</surname></string-name> and <string-name><given-names>R.</given-names> <surname>Garcia</surname></string-name></person-group>, &#x201C;<article-title>A new total body scanning system for automatic change detection in multiple pigmented skin lesions</article-title>,&#x201D; <source>IEEE Transactions on Medical Imaging</source>, vol. <volume>34</volume>, no. <issue>1</issue>, pp. <fpage>317</fpage>&#x2013;<lpage>338</lpage>, <year>2014</year>.</mixed-citation></ref>
<ref id="ref-17"><label>[17]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>A.</given-names> <surname>Mirbeik-Sabzevari</surname></string-name> and <string-name><given-names>N.</given-names> <surname>Tavassolian</surname></string-name></person-group>, &#x201C;<article-title>Ultrawideband, stable normal and cancer skin tissue phantoms for Millimeter-wave skin cancer imaging</article-title>,&#x201D; <source>IEEE Transactions on Biomedical Engineering</source>, vol. <volume>66</volume>, no. <issue>1</issue>, pp. <fpage>176</fpage>&#x2013;<lpage>186</lpage>, <year>2018</year>.</mixed-citation></ref>
<ref id="ref-18"><label>[18]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>A.</given-names> <surname>Keshavarz</surname></string-name> and <string-name><given-names>Z.</given-names> <surname>Vafapour</surname></string-name></person-group>, &#x201C;<article-title>Water-based terahertz metamaterial for skin cancer detection application</article-title>,&#x201D; <source>IEEE Sensors Journal</source>, vol. <volume>19</volume>, no. <issue>4</issue>, pp. <fpage>1519</fpage>&#x2013;<lpage>1524</lpage>, <year>2018</year>.</mixed-citation></ref>
<ref id="ref-19"><label>[19]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>H.</given-names> <surname>Sun</surname></string-name> and <string-name><given-names>R.</given-names> <surname>Grishman</surname></string-name></person-group>, &#x201C;<article-title>Lexicalized dependency paths based supervised learning for relation extraction</article-title>,&#x201D; <source>Computer Systems Science and Engineering</source>, vol. <volume>43</volume>, no. <issue>3</issue>, pp. <fpage>861</fpage>&#x2013;<lpage>870</lpage>, <year>2022</year>.</mixed-citation></ref>
<ref id="ref-20"><label>[20]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>R.</given-names> <surname>Amelard</surname></string-name>, <string-name><given-names>J.</given-names> <surname>Glaister</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Wong</surname></string-name> and <string-name><given-names>D. A.</given-names> <surname>Clausi</surname></string-name></person-group>, &#x201C;<article-title>High-level intuitive features (HLIFs) for intuitive skin lesion description</article-title>,&#x201D; <source>IEEE Transactions on Biomedical Engineering</source>, vol. <volume>62</volume>, no. <issue>3</issue>, pp. <fpage>820</fpage>&#x2013;<lpage>831</lpage>, <year>2014</year>.</mixed-citation></ref>
<ref id="ref-21"><label>[21]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>H.</given-names> <surname>Sun</surname></string-name> and <string-name><given-names>R.</given-names> <surname>Grishman</surname></string-name></person-group>, &#x201C;<article-title>Employing lexicalized dependency paths for active learning of relation extraction</article-title>,&#x201D; <source>Intelligent Automation &#x0026; Soft Computing</source>, vol. <volume>34</volume>, no. <issue>3</issue>, pp. <fpage>1415</fpage>&#x2013;<lpage>1423</lpage>, <year>2022</year>.</mixed-citation></ref>
<ref id="ref-22"><label>[22]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>D.</given-names> <surname>Caratelli</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Massaro</surname></string-name>, <string-name><given-names>R.</given-names> <surname>Cingolani</surname></string-name> and <string-name><given-names>A. G.</given-names> <surname>Yarovoy</surname></string-name></person-group>, &#x201C;<article-title>Accurate time-domain modeling of reconfigurable antenna sensors for non-invasive melanoma skin cancer detection</article-title>,&#x201D; <source>IEEE Sensors Journal</source>, vol. <volume>12</volume>, no. <issue>3</issue>, pp. <fpage>635</fpage>&#x2013;<lpage>643</lpage>, <year>2011</year>.</mixed-citation></ref>
<ref id="ref-23"><label>[23]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>C.</given-names> <surname>Barata</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Ruela</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Francisco</surname></string-name>, <string-name><given-names>T.</given-names> <surname>Mendon&#x00E7;a</surname></string-name> and <string-name><given-names>J. S.</given-names> <surname>Marques</surname></string-name></person-group>, &#x201C;<article-title>Two systems for the detection of melanomas in dermoscopy images using texture and color features</article-title>,&#x201D; <source>IEEE Systems Journal</source>, vol. <volume>8</volume>, no. <issue>3</issue>, pp. <fpage>965</fpage>&#x2013;<lpage>979</lpage>, <year>2013</year>.</mixed-citation></ref>
<ref id="ref-24"><label>[24]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>G.</given-names> <surname>Sforza</surname></string-name>, <string-name><given-names>G.</given-names> <surname>Castellano</surname></string-name>, <string-name><given-names>S. K.</given-names> <surname>Arika</surname></string-name>, <string-name><given-names>R. W.</given-names> <surname>LeAnder</surname></string-name>, <string-name><given-names>R. J.</given-names> <surname>Stanley</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>Using adaptive thresholding and skewness correction to detect gray areas in melanoma in situ images</article-title>,&#x201D; <source>IEEE Transactions on Instrumentation and Measurement</source>, vol. <volume>61</volume>, no. <issue>7</issue>, pp. <fpage>1839</fpage>&#x2013;<lpage>1847</lpage>, <year>2012</year>.</mixed-citation></ref>
<ref id="ref-25"><label>[25]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>P.</given-names> <surname>Mehta</surname></string-name>, <string-name><given-names>K.</given-names> <surname>Chand</surname></string-name>, <string-name><given-names>D.</given-names> <surname>Narayanswamy</surname></string-name>, <string-name><given-names>D. G.</given-names> <surname>Beetner</surname></string-name>, <string-name><given-names>R.</given-names> <surname>Zoughi</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>Microwave reflectometry as a novel diagnostic tool for detection of skin cancers</article-title>,&#x201D; <source>IEEE Transactions on Instrumentation and Measurement</source>, vol. <volume>55</volume>, no. <issue>4</issue>, pp. <fpage>1309</fpage>&#x2013;<lpage>1316</lpage>, <year>2006</year>.</mixed-citation></ref>
<ref id="ref-26"><label>[26]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>P.</given-names> <surname>Aberg</surname></string-name>, <string-name><given-names>I.</given-names> <surname>Nicander</surname></string-name>, <string-name><given-names>J.</given-names> <surname>Hansson</surname></string-name>, <string-name><given-names>P.</given-names> <surname>Geladi</surname></string-name>, <string-name><given-names>U.</given-names> <surname>Holmgren</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>Skin cancer identification using multifrequency electrical impedance-a potential screening tool</article-title>,&#x201D; <source>IEEE Transactions on Biomedical Engineering</source>, vol. <volume>51</volume>, no. <issue>12</issue>, pp. <fpage>2097</fpage>&#x2013;<lpage>2102</lpage>, <year>2004</year>.</mixed-citation></ref>
<ref id="ref-27"><label>[27]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>T. T.</given-names> <surname>Do</surname></string-name>, <string-name><given-names>T.</given-names> <surname>Hoang</surname></string-name>, <string-name><given-names>V.</given-names> <surname>Pomponiu</surname></string-name>, <string-name><given-names>Y.</given-names> <surname>Zhou</surname></string-name>, <string-name><given-names>Z.</given-names> <surname>Chen</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>Accessible melanoma detection using smartphones and mobile image analysis</article-title>,&#x201D; <source>IEEE Transactions on Multimedia</source>, vol. <volume>20</volume>, no. <issue>10</issue>, pp. <fpage>2849</fpage>&#x2013;<lpage>2864</lpage>, <year>2018</year>.</mixed-citation></ref>
</ref-list>
</back>
</article>















