<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.1 20151215//EN" "http://jats.nlm.nih.gov/publishing/1.1/JATS-journalpublishing1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:mml="http://www.w3.org/1998/Math/MathML" xml:lang="en" article-type="research-article" dtd-version="1.1">
<front>
<journal-meta>
<journal-id journal-id-type="pmc">IASC</journal-id>
<journal-id journal-id-type="nlm-ta">IASC</journal-id>
<journal-id journal-id-type="publisher-id">IASC</journal-id>
<journal-title-group>
<journal-title>Intelligent Automation &#x0026; Soft Computing</journal-title>
</journal-title-group>
<issn pub-type="epub">2326-005X</issn>
<issn pub-type="ppub">1079-8587</issn>
<publisher>
<publisher-name>Tech Science Press</publisher-name>
<publisher-loc>USA</publisher-loc>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="publisher-id">34719</article-id>
<article-id pub-id-type="doi">10.32604/iasc.2023.034719</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Article</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>Hyperparameter Tuned Deep Hybrid Denoising Autoencoder Breast Cancer Classification on Digital Mammograms</article-title><alt-title alt-title-type="left-running-head">Hyperparameter Tuned Deep Hybrid Denoising Autoencoder Breast Cancer Classification on Digital Mammograms</alt-title><alt-title alt-title-type="right-running-head">Hyperparameter Tuned Deep Hybrid Denoising Autoencoder Breast Cancer Classification on Digital Mammograms</alt-title>
</title-group>
<contrib-group>
<contrib id="author-1" contrib-type="author" corresp="yes">
<name name-style="western"><surname>Hamza</surname><given-names>Manar Ahmed</given-names></name><email>ma.hamza@psau.edu.sa</email>
</contrib>
<aff id="aff-1"><institution>Department of Computer and Self Development, Preparatory Year Deanship, Prince Sattam bin Abdulaziz University</institution>, <addr-line>AlKharj</addr-line>, <country>Saudi Arabia</country></aff>
</contrib-group><author-notes><corresp id="cor1"><label>&#x002A;</label>Corresponding Author: Manar Ahmed Hamza. Email: <email>ma.hamza@psau.edu.sa</email></corresp></author-notes>
<pub-date date-type="collection" publication-format="electronic"><year>2023</year></pub-date>
<pub-date date-type="pub" publication-format="electronic"><day>9</day><month>3</month><year>2023</year></pub-date>
<volume>36</volume>
<issue>3</issue>
<fpage>2879</fpage>
<lpage>2895</lpage>
<history>
<date date-type="received"><day>25</day><month>7</month><year>2022</year></date>
<date date-type="accepted"><day>14</day><month>11</month><year>2022</year></date>
</history>
<permissions>
<copyright-statement>&#x00A9; 2023 Hamza</copyright-statement>
<copyright-year>2023</copyright-year>
<copyright-holder>Hamza</copyright-holder>
<license xlink:href="https://creativecommons.org/licenses/by/4.0/">
<license-p>This work is licensed under a <ext-link ext-link-type="uri" xlink:type="simple" xlink:href="https://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</ext-link>, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.</license-p>
</license>
</permissions>
<self-uri content-type="pdf" xlink:href="TSP_IASC_34719.pdf"></self-uri>
<abstract><p>Breast Cancer (BC) is considered the most commonly scrutinized cancer in women worldwide, affecting one in eight women in a lifetime. Mammography screening becomes one such standard method that is helpful in identifying suspicious masses&#x2019; malignancy of BC at an initial level. However, the prior identification of masses in mammograms was still challenging for extremely dense and dense breast categories and needs an effective and automatic mechanisms for helping radiotherapists in diagnosis. Deep learning (DL) techniques were broadly utilized for medical imaging applications, particularly breast mass classification. The advancements in the DL field paved the way for highly intellectual and self-reliant computer-aided diagnosis (CAD) systems since the learning capability of Machine Learning (ML) techniques was constantly improving. This paper presents a new Hyperparameter Tuned Deep Hybrid Denoising Autoencoder Breast Cancer Classification (HTDHDAE-BCC) on Digital Mammograms. The presented HTDHDAE-BCC model examines the mammogram images for the identification of BC. In the HTDHDAE-BCC model, the initial stage of image preprocessing is carried out using an average median filter. In addition, the deep convolutional neural network-based Inception v4 model is employed to generate feature vectors. The parameter tuning process uses the binary spider monkey optimization (BSMO) algorithm. The HTDHDAE-BCC model exploits chameleon swarm optimization (CSO) with the DHDAE model for BC classification. The experimental analysis of the HTDHDAE-BCC model is performed using the MIAS database. The experimental outcomes demonstrate the betterments of the HTDHDAE-BCC model over other recent approaches.</p>
</abstract>
<kwd-group kwd-group-type="author">
<kwd>Digital mammograms</kwd>
<kwd>breast cancer classification</kwd>
<kwd>computer-aided diagnosis</kwd>
<kwd>deep learning</kwd>
<kwd>metaheuristics</kwd>
</kwd-group>
</article-meta>
</front>
<body>
<sec id="s1">
<label>1</label>
<title>Introduction</title>
<p>Breast cancer (BC) has become one of the most common cancers among women; as the name defines, it starts in the breast and gradually spreads to other body parts [<xref ref-type="bibr" rid="ref-1">1</xref>]. This cancer ranks second in the list of most common cancer globally, next to lung cancers, and mainly affects the breast glands. BC cells create cancer that can be seen in X-ray images [<xref ref-type="bibr" rid="ref-2">2</xref>]. In 2020, almost 1.8 million cancer cases were identified, representing 30&#x0025; of those cases. There exist 2 kinds of BC exist: benign and malignant [<xref ref-type="bibr" rid="ref-3">3</xref>]. Cells were categorized based on several features. Identifying BC at an initial stage is critical for reducing the death rate. Medical image analysis considers the effectual methodology for identifying BC [<xref ref-type="bibr" rid="ref-4">4</xref>]. Different imaging modalities were utilized for diagnoses, like magnetic resonance imaging (MRI), digital mammography, infrared thermography, and ultrasound (US). Also, mammography imaging was highly suggested. Mammography generates high-quality images for visualizing the breast&#x2019;s internal anatomy. There were various pointers of BC from mammograms [<xref ref-type="bibr" rid="ref-5">5</xref>]. Some of them were architectural distortions, masses, and macrocalcifications (MCs). The former 2 indicators were the powerful pointers of cancers in the initial stage, whereas the architectural distortions were less important than the MCs and masses. Radiotherapists cannot simply offer precise manual assessment because of the rising amount of mammograms produced in widespread screening [<xref ref-type="bibr" rid="ref-6">6</xref>]. Thus, computer-aided diagnosis (CAD) systems were advanced for identifying BC&#x2019;s pointers and enhancing the accuracy of diagnosis. Such a system would simplify the diagnosis procedure and remains a second opinion from the radiologist&#x2019;s point of view.</p>
<p>For the past few years, several authors have recommended numerous solutions for automated cell classification in BC diagnosis [<xref ref-type="bibr" rid="ref-7">7</xref>]. Because of the complicated nature of classical ML approaches, like feature extraction, pre-processing, and segmentation, the system&#x2019;s performance reduces accuracy and efficiency. Conventional ML difficulties were addressed by the deep learning (DL) approach, which has occurred currently [<xref ref-type="bibr" rid="ref-8">8</xref>]. This technique can achieve outstanding feature representation for solving object-localization tasks and image classification. A conventional neural network (CNN)-training task mandates a large amount of data lacking in the medical field, particularly in BC [<xref ref-type="bibr" rid="ref-9">9</xref>]. The transfer learning (TL) method from natural-images datasets comes as a solution to this issue, namely ImageNet, and applies a finely tuned method. The TL idea is used for enhancing the performances of different CNN architectures by merging their knowledge. The major benefit of TL was the improvement of classifier accuracy and quickening training procedures [<xref ref-type="bibr" rid="ref-10">10</xref>]. A suitable TL approach was a model transfer; initially, the network parameters were pre-trained utilizing the source data, then such variables were implemented in the target field, and lastly, the network variables were altered for superior performance.</p>
<p>In [<xref ref-type="bibr" rid="ref-11">11</xref>], CNN structure has been devised using simplified feature learning and finely tuned classification method for separating cancer and ordinary mammogram cases. BC was a predominant and mortal illness that seemed resultant the mutation of normal tissues into cancer pathology. Mammograms have become effective and typical tools for BC diagnosis. The presented DL-related method mainly focused on evaluating the pertinence of several feature-learning techniques and improving the learning capability of the DL techniques for an effective BC identification utilizing CNN. Hassan et al. [<xref ref-type="bibr" rid="ref-12">12</xref>] introduce an innovative classifier algorithm for BC masses related to deep CNNs (DCNNs). The author examined the usage of TL from GoogleNet and AlexNet pre-trained techniques, which is suitable for this task. The author, through an experiment, determined the optimal DCNN technique for precise categorization through comparison made with different approaches that vary under the hyper-parameters and design.</p>
<p>Cabrera et al. [<xref ref-type="bibr" rid="ref-13">13</xref>] inquiry depended on this network type for categorizing 3 classes, malignant normal and benign cancer. For this reason, the miniMIAS database employed contains lesser images, and the TL algorithm has been implemented in the Inception v3 pre-trained network. Kavitha et al. [<xref ref-type="bibr" rid="ref-14">14</xref>] provide a novel Optimal Multi-Level Thresholding-related Segmentation with DL-enabled Capsule Network (OMLTS-DLCN) BC diagnosing technique leveraging digital mammogram. The OMLTS-DLCN technique adds an Adaptive Fuzzy related median filtering (AFF) method as a preprocessing stage for eliminating the noise presented in the mammogram images. Further, the presented method includes a CapsNet-related Back Propagation Neural Network (BPNN) classification, and the feature extractor technique was used to identify the BC existence.</p>
<p>Saffari et al. [<xref ref-type="bibr" rid="ref-15">15</xref>] advance a fully automatic digital breast tissue classification and segmentation utilizing advanced DL methods. The Conditional generative adversarial networks (cGAN) network was implemented to segment the tissues dense in mammograms. To take a whole mechanism for breast density classifications, the author modelled a CNN for classifying mammograms related to the standard Breast Imaging-Reporting and Data System (BI-RADS). Zahoor et al. [<xref ref-type="bibr" rid="ref-16">16</xref>] intend to examine a solution for preventing the disease and offer innovative classification techniques to reduce the risk of BC among women. The Modified Entropy Whale Optimization Algorithm (MEWOA) can be projected related to fusion for in-depth feature extracting and executing the classifications. In the presented technique, the effective Nasnet Mobile and MobilenetV2 were implemented for simulation.</p>
<p>This paper presents a new Hyperparameter Tuned Deep Hybrid Denoising Autoencoder Breast Cancer Classification (HTDHDAE-BCC) on Digital Mammograms. The presented HTDHDAE-BCC model examines the mammogram images for the identification of BC. In the HTDHDAE-BCC model, the initial stage of image preprocessing is carried out using an average median filter. In addition, the deep convolutional neural network-based Inception v4 model is employed to generate feature vectors. The parameter tuning process uses the binary spider monkey optimization (BSMO) algorithm. The HTDHDAE-BCC model exploits chameleon swarm optimization (CSO) with the DHDAE model for BC classification. The experimental analysis of the HTDHDAE-BCC model is performed using the MIAS database.</p>
</sec>
<sec id="s2">
<label>2</label>
<title>The Proposed BC Classification Model</title>
<p>This paper devised a new HTDHDAE-BCC technique for classifying BC on digital mammograms. The HTDHDAE-BCC model encompasses preprocessing, Inception v4 feature extraction, BSMO-based hyperparameter tuning, DHDAE classification, and CSO hyperparameter optimization. <xref ref-type="fig" rid="fig-1">Fig. 1</xref> displays the overall process of the HTDHDAE-BCC approach.</p>
<fig id="fig-1">
<label>Figure 1</label>
<caption>
<title>Overall process of HTDHDAE-BCC approach</title></caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="IASC_34719-fig-1.tif"/>
</fig>
<sec id="s2_1">
<label>2.1</label>
<title>Feature Extraction Using Inception v4</title>
<p>In this work, the Inception v4 model is employed to generate feature vectors. CNNs remove the need for manual extraction of features to identify features employed for classifying images. CNN operates through the extraction of features straightforwardly from images. Inception-v4 was a CNN structure constructed on earlier iterations of the Inception family by facilitating the structure and utilizing more inception modules instead of Inception-v3 [<xref ref-type="bibr" rid="ref-17">17</xref>]. The inception module has been initially presented in GoogLeNet and Inception-v1. The input went through 1&#x2009;&#x00D7;&#x2009;1, 3&#x2009;&#x00D7;&#x2009;3, and 5&#x2009;&#x00D7;&#x2009;5 conv, along with the max pooling concurrently and concatenated as output. The inception-v4 presented as a Batch normalization (BN), in which ReLU can be utilized as an activation function for addressing the saturation issue and the resultant vanishing gradients. Moreover, 5&#x2009;&#x00D7;&#x2009;5 conv has been replaced by dual 3&#x2009;&#x00D7;&#x2009;3 convs for parameter minimization while keeping the receptive field size. Additionally, a factorization idea was presented in the convolution layer to diminish the dimensionality to lessen the overfitting complexity.</p>
</sec>
<sec id="s2_2">
<label>2.2</label>
<title>Hyperparameter Tuning Using BSMO Algorithm</title>
<p>To optimally modify the hyperparameters related to the Inception v4 model, the BSMO algorithm is utilized. SMO discovers novel solutions and uses present solutions to accomplish optimum outcomes [<xref ref-type="bibr" rid="ref-18">18</xref>]. The summary of different stages associated with the traditional SMO process is demonstrated for the reader&#x2019;s rapid reference. The logical operators such as AND (<inline-formula id="ieqn-1">
<mml:math id="mml-ieqn-1"><mml:mo>&#x2297;</mml:mo><mml:mo stretchy="false">)</mml:mo><mml:mo>,</mml:mo><mml:mtext>&#x00A0;</mml:mtext></mml:math>
</inline-formula> OR <inline-formula id="ieqn-2">
<mml:math id="mml-ieqn-2"><mml:mo stretchy="false">(</mml:mo><mml:mo>+</mml:mo></mml:math>
</inline-formula>), and XOR <inline-formula id="ieqn-3">
<mml:math id="mml-ieqn-3"><mml:mrow><mml:mo>(</mml:mo><mml:mo>&#x2295;</mml:mo><mml:mo>)</mml:mo></mml:mrow></mml:math>
</inline-formula> are applied in the SMO equation to form binary SMO as follows.</p>
<p><bold>Initialization phase</bold></p>
<p>Initialization of the arbitrary binary solution is as follows:<disp-formula id="eqn-1"><label>(1)</label>
<mml:math id="mml-eqn-1" display="block"><mml:mi>S</mml:mi><mml:msub><mml:mi>M</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mrow><mml:mo>{</mml:mo><mml:mrow><mml:mtable rowspacing="4pt" columnspacing="1em"><mml:mtr><mml:mtd><mml:mrow><mml:mn>0</mml:mn><mml:mo>,</mml:mo><mml:mtext>&#x00A0;</mml:mtext></mml:mrow><mml:mspace width="1em" /><mml:mrow><mml:mi>i</mml:mi><mml:mi>f</mml:mi><mml:mtext>&#x00A0;</mml:mtext><mml:mi>x</mml:mi><mml:mo>&#x2264;</mml:mo><mml:mi>p</mml:mi></mml:mrow></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mrow><mml:mn>1</mml:mn><mml:mo>,</mml:mo><mml:mtext>&#x00A0;</mml:mtext></mml:mrow><mml:mspace width="1em" /><mml:mrow><mml:mi>o</mml:mi><mml:mi>t</mml:mi><mml:mi>h</mml:mi><mml:mi>e</mml:mi><mml:mi>r</mml:mi><mml:mi>w</mml:mi><mml:mi>i</mml:mi><mml:mi>s</mml:mi><mml:mi>e</mml:mi></mml:mrow></mml:mtd></mml:mtr></mml:mtable></mml:mrow><mml:mo fence="true" stretchy="true" symmetric="true"></mml:mo></mml:mrow></mml:math>
</disp-formula></p>
<p>In <xref ref-type="disp-formula" rid="eqn-7">Eq. (7)</xref>, <inline-formula id="ieqn-4">
<mml:math id="mml-ieqn-4"><mml:mi>S</mml:mi><mml:msub><mml:mi>M</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> denotes the <inline-formula id="ieqn-5">
<mml:math id="mml-ieqn-5"><mml:msup><mml:mi>i</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math>
</inline-formula> monkey&#x2019;s <inline-formula id="ieqn-6">
<mml:math id="mml-ieqn-6"><mml:msup><mml:mi>j</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math>
</inline-formula> parameter. <italic>p</italic> indicates the likelihood taken as 0.5, and <italic>x</italic> indicates an arbitrary number within <inline-formula id="ieqn-7">
<mml:math id="mml-ieqn-7"><mml:mrow><mml:mo>[</mml:mo><mml:mrow><mml:mn>0</mml:mn><mml:mo>,</mml:mo><mml:mtext>&#x00A0;</mml:mtext><mml:mn>1</mml:mn></mml:mrow><mml:mo>]</mml:mo></mml:mrow></mml:math>
</inline-formula>. The dimension becomes <inline-formula id="ieqn-8">
<mml:math id="mml-ieqn-8"><mml:mn>0</mml:mn></mml:math>
</inline-formula> once the arbitrary value is lesser than 0.5. The logical operator upgrades afterwards initialization, the position of the SM.</p>
<p>Local leader (LL) stage (LLS)<disp-formula id="eqn-2"><label>(2)</label>
<mml:math id="mml-eqn-2" display="block"><mml:mi>S</mml:mi><mml:msub><mml:mi>M</mml:mi><mml:mrow><mml:mi>n</mml:mi><mml:mi>e</mml:mi><mml:mi>w</mml:mi><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mrow><mml:mo>{</mml:mo><mml:mrow><mml:mtable rowspacing="4pt" columnspacing="1em"><mml:mtr><mml:mtd columnalign="left"><mml:mrow><mml:mi>S</mml:mi><mml:msub><mml:mi>M</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mo>&#x2295;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>b</mml:mi><mml:mo>&#x2297;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>L</mml:mi><mml:mi>L</mml:mi><mml:mi mathvariant="normal">&#x005F;</mml:mi><mml:mi>k</mml:mi><mml:mi>j</mml:mi><mml:mo>&#x2295;</mml:mo><mml:mi>S</mml:mi><mml:mi>M</mml:mi><mml:mi mathvariant="normal">&#x005F;</mml:mi><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>d</mml:mi><mml:mo>&#x2297;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>S</mml:mi><mml:mi>M</mml:mi><mml:mi mathvariant="normal">&#x005F;</mml:mi><mml:mi>r</mml:mi><mml:mi>j</mml:mi><mml:mo>&#x2295;</mml:mo><mml:mi>S</mml:mi><mml:mi>M</mml:mi><mml:mi mathvariant="normal">&#x005F;</mml:mi><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mo>,</mml:mo></mml:mrow></mml:mtd><mml:mtd><mml:mspace width="1em" /><mml:mrow><mml:mi>i</mml:mi><mml:mi>f</mml:mi><mml:mtext>&#x00A0;</mml:mtext><mml:mi>b</mml:mi><mml:mo>&#x2265;</mml:mo><mml:mi>p</mml:mi><mml:mi>r</mml:mi></mml:mrow></mml:mtd></mml:mtr><mml:mtr><mml:mtd columnalign="left"><mml:mrow><mml:mi>S</mml:mi><mml:msub><mml:mi>M</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi><mml:mo>,</mml:mo></mml:mrow></mml:msub></mml:mrow></mml:mtd><mml:mtd><mml:mspace width="1em" /><mml:mrow><mml:mi>o</mml:mi><mml:mi>t</mml:mi><mml:mi>h</mml:mi><mml:mi>e</mml:mi><mml:mi>r</mml:mi><mml:mi>w</mml:mi><mml:mi>i</mml:mi><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mo>.</mml:mo></mml:mrow></mml:mtd></mml:mtr></mml:mtable></mml:mrow><mml:mo fence="true" stretchy="true" symmetric="true"></mml:mo></mml:mrow></mml:math>
</disp-formula></p>
<p>In <xref ref-type="disp-formula" rid="eqn-8">Eq. (8)</xref>, <inline-formula id="ieqn-9">
<mml:math id="mml-ieqn-9"><mml:mi>S</mml:mi><mml:msub><mml:mi>M</mml:mi><mml:mrow><mml:mi>n</mml:mi><mml:mi>e</mml:mi><mml:mi>w</mml:mi><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> represents the upgraded location of <inline-formula id="ieqn-10">
<mml:math id="mml-ieqn-10"><mml:msup><mml:mi>i</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup><mml:mi>S</mml:mi><mml:mi>M</mml:mi></mml:math>
</inline-formula>,&#x00A0;<inline-formula id="ieqn-11">
<mml:math id="mml-ieqn-11"><mml:mi>L</mml:mi><mml:msub><mml:mi>L</mml:mi><mml:mrow><mml:mi>k</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> denotes the <inline-formula id="ieqn-12">
<mml:math id="mml-ieqn-12"><mml:msup><mml:mi>k</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math>
</inline-formula> local group leader in the <inline-formula id="ieqn-13">
<mml:math id="mml-ieqn-13"><mml:msup><mml:mi>j</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math>
</inline-formula> parameter, <inline-formula id="ieqn-14">
<mml:math id="mml-ieqn-14"><mml:mi>S</mml:mi><mml:msub><mml:mi>M</mml:mi><mml:mrow><mml:mi>r</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> indicates <inline-formula id="ieqn-15">
<mml:math id="mml-ieqn-15"><mml:msup><mml:mi>r</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math>
</inline-formula> SM in <inline-formula id="ieqn-16">
<mml:math id="mml-ieqn-16"><mml:msup><mml:mi>j</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math>
</inline-formula> variable so that <inline-formula id="ieqn-17">
<mml:math id="mml-ieqn-17"><mml:mi>r</mml:mi><mml:mo>&#x2260;</mml:mo><mml:mi>i</mml:mi><mml:mo>;</mml:mo><mml:mtext>&#x00A0;</mml:mtext><mml:mspace width="thinmathspace" /><mml:mi>b</mml:mi></mml:math>
</inline-formula> and <italic>d</italic> denotes binary arbitrary numbers amongst <inline-formula id="ieqn-18">
<mml:math id="mml-ieqn-18"><mml:mrow><mml:mo>[</mml:mo><mml:mrow><mml:mn>0</mml:mn><mml:mo>,</mml:mo><mml:mtext>&#x00A0;</mml:mtext><mml:mn>1</mml:mn></mml:mrow><mml:mo>]</mml:mo></mml:mrow><mml:mo>.</mml:mo><mml:mtext>&#x00A0;</mml:mtext><mml:mi>p</mml:mi><mml:mi>r</mml:mi></mml:math>
</inline-formula> denotes the perturbation rate.</p>
<p><bold>Global leader stage (GLS)</bold><disp-formula id="eqn-3"><label>(3)</label>
<mml:math id="mml-eqn-3" display="block"><mml:mi>S</mml:mi><mml:msub><mml:mi>M</mml:mi><mml:mrow><mml:mi>n</mml:mi><mml:mi>e</mml:mi><mml:mi>w</mml:mi><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mi>S</mml:mi><mml:msub><mml:mi>M</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mo>&#x2295;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>b</mml:mi><mml:mo>&#x2297;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>G</mml:mi><mml:msub><mml:mi>L</mml:mi><mml:mrow><mml:mi>k</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mo>&#x2295;</mml:mo><mml:mi>S</mml:mi><mml:msub><mml:mi>M</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>d</mml:mi><mml:mo>&#x2297;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>S</mml:mi><mml:msub><mml:mi>M</mml:mi><mml:mrow><mml:mi>r</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mo>&#x2295;</mml:mo><mml:mi>S</mml:mi><mml:msub><mml:mi>M</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mo>,</mml:mo><mml:mtext>&#x00A0;</mml:mtext><mml:mi>i</mml:mi><mml:mi>f</mml:mi><mml:mspace width="thinmathspace" /><mml:mi>b</mml:mi><mml:mo>&#x003C;</mml:mo><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>o</mml:mi><mml:msub><mml:mi>b</mml:mi><mml:mi>i</mml:mi></mml:msub></mml:math>
</disp-formula></p>
<p>Now, <inline-formula id="ieqn-19">
<mml:math id="mml-ieqn-19"><mml:mi>G</mml:mi><mml:msub><mml:mi>L</mml:mi><mml:mrow><mml:mi>k</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> indicates the <inline-formula id="ieqn-20">
<mml:math id="mml-ieqn-20"><mml:msup><mml:mi>k</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math>
</inline-formula> GL in <inline-formula id="ieqn-21">
<mml:math id="mml-ieqn-21"><mml:mi>t</mml:mi><mml:mi>h</mml:mi><mml:mi>e</mml:mi><mml:mtext>&#x00A0;</mml:mtext><mml:msup><mml:mi>j</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math>
</inline-formula> parameter. The updated location depends on the following formula:<disp-formula id="eqn-4"><label>(4)</label>
<mml:math id="mml-eqn-4" display="block"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>o</mml:mi><mml:msub><mml:mi>b</mml:mi><mml:mi>i</mml:mi></mml:msub><mml:mo>=</mml:mo><mml:mn>0.9</mml:mn><mml:mo>&#x00D7;</mml:mo><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mrow><mml:mfrac><mml:mrow><mml:mi>f</mml:mi><mml:mi>i</mml:mi><mml:msub><mml:mi>t</mml:mi><mml:mi>i</mml:mi></mml:msub></mml:mrow><mml:mrow><mml:mrow><mml:mi mathvariant="normal">m</mml:mi><mml:mi mathvariant="normal">a</mml:mi></mml:mrow><mml:msub><mml:mrow><mml:mi mathvariant="normal">x</mml:mi></mml:mrow><mml:mrow><mml:msup><mml:mi>f</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>t</mml:mi></mml:mrow></mml:msup></mml:mrow></mml:msub></mml:mrow></mml:mfrac></mml:mrow><mml:mo>+</mml:mo><mml:mn>0.1</mml:mn></mml:mstyle></mml:math>
</disp-formula></p>
<p>In <xref ref-type="disp-formula" rid="eqn-4">Eq. (4)</xref>, <inline-formula id="ieqn-22">
<mml:math id="mml-ieqn-22"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>o</mml:mi><mml:msub><mml:mi>b</mml:mi><mml:mi>i</mml:mi></mml:msub></mml:math>
</inline-formula> denotes the probability, <inline-formula id="ieqn-23">
<mml:math id="mml-ieqn-23"><mml:mi>f</mml:mi><mml:mi>i</mml:mi><mml:msub><mml:mi>t</mml:mi><mml:mi>i</mml:mi></mml:msub></mml:math>
</inline-formula> indicates the fitness of <inline-formula id="ieqn-24">
<mml:math id="mml-ieqn-24"><mml:msup><mml:mi>i</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math>
</inline-formula> SM, and <inline-formula id="ieqn-25">
<mml:math id="mml-ieqn-25"><mml:mrow><mml:mi mathvariant="normal">m</mml:mi><mml:mi mathvariant="normal">a</mml:mi></mml:mrow><mml:msub><mml:mrow><mml:mi mathvariant="normal">x</mml:mi></mml:mrow><mml:mrow><mml:msub><mml:mi>f</mml:mi><mml:mi>l</mml:mi></mml:msub><mml:mi>t</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> shows the maximal fitness of the group.</p>
<p><bold>Local leader decision stage (LLDS)</bold><disp-formula id="eqn-5"><label>(5)</label>
<mml:math id="mml-eqn-5" display="block"><mml:mi>S</mml:mi><mml:msub><mml:mi>M</mml:mi><mml:mrow><mml:mi>n</mml:mi><mml:mi>e</mml:mi><mml:mi>w</mml:mi><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mrow><mml:mo>{</mml:mo><mml:mrow><mml:mtable rowspacing="4pt" columnspacing="1em"><mml:mtr><mml:mtd columnalign="left"><mml:mrow><mml:mi>S</mml:mi><mml:msub><mml:mi>M</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mo>&#x2295;</mml:mo><mml:mstyle scriptlevel="0"><mml:mrow><mml:mo maxsize="1.2em" minsize="1.2em">(</mml:mo></mml:mrow></mml:mstyle><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>b</mml:mi><mml:mo>&#x2297;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>L</mml:mi><mml:msub><mml:mi>L</mml:mi><mml:mrow><mml:mi>k</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mo>&#x2295;</mml:mo><mml:mi>S</mml:mi><mml:msub><mml:mi>M</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:mo stretchy="false">(</mml:mo><mml:mi>b</mml:mi><mml:mo>&#x2297;</mml:mo><mml:mo stretchy="false">(</mml:mo><mml:mi>G</mml:mi><mml:msub><mml:mi>L</mml:mi><mml:mi>j</mml:mi></mml:msub><mml:mo>&#x2295;</mml:mo><mml:mi>S</mml:mi><mml:msub><mml:mi>M</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo stretchy="false">)</mml:mo><mml:mo stretchy="false">)</mml:mo><mml:mstyle scriptlevel="0"><mml:mrow><mml:mo maxsize="1.2em" minsize="1.2em">)</mml:mo></mml:mrow></mml:mstyle><mml:mspace width="1em" /><mml:mrow><mml:mi>i</mml:mi><mml:mi>f</mml:mi><mml:mtext>&#x00A0;</mml:mtext><mml:mi>b</mml:mi><mml:mo>&#x2265;</mml:mo><mml:mi>p</mml:mi><mml:mi>r</mml:mi></mml:mrow></mml:mtd></mml:mtr><mml:mtr><mml:mtd columnalign="left"><mml:mrow><mml:mi>u</mml:mi><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mtext>&#x00A0;</mml:mtext><mml:mi>E</mml:mi><mml:mi>q</mml:mi><mml:mi>u</mml:mi><mml:mi>a</mml:mi><mml:mi>t</mml:mi><mml:mi>i</mml:mi><mml:mi>o</mml:mi><mml:mi>n</mml:mi><mml:mtext>&#x00A0;</mml:mtext><mml:mi>l</mml:mi><mml:mn>2</mml:mn><mml:mo>,</mml:mo><mml:mtext>&#x00A0;&#x00A0;</mml:mtext><mml:mi>o</mml:mi><mml:mi>t</mml:mi><mml:mi>h</mml:mi><mml:mi>e</mml:mi><mml:mi>r</mml:mi><mml:mi>w</mml:mi><mml:mi>i</mml:mi><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mo>.</mml:mo></mml:mrow><mml:mspace width="1em" /><mml:mtext>&#x00A0;</mml:mtext></mml:mtd></mml:mtr></mml:mtable></mml:mrow><mml:mo fence="true" stretchy="true" symmetric="true"></mml:mo></mml:mrow></mml:math>
</disp-formula></p>
<p>In <xref ref-type="disp-formula" rid="eqn-5">Eq. (5)</xref>, <inline-formula id="ieqn-26">
<mml:math id="mml-ieqn-26"><mml:mi>L</mml:mi><mml:msub><mml:mi>L</mml:mi><mml:mrow><mml:mi>k</mml:mi><mml:mi>j</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> indicates the <inline-formula id="ieqn-27">
<mml:math id="mml-ieqn-27"><mml:msup><mml:mi>k</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math>
</inline-formula> LL in <inline-formula id="ieqn-28">
<mml:math id="mml-ieqn-28"><mml:mi>t</mml:mi><mml:mi>h</mml:mi><mml:mi>e</mml:mi><mml:mtext>&#x00A0;</mml:mtext><mml:msup><mml:mi>j</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math>
</inline-formula> parameter, <inline-formula id="ieqn-29">
<mml:math id="mml-ieqn-29"><mml:mi>a</mml:mi><mml:mi>n</mml:mi><mml:mi>d</mml:mi><mml:mtext>&#x00A0;</mml:mtext><mml:mi>G</mml:mi><mml:msub><mml:mi>L</mml:mi><mml:mi>j</mml:mi></mml:msub></mml:math>
</inline-formula> denotes GL in the <inline-formula id="ieqn-30">
<mml:math id="mml-ieqn-30"><mml:msup><mml:mi>j</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math>
</inline-formula> variable.</p>
<p><bold>Global leader decision stage (GLDS)</bold></p>
<p>Once the GL SM value isn&#x2019;t changed to global leader limit <inline-formula id="ieqn-31">
<mml:math id="mml-ieqn-31"><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>G</mml:mi><mml:mi>L</mml:mi><mml:mi>L</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:math>
</inline-formula> time, GL divides the entire group into subcategories. Once the class number is equivalent to the maximal group <inline-formula id="ieqn-32">
<mml:math id="mml-ieqn-32"><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>M</mml:mi><mml:mi>G</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:math>
</inline-formula> value, then it is regarded that the maximal potential subgroup has been generated. Next, GL integrates each subgroup to generate a single group. The position of LL is eventually upgraded.</p>
</sec>
<sec id="s2_3">
<label>2.3</label>
<title>BC Classification Using Optimal DHDAE Model</title>
<p>For BC classification, the DHDAE model is exploited in this study. The autoencoder (AE) variants of the Deep Hybrid Boltzmann Machine (DHBM), the DHDAE, Follow the same path as a preceding subsection; however, it starts from the HSDA basis structural design [<xref ref-type="bibr" rid="ref-19">19</xref>]. Also, this borrows the similar structure of DHBM, containing the bi-direction connection required for integrating bottom-up and top-down influences. But, rather than learning through a Boltzmann-based method, we intend to learn a stochastic decoder and encoder procedure over several layers conjointly. The DHDAE might be stacking a strongly incorporated hybrid denoising autoencoder (HdA) with the coupled predictor. An HdA is a single hidden-layer MLP that shares input-to-hidden weight with an encoder-decoder module whose weight is tied (viz., decoding weight is equivalent to the transfer of encoding weight).</p>
<p>A 3-layer form of the joint models (generalized to <italic>L</italic> layer and determined as the similar variable set as DHBM) is quantified as the encoder <inline-formula id="ieqn-33">
<mml:math id="mml-ieqn-33"><mml:mrow><mml:mo>(</mml:mo><mml:mspace width="thinmathspace" /><mml:msubsup><mml:mi>f</mml:mi><mml:mi>&#x03B8;</mml:mi><mml:mn>1</mml:mn></mml:msubsup><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>y</mml:mi><mml:mo>,</mml:mo><mml:mtext>&#x00A0;&#x00A0;</mml:mtext><mml:mrow><mml:mover><mml:mi>X</mml:mi><mml:mo stretchy="false">&#x005E;</mml:mo></mml:mover></mml:mrow><mml:mo>,</mml:mo><mml:mtext>&#x00A0;</mml:mtext><mml:mrow><mml:mover><mml:mrow><mml:msup><mml:mi>h</mml:mi><mml:mn>2</mml:mn></mml:msup></mml:mrow><mml:mo>&#x005E;</mml:mo></mml:mover></mml:mrow></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mo>,</mml:mo><mml:mtext>&#x00A0;&#x00A0;</mml:mtext><mml:msubsup><mml:mi>f</mml:mi><mml:mi>&#x03B8;</mml:mi><mml:mn>2</mml:mn></mml:msubsup><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mrow><mml:mover><mml:mrow><mml:msup><mml:mi>h</mml:mi><mml:mn>1</mml:mn></mml:msup></mml:mrow><mml:mo>&#x005E;</mml:mo></mml:mover></mml:mrow></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mstyle scriptlevel="0"><mml:mrow><mml:mo maxsize="1.623em" minsize="1.623em">}</mml:mo></mml:mrow></mml:mstyle><mml:mo>)</mml:mo></mml:mrow></mml:math>
</inline-formula> and decoder <inline-formula id="ieqn-34">
<mml:math id="mml-ieqn-34"><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msub><mml:mi>g</mml:mi><mml:mi>&#x03B8;</mml:mi></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mrow><mml:mover><mml:mrow><mml:msup><mml:mi>h</mml:mi><mml:mn>1</mml:mn></mml:msup></mml:mrow><mml:mo>&#x005E;</mml:mo></mml:mover></mml:mrow></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:math>
</inline-formula> functions:</p>
<p><disp-formula id="eqn-6"><label>(6)</label>
<mml:math id="mml-eqn-6" display="block"><mml:msup><mml:mi>h</mml:mi><mml:mn>1</mml:mn></mml:msup><mml:mo>=</mml:mo><mml:msubsup><mml:mi>f</mml:mi><mml:mi>&#x03B8;</mml:mi><mml:mn>1</mml:mn></mml:msubsup><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mrow><mml:mover><mml:mi>X</mml:mi><mml:mo stretchy="false">&#x005E;</mml:mo></mml:mover></mml:mrow><mml:mo>,</mml:mo><mml:mtext>&#x00A0;&#x00A0;</mml:mtext><mml:mrow><mml:mover><mml:mrow><mml:msup><mml:mi>h</mml:mi><mml:mn>2</mml:mn></mml:msup></mml:mrow><mml:mo>&#x005E;</mml:mo></mml:mover></mml:mrow></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mi>&#x03D5;</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:msup><mml:mi>W</mml:mi><mml:mn>1</mml:mn></mml:msup><mml:mi>X</mml:mi><mml:mo>+</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msup><mml:mi>W</mml:mi><mml:mn>2</mml:mn></mml:msup></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mrow><mml:msup><mml:mi></mml:mi><mml:mi>T</mml:mi></mml:msup><mml:mrow><mml:mover><mml:mi>h</mml:mi><mml:mo stretchy="false">&#x005E;</mml:mo></mml:mover></mml:mrow></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:math>
</disp-formula></p>
<p><disp-formula id="eqn-7"><label>(7)</label>
<mml:math id="mml-eqn-7" display="block"><mml:msup><mml:mi>h</mml:mi><mml:mn>2</mml:mn></mml:msup><mml:mo>=</mml:mo><mml:msubsup><mml:mi>f</mml:mi><mml:mi>&#x03B8;</mml:mi><mml:mn>2</mml:mn></mml:msubsup><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mrow><mml:mover><mml:mrow><mml:msup><mml:mi>h</mml:mi><mml:mn>1</mml:mn></mml:msup></mml:mrow><mml:mo>&#x005E;</mml:mo></mml:mover></mml:mrow></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mi>&#x03D5;</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msup><mml:mi>W</mml:mi><mml:mn>2</mml:mn></mml:msup><mml:msup><mml:mrow><mml:mrow><mml:mover><mml:mi>h</mml:mi><mml:mo stretchy="false">&#x005E;</mml:mo></mml:mover></mml:mrow></mml:mrow><mml:mn>1</mml:mn></mml:msup></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:math>
</disp-formula></p>
<p><disp-formula id="eqn-8"><label>(8)</label>
<mml:math id="mml-eqn-8" display="block"><mml:mrow><mml:mover><mml:mi>x</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow><mml:mo>=</mml:mo><mml:msub><mml:mi>g</mml:mi><mml:mi>&#x03B8;</mml:mi></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mrow><mml:mover><mml:mrow><mml:msup><mml:mi>h</mml:mi><mml:mn>1</mml:mn></mml:msup></mml:mrow><mml:mo>&#x005E;</mml:mo></mml:mover></mml:mrow></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mi>&#x03D5;</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msup><mml:mi>W</mml:mi><mml:mn>1</mml:mn></mml:msup></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mrow><mml:msup><mml:mi></mml:mi><mml:mi>T</mml:mi></mml:msup><mml:msup><mml:mrow><mml:mrow><mml:mover><mml:mi>h</mml:mi><mml:mo stretchy="false">&#x005E;</mml:mo></mml:mover></mml:mrow></mml:mrow><mml:mn>1</mml:mn></mml:msup></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:math>
</disp-formula></p>
<p><inline-formula id="ieqn-35">
<mml:math id="mml-ieqn-35"><mml:mrow><mml:mo>{</mml:mo><mml:mrow><mml:msup><mml:mi>W</mml:mi><mml:mn>1</mml:mn></mml:msup><mml:mo>,</mml:mo><mml:mtext>&#x00A0;&#x00A0;</mml:mtext><mml:msup><mml:mi>W</mml:mi><mml:mn>2</mml:mn></mml:msup></mml:mrow><mml:mo>}</mml:mo></mml:mrow></mml:math>
</inline-formula> denotes the weight matrix interconnecting input <italic>X</italic> to <inline-formula id="ieqn-36">
<mml:math id="mml-ieqn-36"><mml:msup><mml:mi>h</mml:mi><mml:mn>1</mml:mn></mml:msup></mml:math>
</inline-formula> and <inline-formula id="ieqn-37">
<mml:math id="mml-ieqn-37"><mml:msup><mml:mi>h</mml:mi><mml:mn>1</mml:mn></mml:msup></mml:math>
</inline-formula> to <inline-formula id="ieqn-38">
<mml:math id="mml-ieqn-38"><mml:msup><mml:mi>h</mml:mi><mml:mn>2</mml:mn></mml:msup></mml:math>
</inline-formula> correspondingly (the superscript <italic>T</italic> signifies a matrixes transpose function). The output operation <inline-formula id="ieqn-39">
<mml:math id="mml-ieqn-39"><mml:msub><mml:mi>o</mml:mi><mml:mi>&#x03B8;</mml:mi></mml:msub><mml:mo stretchy="false">(</mml:mo><mml:mi>y</mml:mi><mml:mo fence="false" stretchy="false">|</mml:mo><mml:msup><mml:mi>h</mml:mi><mml:mn>1</mml:mn></mml:msup><mml:mo>,</mml:mo><mml:mtext>&#x00A0;&#x00A0;</mml:mtext><mml:msup><mml:mi>h</mml:mi><mml:mn>2</mml:mn></mml:msup><mml:mo stretchy="false">)</mml:mo></mml:math>
</inline-formula> required to produce prediction is evaluated through <xref ref-type="disp-formula" rid="eqn-6">Eq. (6)</xref>, similar to the DHBM method. Similar to HSDA, the DHDAE exploits a stochastic mapping function <inline-formula id="ieqn-40">
<mml:math id="mml-ieqn-40"><mml:msubsup><mml:mrow><mml:mover><mml:mi>h</mml:mi><mml:mo stretchy="false">&#x005E;</mml:mo></mml:mover></mml:mrow><mml:mi>t</mml:mi><mml:mi>l</mml:mi></mml:msubsup><mml:mo>&#x223C;</mml:mo><mml:msub><mml:mi>q</mml:mi><mml:mi>D</mml:mi></mml:msub><mml:mo stretchy="false">(</mml:mo><mml:msubsup><mml:mrow><mml:mover><mml:mi>h</mml:mi><mml:mo stretchy="false">&#x005E;</mml:mo></mml:mover></mml:mrow><mml:mi>t</mml:mi><mml:mi>l</mml:mi></mml:msubsup><mml:mo fence="false" stretchy="false">|</mml:mo><mml:mi>h</mml:mi><mml:mo stretchy="false">)</mml:mo></mml:math>
</inline-formula> for corrupting input vector (haphazardly masking entry in provided likelihood) 4 Noted that the DHDAE needs lesser matrix operation when compared to the DHBM because it exemplified the &#x201C;weak multiple level semi-supervised hypotheses&#x201D;, providing it with the benefit of a speedup in comparison with the DHBM.</p>
<p>The information flow to collect layer-by-layer statistics is specified as the arrows (that characterize an operation like element-wise non-linearity and matrix multiplication), using the vector at the arrow origin and the suitable layer parameter matrix) that is totalled based on the consecutive computation step taken to evaluate them. Arrows (or operations) in similar computation steps are similar and point to the resultant activation value evaluated. <italic>v</italic> corresponds to the detection network <inline-formula id="ieqn-41">
<mml:math id="mml-ieqn-41"><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msub><mml:mi>Q</mml:mi><mml:mrow><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:math>
</inline-formula> primary guess for the mean-field, whereas <inline-formula id="ieqn-42">
<mml:math id="mml-ieqn-42"><mml:mi>&#x03BC;</mml:mi></mml:math>
</inline-formula> characterizes the real mean-field statistics. <xref ref-type="fig" rid="fig-2">Fig. 2</xref> showcases the infrastructure of AE.</p>
<fig id="fig-2">
<label>Figure 2</label>
<caption>
<title>Structure of autoencoder</title></caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="IASC_34719-fig-2.tif"/>
</fig>
<p>To evaluate layer-by-layer activation value for the DHDAE, one uses the detection model to attain primary guess for <inline-formula id="ieqn-43">
<mml:math id="mml-ieqn-43"><mml:mrow><mml:mo>{</mml:mo><mml:mrow><mml:msup><mml:mi>h</mml:mi><mml:mn>1</mml:mn></mml:msup><mml:mo>,</mml:mo><mml:mtext>&#x00A0;&#x00A0;</mml:mtext><mml:msup><mml:mi>h</mml:mi><mml:mn>2</mml:mn></mml:msup></mml:mrow><mml:mo>}</mml:mo></mml:mrow></mml:math>
</inline-formula>.</p>
<p>For optimal hyperparameter tuning of the DHDAE, the CSO algorithm is used. CSO algorithm stimulates the hunting and food-searching method [<xref ref-type="bibr" rid="ref-20">20</xref>]. They are many specified classes of species that can change colour to blend with their surroundings. They can live and survive in semi-desert areas, lowlands, mountains, and deserts and usually eat insects. The food hunting procedure includes the following phases: attacking, tracking, and pursuing the prey using their sight. The mathematical steps and models are described in the succeeding subsections.</p>
<p>CSO algorithm was a population-related meta-heuristic that arbitrarily produces an initializing population to begin the optimization procedure. The chameleon population having the size <italic>n</italic> was generated in a <inline-formula id="ieqn-44">
<mml:math id="mml-ieqn-44"><mml:mi>d</mml:mi></mml:math>
</inline-formula>-dimension search region in which every individual of the population becomes a feasible solution to the optimization issue. The position of the chameleon at iteration in the searching region can be considered as follows:<disp-formula id="eqn-9"><label>(9)</label>
<mml:math id="mml-eqn-9" display="block"><mml:msubsup><mml:mi>y</mml:mi><mml:mi>t</mml:mi><mml:mi>i</mml:mi></mml:msubsup><mml:mo>=</mml:mo><mml:mrow><mml:mo>[</mml:mo><mml:mrow><mml:msubsup><mml:mi>y</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mo>,</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mi>i</mml:mi></mml:msubsup><mml:mo>,</mml:mo><mml:mtext>&#x00A0;</mml:mtext><mml:msubsup><mml:mi>y</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mo>,</mml:mo><mml:mn>2</mml:mn></mml:mrow><mml:mi>i</mml:mi></mml:msubsup><mml:mo>,</mml:mo><mml:mtext>&#x00A0;</mml:mtext><mml:msubsup><mml:mi>y</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mo>,</mml:mo><mml:mi>d</mml:mi></mml:mrow><mml:mi>i</mml:mi></mml:msubsup></mml:mrow><mml:mo>]</mml:mo></mml:mrow></mml:math>
</disp-formula>where <inline-formula id="ieqn-45">
<mml:math id="mml-ieqn-45"><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn><mml:mo>,</mml:mo><mml:mtext>&#x00A0;</mml:mtext><mml:mn>2</mml:mn><mml:mo>&#x2026;</mml:mo><mml:mi>t</mml:mi></mml:math>
</inline-formula> characterizes the iteration count, <inline-formula id="ieqn-46">
<mml:math id="mml-ieqn-46"><mml:msubsup><mml:mi>y</mml:mi><mml:mrow><mml:mi>&#x03C4;</mml:mi><mml:mo>,</mml:mo><mml:mi>E</mml:mi><mml:mi>r</mml:mi><mml:mi>r</mml:mi><mml:mi>o</mml:mi><mml:mi>r</mml:mi><mml:mo>&#x003A;</mml:mo><mml:mo>&#x003A;</mml:mo><mml:mn>0</mml:mn><mml:mi>x</mml:mi><mml:mn>0000</mml:mn></mml:mrow><mml:mi>i</mml:mi></mml:msubsup></mml:math>
</inline-formula> characterizes the chameleon location.</p>
<p>The initialized population is produced according to the problem dimension, and the chameleon count in the searching region is shown below:<disp-formula id="eqn-10"><label>(10)</label>
<mml:math id="mml-eqn-10" display="block"><mml:msup><mml:mi>y</mml:mi><mml:mi>i</mml:mi></mml:msup><mml:mo>=</mml:mo><mml:msub><mml:mi>l</mml:mi><mml:mi>j</mml:mi></mml:msub><mml:mo>+</mml:mo><mml:mi>r</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msub><mml:mi>u</mml:mi><mml:mi>j</mml:mi></mml:msub><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>l</mml:mi><mml:mi>j</mml:mi></mml:msub></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:math>
</disp-formula></p>
<p>In <xref ref-type="disp-formula" rid="eqn-10">Eq. (10)</xref>, <inline-formula id="ieqn-47">
<mml:math id="mml-ieqn-47"><mml:msup><mml:mi>y</mml:mi><mml:mi>i</mml:mi></mml:msup></mml:math>
</inline-formula> represents the primary vector of <inline-formula id="ieqn-48">
<mml:math id="mml-ieqn-48"><mml:mi>i</mml:mi></mml:math>
</inline-formula>-<inline-formula id="ieqn-49">
<mml:math id="mml-ieqn-49"><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:math>
</inline-formula> chameleons, <inline-formula id="ieqn-50">
<mml:math id="mml-ieqn-50"><mml:msub><mml:mi>u</mml:mi><mml:mi>j</mml:mi></mml:msub></mml:math>
</inline-formula> and <inline-formula id="ieqn-51">
<mml:math id="mml-ieqn-51"><mml:msub><mml:mi>l</mml:mi><mml:mi>j</mml:mi></mml:msub></mml:math>
</inline-formula> denote the upper and lower bounds of the searching region, correspondingly, and <italic>r</italic> shows a uniformly distributed value ranges from [0, 1]. The solution quality in every step can be evaluated for every novel location according to the assessment of objective function.</p>
<p>The chameleon movement behaviour in searching is represented according to the updated approach of location as follows:<disp-formula id="eqn-11"><label>(11)</label>
<mml:math id="mml-eqn-11" display="block"><mml:msubsup><mml:mi>y</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mo>+</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:msub><mml:mi>i</mml:mi><mml:mo>+</mml:mo></mml:msub><mml:mi>j</mml:mi></mml:mrow></mml:msubsup><mml:mo>=</mml:mo><mml:mrow><mml:mo>{</mml:mo><mml:mrow><mml:mtable rowspacing="4pt" columnspacing="1em"><mml:mtr><mml:mtd columnalign="left"><mml:mrow><mml:msubsup><mml:mi>y</mml:mi><mml:mi>t</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mi>j</mml:mi></mml:mrow></mml:msubsup><mml:mo>+</mml:mo><mml:msub><mml:mi>P</mml:mi><mml:mn>1</mml:mn></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msubsup><mml:mi>P</mml:mi><mml:mi>&#x03C4;</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mi>j</mml:mi></mml:mrow></mml:msubsup><mml:mo>&#x2212;</mml:mo><mml:msubsup><mml:mi>G</mml:mi><mml:mi>&#x03C4;</mml:mi><mml:mi>j</mml:mi></mml:msubsup></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:msub><mml:mi>r</mml:mi><mml:mn>2</mml:mn></mml:msub><mml:mo>+</mml:mo><mml:msub><mml:mi>P</mml:mi><mml:mn>2</mml:mn></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msubsup><mml:mi>G</mml:mi><mml:mi>&#x03C4;</mml:mi><mml:mi>j</mml:mi></mml:msubsup><mml:mo>&#x2212;</mml:mo><mml:msubsup><mml:mi>y</mml:mi><mml:mi>&#x03C4;</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mi>j</mml:mi></mml:mrow></mml:msubsup></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:msub><mml:mi>r</mml:mi><mml:mn>1</mml:mn></mml:msub></mml:mrow></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mtext>&#x00A0;</mml:mtext></mml:mtd></mml:mtr><mml:mtr><mml:mtd columnalign="left"><mml:mrow><mml:msubsup><mml:mi>y</mml:mi><mml:mi>t</mml:mi><mml:mrow><mml:msub><mml:mi>i</mml:mi><mml:mtext>&#x00A0;</mml:mtext></mml:msub><mml:mo>,</mml:mo><mml:mi>j</mml:mi></mml:mrow></mml:msubsup><mml:mo>+</mml:mo><mml:mi>&#x03BC;</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msup><mml:mi>u</mml:mi><mml:mi>j</mml:mi></mml:msup><mml:mo>&#x2212;</mml:mo><mml:msup><mml:mi>l</mml:mi><mml:mi>j</mml:mi></mml:msup></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:msub><mml:mi>r</mml:mi><mml:mn>3</mml:mn></mml:msub><mml:mo>+</mml:mo><mml:msubsup><mml:mi>l</mml:mi><mml:mi>b</mml:mi><mml:mi>j</mml:mi></mml:msubsup><mml:mi>s</mml:mi><mml:mi>n</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>r</mml:mi><mml:mi>a</mml:mi><mml:mi>n</mml:mi><mml:mi>d</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>0.5</mml:mn></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:msub><mml:mi>r</mml:mi><mml:mn>1</mml:mn></mml:msub><mml:mo>&#x003C;</mml:mo><mml:msubsup><mml:mi>P</mml:mi><mml:mi>p</mml:mi><mml:mrow><mml:msub><mml:mi>r</mml:mi><mml:mn>1</mml:mn></mml:msub><mml:mo>&#x2265;</mml:mo><mml:msub><mml:mi>P</mml:mi><mml:mi>p</mml:mi></mml:msub></mml:mrow></mml:msubsup></mml:mrow></mml:mtd></mml:mtr></mml:mtable></mml:mrow><mml:mo fence="true" stretchy="true" symmetric="true"></mml:mo></mml:mrow></mml:math>
</disp-formula></p>
<p>In <xref ref-type="disp-formula" rid="eqn-11">Eq. (11)</xref>, <italic>t</italic> and <inline-formula id="ieqn-52">
<mml:math id="mml-ieqn-52"><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi>t</mml:mi><mml:mo>+</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:math>
</inline-formula> designate the <inline-formula id="ieqn-53">
<mml:math id="mml-ieqn-53"><mml:msup><mml:mi>t</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math>
</inline-formula> and <inline-formula id="ieqn-54">
<mml:math id="mml-ieqn-54"><mml:mo stretchy="false">(</mml:mo><mml:mi>t</mml:mi><mml:mo>+</mml:mo><mml:mn>1</mml:mn><mml:msup><mml:mo stretchy="false">)</mml:mo><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math>
</inline-formula> iterative phase correspondingly. <italic>i</italic> and <italic>j</italic> characterize the <inline-formula id="ieqn-55">
<mml:math id="mml-ieqn-55"><mml:msup><mml:mi>i</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math>
</inline-formula> chameleons in the <inline-formula id="ieqn-56">
<mml:math id="mml-ieqn-56"><mml:msup><mml:mi>j</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math>
</inline-formula> parameter. <inline-formula id="ieqn-57">
<mml:math id="mml-ieqn-57"><mml:msubsup><mml:mi>y</mml:mi><mml:mi>t</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mi>j</mml:mi></mml:mrow></mml:msubsup></mml:math>
</inline-formula> and <inline-formula id="ieqn-58">
<mml:math id="mml-ieqn-58"><mml:msubsup><mml:mi>y</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mo>+</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:msub><mml:mi>i</mml:mi><mml:mo>+</mml:mo></mml:msub><mml:mi>j</mml:mi></mml:mrow></mml:msubsup></mml:math>
</inline-formula> denote the existing and novel locations, correspondingly. <inline-formula id="ieqn-59">
<mml:math id="mml-ieqn-59"><mml:msubsup><mml:mi>P</mml:mi><mml:mi>t</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mi>j</mml:mi></mml:mrow></mml:msubsup></mml:math>
</inline-formula> and <inline-formula id="ieqn-60">
<mml:math id="mml-ieqn-60"><mml:msubsup><mml:mi>G</mml:mi><mml:mi>t</mml:mi><mml:mi>j</mml:mi></mml:msubsup></mml:math>
</inline-formula> suggest the finest and global finest locations, correspondingly.</p>
<p>Where <inline-formula id="ieqn-61">
<mml:math id="mml-ieqn-61"><mml:msub><mml:mi>P</mml:mi><mml:mn>1</mml:mn></mml:msub></mml:math>
</inline-formula> and <inline-formula id="ieqn-62">
<mml:math id="mml-ieqn-62"><mml:msub><mml:mi>P</mml:mi><mml:mn>2</mml:mn></mml:msub></mml:math>
</inline-formula> refer to a positive number that controls exploration capability. <inline-formula id="ieqn-63">
<mml:math id="mml-ieqn-63"><mml:msub><mml:mi>r</mml:mi><mml:mn>1</mml:mn></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-64">
<mml:math id="mml-ieqn-64"><mml:msub><mml:mi>r</mml:mi><mml:mn>2</mml:mn></mml:msub><mml:mo>,</mml:mo><mml:mtext>&#x00A0;</mml:mtext></mml:math>
</inline-formula> and <inline-formula id="ieqn-65">
<mml:math id="mml-ieqn-65"><mml:msub><mml:mi>r</mml:mi><mml:mn>3</mml:mn></mml:msub></mml:math>
</inline-formula> indicate uniformly distributed arbitrary values ranging from [0, 1]. <inline-formula id="ieqn-66">
<mml:math id="mml-ieqn-66"><mml:msub><mml:mi>r</mml:mi><mml:mi>i</mml:mi></mml:msub></mml:math>
</inline-formula> denotes a uniform distribution value generated at index <italic>i</italic> ranges from [0, 1]. <inline-formula id="ieqn-67">
<mml:math id="mml-ieqn-67"><mml:msub><mml:mi>P</mml:mi><mml:mi>p</mml:mi></mml:msub></mml:math>
</inline-formula> shows the probability of the chameleon perceiving prey. Sgn(rand-0.5) denotes an effect on the direction of exploration and exploitation, and it ranges from [<inline-formula id="ieqn-68">
<mml:math id="mml-ieqn-68"><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:math>
</inline-formula>, 1]. <inline-formula id="ieqn-69">
<mml:math id="mml-ieqn-69"><mml:mi>&#x03BC;</mml:mi></mml:math>
</inline-formula> indicates a function of the iteration variable, which decreases with iteration count.</p>
<p>Chameleon&#x2019;s Eyes Rotation able to recognize prey location by rotating their eyes, and that feature assists in spotting the target via 360 degrees steps are given below:<list list-type="bullet"><list-item>
<p>The initial location was the focal point of gravity;</p></list-item><list-item>
<p>The rotation matrix was found out that identifies the prey position;</p></list-item><list-item>
<p>The situation is refreshed through a rotation matrix at the focal point of gravity;</p></list-item><list-item>
<p>At last, they were resumed to the initial location position</p></list-item></list></p>
<p>Chameleon assaults its target once it excessively comes closer. The chameleon adjacent to the target is the optimum chameleon and is regarded as the optimum outcome. Such chameleons assault the target through the tongue. The chameleon situation is enhanced since it could prolong the tongue by double the length. It assists the chameleon in exploiting the pursuit space and enables them to catch the target sufficiently. The speed of the tongue, once it is protracted toward the target, is arithmetically given as follows:<disp-formula id="eqn-12"><label>(12)</label>
<mml:math id="mml-eqn-12" display="block"><mml:msubsup><mml:mi>v</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mo>+</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mi>j</mml:mi></mml:mrow></mml:msubsup><mml:mo>=</mml:mo><mml:mi>w</mml:mi><mml:msubsup><mml:mi>v</mml:mi><mml:mi>t</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mi>j</mml:mi></mml:mrow></mml:msubsup><mml:mo>+</mml:mo><mml:msub><mml:mi>c</mml:mi><mml:mn>1</mml:mn></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msubsup><mml:mi>G</mml:mi><mml:mi>t</mml:mi><mml:mi>j</mml:mi></mml:msubsup><mml:mo>&#x2212;</mml:mo><mml:msubsup><mml:mi>y</mml:mi><mml:mi>t</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mi>j</mml:mi></mml:mrow></mml:msubsup></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:msub><mml:mi>c</mml:mi><mml:mn>2</mml:mn></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msubsup><mml:mi>P</mml:mi><mml:mi>t</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mi>j</mml:mi></mml:mrow></mml:msubsup><mml:mo>&#x2212;</mml:mo><mml:msubsup><mml:mi>y</mml:mi><mml:mi>t</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mi>j</mml:mi></mml:mrow></mml:msubsup></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:msub><mml:mi>r</mml:mi><mml:mn>2</mml:mn></mml:msub></mml:math>
</disp-formula></p>
<p>In <xref ref-type="disp-formula" rid="eqn-12">Eq. (12)</xref>, <inline-formula id="ieqn-70">
<mml:math id="mml-ieqn-70"><mml:msubsup><mml:mi>v</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mo>+</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mi>j</mml:mi></mml:mrow></mml:msubsup></mml:math>
</inline-formula> denotes the novel velocity of <inline-formula id="ieqn-71">
<mml:math id="mml-ieqn-71"><mml:msup><mml:mi>i</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math>
</inline-formula> chameleons in the <inline-formula id="ieqn-72">
<mml:math id="mml-ieqn-72"><mml:msup><mml:mi>j</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math>
</inline-formula> parameter of <inline-formula id="ieqn-73">
<mml:math id="mml-ieqn-73"><mml:mi>t</mml:mi><mml:mo>+</mml:mo><mml:mn>1</mml:mn></mml:math>
</inline-formula> iteration, and <inline-formula id="ieqn-74">
<mml:math id="mml-ieqn-74"><mml:msubsup><mml:mi>v</mml:mi><mml:mi>t</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mo>,</mml:mo><mml:mi>j</mml:mi></mml:mrow></mml:msubsup></mml:math>
</inline-formula> shows the existing velocity of <inline-formula id="ieqn-75">
<mml:math id="mml-ieqn-75"><mml:msup><mml:mi>i</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math>
</inline-formula> chameleon in the <inline-formula id="ieqn-76">
<mml:math id="mml-ieqn-76"><mml:msup><mml:mi>j</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math>
</inline-formula> parameter.</p>
</sec>
</sec>
<sec id="s3">
<label>3</label>
<title>Experimental Validation</title>
<p>The experimental validation of the HTDHDAE-BCC method is tested utilizing the MIAS dataset. It contains 322 images under three class labels, as depicted in <xref ref-type="table" rid="table-1">Table 1</xref>. A few sample images are demonstrated in <xref ref-type="fig" rid="fig-3">Fig. 3</xref>.</p>
<table-wrap id="table-1"><label>Table 1</label>
<caption>
<title>Dataset details</title></caption>
<table><colgroup><col align="left"/><col align="left"/>
</colgroup>
<thead>
<tr>
<th align="left">Class</th>
<th align="left">No. of samples</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">Benign</td>
<td align="left">62</td>
</tr>
<tr>
<td align="left">Malignant</td>
<td align="left">51</td>
</tr>
<tr>
<td align="left">Normal</td>
<td align="left">209</td>
</tr>
<tr>
<td align="left"><bold>Total No. of samples</bold></td>
<td align="left"><bold>322</bold></td>
</tr>
</tbody>
</table>
</table-wrap><fig id="fig-3">
<label>Figure 3</label>
<caption>
<title>Sample images (a) Normal, (b) Benign, and (c) Malignant</title></caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="IASC_34719-fig-3.tif"/>
</fig>
<p><xref ref-type="fig" rid="fig-4">Fig. 4</xref> exhibits the confusion matrices generated by the HTDHDAE-BCC model under five runs. On run-1, the HTDHDAE-BCC model categorized 62 samples into benign, 49 into malignant, and 207 samples into normal. Temporarily, on run-2, the HTDHDAE-BCC approach has categorized 62 samples into benign, 48 samples into malignant, and 207 samples into normal. Meanwhile, on run-3, the HTDHDAE-BCC technique has categorized 62 samples into benign, 48 samples into malignant, and 209 samples into normal. Finally, on run-4, the HTDHDAE-BCC algorithm has categorized 62 samples into benign, 49 samples into malignant, and 206 samples into normal.</p>
<fig id="fig-4">
<label>Figure 4</label>
<caption>
<title>Confusion matrices of HTDHDAE-BCC approach (a) Run1, (b) Run2, (c) Run3, (d) Run4, and (e) Run5</title></caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="IASC_34719-fig-4.tif"/>
</fig>
<p><xref ref-type="table" rid="table-2">Table 2</xref> provides detailed BC classification outcomes of the HTDHDAE-BCC model under five distinct runs.</p>
<table-wrap id="table-2"><label>Table 2</label>
<caption>
<title>Result analysis of HTDHDAE-BCC approach with distinct measures and runs</title></caption>
<table><colgroup><col align="left"/><col align="left"/><col align="left"/><col align="left"/><col align="left"/><col align="left"/>
</colgroup>
<thead>
<tr>
<th align="left">Labels</th>
<th align="left">Accuracy</th>
<th align="left">Sensitivity</th>
<th align="left">Specificity</th>
<th align="left">F-score</th>
<th align="left">G-mean</th>
</tr>
</thead>
<tbody><tr>
<td align="left" colspan="6">Run-1</td>
</tr>
<tr>
<td align="left">Benign</td>
<td align="left">100.00</td>
<td align="left">100.00</td>
<td align="left">100.00</td>
<td align="left">100.00</td>
<td align="left">100.00</td>
</tr>
<tr>
<td align="left">Malignant</td>
<td align="left">98.76</td>
<td align="left">96.08</td>
<td align="left">99.26</td>
<td align="left">96.08</td>
<td align="left">97.66</td>
</tr>
<tr>
<td align="left">Normal</td>
<td align="left">98.76</td>
<td align="left">99.04</td>
<td align="left">98.23</td>
<td align="left">99.04</td>
<td align="left">98.64</td>
</tr><tr>
<td align="left"><bold>Average</bold></td>
<td align="left"><bold>99.17</bold></td>
<td align="left"><bold>98.37</bold></td>
<td align="left"><bold>99.16</bold></td>
<td align="left"><bold>98.37</bold></td>
<td align="left"><bold>98.76</bold></td>
</tr><tr>
<td align="left" colspan="6">Run-2</td>
</tr>
<tr>
<td align="left">Benign</td>
<td align="left">100.00</td>
<td align="left">100.00</td>
<td align="left">100.00</td>
<td align="left">100.00</td>
<td align="left">100.00</td>
</tr>
<tr>
<td align="left">Malignant</td>
<td align="left">98.45</td>
<td align="left">94.12</td>
<td align="left">99.26</td>
<td align="left">95.05</td>
<td align="left">96.66</td>
</tr>
<tr>
<td align="left">Normal</td>
<td align="left">98.45</td>
<td align="left">99.04</td>
<td align="left">97.35</td>
<td align="left">98.81</td>
<td align="left">98.19</td>
</tr>
<tr>
<td align="left"><bold>Average</bold></td>
<td align="left"><bold>98.96</bold></td>
<td align="left"><bold>97.72</bold></td>
<td align="left"><bold>98.87</bold></td>
<td align="left"><bold>97.95</bold></td>
<td align="left"><bold>98.28</bold></td>
</tr><tr>
<td align="left" colspan="6">Run-3</td>
</tr>
<tr>
<td align="left">Benign</td>
<td align="left">100.00</td>
<td align="left">100.00</td>
<td align="left">100.00</td>
<td align="left">100.00</td>
<td align="left">100.00</td>
</tr>
<tr>
<td align="left">Malignant</td>
<td align="left">99.07</td>
<td align="left">94.12</td>
<td align="left">100.00</td>
<td align="left">96.97</td>
<td align="left">97.01</td>
</tr>
<tr>
<td align="left">Normal</td>
<td align="left">99.07</td>
<td align="left">100.00</td>
<td align="left">97.35</td>
<td align="left">99.29</td>
<td align="left">98.66</td>
</tr><tr>
<td align="left"><bold>Average</bold></td>
<td align="left"><bold>99.38</bold></td>
<td align="left"><bold>98.04</bold></td>
<td align="left"><bold>99.12</bold></td>
<td align="left"><bold>98.75</bold></td>
<td align="left"><bold>98.56</bold></td>
</tr><tr>
<td align="left" colspan="6">Run-4</td>
</tr>
<tr>
<td align="left">Benign</td>
<td align="left">99.69</td>
<td align="left">100.00</td>
<td align="left">99.62</td>
<td align="left">99.20</td>
<td align="left">99.81</td>
</tr>
<tr>
<td align="left">Malignant</td>
<td align="left">98.76</td>
<td align="left">96.08</td>
<td align="left">99.26</td>
<td align="left">96.08</td>
<td align="left">97.66</td>
</tr>
<tr>
<td align="left">Normal</td>
<td align="left">98.45</td>
<td align="left">98.56</td>
<td align="left">98.23</td>
<td align="left">98.80</td>
<td align="left">98.40</td>
</tr><tr>
<td align="left"><bold>Average</bold></td>
<td align="left"><bold>98.96</bold></td>
<td align="left"><bold>98.21</bold></td>
<td align="left"><bold>99.04</bold></td>
<td align="left"><bold>98.03</bold></td>
<td align="left"><bold>98.62</bold></td>
</tr><tr>
<td align="left" colspan="6">Run-5</td>
</tr>
<tr>
<td align="left">Benign</td>
<td align="left">100.00</td>
<td align="left">100.00</td>
<td align="left">100.00</td>
<td align="left">100.00</td>
<td align="left">100.00</td>
</tr>
<tr>
<td align="left">Malignant</td>
<td align="left">98.45</td>
<td align="left">98.04</td>
<td align="left">98.52</td>
<td align="left">95.24</td>
<td align="left">98.28</td>
</tr>
<tr>
<td align="left">Normal</td>
<td align="left">98.45</td>
<td align="left">98.09</td>
<td align="left">99.12</td>
<td align="left">98.80</td>
<td align="left">98.60</td>
</tr>
<tr>
<td align="left"><bold>Average</bold></td>
<td align="left"><bold>98.96</bold></td>
<td align="left"><bold>98.71</bold></td>
<td align="left"><bold>99.21</bold></td>
<td align="left"><bold>98.01</bold></td>
<td align="left"><bold>98.96</bold></td>
</tr>
</tbody>
</table>
</table-wrap>
<p><xref ref-type="fig" rid="fig-5">Fig. 5</xref> provides the overall classifier results of the HTDHDAE-BCC model on run-1. The figure shows that the HTDHDAE-BCC model has enhanced results under all classes. For instance, the HTDHDAE-BCC model has categorized benign images with <inline-formula id="ieqn-77">
<mml:math id="mml-ieqn-77"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-78">
<mml:math id="mml-ieqn-78"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-79">
<mml:math id="mml-ieqn-79"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-80">
<mml:math id="mml-ieqn-80"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mi>s</mml:mi><mml:mi>c</mml:mi><mml:mi>o</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula>, and <inline-formula id="ieqn-81">
<mml:math id="mml-ieqn-81"><mml:msub><mml:mi>G</mml:mi><mml:mrow><mml:mi>m</mml:mi><mml:mi>e</mml:mi><mml:mi>a</mml:mi><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> of 100&#x0025;, 100&#x0025;, 100&#x0025;, 100&#x0025;, and 100&#x0025;, respectively. Further, the HTDHDAE-BCC technique has categorized malignant images with <inline-formula id="ieqn-82">
<mml:math id="mml-ieqn-82"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-83">
<mml:math id="mml-ieqn-83"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-84">
<mml:math id="mml-ieqn-84"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-85">
<mml:math id="mml-ieqn-85"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mi>s</mml:mi><mml:mi>c</mml:mi><mml:mi>o</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula>, and <inline-formula id="ieqn-86">
<mml:math id="mml-ieqn-86"><mml:msub><mml:mi>G</mml:mi><mml:mrow><mml:mi>m</mml:mi><mml:mi>e</mml:mi><mml:mi>a</mml:mi><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> of 98.76&#x0025;, 96.06&#x0025;, 99.26&#x0025;, 96.08&#x0025;, and 97.66&#x0025; correspondingly. In the meantime, the HTDHDAE-BCC technique has categorized normal images with <inline-formula id="ieqn-87">
<mml:math id="mml-ieqn-87"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-88">
<mml:math id="mml-ieqn-88"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-89">
<mml:math id="mml-ieqn-89"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-90">
<mml:math id="mml-ieqn-90"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mi>s</mml:mi><mml:mi>c</mml:mi><mml:mi>o</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula>, and <inline-formula id="ieqn-91">
<mml:math id="mml-ieqn-91"><mml:msub><mml:mi>G</mml:mi><mml:mrow><mml:mi>m</mml:mi><mml:mi>e</mml:mi><mml:mi>a</mml:mi><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> of 98.76&#x0025;, 99.04&#x0025;, 98.23&#x0025;, 99.04&#x0025;, and 98.64&#x0025; correspondingly.</p>
<fig id="fig-5">
<label>Figure 5</label>
<caption>
<title>Result analysis of HTDHDAE-BCC approach under run-1</title></caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="IASC_34719-fig-5.tif"/>
</fig>
<p><xref ref-type="fig" rid="fig-6">Fig. 6</xref> presents the complete classifier results of the HTDHDAE-BCC method on run-2. The figure displayed the HTDHDAE-BCC approach has offered enhanced results under all classes. For example, the HTDHDAE-BCC methodology has categorized benign images with <inline-formula id="ieqn-92">
<mml:math id="mml-ieqn-92"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-93">
<mml:math id="mml-ieqn-93"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-94">
<mml:math id="mml-ieqn-94"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-95">
<mml:math id="mml-ieqn-95"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mi>s</mml:mi><mml:mi>c</mml:mi><mml:mi>o</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula>, and <inline-formula id="ieqn-96">
<mml:math id="mml-ieqn-96"><mml:msub><mml:mi>G</mml:mi><mml:mrow><mml:mi>m</mml:mi><mml:mi>e</mml:mi><mml:mi>a</mml:mi><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> of 100&#x0025;, 100&#x0025;, 100&#x0025;, 100&#x0025;, and 100&#x0025;, respectively. Additionally, the HTDHDAE-BCC technique has categorized malignant images with <inline-formula id="ieqn-97">
<mml:math id="mml-ieqn-97"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-98">
<mml:math id="mml-ieqn-98"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-99">
<mml:math id="mml-ieqn-99"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-100">
<mml:math id="mml-ieqn-100"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mi>s</mml:mi><mml:mi>c</mml:mi><mml:mi>o</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula>, and <inline-formula id="ieqn-101">
<mml:math id="mml-ieqn-101"><mml:msub><mml:mi>G</mml:mi><mml:mrow><mml:mi>m</mml:mi><mml:mi>e</mml:mi><mml:mi>a</mml:mi><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> of 98.45&#x0025;, 94.21&#x0025;, 99.26&#x0025;, 95.05&#x0025;, and 96.66&#x0025;, respectively. Meanwhile, the HTDHDAE-BCC algorithm has categorized normal images with <inline-formula id="ieqn-102">
<mml:math id="mml-ieqn-102"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-103">
<mml:math id="mml-ieqn-103"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-104">
<mml:math id="mml-ieqn-104"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-105">
<mml:math id="mml-ieqn-105"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mi>s</mml:mi><mml:mi>c</mml:mi><mml:mi>o</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula>, and <inline-formula id="ieqn-106">
<mml:math id="mml-ieqn-106"><mml:msub><mml:mi>G</mml:mi><mml:mrow><mml:mi>m</mml:mi><mml:mi>e</mml:mi><mml:mi>a</mml:mi><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> of 98.45&#x0025;, 99.04&#x0025;, 97.35&#x0025;, 98.81&#x0025;, and 98.19&#x0025; correspondingly.</p>
<fig id="fig-6">
<label>Figure 6</label>
<caption>
<title>Result analysis of HTDHDAE-BCC approach under run-2</title></caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="IASC_34719-fig-6.tif"/>
</fig>
<p><xref ref-type="fig" rid="fig-7">Fig. 7</xref> illustrates the inclusive classifier results of the HTDHDAE-BCC technique on run-3. The figure represented the HTDHDAE-BCC approach has presented enhanced results under all classes. For example, the HTDHDAE-BCC method has categorized benign images with <inline-formula id="ieqn-107">
<mml:math id="mml-ieqn-107"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-108">
<mml:math id="mml-ieqn-108"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-109">
<mml:math id="mml-ieqn-109"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-110">
<mml:math id="mml-ieqn-110"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mi>s</mml:mi><mml:mi>c</mml:mi><mml:mi>o</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula>, and <inline-formula id="ieqn-111">
<mml:math id="mml-ieqn-111"><mml:msub><mml:mi>G</mml:mi><mml:mrow><mml:mi>m</mml:mi><mml:mi>e</mml:mi><mml:mi>a</mml:mi><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> of 100&#x0025;, 100&#x0025;, 100&#x0025;, 100&#x0025;, and 100&#x0025;, correspondingly. Moreover, the HTDHDAE-BCC methodology has categorized malignant images with <inline-formula id="ieqn-112">
<mml:math id="mml-ieqn-112"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-113">
<mml:math id="mml-ieqn-113"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-114">
<mml:math id="mml-ieqn-114"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-115">
<mml:math id="mml-ieqn-115"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mi>s</mml:mi><mml:mi>c</mml:mi><mml:mi>o</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula>, and <inline-formula id="ieqn-116">
<mml:math id="mml-ieqn-116"><mml:msub><mml:mi>G</mml:mi><mml:mrow><mml:mi>m</mml:mi><mml:mi>e</mml:mi><mml:mi>a</mml:mi><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> of 99.07&#x0025;, 94.12&#x0025;, 100&#x0025;, 96.97&#x0025;, and 97.01&#x0025; correspondingly. In parallel, the HTDHDAE-BCC approach has categorized normal image with <inline-formula id="ieqn-117">
<mml:math id="mml-ieqn-117"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-118">
<mml:math id="mml-ieqn-118"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-119">
<mml:math id="mml-ieqn-119"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-120">
<mml:math id="mml-ieqn-120"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mi>s</mml:mi><mml:mi>c</mml:mi><mml:mi>o</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula>, and <inline-formula id="ieqn-121">
<mml:math id="mml-ieqn-121"><mml:msub><mml:mi>G</mml:mi><mml:mrow><mml:mi>m</mml:mi><mml:mi>e</mml:mi><mml:mi>a</mml:mi><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> of 99.07&#x0025;, 100&#x0025;, 97.35&#x0025;, 99.29&#x0025;, and 98.66&#x0025; correspondingly.</p>
<fig id="fig-7">
<label>Figure 7</label>
<caption>
<title>Result analysis of HTDHDAE-BCC approach under run-3</title></caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="IASC_34719-fig-7.tif"/>
</fig>
<p><xref ref-type="fig" rid="fig-8">Fig. 8</xref> portrays the general classifier results of the HTDHDAE-BCC methodology on run-4. The figure denoted the HTDHDAE-BCC technique has provided enhanced results under all classes. For example, the HTDHDAE-BCC approach has categorized benign images with <inline-formula id="ieqn-122">
<mml:math id="mml-ieqn-122"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-123">
<mml:math id="mml-ieqn-123"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-124">
<mml:math id="mml-ieqn-124"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-125">
<mml:math id="mml-ieqn-125"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mi>s</mml:mi><mml:mi>c</mml:mi><mml:mi>o</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula>, and <inline-formula id="ieqn-126">
<mml:math id="mml-ieqn-126"><mml:msub><mml:mi>G</mml:mi><mml:mrow><mml:mi>m</mml:mi><mml:mi>e</mml:mi><mml:mi>a</mml:mi><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> of 99.69&#x0025;, 100&#x0025;, 99.62&#x0025;, 99.20&#x0025;, and 99.81&#x0025; correspondingly. Further, the HTDHDAE-BCC algorithm has categorized malignant images with <inline-formula id="ieqn-127">
<mml:math id="mml-ieqn-127"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-128">
<mml:math id="mml-ieqn-128"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-129">
<mml:math id="mml-ieqn-129"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-130">
<mml:math id="mml-ieqn-130"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mi>s</mml:mi><mml:mi>c</mml:mi><mml:mi>o</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula>, and <inline-formula id="ieqn-131">
<mml:math id="mml-ieqn-131"><mml:msub><mml:mi>G</mml:mi><mml:mrow><mml:mi>m</mml:mi><mml:mi>e</mml:mi><mml:mi>a</mml:mi><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> of 99.76&#x0025;, 96.08&#x0025;, 99.26&#x0025;, 96.08&#x0025;, and 97.66&#x0025; correspondingly. In the meantime, the HTDHDAE-BCC technique has categorized normal images with <inline-formula id="ieqn-132">
<mml:math id="mml-ieqn-132"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-133">
<mml:math id="mml-ieqn-133"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-134">
<mml:math id="mml-ieqn-134"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-135">
<mml:math id="mml-ieqn-135"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mi>s</mml:mi><mml:mi>c</mml:mi><mml:mi>o</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula>, and <inline-formula id="ieqn-136">
<mml:math id="mml-ieqn-136"><mml:msub><mml:mi>G</mml:mi><mml:mrow><mml:mi>m</mml:mi><mml:mi>e</mml:mi><mml:mi>a</mml:mi><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> of 98.45&#x0025;, 98.56&#x0025;, 98.23&#x0025;, 98.80&#x0025;, and 98.40&#x0025; correspondingly.</p>
<fig id="fig-8">
<label>Figure 8</label>
<caption>
<title>Result analysis of HTDHDAE-BCC approach under run-4</title></caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="IASC_34719-fig-8.tif"/>
</fig>
<p><xref ref-type="fig" rid="fig-9">Fig. 9</xref> delivers an overall classifier fallout of the HTDHDAE-BCC approach on run-5. The figure exemplified the HTDHDAE-BCC methodology has rendered enhanced results under all classes. For example, the HTDHDAE-BCC algorithm has categorized benign images with <inline-formula id="ieqn-137">
<mml:math id="mml-ieqn-137"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-138">
<mml:math id="mml-ieqn-138"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-139">
<mml:math id="mml-ieqn-139"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-140">
<mml:math id="mml-ieqn-140"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mi>s</mml:mi><mml:mi>c</mml:mi><mml:mi>o</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula>, and <inline-formula id="ieqn-141">
<mml:math id="mml-ieqn-141"><mml:msub><mml:mi>G</mml:mi><mml:mrow><mml:mi>m</mml:mi><mml:mi>e</mml:mi><mml:mi>a</mml:mi><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> of 100&#x0025;, 100&#x0025;, 100&#x0025;, 100&#x0025;, and 100&#x0025; correspondingly. Additionally, the HTDHDAE-BCC method has categorized malignant images with <inline-formula id="ieqn-142">
<mml:math id="mml-ieqn-142"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-143">
<mml:math id="mml-ieqn-143"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-144">
<mml:math id="mml-ieqn-144"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-145">
<mml:math id="mml-ieqn-145"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mi>s</mml:mi><mml:mi>c</mml:mi><mml:mi>o</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula>, and <inline-formula id="ieqn-146">
<mml:math id="mml-ieqn-146"><mml:msub><mml:mi>G</mml:mi><mml:mrow><mml:mi>m</mml:mi><mml:mi>e</mml:mi><mml:mi>a</mml:mi><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> of 98.45&#x0025;, 98.04&#x0025;, 98.52&#x0025;, 95.24&#x0025;, and 98.28&#x0025; correspondingly. In the meantime, the HTDHDAE-BCC approach has categorized normal image with <inline-formula id="ieqn-147">
<mml:math id="mml-ieqn-147"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-143a">
<mml:math id="mml-ieqn-143a"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-148">
<mml:math id="mml-ieqn-148"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-149">
<mml:math id="mml-ieqn-149"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mi>s</mml:mi><mml:mi>c</mml:mi><mml:mi>o</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula>, and <inline-formula id="ieqn-150">
<mml:math id="mml-ieqn-150"><mml:msub><mml:mi>G</mml:mi><mml:mrow><mml:mi>m</mml:mi><mml:mi>e</mml:mi><mml:mi>a</mml:mi><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> of 98.45&#x0025;, 98.09&#x0025;, 99.12&#x0025;, 98.80&#x0025;, and 98.60&#x0025; correspondingly.</p>
<fig id="fig-9">
<label>Figure 9</label>
<caption>
<title>Result analysis of HTDHDAE-BCC approach under run-5</title></caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="IASC_34719-fig-9.tif"/>
</fig>
<p><xref ref-type="fig" rid="fig-10">Fig. 10</xref> demonstrates the average classification outcomes of the HTDHDAE-BCC model. On run-1, the HTDHDAE-BCC model has attained average <inline-formula id="ieqn-151">
<mml:math id="mml-ieqn-151"><mml:mtext>&#x00A0;</mml:mtext><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-152">
<mml:math id="mml-ieqn-152"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-153">
<mml:math id="mml-ieqn-153"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-154">
<mml:math id="mml-ieqn-154"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mi>s</mml:mi><mml:mi>c</mml:mi><mml:mi>o</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula>, and <inline-formula id="ieqn-155">
<mml:math id="mml-ieqn-155"><mml:msub><mml:mi>G</mml:mi><mml:mrow><mml:mi>m</mml:mi><mml:mi>e</mml:mi><mml:mi>a</mml:mi><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> of 99.17&#x0025;, 98.37&#x0025;, 99.16&#x0025;, 98.37&#x0025;, and 98.76&#x0025; respectively. In line with, on run-2, the HTDHDAE-BCC approach has gained average <inline-formula id="ieqn-156">
<mml:math id="mml-ieqn-156"><mml:mtext>&#x00A0;</mml:mtext><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-157">
<mml:math id="mml-ieqn-157"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-158">
<mml:math id="mml-ieqn-158"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-159">
<mml:math id="mml-ieqn-159"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mi>s</mml:mi><mml:mi>c</mml:mi><mml:mi>o</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula>, and <inline-formula id="ieqn-160">
<mml:math id="mml-ieqn-160"><mml:msub><mml:mi>G</mml:mi><mml:mrow><mml:mi>m</mml:mi><mml:mi>e</mml:mi><mml:mi>a</mml:mi><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> of 98.96&#x0025;, 97.92&#x0025;, 98.87&#x0025;, 97.95&#x0025;, and 98.28&#x0025; correspondingly. Additionally, on run-4, the HTDHDAE-BCC model has obtained average <inline-formula id="ieqn-161">
<mml:math id="mml-ieqn-161"><mml:mtext>&#x00A0;</mml:mtext><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-162">
<mml:math id="mml-ieqn-162"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-163">
<mml:math id="mml-ieqn-163"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-164">
<mml:math id="mml-ieqn-164"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mi>s</mml:mi><mml:mi>c</mml:mi><mml:mi>o</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula>, and <inline-formula id="ieqn-165">
<mml:math id="mml-ieqn-165"><mml:msub><mml:mi>G</mml:mi><mml:mrow><mml:mi>m</mml:mi><mml:mi>e</mml:mi><mml:mi>a</mml:mi><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> of 98.96&#x0025;, 98.21&#x0025;, 99.04&#x0025;, 98.03&#x0025;, and 98.62&#x0025;, correspondingly. At last, on run-5, that the HTDHDAE-BCC approach has reached average <inline-formula id="ieqn-166">
<mml:math id="mml-ieqn-166"><mml:mtext>&#x00A0;</mml:mtext><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-167">
<mml:math id="mml-ieqn-167"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-168">
<mml:math id="mml-ieqn-168"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, <inline-formula id="ieqn-169">
<mml:math id="mml-ieqn-169"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mi>s</mml:mi><mml:mi>c</mml:mi><mml:mi>o</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula>, and <inline-formula id="ieqn-170">
<mml:math id="mml-ieqn-170"><mml:msub><mml:mi>G</mml:mi><mml:mrow><mml:mi>m</mml:mi><mml:mi>e</mml:mi><mml:mi>a</mml:mi><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math>
</inline-formula> of 98.96&#x0025;, 98.71&#x0025;, 99.21&#x0025;, 98.01&#x0025;, and 98.96&#x0025; correspondingly.</p>
<fig id="fig-10">
<label>Figure 10</label>
<caption>
<title>Average analysis of HTDHDAE-BCC approach (a) Run1, (b) Run2, (c) Run3, (d) Run4, and (e) Run5</title></caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="IASC_34719-fig-10.tif"/>
</fig>
<p>The training accuracy (TRA) and validation accuracy (VLA) obtained by the HTDHDAE-BCC method on the test dataset is illustrated in <xref ref-type="fig" rid="fig-11">Fig. 11</xref>. The experimental outcome inferred that the HTDHDAE-BCC technique had achieved maximal values of TRA and VLA. Particularly the VLA seemed to be higher than TRA.</p>
<fig id="fig-11">
<label>Figure 11</label>
<caption>
<title>TRA and VLA analysis of HTDHDAE-BCC approach</title></caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="IASC_34719-fig-11.tif"/>
</fig>
<p>The training loss (TRL) and validation loss (VLL) achieved by the HTDHDAE-BCC approach on the test dataset are established in <xref ref-type="fig" rid="fig-12">Fig. 12</xref>. The experimental outcome implied that the HTDHDAE-BCC algorithm had accomplished the least values of TRL and VLL. In specific, the VLL is lower than TRL.</p>
<fig id="fig-12">
<label>Figure 12</label>
<caption>
<title>TRL and VLL analysis of HTDHDAE-BCC approach</title></caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="IASC_34719-fig-12.tif"/>
</fig>
<p>A brief ROC analysis of the HTDHDAE-BCC method on the test dataset is depicted in <xref ref-type="fig" rid="fig-13">Fig. 13</xref>. The results denoted the HTDHDAE-BCC approach has shown its ability to categorize distinct classes on the test dataset.</p>
<fig id="fig-13">
<label>Figure 13</label>
<caption>
<title>ROC analysis of HTDHDAE-BCC approach</title></caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="IASC_34719-fig-13.tif"/>
</fig>
<p>To ensure the enhanced results of the HTDHDAE-BCC model, a brief comparative examination is offered in <xref ref-type="table" rid="table-3">Table 3</xref> [<xref ref-type="bibr" rid="ref-21">21</xref>]. The outcomes inferred the improved outcomes of the HTDHDAE-BCC model compared to existing techniques. Based on <inline-formula id="ieqn-171">
<mml:math id="mml-ieqn-171"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, the HTDHDAE-BCC model has offered a higher <inline-formula id="ieqn-172">
<mml:math id="mml-ieqn-172"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula> of 99.38&#x0025;, whereas the deep computer-aided diagnosis (deep-CAD), CNN, improved CNN, Opt. DL and DL-TLT models have obtained lower <inline-formula id="ieqn-173">
<mml:math id="mml-ieqn-173"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula> of 92.47&#x0025;, 85.87&#x0025;, 86.25&#x0025;, 91.93&#x0025;, and 98.90&#x0025; respectively.</p>
<table-wrap id="table-3"><label>Table 3</label>
<caption>
<title>Comparative analysis of HTDHDAE-BCC approach with existing algorithms</title></caption>
<table><colgroup><col align="left"/><col align="left"/><col align="left"/><col align="left"/>
</colgroup>
<thead>
<tr>
<th align="left">Methods</th>
<th align="left">Accuracy</th>
<th align="left">Sensitivity</th>
<th align="left">Specificity</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">HTDHDAE-BCC</td>
<td align="left">99.38</td>
<td align="left">98.04</td>
<td align="left">99.12</td>
</tr>
<tr>
<td align="left">Deep-CAD</td>
<td align="left">92.47</td>
<td align="left">97.70</td>
<td align="left">88.99</td>
</tr>
<tr>
<td align="left">CNN</td>
<td align="left">85.87</td>
<td align="left">88.52</td>
<td align="left">91.46</td>
</tr>
<tr>
<td align="left">Improved CNN</td>
<td align="left">86.25</td>
<td align="left">94.54</td>
<td align="left">94.22</td>
</tr>
<tr>
<td align="left">Opt. DL model</td>
<td align="left">91.93</td>
<td align="left">91.52</td>
<td align="left">95.63</td>
</tr>
<tr>
<td align="left">DL-TLT</td>
<td align="left">98.90</td>
<td align="left">97.63</td>
<td align="left">99.04</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>Simultaneously, based on <inline-formula id="ieqn-174">
<mml:math id="mml-ieqn-174"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, the HTDHDAE-BCC model has offered a higher <inline-formula id="ieqn-175">
<mml:math id="mml-ieqn-175"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula> of 98.04&#x0025;, whereas the deep-CAD, CNN, improved CNN, Opt. DL and DL-TLT models have obtained lower <inline-formula id="ieqn-176">
<mml:math id="mml-ieqn-176"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula> of 97.70&#x0025;, 88.52&#x0025;, 94.54&#x0025;, 91.52&#x0025;, and 97.63&#x0025;, respectively. Concurrently, based on <inline-formula id="ieqn-177">
<mml:math id="mml-ieqn-177"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula>, the HTDHDAE-BCC model has offered a higher <inline-formula id="ieqn-178">
<mml:math id="mml-ieqn-178"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula> of 99.12&#x0025; whereas the deep-CAD, CNN, improved CNN, Opt. DL and DL-TLT models have obtained lower <inline-formula id="ieqn-179">
<mml:math id="mml-ieqn-179"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mi>y</mml:mi></mml:msub></mml:math>
</inline-formula> of 88.99&#x0025;, 91.46&#x0025;, 94.22&#x0025;, 95.63&#x0025;, and 99.04&#x0025;, respectively. These values ensured that the HTDHDAE-BCC model had shown effectual BC classification outcomes over existing DL models.</p>
</sec>
<sec id="s4">
<label>4</label>
<title>Conclusion</title>
<p>In this paper, a novel HTDHDAE-BCC algorithm was projected for the classification of BC on digital mammograms. In the HTDHDAE-BCC model, the initial stage of image pre-processing is carried out using an average median filter. In addition, the deep convolutional neural network-based Inception v4 model is employed to generate feature vectors. The parameter tuning process can be performed using the binary spider monkey optimization (BSMO) algorithm. The HTDHDAE-BCC model exploits the CSO algorithm with the DHDAE model for BC classification. The experimental analysis of the HTDHDAE-BCC model is performed using the MIAS database. The experimental outcomes demonstrate the betterments of the HTDHDAE-BCC model over other recent approaches. In the upcoming years, the performance of the HTDHDAE-BCC technique can be boosted by a deep instance segmentation model.</p>
</sec>
</body>
<back>
<sec><title>Funding Statement</title>
<p>This project was supported by the <funding-source>Deanship of Scientific Research at Prince SattamBin Abdulaziz University</funding-source> under research Project&#x0023; (<award-id>PSAU-2022/01/20287</award-id>).</p>
</sec>
<sec sec-type="COI-statement">
<title>Conflicts of Interest</title>
<p>The author declares that they have no conflicts of interest to report regarding the present study.</p>
</sec>
<ref-list content-type="authoryear">
<title>References</title>
<ref id="ref-1"><label>[1]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>L.</given-names> <surname>Tsochatzidis</surname></string-name>, <string-name><given-names>L.</given-names> <surname>Costaridou</surname></string-name> and <string-name><given-names>I.</given-names> <surname>Pratikakis</surname></string-name></person-group>, &#x201C;<article-title>Deep learning for breast cancer diagnosis from mammograms&#x2014;A comparative study</article-title>,&#x201D; <source>Journal of Imaging</source>, vol. <volume>5</volume>, no. <issue>3</issue>, pp. <fpage>37</fpage>, <year>2019</year>; <pub-id pub-id-type="pmid">34460465</pub-id></mixed-citation></ref>
<ref id="ref-2"><label>[2]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>Y. J.</given-names> <surname>Suh</surname></string-name>, <string-name><given-names>J.</given-names> <surname>g</surname></string-name> and <string-name><given-names>B. J.</given-names> <surname>Cho</surname></string-name></person-group>, &#x201C;<article-title>Automated breast cancer detection in digital mammograms of various densities via deep learning</article-title>,&#x201D; <source>Journal of Personalized Medicine</source>, vol. <volume>10</volume>, no. <issue>4</issue>, pp. <fpage>211</fpage>, <year>2020</year>; <pub-id pub-id-type="pmid">33172076</pub-id></mixed-citation></ref>
<ref id="ref-3"><label>[3]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>A.</given-names> <surname>Akselrod-Ballin</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Chorev</surname></string-name>, <string-name><given-names>Y.</given-names> <surname>Shoshan</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Spiro</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Hazan</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>Predicting breast cancer by applying deep learning to linked health records and mammograms</article-title>,&#x201D; <source>Radiology</source>, vol. <volume>292</volume>, no. <issue>2</issue>, pp. <fpage>331</fpage>&#x2013;<lpage>342</lpage>, <year>2019</year>; <pub-id pub-id-type="pmid">31210611</pub-id></mixed-citation></ref>
<ref id="ref-4"><label>[4]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>L.</given-names> <surname>Shen</surname></string-name>, <string-name><given-names>L. R.</given-names> <surname>Golies</surname></string-name>, <string-name><given-names>J. H.</given-names> <surname>Rothstein</surname></string-name>, <string-name><given-names>E.</given-names> <surname>Fluder</surname></string-name>, <string-name><given-names>R.</given-names> <surname>McBride</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>Deep learning to improve breast cancer detection on screening mammography</article-title>,&#x201D; <source>Scientific Reports</source>, vol. <volume>9</volume>, no. <issue>1</issue>, pp. <fpage>1</fpage>&#x2013;<lpage>12</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-5"><label>[5]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>A.</given-names> <surname>Kumar</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Mukherjee</surname></string-name> and <string-name><given-names>A. K.</given-names> <surname>Luhach</surname></string-name></person-group>, &#x201C;<article-title>Deep learning with perspective modeling for early detection of malignancy in mammograms</article-title>,&#x201D; <source>Journal of Discrete Mathematical Sciences and Cryptography</source>, vol. <volume>22</volume>, no. <issue>4</issue>, pp. <fpage>627</fpage>&#x2013;<lpage>643</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-6"><label>[6]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>C. D.</given-names> <surname>Lehman</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Yala</surname></string-name>, <string-name><given-names>T.</given-names> <surname>Schuster</surname></string-name>, <string-name><given-names>B.</given-names> <surname>Dontchos</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Bahl</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>Mammographic breast density assessment using deep learning: Clinical implementation</article-title>,&#x201D; <source>Radiology</source>, vol. <volume>290</volume>, no. <issue>1</issue>, pp. <fpage>52</fpage>&#x2013;<lpage>58</lpage>, <year>2019</year>; <pub-id pub-id-type="pmid">30325282</pub-id></mixed-citation></ref>
<ref id="ref-7"><label>[7]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>X.</given-names> <surname>Zhu</surname></string-name>, <string-name><given-names>T. K.</given-names> <surname>Wolfgruber</surname></string-name>, <string-name><given-names>L.</given-names> <surname>Leong</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Jensen</surname></string-name>, <string-name><given-names>C.</given-names> <surname>Scott</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>Deep learning predicts interval and screening-detected cancer from screening mammograms: A case-case-control study in 6369 women</article-title>,&#x201D; <source>Radiology</source>, vol. <volume>301</volume>, no. <issue>3</issue>, pp. <fpage>550</fpage>&#x2013;<lpage>558</lpage>, <year>2021</year>; <pub-id pub-id-type="pmid">34491131</pub-id></mixed-citation></ref>
<ref id="ref-8"><label>[8]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>D.</given-names> <surname>Arefan</surname></string-name>, <string-name><given-names>A. A.</given-names> <surname>Mohamed</surname></string-name>, <string-name><given-names>W. A.</given-names> <surname>Berg</surname></string-name>, <string-name><given-names>M. L.</given-names> <surname>Zuley</surname></string-name>, <string-name><given-names>J. H.</given-names> <surname>Sumkin</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>Deep learning modeling using normal mammograms for predicting breast cancer risk</article-title>,&#x201D; <source>Medical Physics</source>, vol. <volume>47</volume>, no. <issue>1</issue>, pp. <fpage>110</fpage>&#x2013;<lpage>118</lpage>, <year>2020</year>; <pub-id pub-id-type="pmid">31667873</pub-id></mixed-citation></ref>
<ref id="ref-9"><label>[9]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>W.</given-names> <surname>Lotter</surname></string-name>, <string-name><given-names>A. R.</given-names> <surname>Diab</surname></string-name>, <string-name><given-names>B.</given-names> <surname>Haslam</surname></string-name>, <string-name><given-names>J. G.</given-names> <surname>Kim</surname></string-name>, <string-name><given-names>G.</given-names> <surname>Grisot</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>Robust breast cancer detection in mammography and digital breast tomosynthesis using an annotation-efficient deep learning approach</article-title>,&#x201D; <source>Nature Medicine</source>, vol. <volume>27</volume>, no. <issue>2</issue>, pp. <fpage>244</fpage>&#x2013;<lpage>249</lpage>, <year>2021</year>; <pub-id pub-id-type="pmid">33432172</pub-id></mixed-citation></ref>
<ref id="ref-10"><label>[10]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S. S.</given-names> <surname>Chakravarthy</surname></string-name> and <string-name><given-names>H.</given-names> <surname>Rajaguru</surname></string-name></person-group>, &#x201C;<article-title>Automatic detection and classification of mammograms using improved extreme learning machine with deep learning</article-title>,&#x201D; <source>IRBM</source>, vol. <volume>43</volume>, no. <issue>1</issue>, pp. <fpage>49</fpage>&#x2013;<lpage>61</lpage>, <year>2022</year>.</mixed-citation></ref>
<ref id="ref-11"><label>[11]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>G.</given-names> <surname>Altan</surname></string-name></person-group>, &#x201C;<article-title>Deep learning-based mammogram classification for breast cancer</article-title>,&#x201D; <source>International Journal of Intelligent Systems and Applications in Engineering</source>, vol. <volume>8</volume>, no. <issue>4</issue>, pp. <fpage>171</fpage>&#x2013;<lpage>176</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-12"><label>[12]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S. A. A.</given-names> <surname>Hassan</surname></string-name>, <string-name><given-names>M. S.</given-names> <surname>Sayed</surname></string-name>, <string-name><given-names>M. I.</given-names> <surname>Abdalla</surname></string-name> and <string-name><given-names>M. A.</given-names> <surname>Rashwan</surname></string-name></person-group>, &#x201C;<article-title>Breast cancer masses classification using deep convolutional neural networks and transfer learning</article-title>,&#x201D; <source>Multimedia Tools and Applications</source>, vol. <volume>79</volume>, no. <issue>41</issue>, pp. <fpage>30735</fpage>&#x2013;<lpage>30768</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-13"><label>[13]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>J. D. L.</given-names> <surname>Cabrera</surname></string-name>, <string-name><given-names>L. A. L.</given-names> <surname>Rodr&#x00ED;guez</surname></string-name> and <string-name><given-names>M. P.</given-names> <surname>D&#x00ED;az</surname></string-name></person-group>, &#x201C;<article-title>Classification of breast cancer from digital mammography using deep learning</article-title>,&#x201D; <source>Inteligencia Artificial</source>, vol. <volume>23</volume>, no. <issue>65</issue>, pp. <fpage>56</fpage>&#x2013;<lpage>66</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-14"><label>[14]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>T.</given-names> <surname>Kavitha</surname></string-name>, <string-name><given-names>P. P.</given-names> <surname>Mathai</surname></string-name>, <string-name><given-names>C.</given-names> <surname>Karthikeyan</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Ashok</surname></string-name>, <string-name><given-names>R.</given-names> <surname>Kohar</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>Deep learning based capsule neural network model for breast cancer diagnosis using mammogram images</article-title>,&#x201D; <source>Interdisciplinary Sciences: Computational Life Sciences</source>, vol. <volume>14</volume>, no. <issue>1</issue>, pp. <fpage>113</fpage>&#x2013;<lpage>129</lpage>, <year>2022</year>; <pub-id pub-id-type="pmid">34338956</pub-id></mixed-citation></ref>
<ref id="ref-15"><label>[15]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>N.</given-names> <surname>Saffari</surname></string-name>, <string-name><given-names>H. A.</given-names> <surname>Rashwan</surname></string-name>, <string-name><given-names>M. A.</given-names> <surname>Nasser</surname></string-name>, <string-name><given-names>V. K.</given-names> <surname>Singh</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Arenas</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>Fully automated breast density segmentation and classification using deep learning</article-title>,&#x201D; <source>Diagnostics</source>, vol. <volume>10</volume>, no. <issue>11</issue>, pp. <fpage>988</fpage>, <year>2020</year>; <pub-id pub-id-type="pmid">33238512</pub-id></mixed-citation></ref>
<ref id="ref-16"><label>[16]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>Zahoor</surname></string-name>, <string-name><given-names>U.</given-names> <surname>Shoaib</surname></string-name> and <string-name><given-names>I. U.</given-names> <surname>Lali</surname></string-name></person-group>, &#x201C;<article-title>Breast cancer mammograms classification using deep neural network and entropy-controlled whale optimization algorithm</article-title>,&#x201D; <source>Diagnostics</source>, vol. <volume>12</volume>, no. <issue>2</issue>, pp. <fpage>557</fpage>, <year>2022</year>; <pub-id pub-id-type="pmid">35204646</pub-id></mixed-citation></ref>
<ref id="ref-17"><label>[17]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>M. A. S.</given-names> <surname>Al Husaini</surname></string-name>, <string-name><given-names>M. H.</given-names> <surname>Habaebi</surname></string-name>, <string-name><given-names>T. S.</given-names> <surname>Gunawan</surname></string-name>, <string-name><given-names>M. R.</given-names> <surname>Islam</surname></string-name>, <string-name><given-names>E. A.</given-names> <surname>Elsheikh</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>Thermal-based early breast cancer detection using inception V3, inception V4 and modified inception MV4</article-title>,&#x201D; <source>Neural Computing and Applications</source>, vol. <volume>34</volume>, no. <issue>1</issue>, pp. <fpage>333</fpage>&#x2013;<lpage>348</lpage>, <year>2022</year>; <pub-id pub-id-type="pmid">34393379</pub-id></mixed-citation></ref>
<ref id="ref-18"><label>[18]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>N.</given-names> <surname>Khare</surname></string-name>, <string-name><given-names>P.</given-names> <surname>Devan</surname></string-name>, <string-name><given-names>C. L.</given-names> <surname>Chowdhary</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Bhattacharya</surname></string-name>, <string-name><given-names>G.</given-names> <surname>Singh</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>Smo-dnn: Spider monkey optimization and deep neural network hybrid classifier model for intrusion detection</article-title>,&#x201D; <source>Electronics</source>, vol. <volume>9</volume>, no. <issue>4</issue>, pp. <fpage>692</fpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-19"><label>[19]</label><mixed-citation publication-type="other"><person-group person-group-type="author"><string-name><given-names>A. G.</given-names> <surname>Ororbia</surname> <suffix>II</suffix></string-name>, <string-name><given-names>C. L.</given-names> <surname>Giles</surname></string-name> and <string-name><given-names>D.</given-names> <surname>Reitter</surname></string-name></person-group>, &#x201C;<article-title>Online semi-supervised learning with deep hybrid boltzmann machines and denoising autoencoders</article-title>,&#x201D; <comment>arXiv preprint arXiv:1511.06964</comment>, <year>2015</year>.</mixed-citation></ref>
<ref id="ref-20"><label>[20]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>M.</given-names> <surname>Said</surname></string-name>, <string-name><given-names>A. M.</given-names> <surname>El-Rifaie</surname></string-name>, <string-name><given-names>M. A.</given-names> <surname>Tolba</surname></string-name>, <string-name><given-names>E. H.</given-names> <surname>Houssein</surname></string-name> and <string-name><given-names>S.</given-names> <surname>Deb</surname></string-name></person-group>, &#x201C;<article-title>An efficient chameleon swarm algorithm for economic load dispatch problem</article-title>,&#x201D; <source>Mathematics</source>, vol. <volume>9</volume>, no. <issue>21</issue>, pp. <fpage>2770</fpage>, <year>2021</year>.</mixed-citation></ref>
<ref id="ref-21"><label>[21]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>A.</given-names> <surname>Saber</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Sakr</surname></string-name>, <string-name><given-names>O. M.</given-names> <surname>Abo-Seida</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Keshk</surname></string-name> and <string-name><given-names>H.</given-names> <surname>Chen</surname></string-name></person-group>, &#x201C;<article-title>A novel deep-learning model for automatic detection and classification of breast cancer using the transfer-learning technique</article-title>,&#x201D; <source>IEEE Access</source>, vol. <volume>9</volume>, pp. <fpage>71194</fpage>&#x2013;<lpage>71209</lpage>, <year>2021</year>.</mixed-citation></ref>
</ref-list>
</back>
</article>















