<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.1 20151215//EN" "http://jats.nlm.nih.gov/publishing/1.1/JATS-journalpublishing1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" article-type="research-article" dtd-version="1.1">
<front>
<journal-meta>
<journal-id journal-id-type="pmc">CMC</journal-id>
<journal-id journal-id-type="nlm-ta">CMC</journal-id>
<journal-id journal-id-type="publisher-id">CMC</journal-id>
<journal-title-group>
<journal-title>Computers, Materials &#x0026; Continua</journal-title>
</journal-title-group>
<issn pub-type="epub">1546-2226</issn>
<issn pub-type="ppub">1546-2218</issn>
<publisher>
<publisher-name>Tech Science Press</publisher-name>
<publisher-loc>USA</publisher-loc>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="publisher-id">26729</article-id>
<article-id pub-id-type="doi">10.32604/cmc.2022.026729</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Article</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>Evolutionary Intelligence and Deep Learning Enabled Diabetic Retinopathy Classification Model</article-title>
<alt-title alt-title-type="left-running-head">Evolutionary Intelligence and Deep Learning Enabled Diabetic Retinopathy Classification Model</alt-title>
<alt-title alt-title-type="right-running-head">Evolutionary Intelligence and Deep Learning Enabled Diabetic Retinopathy Classification Model</alt-title>
</title-group>
<contrib-group content-type="authors">
<contrib id="author-1" contrib-type="author" corresp="yes">
<name name-style="western"><surname>Alqaralleh</surname><given-names>Bassam A. Y.</given-names>
</name><xref ref-type="aff" rid="aff-1">1</xref><email>b.alqaralleh@ubt.edu.sa</email>
</contrib>
<contrib id="author-2" contrib-type="author">
<name name-style="western"><surname>Aldhaban</surname><given-names>Fahad</given-names>
</name><xref ref-type="aff" rid="aff-1">1</xref></contrib>
<contrib id="author-3" contrib-type="author">
<name name-style="western"><surname>Abukaraki</surname><given-names>Anas</given-names>
</name><xref ref-type="aff" rid="aff-2">2</xref></contrib>
<contrib id="author-4" contrib-type="author">
<name name-style="western"><surname>AlQaralleh</surname><given-names>Esam A.</given-names>
</name><xref ref-type="aff" rid="aff-3">3</xref></contrib>
<aff id="aff-1"><label>1</label><institution>MIS Department, College of Business Administration, University of Business and Technology</institution>, <addr-line>Jeddah, 21448</addr-line>, <country>Saudi Arabia</country></aff>
<aff id="aff-2"><label>2</label><institution>Department of Computer Science, Faculty of Information Technology, Al-Hussein Bin Talal University</institution>, <addr-line>Ma&#x2019;an, 71111</addr-line>, <country>Jordan</country></aff>
<aff id="aff-3"><label>3</label><institution>School of Engineering, Princess Sumaya University for Technology</institution>, <addr-line>Amman, 11941</addr-line>, <country>Jordan</country>.</aff>
</contrib-group>
<author-notes>
<corresp id="cor1"><label>&#x002A;</label>Corresponding Author: Bassam A. Y. Alqaralleh. Email: <email>b.alqaralleh@ubt.edu.sa</email></corresp>
</author-notes>
<pub-date pub-type="epub" date-type="pub" iso-8601-date="2022-05-16"><day>16</day>
<month>05</month>
<year>2022</year></pub-date>
<volume>73</volume>
<issue>1</issue>
<fpage>87</fpage>
<lpage>101</lpage>
<history>
<date date-type="received">
<day>03</day>
<month>1</month>
<year>2022</year>
</date>
<date date-type="accepted">
<day>18</day>
<month>3</month>
<year>2022</year>
</date>
</history>
<permissions>
<copyright-statement>&#x00A9; 2022 Alqaralleh et al.</copyright-statement>
<copyright-year>2022</copyright-year>
<copyright-holder>Alqaralleh et al.</copyright-holder>
<license xlink:href="https://creativecommons.org/licenses/by/4.0/">
<license-p>This work is licensed under a <ext-link ext-link-type="uri" xlink:type="simple" xlink:href="https://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</ext-link>, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.</license-p>
</license>
</permissions>
<self-uri content-type="pdf" xlink:href="TSP_CMC_26729.pdf"></self-uri>
<abstract>
<p>Diabetic Retinopathy (DR) has become a widespread illness among diabetics across the globe. Retinal fundus images are generally used by physicians to detect and classify the stages of DR. Since manual examination of DR images is a time-consuming process with the risks of biased results, automated tools using Artificial Intelligence (AI) to diagnose the disease have become essential. In this view, the current study develops an Optimal Deep Learning-enabled Fusion-based Diabetic Retinopathy Detection and Classification (ODL-FDRDC) technique. The intention of the proposed ODL-FDRDC technique is to identify DR and categorize its different grades using retinal fundus images. In addition, ODL-FDRDC technique involves region growing segmentation technique to determine the infected regions. Moreover, the fusion of two DL models namely, CapsNet and MobileNet is used for feature extraction. Further, the hyperparameter tuning of these models is also performed via Coyote Optimization Algorithm (COA). Gated Recurrent Unit (GRU) is also utilized to identify DR. The experimental results of the analysis, accomplished by ODL-FDRDC technique against benchmark DR dataset, established the supremacy of the technique over existing methodologies under different measures.</p>
</abstract>
<kwd-group kwd-group-type="author">
<kwd>Optimization algorithms</kwd>
<kwd>medical images</kwd>
<kwd>diabetic retinopathy</kwd>
<kwd>deep learning</kwd>
<kwd>fusion model</kwd>
</kwd-group>
</article-meta>
</front>
<body>
<sec id="s1">
<label>1</label>
<title>Introduction</title>
<p>Diabetes Mellitus is a life-threatening disease that has affected 463 million people across the globe and its prevalence is expected to increase up to 700 million by 2045 [<xref ref-type="bibr" rid="ref-1">1</xref>]. Also, one third of diabetics suffer from Diabetic Retinopathy (DR), an eye disease that is interrelated to diabetes and is increasingly more popular. DR is characterized by advanced vascular disruption in the retina that results in chronic hyperglycemia and it progresses in the diabetics, nevertheless of its seriousness. Globally, it is the major cause of blindness amongst working age adults and is diagnosed among 93 million people [<xref ref-type="bibr" rid="ref-2">2</xref>]. Further, DR is predicted to increase even more, owing to high prevalence rate of diabetes in developing Asian countries like China and India [<xref ref-type="bibr" rid="ref-3">3</xref>].</p>
<p>DR is highly asymptomatic in early stages during when neural retinal damages and medically-invisible microvascular changes occur. Therefore, diabetic patients must undergo periodic eye screening followed by appropriate diagnoses and succeeding management of the condition to save themselves from vision loss [<xref ref-type="bibr" rid="ref-4">4</xref>]. With only protective measures in hand, such as the control of hypertension, hyperglycemia, and hyperlipidemia, early diagnosis of DR is inevitable. Furthermore, with respect to its treatment methods, the intervention methods that are currently available include laser photocoagulation which considerably reduces the possibility of blindness in diabetic maculopathy and proliferative retinopathy up to 98%. This high rate of revival is possible, only when the disease is diagnosed at early stages and cured immediately [<xref ref-type="bibr" rid="ref-5">5</xref>]. Appropriate treatment and early diagnosis are the only preventive measures that can be taken proactively to prevent or delay blindness from DR.</p>
<p>Regular screening of DR patients and their exploding growth rate in India advocate the requirement for an automated screening method for early diagnosis of DR [<xref ref-type="bibr" rid="ref-6">6</xref>]. Timely treatment, earlier detection, and frequent screening are the essential components to be followed in addition to automated diagnosis for preventing blindness. In this background, it is challenging to diagnose the retinal pathologies as it is not readily apparent from retinal images, particularly during early stages. Nonetheless, the present Computer-aided image processing methods have proved their capacity in accurately detecting the abnormal patterns connected to the disease [<xref ref-type="bibr" rid="ref-7">7</xref>]. Blood vessel segmentation is generally regarded as an early stage in building CAD tools. So, several methodologies have been introduced in the last few decades to remove blood vessels from retinographic images through classical image processing and automated learning models [<xref ref-type="bibr" rid="ref-8">8</xref>]. The current Deep Learning (DL) method, including Convolutional Neural Network (CNN), seems to be an optimum choice for automated diagnosis of ailments in digital healthcare images [<xref ref-type="bibr" rid="ref-9">9</xref>,<xref ref-type="bibr" rid="ref-10">10</xref>]. The implementation of CNN has increased in the recent years, with the emergence of supportive tools including activation functions namely, Batch Normalization (BN), Rectified Linear Unit (RLU), Dropout regularization, and so on.</p>
<p>The current study develops an Optimal Deep Learning enabled Fusion based Diabetic Retinopathy Detection and Classification (ODL-FDRDC) technique. Besides, the proposed ODL-FDRDC technique involves region growing segmentation to determine the infected regions. In addition, two DL models namely, CapsNet and MobileNet are fused together for feature extraction process. The hyperparameter tuning of these models is performed via Coyote Optimization Algorithm (COA). Finally, Gated Recurrent Unit (GRU) is utilized in the identification of DR. The experimental analysis results accomplished by ODL-FDRDC technique against benchmark DR dataset established the model&#x2019;s superiority under distinct aspects.</p>
<p>Rest of the paper is arranged as follows. Section 2 offers information about related works, Section 3 discusses about the proposed model, Section 4 details the experimental results, and Section 5 concludes the study.</p>
</sec>
<sec id="s2">
<label>2</label>
<title>Literature Review</title>
<p>Qummar et al. [<xref ref-type="bibr" rid="ref-11">11</xref>] made use of a widely-accessible Kaggle data set that contains retinal images to train an ensemble of five DCNN systems (Dense169, Resnet50, Inceptionv3, Xception and Dense121) in encrypting the rich features. The study was aimed at enhancing the classification accuracy during different phases of DR. The simulation results show that the presented method identified each stage of DR in a different manner compared to existing methodologies and achieved great success than the advanced techniques on similar Kaggle datasets. Beede et al. [<xref ref-type="bibr" rid="ref-12">12</xref>] defined the application of human-centric research about DL method in healthcare centers for the diagnosis of DR disease. Based on the observations and interviews conducted across 11 healthcare centers in Thailand, the study covered information on present eye-screening systems, user expectations for AI-enabled screening process, and post-deployment experience. The results showed that many socio-environment factors affect patient experience, nursing workflows, and the performance of the method.</p>
<p>In literature [<xref ref-type="bibr" rid="ref-13">13</xref>], the researchers used a dataset as DR data set which was gathered from UCI-ML repository. During inception, the new data set was normalized by Standard scalar method following which PCA system was utilized in the extraction of essential attributes. Furthermore, firefly algorithm was executed to reduce the number of dimensions. This reduced dataset was then fed into DNN system for classification. Li et al. [<xref ref-type="bibr" rid="ref-14">14</xref>] validated and presented a deep ensemble model for diagnosing Diabetic Macular Oedema (DMO) and DR using retina fundus images. The researcher collected 8,739 retina fundus images from a retrospective cohort of 3285 persons. In order to detect DMO and DR, several enhanced Inception-v4 ensembling models were proposed. The study evaluated the efficacy of the algorithm and compared it against human expertise on initial dataset. Further, its generalization was measured on the widely accessible Messidor-2 data set as well. Murcia et al. [<xref ref-type="bibr" rid="ref-15">15</xref>] introduced CAD tools that leverage the efficiency rendered by DL architecture in image analysis process. The presented model depends on a deep residual CNN to extract discriminative features without any previous complex image transformation. This is done so to highlight specific structures or enhance the quality of the image. Additionally, the study also employed transfer learning method to reuse the layers from DNN. This was trained earlier using ImageNet data set, under the hypothesis that the initial layer captures abstract features that could be reutilized for diverse challenges.</p>
</sec>
<sec id="s3">
<label>3</label>
<title>The Proposed Model</title>
<p>In current study, a novel ODL-FDRDC technique has been developed to identify and categorize different grades of DR using retinal fundus images. The proposed ODL-FDRDC technique encompasses preprocessing, region-growing segmentation, fusion-based feature extraction, COA-based hyperparameter optimization, and GRU-based classification processes. The hyperparameter tuning of the fusion models is performed via COA. <xref ref-type="fig" rid="fig-1">Fig. 1</xref> depicts the overall working process of the proposed ODL-FDRDC technique.</p>
<sec id="s3_1">
<label>3.1</label>
<title>Region Growing Segmentation</title>
<p>In the initial stage of DR grading process, the purpose is to find out the affected regions in fundus images by following region growing segmentation approach. Region growing is a pixel-based segmentation method in which the similarity constraints including texture, intensity, etc. are considered to group the pixels into regions. Firstly, a group of pixels is combined by iteration method. Then, the seed pixels are selected along the region and the group is nurtured by grouping with adjacent pixels that are equivalent and where the region size increases. The growth of the region is terminated if the adjacent pixel does not fulfill the homogeneity conditions and the other seed pixels are selected. This procedure is repeated until each pixel in the image belongs to some region. In the presented method, both threshold and seed point selection take a decision about homogeneity constraint since it plays a significant role in improving the accuracy of segmentation. As mammograms suffer from severe intensity variations, a constant threshold selection alone does not warrant precise segmentation. Therefore, the study focuses on improving the automated DA method so as to generate an optimum threshold and seed point. The step-by-step process for region growing method is given herewith.
<list list-type="roman-lower">
<list-item>
<p>Input the abnormal images</p></list-item>
<list-item>
<p>Here, t represents the enhanced thresholds created by DA</p></list-item>
<list-item>
<p>Place t as seed point for region growing method</p></list-item>
<list-item>
<p>Add four neighboring pixels</p></list-item>
<list-item>
<p>Evaluate the distance (d) between the mean of region intensity and neighboring pixels.</p></list-item>
<list-item>
<p>Implement region growing when d &#x2264; t on four neighboring pixels and include all when they are not involved earlier in the region as well as store the coordinate of the novel pixels.</p></list-item>
<list-item>
<p>Store the mean of novel region and proceed to step 2 as well as implement the region growing process till all the pixels are grouped.</p></list-item>
</list></p>
<fig id="fig-1">
<label>Figure 1</label>
<caption>
<title>Working process of ODL-FDRDC technique</title>
</caption>
<graphic mimetype="image" mime-subtype="png" xlink:href="CMC_26729-fig-1.png"/>
</fig>
</sec>
<sec id="s3_2">
<label>3.2</label>
<title>Fusion Based Feature Extraction</title>
<p>In this stage, the segmented images are fed into DL models to derive the feature vectors. Feature fusion process is performed by integrating dual feature vectors from MobileNet and CapsNet models using entropy. It is defined as follows.</p>
<p><disp-formula id="eqn-1"><label>(1)</label><mml:math id="mml-eqn-1" display="block"><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">M</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">b</mml:mi><mml:mi mathvariant="italic">i</mml:mi><mml:mi mathvariant="italic">l</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">N</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">t</mml:mi></mml:mrow><mml:mo>&#x00D7;</mml:mo><mml:mi>m</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mrow><mml:mo>{</mml:mo><mml:mrow><mml:mi mathvariant="italic">M</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">b</mml:mi><mml:mi mathvariant="italic">i</mml:mi><mml:mi mathvariant="italic">l</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">N</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">t</mml:mi></mml:mrow><mml:mi mathvariant="normal">&#x005F;</mml:mi><mml:mi>v</mml:mi><mml:msub><mml:mn>2</mml:mn><mml:mrow><mml:mn>1</mml:mn><mml:mo>&#x00D7;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:mrow><mml:mi mathvariant="italic">M</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">b</mml:mi><mml:mi mathvariant="italic">i</mml:mi><mml:mi mathvariant="italic">l</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">N</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">t</mml:mi></mml:mrow><mml:mi mathvariant="normal">&#x005F;</mml:mi><mml:mi>v</mml:mi><mml:msub><mml:mn>2</mml:mn><mml:mrow><mml:mn>1</mml:mn><mml:mo>&#x00D7;</mml:mo><mml:mn>2</mml:mn></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:mrow><mml:mi mathvariant="italic">M</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">b</mml:mi><mml:mi mathvariant="italic">i</mml:mi><mml:mi mathvariant="italic">l</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">N</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">t</mml:mi></mml:mrow><mml:mi mathvariant="normal">&#x005F;</mml:mi><mml:mi>v</mml:mi><mml:msub><mml:mn>2</mml:mn><mml:mrow><mml:mn>1</mml:mn><mml:mo>&#x00D7;</mml:mo><mml:mn>3</mml:mn></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:mo>&#x2026;</mml:mo><mml:mo>,</mml:mo><mml:mrow><mml:mi mathvariant="italic">M</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">b</mml:mi><mml:mi mathvariant="italic">i</mml:mi><mml:mi mathvariant="italic">l</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">N</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">t</mml:mi></mml:mrow><mml:mi mathvariant="normal">&#x005F;</mml:mi><mml:mi>v</mml:mi><mml:msub><mml:mn>3</mml:mn><mml:mrow><mml:mn>1</mml:mn><mml:mo>&#x00D7;</mml:mo><mml:mi>n</mml:mi></mml:mrow></mml:msub><mml:mo>}</mml:mo></mml:mrow></mml:math></disp-formula></p>
<p><disp-formula id="eqn-2"><label>(2)</label><mml:math id="mml-eqn-2" display="block"><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">C</mml:mi><mml:mi mathvariant="italic">a</mml:mi><mml:mi mathvariant="italic">p</mml:mi><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">N</mml:mi></mml:mrow><mml:mo>&#x00D7;</mml:mo><mml:mi>p</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mrow><mml:mo>{</mml:mo><mml:msub><mml:mrow><mml:mi mathvariant="italic">C</mml:mi><mml:mi mathvariant="italic">a</mml:mi><mml:mi mathvariant="italic">p</mml:mi><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">N</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn><mml:mo>&#x00D7;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:msub><mml:mrow><mml:mi mathvariant="italic">C</mml:mi><mml:mi mathvariant="italic">a</mml:mi><mml:mi mathvariant="italic">p</mml:mi><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">N</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn><mml:mo>&#x00D7;</mml:mo><mml:mn>2</mml:mn></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:msub><mml:mrow><mml:mi mathvariant="italic">C</mml:mi><mml:mi mathvariant="italic">a</mml:mi><mml:mi mathvariant="italic">p</mml:mi><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">N</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn><mml:mo>&#x00D7;</mml:mo><mml:mn>3</mml:mn></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:mo>&#x2026;</mml:mo><mml:mo>,</mml:mo><mml:msub><mml:mrow><mml:mi mathvariant="italic">C</mml:mi><mml:mi mathvariant="italic">a</mml:mi><mml:mi mathvariant="italic">p</mml:mi><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">N</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn><mml:mo>&#x00D7;</mml:mo><mml:mi>n</mml:mi></mml:mrow></mml:msub><mml:mo>}</mml:mo></mml:mrow></mml:math></disp-formula></p>
<p>Then, they are fused into a single vector which is represented herewith.</p>
<p><disp-formula id="eqn-3"><label>(3)</label><mml:math id="mml-eqn-3" display="block"><mml:mrow><mml:mi mathvariant="italic">F</mml:mi><mml:mi mathvariant="italic">u</mml:mi><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">d</mml:mi></mml:mrow><mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mi mathvariant="italic">f</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">a</mml:mi><mml:mi mathvariant="italic">t</mml:mi><mml:mi mathvariant="italic">u</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow><mml:mspace width="thinmathspace" /><mml:mspace width="thinmathspace" /><mml:mrow><mml:mi mathvariant="italic">v</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">t</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mrow><mml:mn>1</mml:mn><mml:mo>&#x00D7;</mml:mo><mml:mi>q</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:munderover><mml:mo>&#x2211;</mml:mo><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:munderover><mml:mrow><mml:mo>{</mml:mo><mml:mi>f</mml:mi><mml:mrow><mml:mi mathvariant="italic">M</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">b</mml:mi><mml:mi mathvariant="italic">i</mml:mi><mml:mi mathvariant="italic">l</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">N</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">t</mml:mi><mml:mi mathvariant="italic">V</mml:mi></mml:mrow><mml:msub><mml:mn>2</mml:mn><mml:mrow><mml:mn>1</mml:mn><mml:mo>&#x00D7;</mml:mo><mml:mi>m</mml:mi></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:mspace width="thinmathspace" /><mml:mi>f</mml:mi><mml:msub><mml:mrow><mml:mi mathvariant="italic">a</mml:mi><mml:mi mathvariant="italic">C</mml:mi><mml:mi mathvariant="italic">p</mml:mi><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">N</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn><mml:mo>&#x00D7;</mml:mo><mml:mi>p</mml:mi></mml:mrow></mml:msub><mml:mo>}</mml:mo></mml:mrow></mml:math></disp-formula>whereas <inline-formula id="ieqn-1"><mml:math id="mml-ieqn-1"><mml:mi>f</mml:mi></mml:math></inline-formula> specifies a fused vector. The entropy carried out for feature vector involves the chosen features which can be defined herewith.</p>
<p><disp-formula id="eqn-4"><label>(4)</label><mml:math id="mml-eqn-4" display="block"><mml:msub><mml:mi>B</mml:mi><mml:mrow><mml:mi>H</mml:mi><mml:mi>e</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mo>&#x2212;</mml:mo><mml:mi>N</mml:mi><mml:mi>H</mml:mi><mml:msub><mml:mi>e</mml:mi><mml:mrow><mml:mi>b</mml:mi></mml:mrow></mml:msub><mml:munderover><mml:mo>&#x2211;</mml:mo><mml:mrow><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:munderover><mml:mi>p</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>)</mml:mo></mml:mrow></mml:math></disp-formula></p>
<p><disp-formula id="eqn-5"><label>(5)</label><mml:math id="mml-eqn-5" display="block"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">l</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">t</mml:mi></mml:mrow></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:msub><mml:mi>B</mml:mi><mml:mrow><mml:mi>H</mml:mi><mml:mi>e</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mo movablelimits="true" form="prefix">max</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:mn>1186</mml:mn><mml:mo>)</mml:mo></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:math></disp-formula>where <inline-formula id="ieqn-2"><mml:math id="mml-ieqn-2"><mml:mi>p</mml:mi></mml:math></inline-formula> signifies the feature probability and <inline-formula id="ieqn-3"><mml:math id="mml-ieqn-3"><mml:mi>H</mml:mi><mml:mi>e</mml:mi></mml:math></inline-formula> defines the entropy. Finally, the chosen features are passed onto the classifier to determine DR.</p>
<sec id="s3_2_1">
<label>3.2.1</label>
<title>CapsNet Model</title>
<p>DL is a form of CNN which is commonly utilized in various image-processing related disease diagnosis models. It comprises of numerous connected layers with distinct weight values and activation functions. The fundamental DL model includes convolution layer, pooling layer, and connected layer. These distinct activation functions are utilized for weight adjustment. In order to overcome the limitations of the CNN, CapsNet model is presented. Being a deeper network, this model mainly comprises of capsules [<xref ref-type="bibr" rid="ref-16">16</xref>] and a collection of neurons. The activation neuron defines the features of every component in the object. Every individual capsule plays an important part in the determination of individual elements in the object and every capsule iteratively computes the total structure of objects. It saves both object element and spatial data. In comparison with CNNs, CapsNet model involves multiple layers and performs effective feature extraction process.</p>
</sec>
<sec id="s3_2_2">
<label>3.2.2</label>
<title>MobileNet Model</title>
<p>Here, MobileNetv2 is utilized to detect and classify DR. It includes a small structure with low computational complexity and high precision. In line with depth-wise separable convolution, MobileNet utilizes a pair of hyperparameters to maintain a tradeoff between performance and effectiveness [<xref ref-type="bibr" rid="ref-17">17</xref>]. The basic concept of MobileNet model is decomposition of convolutional kernel. With this concept, the decomposition of a typical convolution kernel takes place in two ways namely depthwise and pointwise convolutions. The former filter carries out the convolution process in all channels and is applied to integrate the outcome of depthwise convolutional layer. Therefore, N typical convolutional kernel gets substituted with M depth wise 1 &#x00D7; 1 convolution kernel and N pointwise convolutional kernel. MobileNet-v2 offers an extract component with inverted residual structures.</p>
</sec>
</sec>
<sec id="s3_3">
<label>3.3</label>
<title>COA Based Hyperparameter Optimization Process</title>
<p>In order to optimally tune the hyperparameters involved in fusion models, COA is utilized [<xref ref-type="bibr" rid="ref-18">18</xref>]. COA is a novel group optimization technique presented in 2018 by Pierezan et al., and is simulated based on the performance of coyotes in North America. This technique inspires the present coyote population and its evolution that contains heuristic arbitrary coyote population combination, development, birth, and death, original-group driving-away, and new-group acceptance performance. In COA, decision variable is demonstrated by coyote social-state factor from all the dimensions of a solution vector. All the coyotes signify the solution of the candidate to a problem. The COA group contains the initial coyote population based on the rule of arbitrary equivalent distribution. Therefore, after setting the amount of coyotes from group <inline-formula id="ieqn-4"><mml:math id="mml-ieqn-4"><mml:msub><mml:mi>N</mml:mi><mml:mrow><mml:mi>p</mml:mi></mml:mrow></mml:msub><mml:mo>&#x2208;</mml:mo><mml:msup><mml:mi>N</mml:mi><mml:mrow><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow></mml:mrow></mml:msup></mml:math></inline-formula> and from single group <inline-formula id="ieqn-5"><mml:math id="mml-ieqn-5"><mml:msub><mml:mi>N</mml:mi><mml:mrow><mml:mi>c</mml:mi></mml:mrow></mml:msub><mml:mo>&#x2208;</mml:mo><mml:msup><mml:mi>N</mml:mi><mml:mrow><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow></mml:mrow></mml:msup></mml:math></inline-formula>, it can attain <inline-formula id="ieqn-6"><mml:math id="mml-ieqn-6"><mml:msub><mml:mi>N</mml:mi><mml:mrow><mml:mi>p</mml:mi></mml:mrow></mml:msub><mml:mo>&#x00D7;</mml:mo><mml:msub><mml:mi>N</mml:mi><mml:mrow><mml:mi>c</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> individual coyote. The primary social condition of this individual coyote is arbitrarily set. <xref ref-type="disp-formula" rid="eqn-6">Eq. (6)</xref> expresses the allocation technique of <inline-formula id="ieqn-7"><mml:math id="mml-ieqn-7"><mml:msup><mml:mi>j</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math></inline-formula> dimension of <inline-formula id="ieqn-8"><mml:math id="mml-ieqn-8"><mml:mi>c</mml:mi></mml:math></inline-formula> coyote from <inline-formula id="ieqn-9"><mml:math id="mml-ieqn-9"><mml:mi>p</mml:mi></mml:math></inline-formula> package.</p>
<p><disp-formula id="eqn-6"><label>(6)</label><mml:math id="mml-eqn-6" display="block"><mml:mi>s</mml:mi><mml:mi>o</mml:mi><mml:msubsup><mml:mi>c</mml:mi><mml:mrow><mml:mi>c</mml:mi><mml:mo>,</mml:mo><mml:mi>j</mml:mi></mml:mrow><mml:mrow><mml:mi>p</mml:mi></mml:mrow></mml:msubsup><mml:mo>=</mml:mo><mml:mi>l</mml:mi><mml:msub><mml:mi>b</mml:mi><mml:mrow><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>r</mml:mi><mml:mo>&#x22C5;</mml:mo><mml:mo stretchy="false">(</mml:mo><mml:mi>u</mml:mi><mml:msub><mml:mi>b</mml:mi><mml:mrow><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mo>&#x2212;</mml:mo><mml:mi>l</mml:mi><mml:msub><mml:mi>b</mml:mi><mml:mrow><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mo stretchy="false">)</mml:mo><mml:mo>,</mml:mo></mml:math></disp-formula>where <inline-formula id="ieqn-10"><mml:math id="mml-ieqn-10"><mml:mi>u</mml:mi><mml:msub><mml:mi>b</mml:mi><mml:mrow><mml:mi>j</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> and <inline-formula id="ieqn-11"><mml:math id="mml-ieqn-11"><mml:mi>l</mml:mi><mml:msub><mml:mi>b</mml:mi><mml:mrow><mml:mi>j</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> imply the upper and lower bounds of <inline-formula id="ieqn-12"><mml:math id="mml-ieqn-12"><mml:msup><mml:mi>j</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math></inline-formula> dimension of the decision variable correspondingly, and <inline-formula id="ieqn-13"><mml:math id="mml-ieqn-13"><mml:mi>r</mml:mi></mml:math></inline-formula> is uniformly distributed from 0 and 1. For this reason, the social adaptability of coyotes are estimated based on <xref ref-type="disp-formula" rid="eqn-7">Eq. (7)</xref>:</p>
<p><disp-formula id="eqn-7"><label>(7)</label><mml:math id="mml-eqn-7" display="block"><mml:mi>f</mml:mi><mml:mi>i</mml:mi><mml:msubsup><mml:mi>t</mml:mi><mml:mrow><mml:mi>c</mml:mi></mml:mrow><mml:mrow><mml:mi>p</mml:mi></mml:mrow></mml:msubsup><mml:mo>=</mml:mo><mml:mi>f</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mi>s</mml:mi><mml:mi>o</mml:mi><mml:msubsup><mml:mi>c</mml:mi><mml:mrow><mml:mi>c</mml:mi></mml:mrow><mml:mrow><mml:mi>p</mml:mi></mml:mrow></mml:msubsup><mml:mo>)</mml:mo></mml:mrow></mml:math></disp-formula></p>
<p>The development of coyotes from the group is nothing but the outcome of cultural interaction. It can be influenced by the alpha wolf while the cultural trends <inline-formula id="ieqn-14"><mml:math id="mml-ieqn-14"><mml:mo stretchy="false">(</mml:mo><mml:mi>c</mml:mi><mml:mi>u</mml:mi><mml:mi>l</mml:mi><mml:msup><mml:mi>t</mml:mi><mml:mrow><mml:mi>p</mml:mi><mml:mo>,</mml:mo><mml:mi>t</mml:mi></mml:mrow></mml:msup><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula> of this group, and two distinct coyotes <inline-formula id="ieqn-15"><mml:math id="mml-ieqn-15"><mml:mo stretchy="false">(</mml:mo><mml:mi>c</mml:mi><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula> and <inline-formula id="ieqn-16"><mml:math id="mml-ieqn-16"><mml:mi>c</mml:mi><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msub><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula> are arbitrarily chosen from the group. The cultural variance between alpha and the arbitrary wolf <inline-formula id="ieqn-17"><mml:math id="mml-ieqn-17"><mml:mi>c</mml:mi><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula> streamlines the influence factor <inline-formula id="ieqn-18"><mml:math id="mml-ieqn-18"><mml:msub><mml:mi>&#x03B4;</mml:mi><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula>, and the cultural variance between <inline-formula id="ieqn-19"><mml:math id="mml-ieqn-19"><mml:mi>c</mml:mi><mml:mi>u</mml:mi><mml:mi>l</mml:mi><mml:msup><mml:mi>t</mml:mi><mml:mrow><mml:mi>p</mml:mi><mml:mo>,</mml:mo><mml:mi>t</mml:mi></mml:mrow></mml:msup></mml:math></inline-formula> and arbitrary wolf <inline-formula id="ieqn-20"><mml:math id="mml-ieqn-20"><mml:mi>c</mml:mi><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula> streamlines the impression factor <inline-formula id="ieqn-21"><mml:math id="mml-ieqn-21"><mml:msub><mml:mi>&#x03B4;</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula>, i.e.,:</p>
<p><disp-formula id="eqn-8"><label>(8)</label><mml:math id="mml-eqn-8" display="block"><mml:msub><mml:mi>&#x03B4;</mml:mi><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mi>a</mml:mi><mml:mi>l</mml:mi><mml:mi>p</mml:mi><mml:mi>h</mml:mi><mml:msup><mml:mi>a</mml:mi><mml:mrow><mml:mi>p</mml:mi><mml:mo>,</mml:mo><mml:mi>t</mml:mi></mml:mrow></mml:msup><mml:mo>&#x2212;</mml:mo><mml:mi>s</mml:mi><mml:mi>o</mml:mi><mml:msubsup><mml:mi>c</mml:mi><mml:mrow><mml:mi>c</mml:mi><mml:mi>r</mml:mi><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mi>p</mml:mi><mml:mo>,</mml:mo><mml:mi>t</mml:mi></mml:mrow></mml:msubsup></mml:math></disp-formula></p>
<p><disp-formula id="eqn-9"><label>(9)</label><mml:math id="mml-eqn-9" display="block"><mml:msub><mml:mi>&#x03B4;</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mi>c</mml:mi><mml:mi>u</mml:mi><mml:mi>l</mml:mi><mml:msup><mml:mi>t</mml:mi><mml:mrow><mml:mi>p</mml:mi><mml:mo>,</mml:mo><mml:mi>t</mml:mi></mml:mrow></mml:msup><mml:mo>&#x2212;</mml:mo><mml:mi>s</mml:mi><mml:mi>o</mml:mi><mml:msubsup><mml:mi>c</mml:mi><mml:mrow><mml:mi>c</mml:mi><mml:mi>r</mml:mi><mml:mn>2</mml:mn></mml:mrow><mml:mrow><mml:mi>p</mml:mi><mml:mo>,</mml:mo><mml:mi>t</mml:mi></mml:mrow></mml:msubsup></mml:math></disp-formula></p>
<p>Alpha wolf has a coyote with optimum environmental adaptation from the group. If the minimum issue is resolved, it can be determined as follows.</p>
<p><disp-formula id="eqn-10"><label>(10)</label><mml:math id="mml-eqn-10" display="block"><mml:mi>a</mml:mi><mml:mi>l</mml:mi><mml:mi>p</mml:mi><mml:mi>h</mml:mi><mml:msup><mml:mi>a</mml:mi><mml:mrow><mml:mi>p</mml:mi><mml:mo>,</mml:mo><mml:mi>t</mml:mi></mml:mrow></mml:msup><mml:mo>=</mml:mo><mml:mrow><mml:mo>{</mml:mo><mml:mi>s</mml:mi><mml:mi>o</mml:mi><mml:msubsup><mml:mi>c</mml:mi><mml:mrow><mml:mi>c</mml:mi></mml:mrow><mml:mrow><mml:mi>p</mml:mi><mml:mo>,</mml:mo><mml:mi>t</mml:mi></mml:mrow></mml:msubsup><mml:mrow><mml:mo stretchy="false">|</mml:mo></mml:mrow><mml:msub><mml:mi>arg</mml:mi><mml:mrow><mml:mi>c</mml:mi><mml:mo>=</mml:mo><mml:mrow><mml:mo>{</mml:mo><mml:mn>1</mml:mn><mml:mo>,</mml:mo><mml:mn>2</mml:mn><mml:mo>,</mml:mo><mml:mo>&#x2026;</mml:mo><mml:msub><mml:mi>N</mml:mi><mml:mrow><mml:mi>c</mml:mi></mml:mrow></mml:msub><mml:mo>}</mml:mo></mml:mrow></mml:mrow></mml:msub><mml:mo>&#x2061;</mml:mo><mml:mo movablelimits="true" form="prefix">min</mml:mo><mml:mi>f</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mi>s</mml:mi><mml:mi>o</mml:mi><mml:msubsup><mml:mi>c</mml:mi><mml:mrow><mml:mi>c</mml:mi></mml:mrow><mml:mrow><mml:mi>p</mml:mi><mml:mo>,</mml:mo><mml:mi>t</mml:mi></mml:mrow></mml:msubsup><mml:mo>)</mml:mo></mml:mrow><mml:mo>}</mml:mo></mml:mrow><mml:mo>.</mml:mo></mml:math></disp-formula></p>
<p>The cultural trend offers a situation to coyote from the group in terms of sharing data and collecting the median values of the social states of every coyote from the group. It can be identified as the act of algorithmic SI. The particular computation equation is given herewith.</p>
<p><disp-formula id="eqn-11"><label>(11)</label><mml:math id="mml-eqn-11" display="block"><mml:mi>c</mml:mi><mml:mi>u</mml:mi><mml:mi>l</mml:mi><mml:msubsup><mml:mi>t</mml:mi><mml:mrow><mml:mi>j</mml:mi></mml:mrow><mml:mrow><mml:mi>p</mml:mi><mml:mo>,</mml:mo><mml:mi>t</mml:mi></mml:mrow></mml:msubsup><mml:mo>=</mml:mo><mml:mrow><mml:mo>{</mml:mo><mml:mtable columnalign="left left" rowspacing=".2em" columnspacing="1em" displaystyle="false"><mml:mtr><mml:mtd><mml:msubsup><mml:mi>O</mml:mi><mml:mrow><mml:mfrac><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>N</mml:mi><mml:mrow><mml:mi>C</mml:mi></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mn>1</mml:mn><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mn>2</mml:mn></mml:mfrac></mml:mrow><mml:mrow><mml:mi>p</mml:mi><mml:mo>,</mml:mo><mml:mi>t</mml:mi></mml:mrow></mml:msubsup><mml:mo>,</mml:mo></mml:mtd><mml:mtd><mml:msub><mml:mi>N</mml:mi><mml:mrow><mml:mi>c</mml:mi></mml:mrow></mml:msub><mml:mspace width="thinmathspace" /><mml:mi>i</mml:mi><mml:mi>s</mml:mi><mml:mspace width="thinmathspace" /><mml:mi>o</mml:mi><mml:mi>d</mml:mi><mml:mi>d</mml:mi></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mfrac><mml:mrow><mml:msubsup><mml:mi>O</mml:mi><mml:mrow><mml:mfrac><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>N</mml:mi><mml:mrow><mml:mi>C</mml:mi></mml:mrow></mml:msub><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mn>2</mml:mn></mml:mfrac><mml:mo>,</mml:mo><mml:mi>j</mml:mi></mml:mrow><mml:mrow><mml:mi>p</mml:mi><mml:mo>,</mml:mo><mml:mi>t</mml:mi></mml:mrow></mml:msubsup><mml:mo>+</mml:mo><mml:msubsup><mml:mi>O</mml:mi><mml:mrow><mml:mfrac><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>N</mml:mi><mml:mrow><mml:mi>C</mml:mi></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mn>1</mml:mn><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mn>2</mml:mn></mml:mfrac><mml:mo>,</mml:mo><mml:mi>j</mml:mi></mml:mrow><mml:mrow><mml:mi>p</mml:mi><mml:mo>,</mml:mo><mml:mi>t</mml:mi></mml:mrow></mml:msubsup></mml:mrow><mml:mn>2</mml:mn></mml:mfrac><mml:mo>,</mml:mo></mml:mstyle></mml:mtd><mml:mtd><mml:mrow><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">t</mml:mi><mml:mi mathvariant="italic">h</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">w</mml:mi><mml:mi mathvariant="italic">i</mml:mi><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow><mml:mrow><mml:mtext>&#xA0;'</mml:mtext></mml:mrow></mml:mtd></mml:mtr></mml:mtable><mml:mo fence="true" stretchy="true" symmetric="true"></mml:mo></mml:mrow></mml:math></disp-formula>where <inline-formula id="ieqn-22"><mml:math id="mml-ieqn-22"><mml:msup><mml:mi>O</mml:mi><mml:mrow><mml:mi>p</mml:mi><mml:mo>,</mml:mo><mml:mi>t</mml:mi></mml:mrow></mml:msup></mml:math></inline-formula> stands for social state in which <inline-formula id="ieqn-23"><mml:math id="mml-ieqn-23"><mml:mo stretchy="false">[</mml:mo></mml:math></inline-formula>1, <inline-formula id="ieqn-24"><mml:math id="mml-ieqn-24"><mml:mi>D</mml:mi><mml:mo stretchy="false">]</mml:mo></mml:math></inline-formula> is from <inline-formula id="ieqn-25"><mml:math id="mml-ieqn-25"><mml:msup><mml:mi>p</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math></inline-formula> package during <inline-formula id="ieqn-26"><mml:math id="mml-ieqn-26"><mml:msup><mml:mi>t</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math></inline-formula> iteration and was sorted by the dimension.</p>
<p>So, the social condition of coyote after development is represented in <xref ref-type="disp-formula" rid="eqn-12">Eq. (12)</xref>:</p>
<p><disp-formula id="eqn-12"><label>(12)</label><mml:math id="mml-eqn-12" display="block"><mml:mi>n</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>w</mml:mi><mml:mrow><mml:mo>&#x2212;</mml:mo></mml:mrow></mml:msub><mml:mi>s</mml:mi><mml:mi>o</mml:mi><mml:msubsup><mml:mi>c</mml:mi><mml:mrow><mml:mi>c</mml:mi></mml:mrow><mml:mrow><mml:mi>p</mml:mi><mml:mo>,</mml:mo><mml:mi>t</mml:mi></mml:mrow></mml:msubsup><mml:mo>=</mml:mo><mml:mi>s</mml:mi><mml:mi>o</mml:mi><mml:msubsup><mml:mi>c</mml:mi><mml:mrow><mml:mi>c</mml:mi></mml:mrow><mml:mrow><mml:mi>p</mml:mi><mml:mo>,</mml:mo><mml:mi>t</mml:mi></mml:mrow></mml:msubsup><mml:mo>+</mml:mo><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>&#x22C5;</mml:mo><mml:msub><mml:mi>&#x03B4;</mml:mi><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msub><mml:mo>&#x22C5;</mml:mo><mml:msub><mml:mi>&#x03B4;</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msub></mml:math></disp-formula>where <inline-formula id="ieqn-27"><mml:math id="mml-ieqn-27"><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula> and <inline-formula id="ieqn-28"><mml:math id="mml-ieqn-28"><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula> imply the weights of alpha wolf and cultural trend stimuli from the group correspondingly; it can be any arbitrary number between zero and one and is uniformly distributed. COA still utilizes greedy technique to determine whether the development of coyotes is permitted. <xref ref-type="disp-formula" rid="eqn-13">Eq. (13)</xref> estimates the development state of coyotes. In <xref ref-type="disp-formula" rid="eqn-14">Eq. (14)</xref>, coyotes with optimum environmental adaptabilities are recollected to participate in the succeeding procedures such as development, birth, and death and elimination in the novel group, and acceptance to the novel group.</p>
<p><disp-formula id="eqn-13"><label>(13)</label><mml:math id="mml-eqn-13" display="block"><mml:mi>n</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>w</mml:mi><mml:mrow><mml:mo>&#x2212;</mml:mo></mml:mrow></mml:msub><mml:mi>f</mml:mi><mml:mi>i</mml:mi><mml:msubsup><mml:mi>t</mml:mi><mml:mrow><mml:mi>c</mml:mi></mml:mrow><mml:mrow><mml:mi>p</mml:mi><mml:mo>,</mml:mo><mml:mi>t</mml:mi></mml:mrow></mml:msubsup><mml:mo>=</mml:mo><mml:mi>f</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mi>n</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>w</mml:mi><mml:mrow><mml:mo>&#x2212;</mml:mo></mml:mrow></mml:msub><mml:mi>s</mml:mi><mml:mi>o</mml:mi><mml:msubsup><mml:mi>c</mml:mi><mml:mrow><mml:mi>c</mml:mi></mml:mrow><mml:mrow><mml:mi>p</mml:mi><mml:mo>,</mml:mo><mml:mi>t</mml:mi></mml:mrow></mml:msubsup><mml:mo>)</mml:mo></mml:mrow></mml:math></disp-formula></p>
<p><disp-formula id="eqn-14"><label>(14)</label><mml:math id="mml-eqn-14" display="block"><mml:mi>s</mml:mi><mml:mi>o</mml:mi><mml:msubsup><mml:mi>c</mml:mi><mml:mrow><mml:mi>c</mml:mi></mml:mrow><mml:mrow><mml:mi>p</mml:mi><mml:mo>,</mml:mo><mml:mi>t</mml:mi><mml:mo>+</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msubsup><mml:mo>=</mml:mo><mml:mrow><mml:mo>{</mml:mo><mml:mtable columnalign="left left" rowspacing=".2em" columnspacing="1em" displaystyle="false"><mml:mtr><mml:mtd><mml:mi>n</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>w</mml:mi><mml:mrow><mml:mo>&#x2212;</mml:mo></mml:mrow></mml:msub><mml:mi>s</mml:mi><mml:mi>o</mml:mi><mml:msubsup><mml:mi>c</mml:mi><mml:mrow><mml:mi>c</mml:mi></mml:mrow><mml:mrow><mml:mi>p</mml:mi><mml:mo>,</mml:mo><mml:mi>t</mml:mi></mml:mrow></mml:msubsup><mml:mspace width="thinmathspace" /><mml:mi>n</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>w</mml:mi><mml:mrow><mml:mo>&#x2212;</mml:mo></mml:mrow></mml:msub><mml:mi>f</mml:mi><mml:mi>i</mml:mi><mml:msubsup><mml:mi>t</mml:mi><mml:mrow><mml:mi>c</mml:mi></mml:mrow><mml:mrow><mml:mi>p</mml:mi><mml:mo>,</mml:mo><mml:mi>t</mml:mi></mml:mrow></mml:msubsup><mml:mo>&#x003C;</mml:mo><mml:mi>f</mml:mi><mml:mi>i</mml:mi><mml:msubsup><mml:mi>t</mml:mi><mml:mrow><mml:mi>c</mml:mi></mml:mrow><mml:mrow><mml:mi>p</mml:mi><mml:mo>,</mml:mo><mml:mi>t</mml:mi></mml:mrow></mml:msubsup></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mi>s</mml:mi><mml:mi>o</mml:mi><mml:msubsup><mml:mi>c</mml:mi><mml:mrow><mml:mi>c</mml:mi></mml:mrow><mml:mrow><mml:mi>p</mml:mi><mml:mi>t</mml:mi></mml:mrow></mml:msubsup><mml:mo>,</mml:mo><mml:mrow><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">t</mml:mi><mml:mi mathvariant="italic">h</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">w</mml:mi><mml:mi mathvariant="italic">i</mml:mi><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mtd></mml:mtr></mml:mtable><mml:mo fence="true" stretchy="true" symmetric="true"></mml:mo></mml:mrow></mml:math></disp-formula></p>
<p>Based on the laws of nature, the group coyotes give birth to cubs. Once the offspring develops, it also face death threats. The particular birth technique of these cubs is as follows.</p>
<p><disp-formula id="eqn-15"><label>(15)</label><mml:math id="mml-eqn-15" display="block"><mml:mi>p</mml:mi><mml:mi>u</mml:mi><mml:msubsup><mml:mi>p</mml:mi><mml:mrow><mml:mi>j</mml:mi></mml:mrow><mml:mrow><mml:mi>p</mml:mi><mml:mo>,</mml:mo><mml:mi>t</mml:mi></mml:mrow></mml:msubsup><mml:mrow><mml:mo>{</mml:mo><mml:mtable columnalign="left left" rowspacing=".2em" columnspacing="1em" displaystyle="false"><mml:mtr><mml:mtd><mml:mi>s</mml:mi><mml:mi>o</mml:mi><mml:msubsup><mml:mi>c</mml:mi><mml:mrow><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:mi>j</mml:mi></mml:mrow><mml:mrow><mml:mi>p</mml:mi><mml:mo>,</mml:mo><mml:mi>t</mml:mi></mml:mrow></mml:msubsup><mml:mo>,</mml:mo><mml:mi>r</mml:mi><mml:mi>a</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mo>&#x003C;</mml:mo><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>s</mml:mi></mml:mrow></mml:msub><mml:mspace width="thinmathspace" /><mml:mi>o</mml:mi><mml:mi>r</mml:mi><mml:mspace width="thinmathspace" /><mml:mi>j</mml:mi><mml:mo>=</mml:mo><mml:msub><mml:mi>j</mml:mi><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mi>s</mml:mi><mml:mi>o</mml:mi><mml:msubsup><mml:mi>c</mml:mi><mml:mrow><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:mi>j</mml:mi></mml:mrow><mml:mrow><mml:mi>p</mml:mi><mml:mo>,</mml:mo><mml:mi>t</mml:mi></mml:mrow></mml:msubsup><mml:mo>,</mml:mo><mml:mi>r</mml:mi><mml:mi>a</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mo>&#x2265;</mml:mo><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>s</mml:mi></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>a</mml:mi></mml:mrow></mml:msub><mml:mspace width="thinmathspace" /><mml:mi>o</mml:mi><mml:mi>r</mml:mi><mml:mspace width="thinmathspace" /><mml:mi>j</mml:mi><mml:mo>=</mml:mo><mml:mi>i</mml:mi><mml:mn>2</mml:mn><mml:mo>,</mml:mo></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:msub><mml:mi>R</mml:mi><mml:mrow><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:mrow><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">t</mml:mi><mml:mi mathvariant="italic">h</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">w</mml:mi><mml:mi mathvariant="italic">i</mml:mi><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mtd></mml:mtr></mml:mtable><mml:mo fence="true" stretchy="true" symmetric="true"></mml:mo></mml:mrow></mml:math></disp-formula>where <inline-formula id="ieqn-29"><mml:math id="mml-ieqn-29"><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula> and <inline-formula id="ieqn-30"><mml:math id="mml-ieqn-30"><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula> imply two arbitrary coyotes from the present group; <inline-formula id="ieqn-31"><mml:math id="mml-ieqn-31"><mml:msub><mml:mi>j</mml:mi><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula> and <inline-formula id="ieqn-32"><mml:math id="mml-ieqn-32"><mml:mi>i</mml:mi><mml:mn>2</mml:mn></mml:math></inline-formula> signify the two arbitrary dimensions; <inline-formula id="ieqn-33"><mml:math id="mml-ieqn-33"><mml:mi>r</mml:mi><mml:mi>a</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mi>j</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> represents the arbitrary number which is uniformly distributed between zero and one. <inline-formula id="ieqn-34"><mml:math id="mml-ieqn-34"><mml:msub><mml:mi>R</mml:mi><mml:mrow><mml:mi>j</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> stands for the arbitrary number from the bounds of <inline-formula id="ieqn-35"><mml:math id="mml-ieqn-35"><mml:msup><mml:mi>j</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math></inline-formula> dimension decision variable. This arbitrary number demonstrates the influence of reproductive environment on the cubs; and finally, <inline-formula id="ieqn-36"><mml:math id="mml-ieqn-36"><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>s</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> and <inline-formula id="ieqn-37"><mml:math id="mml-ieqn-37"><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>a</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> denote scattering as well as correlation probabilities correspondingly. These values define the degree of cultural diversities of the coyote in a group.</p>
<p><disp-formula id="eqn-16"><label>(16)</label><mml:math id="mml-eqn-16" display="block"><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>s</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:msup><mml:mfrac><mml:mn>1</mml:mn><mml:mi>D</mml:mi></mml:mfrac><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mrow><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>a</mml:mi></mml:mrow></mml:msub></mml:mrow></mml:mrow></mml:msup><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:mn>1</mml:mn><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>s</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mn>2</mml:mn></mml:mfrac></mml:math></disp-formula></p>
<p>The whole population becomes unstable in that individual coyote and is driven by groups followed by its acceptance as a novel group. Rest of the coyotes experience an impact in the group whereas the superior probability <inline-formula id="ieqn-38"><mml:math id="mml-ieqn-38"><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>e</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of this coyote is shown herewith.</p>
<p><disp-formula id="eqn-17"><label>(17)</label><mml:math id="mml-eqn-17" display="block"><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>e</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mn>0.005</mml:mn><mml:mo>&#x22C5;</mml:mo><mml:msubsup><mml:mi>N</mml:mi><mml:mrow><mml:mi>c</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msubsup></mml:math></disp-formula></p>
<p>With a difference in allocating cultural data to the groups, this process promotes the global cultural interchange of coyote population. In order to ensure that A is present amongst zero and one, the amount of coyotes from all groups is needed up to 14.</p>
</sec>
<sec id="s3_4">
<label>3.4</label>
<title>GRU Based Classification</title>
<p>In this final stage, GRU model receives the feature vectors and performs classification. GRU is a different form of LSTM network which can provide the benefits of RNN method. It acquires the features automatically and successfully streamlines the long-term dependent data. It is executed to achieve short-term traffic estimate effectively [<xref ref-type="bibr" rid="ref-19">19</xref>]. In GRU networks, the cell infrastructure has hidden state which can be further related to LSTM. Intuitively, input and forget gates from LSTM are combined as a reset gate from GRU. This reset gate defines the combination of a novel input data in prior time. Another gate in GRU is named as upgrade gate which defines several data in the preceding time which are stored in present time. So, GRU is 1 gate lesser than LSTM. Besides, both cell and hidden states from LSTM are combined together as 1 hidden state in GRU. It can be altered so that the GRU networks can generate few parameters, get trained quickly and need lesser information to generalize the model efficiently. <xref ref-type="fig" rid="fig-2">Fig. 2</xref> illustrates the framework of GRU. The computation equation of GRU is as follows.</p>
<p><disp-formula id="eqn-18"><label>(18)</label><mml:math id="mml-eqn-18" display="block"><mml:msub><mml:mi>z</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mi>&#x03C3;</mml:mi><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>W</mml:mi><mml:mrow><mml:mi>z</mml:mi></mml:mrow></mml:msub><mml:mo>&#x22C5;</mml:mo><mml:mrow><mml:mo>[</mml:mo><mml:msub><mml:mi>h</mml:mi><mml:mrow><mml:mi>n</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub><mml:mo>]</mml:mo></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:math></disp-formula></p>
<p><disp-formula id="eqn-19"><label>(19)</label><mml:math id="mml-eqn-19" display="block"><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mi>&#x03C3;</mml:mi><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>W</mml:mi><mml:mrow><mml:mi>r</mml:mi></mml:mrow></mml:msub><mml:mo>&#x22C5;</mml:mo><mml:mrow><mml:mo>[</mml:mo><mml:msub><mml:mi>h</mml:mi><mml:mrow><mml:mi>n</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub><mml:mo>]</mml:mo></mml:mrow><mml:mo stretchy="false">)</mml:mo><mml:mo>,</mml:mo></mml:math></disp-formula></p>
<p><disp-formula id="eqn-20"><label>(20)</label><mml:math id="mml-eqn-20" display="block"><mml:msub><mml:mover><mml:mi>h</mml:mi><mml:mo accent="false">&#x00AF;</mml:mo></mml:mover><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mi>tanh</mml:mi><mml:mo>&#x2061;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mi>W</mml:mi><mml:mo>&#x22C5;</mml:mo><mml:mrow><mml:mo>[</mml:mo><mml:mi>r</mml:mi><mml:mo>&#x2217;</mml:mo><mml:msub><mml:mi>h</mml:mi><mml:mrow><mml:mi>n</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub><mml:mo>]</mml:mo></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:math></disp-formula></p>
<p><disp-formula id="eqn-21"><label>(21)</label><mml:math id="mml-eqn-21" display="block"><mml:msub><mml:mi>h</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mo stretchy="false">(</mml:mo><mml:mn>1</mml:mn><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>z</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub><mml:mo stretchy="false">)</mml:mo><mml:mo>&#x2217;</mml:mo><mml:msub><mml:mi>h</mml:mi><mml:mrow><mml:mi>n</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:msub><mml:mi>z</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub><mml:mo>&#x2217;</mml:mo><mml:mover><mml:mi>h</mml:mi><mml:mo accent="false">&#x00AF;</mml:mo></mml:mover><mml:mo>.</mml:mo></mml:math></disp-formula></p>
<p><xref ref-type="disp-formula" rid="eqn-4">Eqs. (4)</xref> and <xref ref-type="disp-formula" rid="eqn-5">(5)</xref> demonstrate that updating gate <inline-formula id="ieqn-39"><mml:math id="mml-ieqn-39"><mml:msub><mml:mi>z</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> and reset gate <inline-formula id="ieqn-40"><mml:math id="mml-ieqn-40"><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> are computed from GRU neurons. <inline-formula id="ieqn-41"><mml:math id="mml-ieqn-41"><mml:msub><mml:mi>W</mml:mi><mml:mrow><mml:mi>z</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> signifies the weight of <inline-formula id="ieqn-42"><mml:math id="mml-ieqn-42"><mml:msub><mml:mi>z</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub><mml:mo>,</mml:mo></mml:math></inline-formula> <inline-formula id="ieqn-43"><mml:math id="mml-ieqn-43"><mml:msub><mml:mi>W</mml:mi><mml:mrow><mml:mi>r</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> implies the weight of <inline-formula id="ieqn-44"><mml:math id="mml-ieqn-44"><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-45"><mml:math id="mml-ieqn-45"><mml:mn>0</mml:mn></mml:math></inline-formula> stands for the sigmoid function. The innermost time <inline-formula id="ieqn-46"><mml:math id="mml-ieqn-46"><mml:mo stretchy="false">[</mml:mo><mml:msub><mml:mi>h</mml:mi><mml:mrow><mml:mi>n</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub><mml:mo stretchy="false">]</mml:mo></mml:math></inline-formula> defines the sum of vectors <inline-formula id="ieqn-47"><mml:math id="mml-ieqn-47"><mml:msub><mml:mi>h</mml:mi><mml:mrow><mml:mi>n</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula> and <inline-formula id="ieqn-48"><mml:math id="mml-ieqn-48"><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>. A superior value of <inline-formula id="ieqn-49"><mml:math id="mml-ieqn-49"><mml:msub><mml:mi>z</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> refers to the fact that further data is continued by current cell, but lesser to the preceding cells. <inline-formula id="ieqn-50"><mml:math id="mml-ieqn-50"><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> refers to the fact that once the value of formula is equivalent to <inline-formula id="ieqn-51"><mml:math id="mml-ieqn-51"><mml:mn>0</mml:mn></mml:math></inline-formula>, the data in the preceding cells is discarded. <xref ref-type="disp-formula" rid="eqn-6">Eqs. (6)</xref> and <xref ref-type="disp-formula" rid="eqn-7">(7)</xref> illustrate the computation of pending resultant value, <inline-formula id="ieqn-52"><mml:math id="mml-ieqn-52"><mml:mover><mml:mi>h</mml:mi><mml:mo accent="false">&#x00AF;</mml:mo></mml:mover></mml:math></inline-formula> and last resultant value <inline-formula id="ieqn-53"><mml:math id="mml-ieqn-53"><mml:msub><mml:mi>h</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of GRU-NN. <inline-formula id="ieqn-54"><mml:math id="mml-ieqn-54"><mml:msub><mml:mi>h</mml:mi><mml:mrow><mml:mi>n</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula> stands for resultant in preceding cells, <inline-formula id="ieqn-55"><mml:math id="mml-ieqn-55"><mml:mi>W</mml:mi></mml:math></inline-formula> implies the weight of <inline-formula id="ieqn-56"><mml:math id="mml-ieqn-56"><mml:msub><mml:mi>z</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-57"><mml:math id="mml-ieqn-57"><mml:mi>tanh</mml:mi></mml:math></inline-formula> implies the hyperbolic tangent function. <inline-formula id="ieqn-58"><mml:math id="mml-ieqn-58"><mml:msub><mml:mover><mml:mi>h</mml:mi><mml:mo accent="false">&#x00AF;</mml:mo></mml:mover><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> is attained by multiplying <inline-formula id="ieqn-59"><mml:math id="mml-ieqn-59"><mml:msub><mml:mi>h</mml:mi><mml:mrow><mml:mi>n</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula> of preceding cells by <inline-formula id="ieqn-60"><mml:math id="mml-ieqn-60"><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, plus <inline-formula id="ieqn-61"><mml:math id="mml-ieqn-61"><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, multiplying by <inline-formula id="ieqn-62"><mml:math id="mml-ieqn-62"><mml:mi>W</mml:mi></mml:math></inline-formula>, and utilizing the hyperbolic tangent functions. <inline-formula id="ieqn-63"><mml:math id="mml-ieqn-63"><mml:msub><mml:mi>h</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> stands for the sum of two vectors in which one is attained by multiplying <inline-formula id="ieqn-64"><mml:math id="mml-ieqn-64"><mml:mn>1</mml:mn><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>z</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> by <inline-formula id="ieqn-65"><mml:math id="mml-ieqn-65"><mml:msub><mml:mi>h</mml:mi><mml:mrow><mml:mi>n</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula> and the another one is attained by multiplying <inline-formula id="ieqn-66"><mml:math id="mml-ieqn-66"><mml:msub><mml:mi>z</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> by <inline-formula id="ieqn-67"><mml:math id="mml-ieqn-67"><mml:msub><mml:mover><mml:mi>h</mml:mi><mml:mo accent="false">&#x00AF;</mml:mo></mml:mover><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub><mml:mo>.</mml:mo></mml:math></inline-formula></p>
<fig id="fig-2">
<label>Figure 2</label>
<caption>
<title>GRU structure</title>
</caption>
<graphic mimetype="image" mime-subtype="png" xlink:href="CMC_26729-fig-2.png"/>
</fig>
</sec>
</sec>
<sec id="s4">
<label>4</label>
<title>Experimental Validation</title>
<p>The proposed ODL-FDRDC technique was experimentally validated using MESSIDOR dataset which has a total of 1200 retinal fundus images captured in three ophthalmologic departments. The results of the proposed ODL-FDRDC technique were inspected under distinct Hidden Layers (HL). A few sample images is shown in <xref ref-type="fig" rid="fig-3">Fig. 3</xref>.</p>
<p><xref ref-type="table" rid="table-1">Tab. 1</xref> provides the results for overall DR classification analysis, accomplished by ODL-FDRDC technique under distinct HLs. With an HL of 10, the proposed ODL-FDRDC technique classified the class 0 with a sensitivity (<inline-formula id="ieqn-68"><mml:math id="mml-ieqn-68"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>) of 0.9945, <inline-formula id="ieqn-69"><mml:math id="mml-ieqn-69"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9939, <inline-formula id="ieqn-70"><mml:math id="mml-ieqn-70"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9942, <inline-formula id="ieqn-71"><mml:math id="mml-ieqn-71"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9927, and an <inline-formula id="ieqn-72"><mml:math id="mml-ieqn-72"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9936. Next, the presented ODL-FDRDC technique identified class 1 with a sensitivity (<inline-formula id="ieqn-73"><mml:math id="mml-ieqn-73"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>) of 0.9935, <inline-formula id="ieqn-74"><mml:math id="mml-ieqn-74"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9971, <inline-formula id="ieqn-75"><mml:math id="mml-ieqn-75"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9967, <inline-formula id="ieqn-76"><mml:math id="mml-ieqn-76"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9806, and an <inline-formula id="ieqn-77"><mml:math id="mml-ieqn-77"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula> of 0.987. In line with this, ODL-FDRDC technique recognized class 2 with a sensitivity (<inline-formula id="ieqn-78"><mml:math id="mml-ieqn-78"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>) of 0.9878, <inline-formula id="ieqn-79"><mml:math id="mml-ieqn-79"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9937, <inline-formula id="ieqn-80"><mml:math id="mml-ieqn-80"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9925, <inline-formula id="ieqn-81"><mml:math id="mml-ieqn-81"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9758, and an <inline-formula id="ieqn-82"><mml:math id="mml-ieqn-82"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9817.</p>
<p><xref ref-type="fig" rid="fig-4">Fig. 4</xref> depicts the results of average DR detection analysis, accomplished by ODL-FDRDC technique. The results showcase the effective outcomes of the proposed method under distinct HLs. For instance, with 10 HLs, ODL-FDRDC technique obtained an average <inline-formula id="ieqn-83"><mml:math id="mml-ieqn-83"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9870, <inline-formula id="ieqn-84"><mml:math id="mml-ieqn-84"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9959, <inline-formula id="ieqn-85"><mml:math id="mml-ieqn-85"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9942, <inline-formula id="ieqn-86"><mml:math id="mml-ieqn-86"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9863, and an <inline-formula id="ieqn-87"><mml:math id="mml-ieqn-87"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9866. Meanwhile, with 20 HLs, the proposed ODL-FDRDC technique attained an average <inline-formula id="ieqn-88"><mml:math id="mml-ieqn-88"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9851, <inline-formula id="ieqn-89"><mml:math id="mml-ieqn-89"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9955, <inline-formula id="ieqn-90"><mml:math id="mml-ieqn-90"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9942, <inline-formula id="ieqn-91"><mml:math id="mml-ieqn-91"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9891, and an <inline-formula id="ieqn-92"><mml:math id="mml-ieqn-92"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9871. Eventually, with 30 HLs, the proposed ODL-FDRDC technique offered an average <inline-formula id="ieqn-93"><mml:math id="mml-ieqn-93"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9872, <inline-formula id="ieqn-94"><mml:math id="mml-ieqn-94"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9963, <inline-formula id="ieqn-95"><mml:math id="mml-ieqn-95"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9950, <inline-formula id="ieqn-96"><mml:math id="mml-ieqn-96"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9900, and an <inline-formula id="ieqn-97"><mml:math id="mml-ieqn-97"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9886. Lastly, with 40 HLs, the presented ODL-FDRDC technique gained an average <inline-formula id="ieqn-98"><mml:math id="mml-ieqn-98"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9888, <inline-formula id="ieqn-99"><mml:math id="mml-ieqn-99"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9966, <inline-formula id="ieqn-100"><mml:math id="mml-ieqn-100"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>u</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9950, <inline-formula id="ieqn-101"><mml:math id="mml-ieqn-101"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9871, and an <inline-formula id="ieqn-102"><mml:math id="mml-ieqn-102"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula> of 0.9880.</p>
<fig id="fig-3">
<label>Figure 3</label>
<caption>
<title>Sample images</title>
</caption>
<graphic mimetype="image" mime-subtype="png" xlink:href="CMC_26729-fig-3.png"/>
</fig>
<table-wrap id="table-1">
<label>Table 1</label>
<caption>
<title>Results of the analysis of ODL-FDRDC technique under different HLs</title>
</caption>
<table frame="hsides" >
<colgroup>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
</colgroup>
<thead>
<tr>
<th>No. of hidden layers</th>
<th>Methods</th>
<th>Sensitivity</th>
<th>Specificity</th>
<th>Accuracy</th>
<th>Precision</th>
<th>F-Score</th>
</tr>
</thead>
<tbody>
<tr>
<td rowspan="5">HL&#x2013;10</td>
<td>0</td>
<td>0.9945</td>
<td>0.9939</td>
<td>0.9942</td>
<td>0.9927</td>
<td>0.9936</td>
</tr>
<tr>
<td>1</td>
<td>0.9935</td>
<td>0.9971</td>
<td>0.9967</td>
<td>0.9806</td>
<td>0.987</td>
</tr>
<tr>
<td>2</td>
<td>0.9878</td>
<td>0.9937</td>
<td>0.9925</td>
<td>0.9758</td>
<td>0.9817</td>
</tr>
<tr>
<td>3</td>
<td>0.9723</td>
<td>0.9989</td>
<td>0.9933</td>
<td>0.996</td>
<td>0.984</td>
</tr>
<tr>
<td>Average</td>
<td>0.987</td>
<td>0.9959</td>
<td>0.9942</td>
<td>0.9863</td>
<td>0.9866</td>
</tr>
<tr>
<td rowspan="5">HL&#x2013;20</td>
<td>0</td>
<td>0.9945</td>
<td>0.9892</td>
<td>0.9917</td>
<td>0.9873</td>
<td>0.9909</td>
</tr>
<tr>
<td>1</td>
<td>0.9739</td>
<td>0.999</td>
<td>0.9958</td>
<td>0.9933</td>
<td>0.9835</td>
</tr>
<tr>
<td>2</td>
<td>0.9959</td>
<td>0.9948</td>
<td>0.995</td>
<td>0.9799</td>
<td>0.9879</td>
</tr>
<tr>
<td>3</td>
<td>0.9763</td>
<td>0.9989</td>
<td>0.9942</td>
<td>0.996</td>
<td>0.986</td>
</tr>
<tr>
<td>Average</td>
<td>0.9851</td>
<td>0.9955</td>
<td>0.9942</td>
<td>0.9891</td>
<td>0.9871</td>
</tr>
<tr>
<td rowspan="5">HL&#x2013;30</td>
<td>0</td>
<td>0.9964</td>
<td>0.9923</td>
<td>0.9942</td>
<td>0.9909</td>
<td>0.9936</td>
</tr>
<tr>
<td>1</td>
<td>0.9804</td>
<td>0.999</td>
<td>0.9967</td>
<td>0.9934</td>
<td>0.9868</td>
</tr>
<tr>
<td>2</td>
<td>0.9878</td>
<td>0.9948</td>
<td>0.9933</td>
<td>0.9798</td>
<td>0.9837</td>
</tr>
<tr>
<td>3</td>
<td>0.9842</td>
<td>0.9989</td>
<td>0.9958</td>
<td>0.996</td>
<td>0.9901</td>
</tr>
<tr>
<td>Average</td>
<td>0.9872</td>
<td>0.9963</td>
<td>0.995</td>
<td>0.99</td>
<td>0.9886</td>
</tr>
<tr>
<td rowspan="5">HL&#x2013;40</td>
<td>0</td>
<td>0.9909</td>
<td>0.9954</td>
<td>0.9933</td>
<td>0.9945</td>
<td>0.9927</td>
</tr>
<tr>
<td>1</td>
<td>0.9804</td>
<td>0.9962</td>
<td>0.9942</td>
<td>0.974</td>
<td>0.9772</td>
</tr>
<tr>
<td>2</td>
<td>0.9918</td>
<td>0.9969</td>
<td>0.9958</td>
<td>0.9878</td>
<td>0.9898</td>
</tr>
<tr>
<td>3</td>
<td>0.9921</td>
<td>0.9979</td>
<td>0.9967</td>
<td>0.9921</td>
<td>0.9921</td>
</tr>
<tr>
<td>Average</td>
<td>0.9888</td>
<td>0.9966</td>
<td>0.995</td>
<td>0.9871</td>
<td>0.988</td>
</tr>
<tr>
<td rowspan="5">HL&#x2013;50</td>
<td>0</td>
<td>0.9909</td>
<td>0.9954</td>
<td>0.9933</td>
<td>0.9945</td>
<td>0.9927</td>
</tr>
<tr>
<td>1</td>
<td>0.9739</td>
<td>0.9943</td>
<td>0.9917</td>
<td>0.9613</td>
<td>0.9675</td>
</tr>
<tr>
<td>2</td>
<td>0.9918</td>
<td>0.9948</td>
<td>0.9942</td>
<td>0.9798</td>
<td>0.9858</td>
</tr>
<tr>
<td>3</td>
<td>0.9802</td>
<td>0.9979</td>
<td>0.9942</td>
<td>0.992</td>
<td>0.9861</td>
</tr>
<tr>
<td>Average</td>
<td>0.9842</td>
<td>0.9956</td>
<td>0.9933</td>
<td>0.9819</td>
<td>0.983</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>ROC analysis was conducted for ODL-FDRDC technique on test DR dataset and the results are shown in <xref ref-type="fig" rid="fig-5">Fig. 5</xref>. The results infer the enhanced classification efficiency of the proposed ODL-FDRDC technique with an increased ROC value of 99.9164%.</p>
<fig id="fig-4">
<label>Figure 4</label>
<caption>
<title>Average analysis results of ODL-FDRDC technique under different measures</title>
</caption>
<graphic mimetype="image" mime-subtype="png" xlink:href="CMC_26729-fig-4a.png"/>
<graphic mimetype="image" mime-subtype="png" xlink:href="CMC_26729-fig-4b.png"/>
</fig>
<fig id="fig-5">
<label>Figure 5</label>
<caption>
<title>ROC analysis results of ODL-FDRDC technique</title>
</caption>
<graphic mimetype="image" mime-subtype="png" xlink:href="CMC_26729-fig-5.png"/>
</fig>
<p><xref ref-type="table" rid="table-2">Tab. 2</xref> illustrates the results of comparative analysis, accomplished by ODL-FDRDC technique against existing methods under various measures. <xref ref-type="fig" rid="fig-6">Fig. 6</xref> demonstrates the <inline-formula id="ieqn-103"><mml:math id="mml-ieqn-103"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> analysis outcomes of the proposed ODL-FDRDC technique against recent methods. According to the experimental results, AlexNet model attained the least performance with a <inline-formula id="ieqn-104"><mml:math id="mml-ieqn-104"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 58.70%. Followed by, SqueezeNet, VGG-16, and VGG-19 models achieved low <inline-formula id="ieqn-105"><mml:math id="mml-ieqn-105"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> values such as 73.70%, 73.30%, and 73.50% respectively. In line with this, ResNet-18 and ResNet-50 models reached considerable <inline-formula id="ieqn-106"><mml:math id="mml-ieqn-106"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> values such as 97.50% and 98.30% respectively. However, the proposed ODL-FDRDC technique produced the highest <inline-formula id="ieqn-107"><mml:math id="mml-ieqn-107"><mml:mi>s</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 98.42%.</p>
<table-wrap id="table-2">
<label>Table 2</label>
<caption>
<title>Comparative analysis results of ODL-FDRDC technique against existing methods</title>
</caption>
<table frame="hsides">
<colgroup>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
</colgroup>
<thead>
<tr>
<th>Methods</th>
<th>Sensitivity</th>
<th>Specificity</th>
<th>Accuracy</th>
</tr>
</thead>
<tbody>
<tr>
<td>AlexNet</td>
<td>58.70</td>
<td>83.30</td>
<td>70.00</td>
</tr>
<tr>
<td>SqueezeNet</td>
<td>73.70</td>
<td>83.80</td>
<td>81.80</td>
</tr>
<tr>
<td>VGG-16</td>
<td>73.30</td>
<td>85.70</td>
<td>84.50</td>
</tr>
<tr>
<td>VGG-19</td>
<td>73.50</td>
<td>86.10</td>
<td>79.80</td>
</tr>
<tr>
<td>ResNet-18</td>
<td>97.50</td>
<td>91.20</td>
<td>90.40</td>
</tr>
<tr>
<td>ResNet-50</td>
<td>98.30</td>
<td>94.50</td>
<td>92.40</td>
</tr>
<tr>
<td>ODL-FDRDC</td>
<td>98.42</td>
<td>99.56</td>
<td>99.33</td>
</tr>
</tbody>
</table>
</table-wrap>
<fig id="fig-6">
<label>Figure 6</label>
<caption>
<title><inline-formula id="ieqn-118"><mml:math id="mml-ieqn-118"><mml:mi>S</mml:mi><mml:mi>e</mml:mi><mml:mi>n</mml:mi><mml:msub><mml:mi>s</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> analysis results of ODL-FDRDC technique against existing methods</title>
</caption>
<graphic mimetype="image" mime-subtype="png" xlink:href="CMC_26729-fig-6.png"/>
</fig>
<p><xref ref-type="fig" rid="fig-7">Fig. 7</xref> illustrates the <inline-formula id="ieqn-108"><mml:math id="mml-ieqn-108"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> analysis results of the presented ODL-FDRDC approach against recent algorithms. The experimental outcomes reveal that AlexNet method obtained the least performance with a <inline-formula id="ieqn-109"><mml:math id="mml-ieqn-109"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 83.30%. Then, SqueezeNet, VGG-16, and VGG-19 techniques achieved low <inline-formula id="ieqn-110"><mml:math id="mml-ieqn-110"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> values such as 83.80%, 85.70%, and 86.10% correspondingly. Also, ResNet-18 and ResNet-50 methodologies obtained considerable <inline-formula id="ieqn-111"><mml:math id="mml-ieqn-111"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> values like 91.20% and 94.50% correspondingly. At last, the proposed ODL-FDRDC method produced a superior <inline-formula id="ieqn-112"><mml:math id="mml-ieqn-112"><mml:mi>s</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 99.56%.</p>
<p><xref ref-type="fig" rid="fig-8">Fig. 8</xref> depicts the <inline-formula id="ieqn-113"><mml:math id="mml-ieqn-113"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> analysis results, accomplished by ODL-FDRDC technique as well as other recent methods. The experimental outcomes reveal that AlexNet method attained a minimal performance with an <inline-formula id="ieqn-114"><mml:math id="mml-ieqn-114"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 70%. Likewise, SqueezeNet, VGG-16, and VGG-19 methods reached lower <inline-formula id="ieqn-115"><mml:math id="mml-ieqn-115"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> values such as 81.80%, 84.50%, and 79.80% respectively. In addition, ResNet-18 and ResNet-50 techniques attained considerable <inline-formula id="ieqn-116"><mml:math id="mml-ieqn-116"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> values such as 90.40% and 92.40% respectively. Eventually, the proposed ODL-FDRDC system accomplished the highest <inline-formula id="ieqn-117"><mml:math id="mml-ieqn-117"><mml:mi>a</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> of 99.33%.</p>
<fig id="fig-7">
<label>Figure 7</label>
<caption>
<title><inline-formula id="ieqn-119"><mml:math id="mml-ieqn-119"><mml:mi>S</mml:mi><mml:mi>p</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> analysis results of ODL-FDRDC technique against existing methods</title>
</caption>
<graphic mimetype="image" mime-subtype="png" xlink:href="CMC_26729-fig-7.png"/>
</fig>
<fig id="fig-8">
<label>Figure 8</label>
<caption>
<title><inline-formula id="ieqn-120"><mml:math id="mml-ieqn-120"><mml:mi>A</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>y</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> analysis results of ODL-FDRDC technique against existing methods</title>
</caption>
<graphic mimetype="image" mime-subtype="png" xlink:href="CMC_26729-fig-8.png"/>
</fig>
<p>Based on the results and discussion made above, it is evident that the proposed ODL-FDRDC technique is a superior performer as it produced the maximum DR performance over other techniques.</p>
</sec>
<sec id="s5">
<label>5</label>
<title>Conclusion</title>
<p>In this study, a novel ODL-FDRDC technique has been developed to identify and categorize different grades of DR using retinal fundus images. The proposed ODL-FDRDC technique encompasses preprocessing, region-growing segmentation, fusion-based feature extraction, COA-based hyperparameter optimization, and GRU-based classification. The hyperparameter tuning of the fusion models is performed via COA. The proposed ODL-FDRDC technique was experimentally validated against the benchmark DR dataset and the results were validated under different measures. The outcomes infer that the proposed ODL-FDRDC is a superior performer compared to existing methodologies. Therefore, ODL-FDRDC technique can be used as an effectual tool to perform diagnosis in real-time scenarios. In future, DL-based instance segmentation techniques can be designed to improve DR classification outcomes.</p>
</sec>
</body>
<back>
<fn-group>
<fn fn-type="other"><p><bold>Funding Statement:</bold> This Research was funded by the Deanship of Scientific Research at University of Business and Technology, Saudi Arabia.</p></fn>
<fn fn-type="conflict"><p><bold>Conflicts of Interest:</bold> The authors declare that they have no conflicts of interest to report regarding the present study.</p></fn>
</fn-group>
<ref-list content-type="authoryear">
<title>References</title>
<ref id="ref-1"><label>[1]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>C.</given-names> <surname>Sabanayagam</surname></string-name>, <string-name><given-names>R.</given-names> <surname>Banu</surname></string-name>, <string-name><given-names>M. L.</given-names> <surname>Chee</surname></string-name>, <string-name><given-names>R. L.</given-names> <surname>Chee</surname></string-name>, <string-name><given-names>Y. X.</given-names> <surname>Wang</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Incidence and progression of diabetic retinopathy: A systematic review</article-title>,&#x201D; <source>The Lancet Diabetes &#x0026; Endocrinology</source>, vol. <volume>7</volume>, no. <issue>2</issue>, pp. <fpage>140</fpage>&#x2013;<lpage>149</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-2"><label>[2]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>L. P.</given-names> <surname>Cunha</surname></string-name>, <string-name><given-names>E. A.</given-names> <surname>Figueiredo</surname></string-name>, <string-name><given-names>H. P.</given-names> <surname>Ara&#x00FA;jo</surname></string-name>, <string-name><given-names>L. V. F. C.</given-names> <surname>Cunha</surname></string-name>, <string-name><given-names>C. F.</given-names> <surname>Costa</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Non-mydriatic fundus retinography in screening for diabetic retinopathy: Agreement between family physicians, general ophthalmologists, and a retinal specialist</article-title>,&#x201D; <source>Frontiers in Endocrinology</source>, vol. <volume>9</volume>, pp. <fpage>251</fpage>, <year>2018</year>.</mixed-citation></ref>
<ref id="ref-3"><label>[3]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>N.</given-names> <surname>Salamat</surname></string-name>, <string-name><given-names>M. M. S.</given-names> <surname>Missen</surname></string-name> and <string-name><given-names>A.</given-names> <surname>Rashid</surname></string-name></person-group>, &#x201C;<article-title>Diabetic retinopathy techniques in retinal images: A review</article-title>,&#x201D; <source>Artificial Intelligence in Medicine</source>, vol. <volume>97</volume>, no. <issue>Supplement C</issue>, pp. <fpage>168</fpage>&#x2013;<lpage>188</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-4"><label>[4]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>D. S. W.</given-names> <surname>Ting</surname></string-name>, <string-name><given-names>G. C. M.</given-names> <surname>Cheung</surname></string-name> and <string-name><given-names>T. Y.</given-names> <surname>Wong</surname></string-name></person-group>, &#x201C;<article-title>Diabetic retinopathy: Global prevalence, major risk factors, screening practices and public health challenges: A review: Global burden of diabetic eye diseases</article-title>,&#x201D; <source>Clinical &#x0026; Experimental Ophthalmology</source>, vol. <volume>44</volume>, no. <issue>4</issue>, pp. <fpage>260</fpage>&#x2013;<lpage>277</lpage>, <year>2016</year>.</mixed-citation></ref>
<ref id="ref-5"><label>[5]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>M.</given-names> <surname>Soares</surname></string-name>, <string-name><given-names>C.</given-names> <surname>Neves</surname></string-name>, <string-name><given-names>I. P.</given-names> <surname>Marques</surname></string-name>, <string-name><given-names>I.</given-names> <surname>Pires</surname></string-name>, <string-name><given-names>C.</given-names> <surname>Schwartz</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Comparison of diabetic retinopathy classification using fluorescein angiography and optical coherence tomography angiography</article-title>,&#x201D; <source>British Journal of Ophthalmology</source>, vol. <volume>101</volume>, no. <issue>1</issue>, pp. <fpage>62</fpage>&#x2013;<lpage>68</lpage>, <year>2017</year>.</mixed-citation></ref>
<ref id="ref-6"><label>[6]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>Wang</surname></string-name>, <string-name><given-names>Y.</given-names> <surname>Yin</surname></string-name>, <string-name><given-names>G.</given-names> <surname>Cao</surname></string-name>, <string-name><given-names>B.</given-names> <surname>Wei</surname></string-name>, <string-name><given-names>Y.</given-names> <surname>Zheng</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Hierarchical retinal blood vessel segmentation based on feature and ensemble learning</article-title>,&#x201D; <source>Neurocomputing</source>, vol. <volume>149</volume>, pp. <fpage>708</fpage>&#x2013;<lpage>717</lpage>, <year>2015</year>.</mixed-citation></ref>
<ref id="ref-7"><label>[7]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>J.</given-names> <surname>Sahlsten</surname></string-name>, <string-name><given-names>J.</given-names> <surname>Jaskari</surname></string-name>, <string-name><given-names>J.</given-names> <surname>Kivinen</surname></string-name>, <string-name><given-names>L.</given-names> <surname>Turunen</surname></string-name>, <string-name><given-names>E.</given-names> <surname>Jaanio</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Deep learning fundus image analysis for diabetic retinopathy and macular edema grading</article-title>,&#x201D; <source>Scientific Reports</source>, vol. <volume>9</volume>, no. <issue>1</issue>, pp. <fpage>10750</fpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-8"><label>[8]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>Lahmiri</surname></string-name> and <string-name><given-names>A.</given-names> <surname>Shmuel</surname></string-name></person-group>, &#x201C;<article-title>Variational mode decomposition based approach for accurate classification of color fundus images with hemorrhages</article-title>,&#x201D; <source>Optics &#x0026; Laser Technology</source>, vol. <volume>96</volume>, pp. <fpage>243</fpage>&#x2013;<lpage>248</lpage>, <year>2017</year>.</mixed-citation></ref>
<ref id="ref-9"><label>[9]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>C.</given-names> <surname>Lam</surname></string-name>, <string-name><given-names>C.</given-names> <surname>Yu</surname></string-name>, <string-name><given-names>L.</given-names> <surname>Huang</surname></string-name> and <string-name><given-names>D.</given-names> <surname>Rubin</surname></string-name></person-group>, &#x201C;<article-title>Retinal lesion detection with deep learning using image patches</article-title>,&#x201D; <source>Investigative Ophthalmology &#x0026; Visual Science</source>, vol. <volume>59</volume>, no. <issue>1</issue>, pp. <fpage>590</fpage>, <year>2018</year>.</mixed-citation></ref>
<ref id="ref-10"><label>[10]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>A.</given-names> <surname>Ortiz</surname></string-name>, <string-name><given-names>J.</given-names> <surname>Munilla</surname></string-name>, <string-name><given-names>J. M.</given-names> <surname>G&#x00F3;rriz</surname></string-name> and <string-name><given-names>J.</given-names> <surname>Ram&#x00ED;rez</surname></string-name></person-group>, &#x201C;<article-title>Ensembles of deep learning architectures for the early diagnosis of the alzheimer&#x2019;s disease</article-title>,&#x201D; <source>International Journal of Neural Systems</source>, vol. <volume>26</volume>, no. <issue>7</issue>, pp. <fpage>1650025</fpage>, <year>2016</year>.</mixed-citation></ref>
<ref id="ref-11"><label>[11]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>Qummar</surname></string-name>, <string-name><given-names>F. G.</given-names> <surname>Khan</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Shah</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Khan</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Shamshirband</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>A deep learning ensemble approach for diabetic retinopathy detection</article-title>,&#x201D; <source>IEEE Access</source>, vol. <volume>7</volume>, pp. <fpage>150530</fpage>&#x2013;<lpage>150539</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-12"><label>[12]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>E.</given-names> <surname>Beede</surname></string-name>, <string-name><given-names>E.</given-names> <surname>Baylor</surname></string-name>, <string-name><given-names>F.</given-names> <surname>Hersch</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Iurchenko</surname></string-name>, <string-name><given-names>L.</given-names> <surname>Wilcox</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>A human-centered evaluation of a deep learning system deployed in clinics for the detection of diabetic retinopathy</article-title>,&#x201D; in <conf-name>Proc. of the 2020 CHI Conf. on Human Factors in Computing Systems</conf-name>, <conf-loc>Honolulu, HI, USA</conf-loc>, pp. <fpage>1</fpage>&#x2013;<lpage>12</lpage>, <year>2020</year>. </mixed-citation></ref>
<ref id="ref-13"><label>[13]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>T. R.</given-names> <surname>Gadekallu</surname></string-name>, <string-name><given-names>N.</given-names> <surname>Khare</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Bhattacharya</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Singh</surname></string-name>, <string-name><given-names>P. K. R.</given-names> <surname>Maddikunta</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Early detection of diabetic retinopathy using pca-firefly based deep learning model</article-title>,&#x201D; <source>Electronics</source>, vol. <volume>9</volume>, no. <issue>2</issue>, pp. <fpage>274</fpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-14"><label>[14]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>F.</given-names> <surname>Li</surname></string-name>, <string-name><given-names>Y.</given-names> <surname>Wang</surname></string-name>, <string-name><given-names>T.</given-names> <surname>Xu</surname></string-name>, <string-name><given-names>L.</given-names> <surname>Dong</surname></string-name>, <string-name><given-names>L.</given-names> <surname>Yan</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Deep learning-based automated detection for diabetic retinopathy and diabetic macular oedema in retinal fundus photographs</article-title>,&#x201D; <source>Eye</source>, vol. <volume>39</volume>, pp. <fpage>1483</fpage>, <year>2021</year>.</mixed-citation></ref>
<ref id="ref-15"><label>[15]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>F. J. M.</given-names> <surname>Murcia</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Ortiz</surname></string-name>, <string-name><given-names>J.</given-names> <surname>Ram&#x00ED;rez</surname></string-name>, <string-name><given-names>J. M.</given-names> <surname>G&#x00F3;rriz</surname></string-name> and <string-name><given-names>R.</given-names> <surname>Cruz</surname></string-name></person-group>, &#x201C;<article-title>Deep residual transfer learning for automatic diagnosis and grading of diabetic retinopathy</article-title>,&#x201D; <source>Neurocomputing</source>, vol. <volume>452</volume>, no. <issue>2</issue>, pp. <fpage>424</fpage>&#x2013;<lpage>434</lpage>, <year>2021</year>.</mixed-citation></ref>
<ref id="ref-16"><label>[16]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>X.</given-names> <surname>Jiang</surname></string-name>, <string-name><given-names>Y.</given-names> <surname>Wang</surname></string-name>, <string-name><given-names>W.</given-names> <surname>Liu</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Li</surname></string-name>, <string-name><given-names>J.</given-names> <surname>Liu</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>FCN: Comparative performance evaluation for image classification</article-title>,&#x201D; <source>International Journal of Machine Learning and Computing</source>, vol. <volume>9</volume>, no. <issue>6</issue>, pp. <fpage>840</fpage>&#x2013;<lpage>848</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-17"><label>[17]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>M.</given-names> <surname>Sandler</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Howard</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Zhu</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Zhmoginov</surname></string-name> and <string-name><given-names>L. C.</given-names> <surname>Chen</surname></string-name></person-group>, &#x201C;<article-title>MobileNetV2: Inverted residuals and linear bottlenecks</article-title>,&#x201D; in <conf-name>2018 IEEE/CVF Conf. on Computer Vision and Pattern Recognition</conf-name>, <publisher-loc>Salt Lake City, UT</publisher-loc>, pp. <fpage>4510</fpage>&#x2013;<lpage>4520</lpage>, <year>2018</year>. </mixed-citation></ref>
<ref id="ref-18"><label>[18]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>J.</given-names> <surname>Pierezan</surname></string-name> and <string-name><given-names>L.</given-names> <surname>Dos Santos Coelho</surname></string-name></person-group>, &#x201C;<article-title>Coyote optimization algorithm: A new metaheuristic for global optimization problems</article-title>,&#x201D; in <conf-name>2018 IEEE Congress on Evolutionary Computation (CEC)</conf-name>, Rio de Janeiro, Brazil, pp. <fpage>1</fpage>&#x2013;<lpage>8</lpage>, <year>2018</year>. </mixed-citation></ref>
<ref id="ref-19"><label>[19]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>R.</given-names> <surname>Dey</surname></string-name> and <string-name><given-names>F. M.</given-names> <surname>Salem</surname></string-name></person-group>, &#x201C;<article-title>Gate-variants of Gated Recurrent Unit (GRU) neural networks</article-title>,&#x201D; in <conf-name>2017 IEEE 60th Int. Midwest Symp. on Circuits and Systems (MWSCAS)</conf-name>, <publisher-loc>Boston, MA</publisher-loc>, pp. <fpage>1597</fpage>&#x2013;<lpage>1600</lpage>, <year>2017</year>. </mixed-citation></ref>
</ref-list>
</back>
</article>
