<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.1 20151215//EN" "http://jats.nlm.nih.gov/publishing/1.1/JATS-journalpublishing1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" article-type="research-article" dtd-version="1.1">
<front>
<journal-meta>
<journal-id journal-id-type="pmc">CMC</journal-id>
<journal-id journal-id-type="nlm-ta">CMC</journal-id>
<journal-id journal-id-type="publisher-id">CMC</journal-id>
<journal-title-group>
<journal-title>Computers, Materials &#x0026; Continua</journal-title>
</journal-title-group>
<issn pub-type="epub">1546-2226</issn>
<issn pub-type="ppub">1546-2218</issn>
<publisher>
<publisher-name>Tech Science Press</publisher-name>
<publisher-loc>USA</publisher-loc>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="publisher-id">26780</article-id>
<article-id pub-id-type="doi">10.32604/cmc.2022.026780</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Article</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>Hybrid Metaheuristics Based License Plate Character Recognition in Smart City</article-title>
<alt-title alt-title-type="left-running-head">Hybrid Metaheuristics Based License Plate Character Recognition in Smart City</alt-title>
<alt-title alt-title-type="right-running-head">Hybrid Metaheuristics Based License Plate Character Recognition in Smart City</alt-title>
</title-group>
<contrib-group content-type="authors">
<contrib id="author-1" contrib-type="author">
<name name-style="western"><surname>AlQaralleh</surname><given-names>Esam A.</given-names>
</name><xref ref-type="aff" rid="aff-1">1</xref></contrib>
<contrib id="author-2" contrib-type="author">
<name name-style="western"><surname>Aldhaban</surname><given-names>Fahad</given-names>
</name><xref ref-type="aff" rid="aff-2">2</xref></contrib>
<contrib id="author-3" contrib-type="author">
<name name-style="western"><surname>Nasseif</surname><given-names>Halah</given-names>
</name><xref ref-type="aff" rid="aff-2">2</xref></contrib>
<contrib id="author-4" contrib-type="author" corresp="yes">
<name name-style="western"><surname>Alqaralleh</surname><given-names>Bassam A.Y.</given-names>
</name><xref ref-type="aff" rid="aff-2">2</xref><email>b.alqaralleh@ubt.edu.sa</email>
</contrib>
<contrib id="author-5" contrib-type="author">
<name name-style="western"><surname>AbuKhalil</surname><given-names>Tamer</given-names>
</name><xref ref-type="aff" rid="aff-3">3</xref></contrib>
<aff id="aff-1"><label>1</label><institution>School of Engineering, Princess Sumaya University for Technology</institution>, <addr-line>Amman, 11941</addr-line>, <country>Jordan</country></aff>
<aff id="aff-2"><label>2</label><institution>MIS Department, College of Business Administration, University of Business and Technology</institution>, <addr-line>Jeddah, 21448</addr-line>, <country>Saudi Arabia</country></aff>
<aff id="aff-3"><label>3</label><institution>Department of Computer Science, Faculty of Information Technology, Al-Hussein Bin Talal University</institution>, <addr-line>Ma&#x0027;an, 71111</addr-line>, <country>Jordan</country></aff>
</contrib-group>
<author-notes>
<corresp id="cor1"><label>&#x002A;</label>Corresponding Author: Bassam A.Y. Alqaralleh. Email: <email>b.alqaralleh@ubt.edu.sa</email></corresp>
</author-notes>
<pub-date pub-type="epub" date-type="pub" iso-8601-date="2022-04-20"><day>20</day>
<month>04</month>
<year>2022</year></pub-date>
<volume>72</volume>
<issue>3</issue>
<fpage>5727</fpage>
<lpage>5740</lpage>
<history>
<date date-type="received">
<day>04</day>
<month>1</month>
<year>2022</year>
</date>
<date date-type="accepted">
<day>14</day>
<month>2</month>
<year>2022</year>
</date>
</history>
<permissions>
<copyright-statement>&#x00A9; 2022 AlQaralleh et al.</copyright-statement>
<copyright-year>2022</copyright-year>
<copyright-holder>AlQaralleh et al.</copyright-holder>
<license xlink:href="https://creativecommons.org/licenses/by/4.0/">
<license-p>This work is licensed under a <ext-link ext-link-type="uri" xlink:type="simple" xlink:href="https://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</ext-link>, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.</license-p>
</license>
</permissions>
<self-uri content-type="pdf" xlink:href="TSP_CMC_26780.pdf"></self-uri>
<abstract>
<p>Recent technological advancements have been used to improve the quality of living in smart cities. At the same time, automated detection of vehicles can be utilized to reduce crime rate and improve public security. On the other hand, the automatic identification of vehicle license plate (LP) character becomes an essential process to recognize vehicles in real time scenarios, which can be achieved by the exploitation of optimal deep learning (DL) approaches. In this article, a novel hybrid metaheuristic optimization based deep learning model for automated license plate character recognition (HMODL-ALPCR) technique has been presented for smart city environments. The major intention of the HMODL-ALPCR technique is to detect LPs and recognize the characters that exist in them. For effective LP detection process, mask regional convolutional neural network (Mask-RCNN) model is applied and the Inception with Residual Network (ResNet)-v2 as the baseline network. In addition, hybrid sunflower optimization with butterfly optimization algorithm (HSFO-BOA) is utilized for the hyperparameter tuning of the Inception-ResNetv2 model. Finally, Tesseract based character recognition model is applied to effectively recognize the characters present in the LPs. The experimental result analysis of the HMODL-ALPCR technique takes place against the benchmark dataset and the experimental outcomes pointed out the improved efficacy of the HMODL-ALPCR technique over the recent methods.</p>
</abstract>
<kwd-group kwd-group-type="author">
<kwd>Smart city</kwd>
<kwd>license plate recognition</kwd>
<kwd>optimal deep learning</kwd>
<kwd>metaheuristic algorithms</kwd>
<kwd>parameter tuning</kwd>
</kwd-group>
</article-meta>
</front>
<body>
<sec id="s1">
<label>1</label>
<title>Introduction</title>
<p>Continual urbanization possesses difficult problems on living quality and sustainable development of urban residents in smart cities [<xref ref-type="bibr" rid="ref-1">1</xref>]. The idea of smart cities is to make very effective usages of scarce resources, and enhance the quality of public services and citizen lives [<xref ref-type="bibr" rid="ref-2">2</xref>]. With the growth of embedded devices, Internet of Things (IoT), for example, mobiles phones sensors, Radio Frequency Identifications (RFIDs), and actuators, constructed into all the fabrics of urban environment and coupled together [<xref ref-type="bibr" rid="ref-3">3</xref>]. Several smart city applications were deployed and developed, e.g., smart healthcare, intelligent transportation, public safety, and environment monitoring, etc. License plate recognition (LPR) system is often a great advantage for parking, traffic, cruise control, and toll management applications [<xref ref-type="bibr" rid="ref-4">4</xref>]. Regarding security management and monitoring of any region or place, LPR system is utilized as tracing assistance to help eyes for the safety teams. In terms of law and safety enforcement, LPR system plays an important part in safeguarding, monitoring the borders, and physical intrusion [<xref ref-type="bibr" rid="ref-5">5</xref>]. Different types of LPR systems are introduced by utilizing many smart computation models to attain efficiency and accuracy.</p>
<p>Various recognition approaches were described to implement many intermediate processing phases at the time of Region of Interest (ROI) extraction. Nonetheless, fraud situations such as replacement and alteration, LPR system is related to intelligence method for effectiveness [<xref ref-type="bibr" rid="ref-6">6</xref>]. The first phase of LPR systems is plate localization that is related to a recognition method for license plates (LP) in the input image. Algorithms like threshold or edge detection [<xref ref-type="bibr" rid="ref-7">7</xref>] are utilized by the video sequence. But Gabor filter is taken into account as a promising method for plate recognition through RBG image [<xref ref-type="bibr" rid="ref-8">8</xref>], whereas the previous one employs grey scale conversion for binary images. As well, generate histogram by means of vertical and horizontal prediction on the input images to recognize ROI based histogram that identifies the plates through multiple objects. Also, Hough conversion is employed for finding the edges bounded by the number plate [<xref ref-type="bibr" rid="ref-9">9</xref>].</p>
<p>This paper develops an intelligent hybrid metaheuristic optimization based deep learning model for automated license plate character recognition (HMODL-ALPCR) technique that has been presented for smart city environments. The HMODL-ALPCR technique involves mask regional convolutional neural network (Mask-RCNN) model is applied and the Inception with Residual Network (ResNet)-v2 as the baseline network. In addition, hybrid sunflower optimization with butterfly optimization algorithm (HSFO-BOA) is utilized for the hyperparameter tuning of the Inception-ResNetv2 model. Finally, Tesseract based character recognition model is applied to effectively recognize the characters present in the LPs. The experimental result analysis of the HMODL-ALPCR technique takes place against the benchmark dataset.</p>
</sec>
<sec id="s2">
<label>2</label>
<title>Literature Review</title>
<p>Deep learning (DL), a comparatively young learning model in the CI family, has its source from Artificial Neural Networks (ANN). It enables computation model that is made up of multi-processing layers to learn representation of information with multi stages of abstraction, also it is capable of discovering complex structures from natural information in their new form without needing complex feature tuning and engineering [<xref ref-type="bibr" rid="ref-10">10</xref>]. In comparison with conventional ML models, DL method could develop exceptionally complex functions over layers of nonlinear conversion trainable from the start to the termination. In [<xref ref-type="bibr" rid="ref-11">11</xref>], proposed a cascaded DL method for constructing an effective Automatic license plate (ALP) recognition and detection method for the vehicle of northern Iraq. The LP in northern Iraq contains country region, plate number, and city region. Initially, the presented technique uses various pre-processing methods like adaptive image contrast enhancement and Gaussian filtering for making the input image better suitable for additional processing. Next, a deep semantic segmentation network is utilized for determining the three LPs of the input images. Then, Segmentation is performed by using deep encoder-decoder network framework.</p>
<p>Chen [<xref ref-type="bibr" rid="ref-12">12</xref>] resolves the issues of car LP recognition through a YOLO darknet DL architecture. In the work, we employ YOLO seven convolution layers to identify an individual class. The recognition model is a sliding-window method. The object is to identify Taiwan car LP. Izidio et al. [<xref ref-type="bibr" rid="ref-13">13</xref>] introduced a method to engineer systems to recognize and detect Brazilian LP with CNN i.e., appropriate for embedded systems. The resultant systems detect LP in the captured image through Tiny YOLOv3 framework and recognize its character with second convolution networks trained on synthetic image and finetuned with actual LP image. Pustokhina et al. [<xref ref-type="bibr" rid="ref-14">14</xref>] proposed an efficient DL-based VLPR method with optimum K-means (OKM) cluster-based classification and CNN based detection method. The presented method works on three major phases such as LP segmentation, detection with OKM cluster method, and LP number detection with CNN method. In the initial phase, LP detection and localization method take place.</p>
</sec>
<sec id="s3">
<label>3</label>
<title>The Proposed Model</title>
<p>In this article, an automated HMODL-ALPCR technique has been presented to detect LPs and recognize the characters that exist in them for smart city environments. The HMODL-ALPCR technique involves Mask-RCNN for the detection of LPs and Inception with ResNet-v2 as the baseline network. Moreover, the HSFO-BOA is utilized for the hyperparameter tuning of the Inception-ResNetv2 model. Lastly, Tesseract based character recognition model is applied to effectively recognize the characters present in the LPs.</p>
<sec id="s3_1">
<label>3.1</label>
<title>Phase I: Mask RCNN Based LP Detection Process</title>
<p>The Mask <inline-formula id="ieqn-1"><mml:math id="mml-ieqn-1"><mml:mi>R</mml:mi></mml:math></inline-formula>-CNN technique is melioration dependent upon Faster <inline-formula id="ieqn-2"><mml:math id="mml-ieqn-2"><mml:mi>R</mml:mi></mml:math></inline-formula>-CNN detection technique that presents the fully convolutional network (FCN) for generating masks. During the real time target detection procedure, the pixel of target are categorized accurately, and after that, the contour of target was judged. An image was primary input as to the backbone network consisting of Inception with ResNet v2 and FPN [<xref ref-type="bibr" rid="ref-15">15</xref>]. The structure of Mask RCNN is shown in <xref ref-type="fig" rid="fig-1">Fig. 1</xref>. The backbone network removes any shared feature map (FM) which integrates the coordinate data of detection target place and the form texture data. Afterward, the RPN region offer network utilizes a sliding window for traversing this FM for generating many anchor frames with group of fixed scale and aspect ratio.</p>
<fig id="fig-1">
<label>Figure 1</label>
<caption>
<title>Mask RCNN structure</title>
</caption><graphic mimetype="image" mime-subtype="png" xlink:href="CMC_26780-fig-1.png"/>
</fig>
<p>Afterward, the non-maximum suppression (NMS) technique was utilized for selecting the anchor box with superior score [<xref ref-type="bibr" rid="ref-16">16</xref>]. During the RoIAlign layer of Mask <inline-formula id="ieqn-3"><mml:math id="mml-ieqn-3"><mml:mi>R</mml:mi></mml:math></inline-formula>-CNN technique, the quantization function from the feature aggregation procedure was changed by bilinear interpolation technique that keeps the issue of mismatching and enhancing the accuracy of detecting and segmenting. In the trained procedure, the Mask <inline-formula id="ieqn-4"><mml:math id="mml-ieqn-4"><mml:mi>R</mml:mi></mml:math></inline-formula>-CNN technique determines the multitask loss function to all sampled RoI as</p>
<p><disp-formula id="eqn-1"><label>(1)</label><mml:math id="mml-eqn-1" display="block"><mml:mi>L</mml:mi><mml:mo>=</mml:mo><mml:msub><mml:mi>L</mml:mi><mml:mrow><mml:mi>c</mml:mi><mml:mi>l</mml:mi><mml:mi>s</mml:mi></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:msub><mml:mi>L</mml:mi><mml:mrow><mml:mi>b</mml:mi><mml:mi>o</mml:mi><mml:mi>x</mml:mi></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:msub><mml:mi>L</mml:mi><mml:mrow><mml:mi>m</mml:mi><mml:mi>a</mml:mi><mml:mi>s</mml:mi><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mo>.</mml:mo></mml:math></disp-formula></p>
<p><inline-formula id="ieqn-5"><mml:math id="mml-ieqn-5"><mml:msub><mml:mi>L</mml:mi><mml:mrow><mml:mi>c</mml:mi><mml:mi>l</mml:mi><mml:mi>s</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> implies the classifier error, <inline-formula id="ieqn-6"><mml:math id="mml-ieqn-6"><mml:msub><mml:mi>L</mml:mi><mml:mrow><mml:mi>b</mml:mi><mml:mi>o</mml:mi><mml:mi>x</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> refers the recognition error, and <inline-formula id="ieqn-7"><mml:math id="mml-ieqn-7"><mml:msub><mml:mi>L</mml:mi><mml:mrow><mml:mi>m</mml:mi><mml:mi>a</mml:mi><mml:mi>s</mml:mi><mml:mi>k</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> stands for the segmented error. <inline-formula id="ieqn-8"><mml:math id="mml-ieqn-8"><mml:msub><mml:mi>L</mml:mi><mml:mrow><mml:mi>c</mml:mi><mml:mi>l</mml:mi><mml:mi>s</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> and <inline-formula id="ieqn-9"><mml:math id="mml-ieqn-9"><mml:msub><mml:mi>L</mml:mi><mml:mrow><mml:mi>b</mml:mi><mml:mi>o</mml:mi><mml:mi>x</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> from the Mask, <inline-formula id="ieqn-10"><mml:math id="mml-ieqn-10"><mml:mi>R</mml:mi></mml:math></inline-formula>-CNN is determined as:</p>
<p><disp-formula id="eqn-2"><label>(2)</label><mml:math id="mml-eqn-2" display="block"><mml:mi>L</mml:mi><mml:mo>&#x2018;</mml:mo><mml:mo>=</mml:mo><mml:mfrac><mml:mn>1</mml:mn><mml:msub><mml:mi>N</mml:mi><mml:mrow><mml:mi>c</mml:mi><mml:mi>l</mml:mi><mml:mi>s</mml:mi></mml:mrow></mml:msub></mml:mfrac><mml:munder><mml:mo>&#x2211;</mml:mo><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:munder><mml:msub><mml:mi>L</mml:mi><mml:mrow><mml:mi>c</mml:mi><mml:mi>l</mml:mi><mml:mi>s</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>p</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:msubsup><mml:mi>p</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow></mml:msubsup><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:mi>&#x03BB;</mml:mi><mml:mfrac><mml:mn>1</mml:mn><mml:msub><mml:mi>N</mml:mi><mml:mrow><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>g</mml:mi></mml:mrow></mml:msub></mml:mfrac><mml:munder><mml:mo>&#x2211;</mml:mo><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:munder><mml:msubsup><mml:mi>p</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow></mml:msubsup><mml:msub><mml:mi>L</mml:mi><mml:mrow><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>g</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>t</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:msubsup><mml:mi>t</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow></mml:msubsup><mml:mo>)</mml:mo></mml:mrow><mml:mo>.</mml:mo></mml:math></disp-formula>where <inline-formula id="ieqn-11"><mml:math id="mml-ieqn-11"><mml:msub><mml:mi>p</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> signifies the forecasted probability of <inline-formula id="ieqn-12"><mml:math id="mml-ieqn-12"><mml:msup><mml:mi>i</mml:mi><mml:mrow><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:mrow></mml:msup></mml:math></inline-formula> target on anchor point. <inline-formula id="ieqn-13"><mml:math id="mml-ieqn-13"><mml:msubsup><mml:mi>p</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow></mml:msubsup></mml:math></inline-formula> has referred to as the sign of anchor point samples. If the anchor point instance was positive, <inline-formula id="ieqn-14"><mml:math id="mml-ieqn-14"><mml:msubsup><mml:mi>p</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow></mml:msubsup></mml:math></inline-formula> is 1; else, it can be <inline-formula id="ieqn-15"><mml:math id="mml-ieqn-15"><mml:mn>0</mml:mn></mml:math></inline-formula>. Combined of <inline-formula id="ieqn-16"><mml:math id="mml-ieqn-16"><mml:msub><mml:mi>t</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> and <inline-formula id="ieqn-17"><mml:math id="mml-ieqn-17"><mml:msubsup><mml:mi>t</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow></mml:msubsup></mml:math></inline-formula> are vectors consisting of 4 translation and scaling parameters that correspondingly. The weight <inline-formula id="ieqn-18"><mml:math id="mml-ieqn-18"><mml:msub><mml:mi>N</mml:mi><mml:mrow><mml:mi>c</mml:mi><mml:mi>l</mml:mi><mml:mi>s</mml:mi></mml:mrow></mml:msub><mml:mo>,</mml:mo></mml:math></inline-formula> <inline-formula id="ieqn-19"><mml:math id="mml-ieqn-19"><mml:msub><mml:mi>N</mml:mi><mml:mrow><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>g</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-20"><mml:math id="mml-ieqn-20"><mml:mi>&#x03BB;</mml:mi></mml:math></inline-formula> control the 2 losses for keeping balance. The classification and regression losses are determined as:</p>
<p><disp-formula id="eqn-3"><label>(3)</label><mml:math id="mml-eqn-3" display="block"><mml:msub><mml:mi>L</mml:mi><mml:mrow><mml:mi>c</mml:mi><mml:mi>l</mml:mi><mml:mi>s</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>p</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:msubsup><mml:mi>p</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow></mml:msubsup><mml:mo>)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mo>&#x2212;</mml:mo><mml:mi>log</mml:mi><mml:mo>&#x2061;</mml:mo><mml:mrow><mml:mo>[</mml:mo><mml:msub><mml:mi>p</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:msubsup><mml:mi>p</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow></mml:msubsup><mml:mo>+</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mn>1</mml:mn><mml:mo>&#x2212;</mml:mo><mml:msubsup><mml:mi>p</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow></mml:msubsup><mml:mo>)</mml:mo></mml:mrow><mml:mrow><mml:mo>(</mml:mo><mml:mn>1</mml:mn><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>p</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>)</mml:mo></mml:mrow><mml:mo>]</mml:mo></mml:mrow></mml:math></disp-formula></p>
<p><disp-formula id="eqn-4"><label>(4)</label><mml:math id="mml-eqn-4" display="block"><mml:msub><mml:mi>L</mml:mi><mml:mrow><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>g</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>t</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:msubsup><mml:mi>t</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow></mml:msubsup><mml:mo>)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">m</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">t</mml:mi></mml:mrow><mml:msub><mml:mi>h</mml:mi><mml:mrow><mml:mi>L</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mi>x</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>t</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>&#x2212;</mml:mo><mml:msubsup><mml:mi>t</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow></mml:msubsup><mml:mo>)</mml:mo></mml:mrow><mml:mo>.</mml:mo></mml:math></disp-formula>where smooth (x) refers the robust loss that is referred as the translation <inline-formula id="ieqn-21"><mml:math id="mml-ieqn-21"><mml:mi>&#x03C7;</mml:mi></mml:math></inline-formula> of modified frame on the horizontal axis at anchor points. It can be demonstrated as:</p>
<p><disp-formula id="eqn-5"><label>(5)</label><mml:math id="mml-eqn-5" display="block"><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">m</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">t</mml:mi></mml:mrow><mml:msub><mml:mi>h</mml:mi><mml:mrow><mml:mi>L</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mi>x</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mrow><mml:mo>{</mml:mo><mml:mtable columnalign="left left" rowspacing=".2em" columnspacing="1em" displaystyle="false"><mml:mtr><mml:mtd><mml:mn>0.5</mml:mn><mml:msup><mml:mi>x</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup></mml:mtd><mml:mtd><mml:mrow><mml:mo>|</mml:mo><mml:mi>x</mml:mi><mml:mo>|</mml:mo></mml:mrow><mml:mo>&#x003C;</mml:mo><mml:mi>l</mml:mi><mml:mo>,</mml:mo></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mrow><mml:mo>|</mml:mo><mml:mi>x</mml:mi><mml:mo>|</mml:mo></mml:mrow><mml:mo>&#x2212;</mml:mo><mml:mn>0.5</mml:mn></mml:mtd><mml:mtd><mml:mrow><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">t</mml:mi><mml:mi mathvariant="italic">h</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">w</mml:mi><mml:mi mathvariant="italic">i</mml:mi><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mtd></mml:mtr></mml:mtable><mml:mo fence="true" stretchy="true" symmetric="true"></mml:mo></mml:mrow><mml:mo>.</mml:mo></mml:math></disp-formula></p>
<p><inline-formula id="ieqn-22"><mml:math id="mml-ieqn-22"><mml:msub><mml:mi>L</mml:mi><mml:mrow><mml:mi>m</mml:mi><mml:mi>a</mml:mi><mml:mi>s</mml:mi><mml:mi>k</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> in Mask, <inline-formula id="ieqn-23"><mml:math id="mml-ieqn-23"><mml:mi>R</mml:mi></mml:math></inline-formula>-CNN is the average binary cross entropy function which explains the loss of semantic segmentation branch. During the mask branch, an input FM is resultant as to <inline-formula id="ieqn-24"><mml:math id="mml-ieqn-24"><mml:mi>k</mml:mi><mml:mo>&#x00D7;</mml:mo><mml:mi>m</mml:mi><mml:mo>&#x00D7;</mml:mo><mml:mi>m</mml:mi></mml:math></inline-formula> formats then process, and <inline-formula id="ieqn-25"><mml:math id="mml-ieqn-25"><mml:mi>k</mml:mi></mml:math></inline-formula> and <inline-formula id="ieqn-26"><mml:math id="mml-ieqn-26"><mml:mi>m</mml:mi><mml:mo>,</mml:mo></mml:math></inline-formula> correspondingly, controls the dimensional and scale of the FMs. The <inline-formula id="ieqn-27"><mml:math id="mml-ieqn-27"><mml:mrow><mml:mo>|</mml:mo><mml:mi>x</mml:mi><mml:mo>|</mml:mo></mml:mrow><mml:mo>&#x003C;</mml:mo><mml:mi>l</mml:mi><mml:mo>,</mml:mo></mml:math></inline-formula> comparative entropy was reached by the pixel-by-pixel sigmoid computation of resultant FM, and the average entropy error is <inline-formula id="ieqn-28"><mml:math id="mml-ieqn-28"><mml:msub><mml:mi>L</mml:mi><mml:mrow><mml:mi>m</mml:mi><mml:mi>a</mml:mi><mml:mi>s</mml:mi><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mo>.</mml:mo></mml:math></inline-formula></p>
<p>In the Mask RCNN model, the Inception with ResNetv2 is utilized as the baseline network. DL concentrates on effectiveness as a human mind. If the child was trained on distinct animals, an arbitrary image was created from the mind of child which is a dog as follows and cat as follows, and from the future, the child is identified as this animal. In DL work on a similar rule. Transfer learning (TL) is the next stage from DL. In trained a NN technique needs several times and various runs for capturing the accurate weight based on this model condition. It can be tedious works and could not be simple to student a novel to the field for entering TL. The TL manages the methods led by field experts to the public that skip the necessity of determining compatible weight and carry on to next stage of trained method on novel input data. An Inception ResNetV2 is introduced [<xref ref-type="bibr" rid="ref-17">17</xref>] by combining the 2 most famous DCNN, Inception and ResNet, and utilizing batch-normalization (BN) to the convention layer before summation. The leftover components are specially employed for enabling a superior amount of Inception block and consequence, deeper method. As already mentioned, the extremely noticeable complexity compared with highly deep network is the trained phase. It can be managed to utilize remaining connection. But, an enormous amount of filters were utilized from the system, the remaining was scaled down in an effectual manner for dealing with the trained complexity. If the amount of strainer surpasses 1000, the remaining variants encounter variability, and the network could not be trained. Thus the outcome, the remaining supports are scaled from network trained stabilization.</p>
<p>The sigmoid function was numerically measured which is the feature of transmitting some actual value to range amongst zero and one, shaped like the letter &#x201C;S.&#x201D; The logistic function was another name to the sigmoid function. The sigmoid function is written as:</p>
<p><disp-formula id="eqn-6"><label>(6)</label><mml:math id="mml-eqn-6" display="block"><mml:mi>X</mml:mi><mml:mo>=</mml:mo><mml:mfrac><mml:mn>1</mml:mn><mml:mrow><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:msup><mml:mi>e</mml:mi><mml:mrow><mml:mo>&#x2212;</mml:mo><mml:mi>Y</mml:mi></mml:mrow></mml:msup></mml:mrow></mml:mfrac><mml:mo>.</mml:mo></mml:math></disp-formula></p>
<p>An important benefit of the sigmoid function is that it occurs amongst 2 points, 0 and 1. Thus the result can be most effective from this technique where it is required for anticipating probability as outcome. It can be selected this function as the possibility of something happening is only amongst zero and one.</p>
</sec>
<sec id="s3_2">
<label>3.2</label>
<title>Phase II: Design of HSFO-BOA Based Hyperparameter Tuning</title>
<p>In order to optimally adjust the hyperparameters involved in the Inception with ResNetv2 model, the HSFO-BOA is derived. A sunflower lifecycle is reliable: as they arise, accompany the sun daily and the needles of clock. Here, the inverse square law radiation is another key nature-based optimization. The heat quantity <inline-formula id="ieqn-29"><mml:math id="mml-ieqn-29"><mml:mi>Q</mml:mi></mml:math></inline-formula> received by the plant is shown as follows [<xref ref-type="bibr" rid="ref-18">18</xref>]:</p>
<p><disp-formula id="eqn-7"><label>(7)</label><mml:math id="mml-eqn-7" display="block"><mml:msub><mml:mi>Q</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mfrac><mml:mi>P</mml:mi><mml:mrow><mml:mn>4</mml:mn><mml:mi>&#x03C0;</mml:mi><mml:msubsup><mml:mi>r</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msubsup></mml:mrow></mml:mfrac></mml:math></disp-formula></p>
<p>While <inline-formula id="ieqn-30"><mml:math id="mml-ieqn-30"><mml:mi>P</mml:mi></mml:math></inline-formula> indicates the source power and <inline-formula id="ieqn-31"><mml:math id="mml-ieqn-31"><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> represent the distance between the existing paramount and the plant <inline-formula id="ieqn-32"><mml:math id="mml-ieqn-32"><mml:mi>i</mml:mi><mml:mo>.</mml:mo></mml:math></inline-formula></p>
<p><disp-formula id="eqn-8"><label>(8)</label><mml:math id="mml-eqn-8" display="block"><mml:mrow><mml:mover><mml:msub><mml:mi>S</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo stretchy="false">&#x2192;</mml:mo></mml:mover></mml:mrow><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:msup><mml:mi>X</mml:mi><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow></mml:msup><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mo fence="false" stretchy="false">&#x2016;</mml:mo><mml:msup><mml:mi>X</mml:mi><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow></mml:msup><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo fence="false" stretchy="false">&#x2016;</mml:mo></mml:mrow></mml:mfrac><mml:mo>,</mml:mo><mml:mi>i</mml:mi><mml:mo>=</mml:mo><mml:mn>1</mml:mn><mml:mo>,</mml:mo><mml:mn>2</mml:mn><mml:mo>,</mml:mo><mml:mo>&#x22EF;</mml:mo><mml:mo>,</mml:mo><mml:msub><mml:mi>n</mml:mi><mml:mrow><mml:mi>p</mml:mi></mml:mrow></mml:msub><mml:mo>,</mml:mo></mml:math></disp-formula></p>
<p>The sunflower stride in the direction <inline-formula id="ieqn-33"><mml:math id="mml-ieqn-33"><mml:mi>s</mml:mi></mml:math></inline-formula> can be evaluated as follows:</p>
<p><disp-formula id="eqn-9"><label>(9)</label><mml:math id="mml-eqn-9" display="block"><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mi>&#x03BB;</mml:mi><mml:mo>&#x00D7;</mml:mo><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mo symmetric="true">&#x2016;</mml:mo><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo symmetric="true">&#x2016;</mml:mo></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mo>&#x00D7;</mml:mo><mml:mo fence="false" stretchy="false">&#x2016;</mml:mo><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo fence="false" stretchy="false">&#x2016;</mml:mo></mml:math></disp-formula></p>
<p>Here, <inline-formula id="ieqn-34"><mml:math id="mml-ieqn-34"><mml:mi>&#x03BB;</mml:mi></mml:math></inline-formula> shows the perpetual value that determines a &#x201C;inertial&#x201D; dislocation of the plant, <inline-formula id="ieqn-35"><mml:math id="mml-ieqn-35"><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:mo symmetric="true">&#x2016;</mml:mo><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mi>i</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo symmetric="true">&#x2016;</mml:mo></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:math></inline-formula> indicates the possibility of pollination as follows:</p>
<p><disp-formula id="eqn-10"><label>(10)</label><mml:math id="mml-eqn-10" display="block"><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mo movablelimits="true" form="prefix">max</mml:mo></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:mo fence="false" stretchy="false">&#x2016;</mml:mo><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mo movablelimits="true" form="prefix">max</mml:mo></mml:mrow></mml:msub><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mo movablelimits="true" form="prefix">min</mml:mo></mml:mrow></mml:msub><mml:mo fence="false" stretchy="false">&#x2016;</mml:mo></mml:mrow><mml:mrow><mml:mn>2</mml:mn><mml:mo>&#x00D7;</mml:mo><mml:msub><mml:mi>N</mml:mi><mml:mrow><mml:mi>p</mml:mi><mml:mi>o</mml:mi><mml:mi>p</mml:mi></mml:mrow></mml:msub></mml:mrow></mml:mfrac></mml:math></disp-formula></p>
<p>In the equation, <inline-formula id="ieqn-36"><mml:math id="mml-ieqn-36"><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mo movablelimits="true" form="prefix">max</mml:mo></mml:mrow></mml:msub></mml:math></inline-formula> and <inline-formula id="ieqn-37"><mml:math id="mml-ieqn-37"><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mo movablelimits="true" form="prefix">min</mml:mo></mml:mrow></mml:msub></mml:math></inline-formula> denotes the upper and lower limits, and <inline-formula id="ieqn-38"><mml:math id="mml-ieqn-38"><mml:msub><mml:mi>N</mml:mi><mml:mrow><mml:mi>p</mml:mi><mml:mi>o</mml:mi><mml:mi>p</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> indicates the overall amount of plants:</p>
<p><disp-formula id="eqn-11"><label>(11)</label><mml:math id="mml-eqn-11" display="block"><mml:msub><mml:mrow><mml:mover><mml:mi>X</mml:mi><mml:mo stretchy="false">&#x2192;</mml:mo></mml:mover></mml:mrow><mml:mrow><mml:mi>i</mml:mi><mml:mo>+</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:msub><mml:mrow><mml:mover><mml:mi>X</mml:mi><mml:mo stretchy="false">&#x2192;</mml:mo></mml:mover></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>&#x00D7;</mml:mo><mml:msub><mml:mrow><mml:mover><mml:mi>s</mml:mi><mml:mo stretchy="false">&#x2192;</mml:mo></mml:mover></mml:mrow><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:math></disp-formula></p>
<p>The process initiates with population generation that may be random or even. Corresponding individual ratings assist in choosing which one would be moved towards the sun. Next, each entity will position itself into the sun and move in a random manner. However, it is proposed to include the capacity to function with different suns in a future version, now it is restricted to the study. Paramount plants would pollinate around the sun.</p>
<p>For improving the efficacy of the SFO algorithm, the HSFO-BOA is derived by the integration of BOA to it. The BOA imitates the natural behavior of the butterflies on food sources finding and mating. This approach uses two distinct navigation patterns for searching the domains [<xref ref-type="bibr" rid="ref-19">19</xref>]. In the exploration stage <inline-formula id="ieqn-39"><mml:math id="mml-ieqn-39"><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>&#x2264;</mml:mo><mml:mi>p</mml:mi><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula>, butterflies move to the optimal butterfly of the colony whereas in the exploitation stage <inline-formula id="ieqn-40"><mml:math id="mml-ieqn-40"><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>&#x003E;</mml:mo><mml:mi>p</mml:mi><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula>, butterfly performs an arbitrary search within the searching space by moving to a random butterfly in the colony. The mathematical expression of both patterns are given in the following:</p>
<p>When <inline-formula id="ieqn-41"><mml:math id="mml-ieqn-41"><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>&#x2264;</mml:mo><mml:mi>p</mml:mi></mml:math></inline-formula>, the global search process becomes</p>
<p><disp-formula id="eqn-12"><label>(12)</label><mml:math id="mml-eqn-12" display="block"><mml:mtable rowspacing="4pt" columnspacing="1em"><mml:mtr><mml:mtd /></mml:mtr></mml:mtable><mml:msup><mml:mrow></mml:mrow><mml:mrow><mml:mi>t</mml:mi><mml:mo>+</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msup><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mtable rowspacing="4pt" columnspacing="1em"><mml:mtr><mml:mtd /></mml:mtr></mml:mtable><mml:msup><mml:mrow></mml:mrow><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msup><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:msubsup><mml:mi>r</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msubsup><mml:mo>&#x00D7;</mml:mo><mml:msup><mml:mi>g</mml:mi><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow></mml:msup><mml:mo>&#x2212;</mml:mo><mml:mtable rowspacing="4pt" columnspacing="1em"><mml:mtr><mml:mtd /></mml:mtr></mml:mtable><mml:msup><mml:mrow></mml:mrow><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msup><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>)</mml:mo></mml:mrow><mml:mo>&#x00D7;</mml:mo><mml:msub><mml:mi>&#x03C6;</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:math></disp-formula></p>
<p>When <inline-formula id="ieqn-42"><mml:math id="mml-ieqn-42"><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>&#x003E;</mml:mo><mml:mi>p</mml:mi></mml:math></inline-formula>, the local search process becomes</p>
<p><disp-formula id="eqn-13"><label>(13)</label><mml:math id="mml-eqn-13" display="block"><mml:mtable rowspacing="4pt" columnspacing="1em"><mml:mtr><mml:mtd /></mml:mtr></mml:mtable><mml:msup><mml:mrow></mml:mrow><mml:mrow><mml:mi>t</mml:mi><mml:mo>+</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msup><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mtable rowspacing="4pt" columnspacing="1em"><mml:mtr><mml:mtd /></mml:mtr></mml:mtable><mml:msup><mml:mrow></mml:mrow><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msup><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mo stretchy="false">(</mml:mo><mml:msubsup><mml:mi>r</mml:mi><mml:mrow><mml:mn>3</mml:mn></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msubsup><mml:mo>&#x00D7;</mml:mo><mml:mtable rowspacing="4pt" columnspacing="1em"><mml:mtr><mml:mtd /></mml:mtr></mml:mtable><mml:msup><mml:mrow></mml:mrow><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msup><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mi>j</mml:mi></mml:mrow></mml:msub><mml:mo>&#x2212;</mml:mo><mml:mtable rowspacing="4pt" columnspacing="1em"><mml:mtr><mml:mtd /></mml:mtr></mml:mtable><mml:msup><mml:mrow></mml:mrow><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msup><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mo>&#x00D7;</mml:mo><mml:msub><mml:mi>&#x03C6;</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:math></disp-formula></p>
<p>Here, <inline-formula id="ieqn-43"><mml:math id="mml-ieqn-43"><mml:mi>t</mml:mi></mml:math></inline-formula> and <inline-formula id="ieqn-44"><mml:math id="mml-ieqn-44"><mml:mi>t</mml:mi><mml:mo>+</mml:mo><mml:mn>1</mml:mn></mml:math></inline-formula> indicate the present and upgraded states. As well, position of optimal butterfly in the colony has been demonstrated as <inline-formula id="ieqn-45"><mml:math id="mml-ieqn-45"><mml:msup><mml:mi>g</mml:mi><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow></mml:msup></mml:math></inline-formula>, and <inline-formula id="ieqn-46"><mml:math id="mml-ieqn-46"><mml:msup><mml:mrow></mml:mrow><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msup><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> and <inline-formula id="ieqn-47"><mml:math id="mml-ieqn-47"><mml:msup><mml:mrow></mml:mrow><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msup><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, are positions of two arbitrarily designated butterflies; <inline-formula id="ieqn-48"><mml:math id="mml-ieqn-48"><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula> and <inline-formula id="ieqn-49"><mml:math id="mml-ieqn-49"><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula> indicates three random scalars uniformly chosen within [0,1],<inline-formula id="ieqn-50"><mml:math id="mml-ieqn-50"><mml:mo>&#x22C5;</mml:mo></mml:math></inline-formula> <inline-formula id="ieqn-51"><mml:math id="mml-ieqn-51"><mml:msub><mml:mi>&#x03C6;</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> represent the fragrance factor and it is determined by the following equation:</p>
<p><disp-formula id="eqn-14"><label>(14)</label><mml:math id="mml-eqn-14" display="block"><mml:msub><mml:mi>&#x03C6;</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mi>c</mml:mi><mml:msup><mml:mi>I</mml:mi><mml:mrow><mml:mi>a</mml:mi></mml:mrow></mml:msup></mml:math></disp-formula></p>
<p>Whereas, <inline-formula id="ieqn-52"><mml:math id="mml-ieqn-52"><mml:msub><mml:mi>&#x03C6;</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> indicates the fragrance magnitude for <inline-formula id="ieqn-53"><mml:math id="mml-ieqn-53"><mml:mi>i</mml:mi><mml:mi>t</mml:mi><mml:mi>h</mml:mi></mml:math></inline-formula> butterfly; <inline-formula id="ieqn-54"><mml:math id="mml-ieqn-54"><mml:mi>c</mml:mi></mml:math></inline-formula> denotes a coefficient, <inline-formula id="ieqn-55"><mml:math id="mml-ieqn-55"><mml:mi>I</mml:mi><mml:mo>,</mml:mo></mml:math></inline-formula> and <inline-formula id="ieqn-56"><mml:math id="mml-ieqn-56"><mml:mi>a</mml:mi></mml:math></inline-formula> shows the intensity of the stimulus and the fluctuating absorption degree. <inline-formula id="ieqn-57"><mml:math id="mml-ieqn-57"><mml:mi>I</mml:mi></mml:math></inline-formula> is related to the objective function, and for ith butterfly, it is considered <inline-formula id="ieqn-58"><mml:math id="mml-ieqn-58"><mml:mi>f</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mi>i</mml:mi></mml:mrow></mml:msub><mml:mo>)</mml:mo></mml:mrow><mml:mo>,</mml:mo></mml:math></inline-formula> whereas <inline-formula id="ieqn-59"><mml:math id="mml-ieqn-59"><mml:mi>f</mml:mi></mml:math></inline-formula> return objective function of the problem. The <inline-formula id="ieqn-60"><mml:math id="mml-ieqn-60"><mml:mi>a</mml:mi></mml:math></inline-formula> and <inline-formula id="ieqn-61"><mml:math id="mml-ieqn-61"><mml:mi>c</mml:mi></mml:math></inline-formula> coefficients are designated within [0,1],<inline-formula id="ieqn-62"><mml:math id="mml-ieqn-62"><mml:mo>&#x22C5;</mml:mo></mml:math></inline-formula> <inline-formula id="ieqn-63"><mml:math id="mml-ieqn-63"><mml:mi>p</mml:mi></mml:math></inline-formula> indicates the likelihood switch that describes the search behavior.</p>
</sec>
<sec id="s3_3">
<label>3.3</label>
<title>Phase III: Tesseract Based Character Recognition</title>
<p>Primarily, Adaptive Thresholding was implemented for changing the image as to binary version utilizing Otsu&#x2019;s technique [<xref ref-type="bibr" rid="ref-20">20</xref>]. The page layout analysis is the next stage and was implemented by removing the text block in the region. Afterward, the baselines of all lines were identified and the texts were separated as words with the application of finite as well as fuzzy spaces. During the next phase, the character summaries are removed in the words. The text detection was introduced as 2-pass technique. Primary pass, a word detection was implemented with the application of static classifier. All the words are passed suitably for adaptive classifying from the procedure of trained data. The secondary pass was run on the page utilizing a novel adaptive classifier technique where the words are not studied comprehensively for re-examining the modules.</p>
</sec>
</sec>
<sec id="s4">
<label>4</label>
<title>Experimental Validation</title>
<p>The performance validation of the HMODL-ALPCR technique takes place using three benchmark datasets namely FZU Cars, Stanford Cars, and HumAIn 2019 dataset. Few sample images are depicted in <xref ref-type="fig" rid="fig-2">Fig. 2</xref>.</p>
<fig id="fig-2">
<label>Figure 2</label>
<caption>
<title>Sample images</title>
</caption><graphic mimetype="image" mime-subtype="png" xlink:href="CMC_26780-fig-2.png"/>
</fig>
<p><xref ref-type="fig" rid="fig-3">Fig. 3</xref> illustrates the sample results obtained by the HMODL-ALPCR technique. From the figure, it is clear that the HMODL-ALPCR technique has proficiently detected the LP and recognized the characters.</p>
<fig id="fig-3">
<label>Figure 3</label>
<caption>
<title>Sample visualization results of HMODL-ALPCR technique</title>
</caption><graphic mimetype="image" mime-subtype="png" xlink:href="CMC_26780-fig-3.png"/>
</fig>
<p><xref ref-type="table" rid="table-1">Tab. 1</xref> offers the LP detection outcome analysis of the HMODL-ALPCR technique under distinct epochs. <xref ref-type="fig" rid="fig-4">Fig. 4</xref> examines the LP detection result analysis of the HMODL-ALPCR technique under distinct epochs on FZU Cars dataset. With 100 epochs, the HMODL-ALPCR technique has offered <inline-formula id="ieqn-64"><mml:math id="mml-ieqn-64"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-65"><mml:math id="mml-ieqn-65"><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>a</mml:mi><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-66"><mml:math id="mml-ieqn-66"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-67"><mml:math id="mml-ieqn-67"><mml:mi>m</mml:mi><mml:mi>A</mml:mi><mml:mi>P</mml:mi></mml:math></inline-formula> of 99.05%, 99.07%, 98.91%, and 98.56% respectively. Also, with 200 epochs, the HMODL-ALPCR technique has attained <inline-formula id="ieqn-68"><mml:math id="mml-ieqn-68"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-69"><mml:math id="mml-ieqn-69"><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>a</mml:mi><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-70"><mml:math id="mml-ieqn-70"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-71"><mml:math id="mml-ieqn-71"><mml:mi>m</mml:mi><mml:mi>A</mml:mi><mml:mi>P</mml:mi></mml:math></inline-formula> of 99.05%, 99.54%, 98.71%, and 98.42% respectively. Similarly, with 300 epochs, the HMODL-ALPCR technique has provided <inline-formula id="ieqn-72"><mml:math id="mml-ieqn-72"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-73"><mml:math id="mml-ieqn-73"><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>a</mml:mi><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-74"><mml:math id="mml-ieqn-74"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-75"><mml:math id="mml-ieqn-75"><mml:mi>m</mml:mi><mml:mi>A</mml:mi><mml:mi>P</mml:mi></mml:math></inline-formula> of 99.05%, 99.42%, 98.73%, and 97.86% respectively. Likewise, with 400 epochs, the HMODL-ALPCR technique has exhibited <inline-formula id="ieqn-76"><mml:math id="mml-ieqn-76"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-77"><mml:math id="mml-ieqn-77"><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>a</mml:mi><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-78"><mml:math id="mml-ieqn-78"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-79"><mml:math id="mml-ieqn-79"><mml:mi>m</mml:mi><mml:mi>A</mml:mi><mml:mi>P</mml:mi></mml:math></inline-formula> of 99.00%, 99.40%, 98.76%, and 98.37% respectively.</p>
<table-wrap id="table-1">
<label>Table 1</label>
<caption>
<title>LP detection results of HMODL-ALPCR technique</title>
</caption>
<table>
<colgroup>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
</colgroup>
<thead>
<tr>
<th>No. of Epochs</th>
<th>Precision</th>
<th>Recall</th>
<th>F1-score</th>
<th>mAP</th>
</tr>
</thead>
<tbody>
<tr>
<td colspan="5">FZU Cars Dataset</td>
</tr>
<tr>
<td>Epoch-100</td>
<td>99.05</td>
<td>99.07</td>
<td>98.91</td>
<td>98.56</td>
</tr>
<tr>
<td>Epoch-200</td>
<td>99.05</td>
<td>99.54</td>
<td>98.71</td>
<td>98.42</td>
</tr>
<tr>
<td>Epoch-300</td>
<td>99.05</td>
<td>99.42</td>
<td>98.73</td>
<td>97.86</td>
</tr>
<tr>
<td>Epoch-400</td>
<td>99.00</td>
<td>99.40</td>
<td>98.76</td>
<td>98.37</td>
</tr>
<tr>
<td>Epoch-500</td>
<td>99.06</td>
<td>99.36</td>
<td>98.88</td>
<td>97.71</td>
</tr>
<tr>
<td>Average</td>
<td>99.04</td>
<td>99.36</td>
<td>98.80</td>
<td>98.18</td>
</tr>
<tr>
<td colspan="5">Stanford Cars Dataset</td>
</tr>
<tr>
<td>Epoch-100</td>
<td>98.99</td>
<td>99.00</td>
<td>97.86</td>
<td>96.95</td>
</tr>
<tr>
<td>Epoch-200</td>
<td>98.12</td>
<td>99.27</td>
<td>98.46</td>
<td>97.81</td>
</tr>
<tr>
<td>Epoch-300</td>
<td>97.74</td>
<td>98.96</td>
<td>98.54</td>
<td>96.32</td>
</tr>
<tr>
<td>Epoch-400</td>
<td>98.49</td>
<td>99.11</td>
<td>98.48</td>
<td>97.08</td>
</tr>
<tr>
<td>Epoch-500</td>
<td>98.94</td>
<td>99.11</td>
<td>97.71</td>
<td>97.34</td>
</tr>
<tr>
<td>Average</td>
<td>98.46</td>
<td>99.09</td>
<td>98.21</td>
<td>97.10</td>
</tr>
<tr>
<td colspan="5">HumAIn 2019 Dataset</td>
</tr>
<tr>
<td>Epoch-100</td>
<td>98.57</td>
<td>99.14</td>
<td>99.31</td>
<td>97.74</td>
</tr>
<tr>
<td>Epoch-200</td>
<td>98.47</td>
<td>98.97</td>
<td>98.72</td>
<td>98.32</td>
</tr>
<tr>
<td>Epoch-300</td>
<td>99.34</td>
<td>99.13</td>
<td>98.63</td>
<td>98.11</td>
</tr>
<tr>
<td>Epoch-400</td>
<td>98.94</td>
<td>99.25</td>
<td>98.82</td>
<td>98.40</td>
</tr>
<tr>
<td>Epoch-500</td>
<td>98.72</td>
<td>98.96</td>
<td>98.97</td>
<td>98.11</td>
</tr>
<tr>
<td>Average</td>
<td>98.81</td>
<td>99.09</td>
<td>98.89</td>
<td>98.14</td>
</tr>
</tbody>
</table>
</table-wrap>
<fig id="fig-4">
<label>Figure 4</label>
<caption>
<title>Classification results of HMODL-ALPCR technique on FZU Cars dataset</title>
</caption><graphic mimetype="image" mime-subtype="png" xlink:href="CMC_26780-fig-4.png"/>
</fig>
<p><xref ref-type="fig" rid="fig-5">Fig. 5</xref> inspects the LP detection result analysis of the HMODL-ALPCR system under different epochs on Stanford Cars dataset. With 100 epochs, the HMODL-ALPCR approach has offered <inline-formula id="ieqn-80"><mml:math id="mml-ieqn-80"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-81"><mml:math id="mml-ieqn-81"><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>a</mml:mi><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-82"><mml:math id="mml-ieqn-82"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-83"><mml:math id="mml-ieqn-83"><mml:mi>m</mml:mi><mml:mi>A</mml:mi><mml:mi>P</mml:mi></mml:math></inline-formula> of 98.99%, 99.00%, 97.86%, and 96.95% correspondingly. Besides, with 200 epochs, the HMODL-ALPCR methodology has reached <inline-formula id="ieqn-84"><mml:math id="mml-ieqn-84"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-85"><mml:math id="mml-ieqn-85"><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>a</mml:mi><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-86"><mml:math id="mml-ieqn-86"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-87"><mml:math id="mml-ieqn-87"><mml:mi>m</mml:mi><mml:mi>A</mml:mi><mml:mi>P</mml:mi></mml:math></inline-formula> of 98.12%, 99.27%, 98.46%, and 97.81% respectively. In addition, with 300 epochs, the HMODL-ALPCR system has offered <inline-formula id="ieqn-88"><mml:math id="mml-ieqn-88"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-89"><mml:math id="mml-ieqn-89"><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>a</mml:mi><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-90"><mml:math id="mml-ieqn-90"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-91"><mml:math id="mml-ieqn-91"><mml:mi>m</mml:mi><mml:mi>A</mml:mi><mml:mi>P</mml:mi></mml:math></inline-formula> of 97.74%, 98.96%, 98.54%, and 96.32% correspondingly. Moreover, with 400 epochs, the HMODL-ALPCR methodology has demonstrated <inline-formula id="ieqn-92"><mml:math id="mml-ieqn-92"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-93"><mml:math id="mml-ieqn-93"><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>a</mml:mi><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-94"><mml:math id="mml-ieqn-94"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-95"><mml:math id="mml-ieqn-95"><mml:mi>m</mml:mi><mml:mi>A</mml:mi><mml:mi>P</mml:mi></mml:math></inline-formula> of 98.49%, 99.11%, 98.48%, and 97.08% correspondingly.</p>
<fig id="fig-5">
<label>Figure 5</label>
<caption>
<title>Classification results of HMODL-ALPCR technique on Standford Cars dataset</title>
</caption><graphic mimetype="image" mime-subtype="png" xlink:href="CMC_26780-fig-5.png"/>
</fig>
<p><xref ref-type="fig" rid="fig-6">Fig. 6</xref> demonstrates the LP detection result analysis of the HMODL-ALPCR system under distinct epochs on HumAIn 2019 dataset. With 100 epochs, the HMODL-ALPCR technique has obtainable <inline-formula id="ieqn-96"><mml:math id="mml-ieqn-96"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-97"><mml:math id="mml-ieqn-97"><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>a</mml:mi><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-98"><mml:math id="mml-ieqn-98"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-99"><mml:math id="mml-ieqn-99"><mml:mi>m</mml:mi><mml:mi>A</mml:mi><mml:mi>P</mml:mi></mml:math></inline-formula> of 98.57%, 99.14%, 99.31%, and 97.74% respectively. Along with that, with 200 epochs, the HMODL-ALPCR approach has reached <inline-formula id="ieqn-100"><mml:math id="mml-ieqn-100"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-101"><mml:math id="mml-ieqn-101"><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>a</mml:mi><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-102"><mml:math id="mml-ieqn-102"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-103"><mml:math id="mml-ieqn-103"><mml:mi>m</mml:mi><mml:mi>A</mml:mi><mml:mi>P</mml:mi></mml:math></inline-formula> of 98.47%, 98.97%, 98.72%, and 98.32% respectively. Similarly, with 300 epochs, the HMODL-ALPCR technique has accessible <inline-formula id="ieqn-104"><mml:math id="mml-ieqn-104"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-105"><mml:math id="mml-ieqn-105"><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>a</mml:mi><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-106"><mml:math id="mml-ieqn-106"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-107"><mml:math id="mml-ieqn-107"><mml:mi>m</mml:mi><mml:mi>A</mml:mi><mml:mi>P</mml:mi></mml:math></inline-formula> of 99.34%, 99.13%, 98.63%, and 98.11% correspondingly. At last, with 400 epochs, the HMODL-ALPCR system has exhibited <inline-formula id="ieqn-108"><mml:math id="mml-ieqn-108"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-109"><mml:math id="mml-ieqn-109"><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>a</mml:mi><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-110"><mml:math id="mml-ieqn-110"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-111"><mml:math id="mml-ieqn-111"><mml:mi>m</mml:mi><mml:mi>A</mml:mi><mml:mi>P</mml:mi></mml:math></inline-formula> of 98.94%, 99.25%, 98.82%, and 98.40% respectively.</p>
<fig id="fig-6">
<label>Figure 6</label>
<caption>
<title>Classification results of HMODL-ALPCR technique on HumAIn 2019 dataset</title>
</caption><graphic mimetype="image" mime-subtype="png" xlink:href="CMC_26780-fig-6.png"/>
</fig>
<p><xref ref-type="table" rid="table-2">Tab. 2</xref> and <xref ref-type="fig" rid="fig-7">Fig. 7</xref> investigate the comparison study of the HMODL-ALPCR technique with recent methods on the test FZU Cars dataset. The results demonstrated that the CNN-VGG16 and DL-ResNet50 techniques have obtained lower performance with the minimal values of <inline-formula id="ieqn-112"><mml:math id="mml-ieqn-112"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-113"><mml:math id="mml-ieqn-113"><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>a</mml:mi><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-114"><mml:math id="mml-ieqn-114"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-115"><mml:math id="mml-ieqn-115"><mml:mi>m</mml:mi><mml:mi>A</mml:mi><mml:mi>P</mml:mi></mml:math></inline-formula>. In line with, the DL-ResNet101 and HT-SSA-CNN techniques have attained slightly enhanced values of <inline-formula id="ieqn-116"><mml:math id="mml-ieqn-116"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-117"><mml:math id="mml-ieqn-117"><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>a</mml:mi><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-118"><mml:math id="mml-ieqn-118"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-119"><mml:math id="mml-ieqn-119"><mml:mi>m</mml:mi><mml:mi>A</mml:mi><mml:mi>P</mml:mi></mml:math></inline-formula>. Next to that, the DL-VLPNR and OKM-CNN techniques have reached reasonable values of <inline-formula id="ieqn-120"><mml:math id="mml-ieqn-120"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-121"><mml:math id="mml-ieqn-121"><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>a</mml:mi><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-122"><mml:math id="mml-ieqn-122"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-123"><mml:math id="mml-ieqn-123"><mml:mi>m</mml:mi><mml:mi>A</mml:mi><mml:mi>P</mml:mi></mml:math></inline-formula>. However, the HMODL-ALPCR technique has outperformed the other methods with the maximum <inline-formula id="ieqn-124"><mml:math id="mml-ieqn-124"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-125"><mml:math id="mml-ieqn-125"><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>a</mml:mi><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-126"><mml:math id="mml-ieqn-126"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-127"><mml:math id="mml-ieqn-127"><mml:mi>m</mml:mi><mml:mi>A</mml:mi><mml:mi>P</mml:mi></mml:math></inline-formula> of 99.04%, 99.36%, 98.80%, and 98.18% respectively.</p>
<table-wrap id="table-2">
<label>Table 2</label>
<caption>
<title>Comparative LP detection results of HMODL-ALPCR technique on FZU Cars dataset</title>
</caption>
<table>
<colgroup>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
</colgroup>
<thead>
<tr>
<th>Model</th>
<th>Precision</th>
<th>Recall</th>
<th>F1-score</th>
<th>mAP</th>
</tr>
</thead>
<tbody>
<tr>
<td>CNN-VGG16</td>
<td>93.97</td>
<td>96.56</td>
<td>94.43</td>
<td>92.43</td>
</tr>
<tr>
<td>DL-ResNet50</td>
<td>93.26</td>
<td>93.27</td>
<td>94.80</td>
<td>90.66</td>
</tr>
<tr>
<td>DL-ResNet101</td>
<td>96.15</td>
<td>96.88</td>
<td>97.03</td>
<td>93.70</td>
</tr>
<tr>
<td>DL-VLPNR</td>
<td>98.88</td>
<td>96.83</td>
<td>96.76</td>
<td>94.96</td>
</tr>
<tr>
<td>HT-SSA-CNN</td>
<td>96.81</td>
<td>99.08</td>
<td>96.83</td>
<td>97.38</td>
</tr>
<tr>
<td>OKM-CNN</td>
<td>97.49</td>
<td>98.95</td>
<td>96.89</td>
<td>96.68</td>
</tr>
<tr>
<td>HMODL-ALPCR</td>
<td>99.04</td>
<td>99.36</td>
<td>98.80</td>
<td>98.18</td>
</tr>
</tbody>
</table>
</table-wrap>
<fig id="fig-7">
<label>Figure 7</label>
<caption>
<title>LP detection result analysis of HMODL-ALPCR with recent techniques on FZU Cars dataset</title>
</caption><graphic mimetype="image" mime-subtype="png" xlink:href="CMC_26780-fig-7.png"/>
</fig>
<p><xref ref-type="table" rid="table-3">Tab. 3</xref> and <xref ref-type="fig" rid="fig-8">Fig. 8</xref> examine the comparison study of the HMODL-ALPCR approach on the test Stanford Cars dataset. The outcomes exhibited that the CNN-VGG16 and DL-ResNet50 algorithms have reached lesser performance with the minimal values of <inline-formula id="ieqn-128"><mml:math id="mml-ieqn-128"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-129"><mml:math id="mml-ieqn-129"><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>a</mml:mi><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-130"><mml:math id="mml-ieqn-130"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-131"><mml:math id="mml-ieqn-131"><mml:mi>m</mml:mi><mml:mi>A</mml:mi><mml:mi>P</mml:mi></mml:math></inline-formula>. Likewise, the DL-ResNet101 and HT-SSA-CNN techniques have attained slightly enhanced values of <inline-formula id="ieqn-132"><mml:math id="mml-ieqn-132"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-133"><mml:math id="mml-ieqn-133"><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>a</mml:mi><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-134"><mml:math id="mml-ieqn-134"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-135"><mml:math id="mml-ieqn-135"><mml:mi>m</mml:mi><mml:mi>A</mml:mi><mml:mi>P</mml:mi></mml:math></inline-formula>. Followed by, the DL-VLPNR and OKM-CNN techniques have reached reasonable values of <inline-formula id="ieqn-136"><mml:math id="mml-ieqn-136"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-137"><mml:math id="mml-ieqn-137"><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>a</mml:mi><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-138"><mml:math id="mml-ieqn-138"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-139"><mml:math id="mml-ieqn-139"><mml:mi>m</mml:mi><mml:mi>A</mml:mi><mml:mi>P</mml:mi></mml:math></inline-formula>. At last, the HMODL-ALPCR system has outperformed the other methods with the maximum <inline-formula id="ieqn-140"><mml:math id="mml-ieqn-140"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-141"><mml:math id="mml-ieqn-141"><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>a</mml:mi><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-142"><mml:math id="mml-ieqn-142"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-143"><mml:math id="mml-ieqn-143"><mml:mi>m</mml:mi><mml:mi>A</mml:mi><mml:mi>P</mml:mi></mml:math></inline-formula> of 98.46%, 99.09%, 98.21%, and 97.10% respectively.</p>
<table-wrap id="table-3">
<label>Table 3</label>
<caption>
<title>Comparative LP detection results of HMODL-ALPCR technique on Stanford Cars dataset</title>
</caption>
<table>
<colgroup>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
</colgroup>
<thead>
<tr>
<th>Model</th>
<th>Precision</th>
<th>Recall</th>
<th>F1-score</th>
<th>mAP</th>
</tr>
</thead>
<tbody>
<tr>
<td>CNN-VGG16</td>
<td>92.65</td>
<td>96.04</td>
<td>93.85</td>
<td>90.44</td>
</tr>
<tr>
<td>DL-ResNet50</td>
<td>92.64</td>
<td>94.96</td>
<td>92.72</td>
<td>93.62</td>
</tr>
<tr>
<td>DL-ResNet101</td>
<td>94.47</td>
<td>93.46</td>
<td>93.07</td>
<td>90.74</td>
</tr>
<tr>
<td>DL-VLPNR</td>
<td>96.51</td>
<td>98.23</td>
<td>97.37</td>
<td>95.21</td>
</tr>
<tr>
<td>HT-SSA-CNN</td>
<td>97.06</td>
<td>98.58</td>
<td>95.02</td>
<td>96.32</td>
</tr>
<tr>
<td>OKM-CNN</td>
<td>97.09</td>
<td>97.67</td>
<td>96.83</td>
<td>95.57</td>
</tr>
<tr>
<td>HMODL-ALPCR</td>
<td>98.46</td>
<td>99.09</td>
<td>98.21</td>
<td>97.10</td>
</tr>
</tbody>
</table>
</table-wrap>
<fig id="fig-8">
<label>Figure 8</label>
<caption>
<title>LP detection result analysis of HMODL-ALPCR with recent techniques on Stanford Cars dataset</title>
</caption><graphic mimetype="image" mime-subtype="png" xlink:href="CMC_26780-fig-8.png"/>
</fig>
<p><xref ref-type="table" rid="table-4">Tab. 4</xref> and <xref ref-type="fig" rid="fig-9">Fig. 9</xref> examine the comparison study of the HMODL-ALPCR approach on the HumAIn 2019 dataset. The outcomes exhibited that the CNN-VGG16 and DL-ResNet50 systems have obtained lesser performance with the reduced values of <inline-formula id="ieqn-144"><mml:math id="mml-ieqn-144"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-145"><mml:math id="mml-ieqn-145"><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>a</mml:mi><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-146"><mml:math id="mml-ieqn-146"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-147"><mml:math id="mml-ieqn-147"><mml:mi>m</mml:mi><mml:mi>A</mml:mi><mml:mi>P</mml:mi></mml:math></inline-formula>. Besides, the DL-ResNet101 and HT-SSA-CNN techniques have attained somewhat enhanced values of <inline-formula id="ieqn-148"><mml:math id="mml-ieqn-148"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-149"><mml:math id="mml-ieqn-149"><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>a</mml:mi><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-150"><mml:math id="mml-ieqn-150"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-151"><mml:math id="mml-ieqn-151"><mml:mi>m</mml:mi><mml:mi>A</mml:mi><mml:mi>P</mml:mi></mml:math></inline-formula>. Afterward, the DL-VLPNR and OKM-CNN algorithms have obtained reasonable values of <inline-formula id="ieqn-152"><mml:math id="mml-ieqn-152"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-153"><mml:math id="mml-ieqn-153"><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>a</mml:mi><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-154"><mml:math id="mml-ieqn-154"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-155"><mml:math id="mml-ieqn-155"><mml:mi>m</mml:mi><mml:mi>A</mml:mi><mml:mi>P</mml:mi></mml:math></inline-formula>. But, the HMODL-ALPCR system has outperformed the other algorithms with the maximal <inline-formula id="ieqn-156"><mml:math id="mml-ieqn-156"><mml:mi>p</mml:mi><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:msub><mml:mi>c</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-157"><mml:math id="mml-ieqn-157"><mml:mi>r</mml:mi><mml:mi>e</mml:mi><mml:mi>c</mml:mi><mml:msub><mml:mi>a</mml:mi><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-158"><mml:math id="mml-ieqn-158"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-159"><mml:math id="mml-ieqn-159"><mml:mi>m</mml:mi><mml:mi>A</mml:mi><mml:mi>P</mml:mi></mml:math></inline-formula> of 98.81%, 99.09%, 98.89%, and 98.14% correspondingly.</p>
<table-wrap id="table-4">
<label>Table 4</label>
<caption>
<title>Comparative LP detection results of HMODL-ALPCR technique on HumAIn 2019 dataset</title>
</caption>
<table>
<colgroup>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
</colgroup>
<thead>
<tr>
<th>Model</th>
<th>Precision</th>
<th>Recall</th>
<th>F1-score</th>
<th>mAP</th>
</tr>
</thead>
<tbody>
<tr>
<td>CNN-VGG16</td>
<td>88.80</td>
<td>87.24</td>
<td>87.79</td>
<td>87.95</td>
</tr>
<tr>
<td>DL-ResNet50</td>
<td>85.16</td>
<td>90.96</td>
<td>89.16</td>
<td>87.51</td>
</tr>
<tr>
<td>DL-ResNet101</td>
<td>93.20</td>
<td>92.84</td>
<td>89.34</td>
<td>91.19</td>
</tr>
<tr>
<td>DL-VLPNR</td>
<td>98.77</td>
<td>98.64</td>
<td>98.81</td>
<td>96.28</td>
</tr>
<tr>
<td>HT-SSA-CNN</td>
<td>97.43</td>
<td>95.33</td>
<td>95.27</td>
<td>94.71</td>
</tr>
<tr>
<td>OKM-CNN</td>
<td>93.91</td>
<td>95.69</td>
<td>95.54</td>
<td>97.76</td>
</tr>
<tr>
<td>HMODL-ALPCR</td>
<td>98.81</td>
<td>99.09</td>
<td>98.89</td>
<td>98.14</td>
</tr>
</tbody>
</table>
</table-wrap>
<fig id="fig-9">
<label>Figure 9</label>
<caption>
<title>LP detection result analysis of HMODL-ALPCR with recent techniques on HumAIn 2019 dataset</title>
</caption><graphic mimetype="image" mime-subtype="png" xlink:href="CMC_26780-fig-9.png"/>
</fig>
<p>After examining the above mentioned tables and figures, it is obvious that the HMODL-ALPCR technique has outperformed the other techniques on all the datasets.</p>
</sec>
<sec id="s5">
<label>5</label>
<title>Conclusion</title>
<p>In this article, an automated HMODL-ALPCR technique has been presented to detect LPs and recognize the characters that exist in them for smart city environments. The HMODL-ALPCR technique involves Mask-RCNN for the detection of LPs and Inception with ResNet-v2 as the baseline network. Moreover, the HSFO-BOA is utilized for the hyperparameter tuning of the Inception-ResNetv2 model. Lastly, Tesseract based character recognition model is applied to effectively recognize the characters present from the LPs. The experimental result analysis of the HMODL-ALPCR technique takes place against the benchmark dataset and the experimental outcomes pointed out the improved efficacy of the HMODL-ALPCR technique on existing techniques. In future, the detection performance can be improvised by the design of hybrid DL models for smart city environments.</p>
</sec>
</body>
<back>
<fn-group>
<fn fn-type="other"><p><bold>Funding Statement:</bold> This Research was funded by the Deanship of Scientific Research at University of Business and Technology, Saudi Arabia.</p>
</fn>
<fn fn-type="conflict"><p><bold>Conflicts of Interest:</bold> The authors declare that they have no conflicts of interest to report regarding the present study.</p>
</fn>
</fn-group>
<ref-list content-type="authoryear">
<title>References</title>
<ref id="ref-1"><label>[1]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>W.</given-names> <surname>Wang</surname></string-name>, <string-name><given-names>S.</given-names> <surname>De</surname></string-name>, <string-name><given-names>Y.</given-names> <surname>Zhou</surname></string-name>, <string-name><given-names>X.</given-names> <surname>Huang</surname></string-name> and <string-name><given-names>K.</given-names> <surname>Moessner</surname></string-name></person-group>, &#x201C;<article-title>Distributed sensor data computing in smart city applications</article-title>,&#x201D; in <conf-name>2017 IEEE 18th Int. Symp. on A World of Wireless, Mobile and Multimedia Networks (WoWMoM)</conf-name>, <conf-loc>Macau, China</conf-loc>, pp. <fpage>1</fpage>&#x2013;<lpage>5</lpage>, <year>2017</year>. </mixed-citation></ref>
<ref id="ref-2"><label>[2]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>Y.</given-names> <surname>Tian</surname></string-name> and <string-name><given-names>L.</given-names> <surname>Pan</surname></string-name></person-group>, &#x201C;<article-title>Predicting short-term traffic flow by long short-term memory recurrent neural network</article-title>,&#x201D; in <conf-name>2015 IEEE Int. Conf. on Smart City/SocialCom/SustainCom (SmartCity)</conf-name>, <conf-loc>Chengdu, China</conf-loc>, pp. <fpage>153</fpage>&#x2013;<lpage>158</lpage>, <year>2015</year>. </mixed-citation></ref>
<ref id="ref-3"><label>[3]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>V.</given-names> <surname>Gnanaprakash</surname></string-name>, <string-name><given-names>N.</given-names> <surname>Kanthimathi</surname></string-name> and <string-name><given-names>N.</given-names> <surname>Saranya</surname></string-name></person-group>, &#x201C;<article-title>Automatic number plate recognition using deep learning</article-title>,&#x201D; <source>IOP Conference Series: Materials Science and Engineering</source>, vol. <volume>1084</volume>, no. <issue>1</issue>, pp. <fpage>012027</fpage>, <year>2021</year>.</mixed-citation></ref>
<ref id="ref-4"><label>[4]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>A.</given-names> <surname>Zanella</surname></string-name>, <string-name><given-names>N.</given-names> <surname>Bui</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Castellani</surname></string-name>, <string-name><given-names>L.</given-names> <surname>Vangelista</surname></string-name> and <string-name><given-names>M.</given-names> <surname>Zorzi</surname></string-name></person-group>, &#x201C;<article-title>Internet of things for smart cities</article-title>,&#x201D; <source>IEEE Internet of Things Journal</source>, vol. <volume>1</volume>, no. <issue>1</issue>, pp. <fpage>22</fpage>&#x2013;<lpage>32</lpage>, <year>2014</year>.</mixed-citation></ref>
<ref id="ref-5"><label>[5]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>D.</given-names> <surname>Zang</surname></string-name>, <string-name><given-names>Z.</given-names> <surname>Chai</surname></string-name>, <string-name><given-names>J.</given-names> <surname>Zhang</surname></string-name>, <string-name><given-names>D.</given-names> <surname>Zhang</surname></string-name> and <string-name><given-names>J.</given-names> <surname>Cheng</surname></string-name></person-group>, &#x201C;<article-title>Vehicle license plate recognition using visual attention model and deep learning</article-title>,&#x201D; <source>Journal of Electronic Imaging</source>, vol. <volume>24</volume>, no. <issue>3</issue>, pp. <fpage>033001</fpage>, <year>2015</year>.</mixed-citation></ref>
<ref id="ref-6"><label>[6]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>W.</given-names> <surname>Weihong</surname></string-name> and <string-name><given-names>T.</given-names> <surname>Jiaoyang</surname></string-name></person-group>, &#x201C;<article-title>Research on license plate recognition algorithms based on deep learning in complex environment</article-title>,&#x201D; <source>IEEE Access</source>, vol. <volume>8</volume>, pp. <fpage>91661</fpage>&#x2013;<lpage>91675</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-7"><label>[7]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Hendry</surname></string-name> and <string-name><given-names>R. C.</given-names> <surname>Chen</surname></string-name></person-group>, &#x201C;<article-title>Automatic license plate recognition via sliding-window darknet-yolo deep learning</article-title>,&#x201D; <source>Image and Vision Computing</source>, vol. <volume>87</volume>, no. <issue>2</issue>, pp. <fpage>47</fpage>&#x2013;<lpage>56</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-8"><label>[8]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>D. M. F.</given-names> <surname>Izidio</surname></string-name>, <string-name><given-names>A. P. A.</given-names> <surname>Ferreira</surname></string-name> and <string-name><given-names>E. N. S.</given-names> <surname>Barros</surname></string-name></person-group>, &#x201C;<article-title>An embedded automatic license plate recognition system using deep learning</article-title>,&#x201D; in <conf-name>2018 VIII Brazilian Symp. on Computing Systems Engineering (SBESC)</conf-name>, <conf-loc>Salvador, Brazil</conf-loc>, pp. <fpage>38</fpage>&#x2013;<lpage>45</lpage>, <year>2018</year>. </mixed-citation></ref>
<ref id="ref-9"><label>[9]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>Lee</surname></string-name>, <string-name><given-names>K.</given-names> <surname>Son</surname></string-name>, <string-name><given-names>H.</given-names> <surname>Kim</surname></string-name> and <string-name><given-names>J.</given-names> <surname>Park</surname></string-name></person-group>, &#x201C;<article-title>Car plate recognition based on CNN using embedded system with GPU</article-title>,&#x201D; in <conf-name>2017 10th Int. Conf. on Human System Interactions (HSI)</conf-name>, <conf-loc>Ulsan, South Korea</conf-loc>, pp. <fpage>239</fpage>&#x2013;<lpage>241</lpage>, <year>2017</year>. </mixed-citation></ref>
<ref id="ref-10"><label>[10]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>Alghyaline</surname></string-name></person-group>, &#x201C;<article-title>Real-time Jordanian license plate recognition using deep learning</article-title>,&#x201D; <source>Journal of King Saud University-Computer and Information Sciences</source>, pp. <fpage>S1319157820305152</fpage>, <year>2020</year>. DOI <pub-id pub-id-type="doi">10.1016/j.jksuci.2020.09.018</pub-id>.</mixed-citation></ref>
<ref id="ref-11"><label>[11]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>N.</given-names> <surname>Omar</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Sengur</surname></string-name> and <string-name><given-names>S. G. S.</given-names> <surname>Al-Ali</surname></string-name></person-group>, &#x201C;<article-title>Cascaded deep learning-based efficient approach for license plate detection and recognition</article-title>,&#x201D; <source>Expert Systems with Applications</source>, vol. <volume>149</volume>, pp. <fpage>113280</fpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-12"><label>[12]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><surname>Hendry</surname></string-name> and <string-name><given-names>R. C.</given-names> <surname>Chen</surname></string-name></person-group>, &#x201C;<article-title>Automatic license plate recognition via sliding-window darknet-yolo deep learning</article-title>,&#x201D; <source>Image and Vision Computing</source>, vol. <volume>87</volume>, pp. <fpage>47</fpage>&#x2013;<lpage>56</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-13"><label>[13]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>D. M. F.</given-names> <surname>Izidio</surname></string-name>, <string-name><given-names>A. P. A.</given-names> <surname>Ferreira</surname></string-name> and <string-name><given-names>E. N. S.</given-names> <surname>Barros</surname></string-name></person-group>, &#x201C;<article-title>An embedded automatic license plate recognition system using deep learning</article-title>,&#x201D; in <conf-name>2018 VIII Brazilian Symp. on Computing Systems Engineering (SBESC)</conf-name>, <conf-loc>Salvador, Brazil</conf-loc>, pp. <fpage>38</fpage>&#x2013;<lpage>45</lpage>, <year>2018</year>. </mixed-citation></ref>
<ref id="ref-14"><label>[14]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>I. V.</given-names> <surname>Pustokhina</surname></string-name>, <string-name><given-names>D. A.</given-names> <surname>Pustokhin</surname></string-name>, <string-name><given-names>J. J. P. C.</given-names> <surname>Rodrigues</surname></string-name>, <string-name><given-names>D.</given-names> <surname>Gupta</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Khanna</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Automatic vehicle license plate recognition using optimal k-means with convolutional neural network for intelligent transportation systems</article-title>,&#x201D; <source>IEEE Access</source>, vol. <volume>8</volume>, pp. <fpage>92907</fpage>&#x2013;<lpage>92917</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-15"><label>[15]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>D. H.</given-names> <surname>Chen</surname></string-name>, <string-name><given-names>Y. D.</given-names> <surname>Cao</surname></string-name> and <string-name><given-names>J.</given-names> <surname>Yan</surname></string-name></person-group>, &#x201C;<article-title>Towards pedestrian target detection with optimized mask R-CNN</article-title>,&#x201D; <source>Complexity</source>, vol. <volume>2020</volume>, pp. <fpage>1</fpage>&#x2013;<lpage>8</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-16"><label>[16]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>M. M.</given-names> <surname>Karim</surname></string-name>, <string-name><given-names>D.</given-names> <surname>Doell</surname></string-name>, <string-name><given-names>R.</given-names> <surname>Lingard</surname></string-name>, <string-name><given-names>Z.</given-names> <surname>Yin</surname></string-name>, <string-name><given-names>M. C.</given-names> <surname>Leu</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>A region-based deep learning algorithm for detecting and tracking objects in manufacturing plants</article-title>,&#x201D; <source>Procedia Manufacturing</source>, vol. <volume>39</volume>, no. <issue>4</issue>, pp. <fpage>168</fpage>&#x2013;<lpage>177</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-17"><label>[17]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>C. A.</given-names> <surname>Ferreira</surname></string-name>, <string-name><given-names>T.</given-names> <surname>Melo</surname></string-name>, <string-name><given-names>P.</given-names> <surname>Sousa</surname></string-name>, <string-name><given-names>M. I.</given-names> <surname>Meyer</surname></string-name>, <string-name><given-names>E.</given-names> <surname>Shakibapour</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Classification of breast cancer histology images through transfer learning using a pre-trained inception resnet v2</article-title>,&#x201D; in <conf-name>Int. Conf. Image Analysis and Recognition</conf-name>, <conf-loc>Portugal</conf-loc>, pp. <fpage>763</fpage>&#x2013;<lpage>770</lpage>, <year>2018</year>. </mixed-citation></ref>
<ref id="ref-18"><label>[18]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>G. F.</given-names> <surname>Gomes</surname></string-name>, <string-name><given-names>S. S.</given-names> <surname>da Cunha</surname> </string-name> and <string-name><given-names>A. C.</given-names> <surname>Ancelotti</surname></string-name></person-group>, &#x201C;<article-title>A sunflower optimization (SFO) algorithm applied to damage identification on laminated composite plates</article-title>,&#x201D; <source>Engineering with Computers</source>, vol. <volume>35</volume>, no. <issue>2</issue>, pp. <fpage>619</fpage>&#x2013;<lpage>626</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-19"><label>[19]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>Arora</surname></string-name> and <string-name><given-names>S.</given-names> <surname>Singh</surname></string-name></person-group>, &#x201C;<article-title>Butterfly optimization algorithm: A novel approach for global optimization</article-title>,&#x201D; <source>Soft Computing</source>, vol. <volume>23</volume>, no. <issue>3</issue>, pp. <fpage>715</fpage>&#x2013;<lpage>734</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-20"><label>[20]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>T.</given-names> <surname>Vetriselvi</surname></string-name>, <string-name><given-names>E. L.</given-names> <surname>Lydia</surname></string-name>, <string-name><given-names>S. N.</given-names> <surname>Mohanty</surname></string-name>, <string-name><given-names>E.</given-names> <surname>Alabdulkreem</surname></string-name>, <string-name><given-names>S. A.</given-names> <surname>Otaibi</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Deep learning based license plate number recognition for smart cities</article-title>,&#x201D; <source>Computers Materials &#x0026; Continua</source>, vol. <volume>70</volume>, no. <issue>1</issue>, pp. <fpage>2049</fpage>&#x2013;<lpage>2064</lpage>, <year>2022</year>.</mixed-citation></ref>
</ref-list>
</back>
</article>
