<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.1 20151215//EN" "http://jats.nlm.nih.gov/publishing/1.1/JATS-journalpublishing1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:mml="http://www.w3.org/1998/Math/MathML" xml:lang="en" article-type="research-article" dtd-version="1.1">
<front>
<journal-meta>
<journal-id journal-id-type="pmc">CMC</journal-id>
<journal-id journal-id-type="nlm-ta">CMC</journal-id>
<journal-id journal-id-type="publisher-id">CMC</journal-id>
<journal-title-group>
<journal-title>Computers, Materials &#x0026; Continua</journal-title>
</journal-title-group>
<issn pub-type="epub">1546-2226</issn>
<issn pub-type="ppub">1546-2218</issn>
<publisher>
<publisher-name>Tech Science Press</publisher-name>
<publisher-loc>USA</publisher-loc>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="publisher-id">45975</article-id>
<article-id pub-id-type="doi">10.32604/cmc.2024.045975</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Article</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>AutoRhythmAI: A Hybrid Machine and Deep Learning Approach for Automated Diagnosis of Arrhythmias</article-title>
<alt-title alt-title-type="left-running-head">AutoRhythmAI: A Hybrid Machine and Deep Learning Approach for Automated Diagnosis of Arrhythmias</alt-title>
<alt-title alt-title-type="right-running-head">AutoRhythmAI: A Hybrid Machine and Deep Learning Approach for Automated Diagnosis of Arrhythmias</alt-title>
</title-group>
<contrib-group>
<contrib id="author-1" contrib-type="author" corresp="yes">
<name name-style="western"><surname>Jayanthi</surname><given-names>S.</given-names></name><email>js8683@srmist.edu.in</email></contrib>
<contrib id="author-2" contrib-type="author">
<name name-style="western"><surname>Prasanna Devi</surname><given-names>S.</given-names></name></contrib>
<aff><addr-line><institution>Department of Computer Science and Engineering, SRM Institute of Science and Technology&#x2014;Vadapalani Campus</institution>, Chennai, Tamil Nadu</addr-line>, <country>India</country></aff>
</contrib-group>
<author-notes>
<corresp id="cor1"><label>&#x002A;</label>Corresponding Author: S. Jayanthi. Email: <email>js8683@srmist.edu.in</email></corresp>
</author-notes>
<pub-date date-type="collection" publication-format="electronic"><year>2024</year></pub-date>
<pub-date date-type="pub" publication-format="electronic"><day>27</day><month>2</month><year>2024</year></pub-date>
<volume>78</volume>
<issue>2</issue>
<fpage>2137</fpage>
<lpage>2158</lpage>
<history>
<date date-type="received"><day>13</day><month>9</month><year>2023</year>
</date>
<date date-type="accepted"><day>27</day><month>10</month><year>2023</year>
</date>
</history>
<permissions>
<copyright-statement>&#x00A9; 2024 Jayanthi and Devi</copyright-statement>
<copyright-year>2024</copyright-year>
<copyright-holder>Jayanthi and Devi</copyright-holder>
<license xlink:href="https://creativecommons.org/licenses/by/4.0/">
<license-p>This work is licensed under a <ext-link ext-link-type="uri" xlink:type="simple" xlink:href="https://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</ext-link>, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.</license-p>
</license>
</permissions>
<self-uri content-type="pdf" xlink:href="TSP_CMC_45975.pdf"></self-uri>
<abstract>
<p>In healthcare, the persistent challenge of arrhythmias, a leading cause of global mortality, has sparked extensive research into the automation of detection using machine learning (ML) algorithms. However, traditional ML and AutoML approaches have revealed their limitations, notably regarding feature generalization and automation efficiency. This glaring research gap has motivated the development of AutoRhythmAI, an innovative solution that integrates both machine and deep learning to revolutionize the diagnosis of arrhythmias. Our approach encompasses two distinct pipelines tailored for binary-class and multi-class arrhythmia detection, effectively bridging the gap between data preprocessing and model selection. To validate our system, we have rigorously tested AutoRhythmAI using a multimodal dataset, surpassing the accuracy achieved using a single dataset and underscoring the robustness of our methodology. In the first pipeline, we employ signal filtering and ML algorithms for preprocessing, followed by data balancing and split for training. The second pipeline is dedicated to feature extraction and classification, utilizing deep learning models. Notably, we introduce the &#x2018;RRI-convoluted transformer model&#x2019; as a novel addition for binary-class arrhythmias. An ensemble-based approach then amalgamates all models, considering their respective weights, resulting in an optimal model pipeline. In our study, the VGGRes Model achieved impressive results in multi-class arrhythmia detection, with an accuracy of 97.39% and firm performance in precision (82.13%), recall (31.91%), and F1-score (82.61%). In the binary-class task, the proposed model achieved an outstanding accuracy of 96.60%. These results highlight the effectiveness of our approach in improving arrhythmia detection, with notably high accuracy and well-balanced performance metrics.</p>
</abstract>
<kwd-group kwd-group-type="author">
<kwd>Automated machine learning</kwd>
<kwd>neural networks</kwd>
<kwd>deep learning</kwd>
<kwd>arrhythmias</kwd>
</kwd-group>
</article-meta>
</front>
<body>
<sec id="s1">
<label>1</label>
<title>Introduction</title>
<p>Arrhythmias, abnormal heart rhythms ranging from benign to life-threatening, can be triggered by factors like undiagnosed cardiac disease, adverse drug reactions, and metabolic abnormalities [<xref ref-type="bibr" rid="ref-1">1</xref>]. Analyzing the ECG data for heart electrical activity anomalies is crucial for arrhythmia detection. Traditional methods involve continuous manual monitoring of ECG signals, but researchers firmly believe that leveraging AI-guided ECGs to diagnose conditions like atrial fibrillation has several benefits. According to the World Health Organization (WHO) [<xref ref-type="bibr" rid="ref-2">2</xref>], cardiovascular diseases, including arrhythmias, account for approximately 17.9 million deaths annually, representing 31% of all global deaths as shown in <xref ref-type="fig" rid="fig-1">Fig. 1</xref>. These alarming statistics underscore the urgency of developing efficient and accurate arrhythmia diagnosis methods. To determine the applicability of AI-ECG procedures in various clinical contexts and patient demographics, researchers are planning multicenter hybrid experiments that incorporate AI algorithms, such as machine learning and deep learning models [<xref ref-type="bibr" rid="ref-3">3</xref>,<xref ref-type="bibr" rid="ref-4">4</xref>]. Swift and accurate arrhythmia diagnosis requires powerful automatic algorithms, and AI promises to fulfill this critical need.</p>
<fig id="fig-1">
<label>Figure 1</label>
<caption>
<title>Statistics report by WHO</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_45975-fig-1.tif"/>
</fig>
<p>AutoML is becoming increasingly popular because it can drastically save the time and expense needed to create machine-learning models while increasing their efficacy and accuracy [<xref ref-type="bibr" rid="ref-5">5</xref>]. Electrocardiogram (ECG) signal analysis using AutoML has attracted a lot of interest because it can potentially increase the precision and effectiveness of this type of analysis [<xref ref-type="bibr" rid="ref-6">6</xref>]. Research strongly suggests the Auto-ML techniques for automation and ease of quick detection of disease diagnosis [<xref ref-type="bibr" rid="ref-7">7</xref>]. However, those techniques have the following issues. The issues include:</p>
<p>1. Low availability of ECG datasets: Strict regulations govern the access and distribution of the data due to its sensitive nature. Because of this, creating and testing auto-ML models with large, varied ECG datasets may be challenging [<xref ref-type="bibr" rid="ref-8">8</xref>].</p>
<p>2. Managing noisy ECG signals: Movement artifacts, electrode impedance, and electromagnetic interference are some conditions that might cause noisy ECG signals [<xref ref-type="bibr" rid="ref-9">9</xref>]. As a result, utilizing auto-ML models to recognize and categorize ECG characteristics may be challenging. Research is required to create reliable autoML models that handle noisy ECG readings.</p>
<p>3. Powerful automation algorithms: Several ML and deep learning algorithms are available. AutoML supports several automation tools, such as Auto-Sklearn [<xref ref-type="bibr" rid="ref-10">10</xref>], TPOTS [<xref ref-type="bibr" rid="ref-11">11</xref>], and Auto-Keras [<xref ref-type="bibr" rid="ref-12">12</xref>]. From an ECG research point of view, these tools must provide robustness and be more accurate. The AutoML framework initiates with the first step of pre-processing, wherein user input facilitates the conversion of categorical data into integers (e.g., via a label encoder) [<xref ref-type="bibr" rid="ref-13">13</xref>]. Nonetheless, the autoML pipeline necessitates the incorporation of multiple pre-processing tasks. Auto-Keras frameworks do not support generalized feature extraction. So, a separate ECG auto-ML tool is needed to design powerful automation algorithms for processing and classification. Our study aims to collect generic ECG features using extraction and pre-processing techniques to eliminate noisy signals.</p>
<p>This work aims at detecting automatic heart disease by both binary and multi class classification of arrhythmias on ECG using proposed AutoRhythmAI techniques.</p>
<p>1. Focus on multimodal ECG datasets, enhancing AutoML algorithms&#x2019; robustness to noise in ECG Signals by incorporating various data types, improving diagnostic accuracy.</p>
<p>2. Introduction of a custom AutoRhythmAI model pipeline, featuring advanced deep learning models for both binary and multi-class arrhythmia detection.</p>
<p>3. Introduction of a novel ensemble method combining multiple deep learning models, such as ResIncept, InceptVGG, VGGRes, and LenetAlexLSTM, to achieve accurate multi-class arrhythmia diagnosis and a novelty hybrid RRI Peak detection &#x002B; CNN &#x002B; Transformer model for binary arrhythmia detection.</p>
<p>The paper provides an overview as follows: <xref ref-type="sec" rid="s2">Section 2</xref> presents the pertinent literature, while <xref ref-type="sec" rid="s3">Section 3</xref> entails a comprehensive exploration of detailed methodology, discussion of multi-model databases, comparisons, analyses of study groups, elucidation of critical attributes, and examination of research challenges. <xref ref-type="sec" rid="s4">Section 4</xref> outlines the outcomes and their synthesis, while <xref ref-type="sec" rid="s5">Section 5</xref> encapsulates the study&#x2019;s discussion and summarizes its prospects.</p>
</sec>
<sec id="s2">
<label>2</label>
<title>Related Work</title>
<p><bold>Signal Processing Techniques:</bold> ECG data generally contains noise that requires elimination. Various signal processing techniques form the foundation for reducing noise, extending beyond elimination to serve as feature extraction methods. Filtering, the first and most prevalent technique [<xref ref-type="bibr" rid="ref-14">14</xref>], removes noise and artifacts from ECG signals to enhance signal quality. Standard filters encompass high-pass, low-pass, and notch filters. Feature extraction involves deriving pertinent attributes from ECG signals for arrhythmia detection [<xref ref-type="bibr" rid="ref-15">15</xref>]. Standard features contain RR intervals, QRS complex duration, and ST segment deviation. Wavelet analysis decomposes ECG signals into wavelet coefficients at different scales [<xref ref-type="bibr" rid="ref-16">16</xref>] for feature extraction, denoising, and compression. Time-frequency analysis scrutinizes ECG signals across both time and frequency domains, utilizing methods like the short-time Fourier transform and the continuous wavelet transform. Time-frequency analysis assists in feature extraction and denoising of ECG signals. Primary AI techniques using ML algorithms [<xref ref-type="bibr" rid="ref-17">17</xref>] independently acquire the ability to learn and classify ECG signals into various arrhythmia classes. Standard ML algorithms encompass support vector machines, decision trees, and neural networks. Hence, filtering and feature extraction techniques enhance the diagnostic accuracy of ECG arrhythmia detection. The resulting signal becomes more straightforward for analysis, extracting relevant features. These techniques offer continuous surveillance of patients with arrhythmias for real-time ECG signal monitoring, aiding in real-time arrhythmia detection and timely intervention.</p>
<p><bold>Deep Learning Techniques:</bold> Deep learning Models are part of the machine learning models mainly used to automate and detect the features [<xref ref-type="bibr" rid="ref-18">18</xref>,<xref ref-type="bibr" rid="ref-19">19</xref>] and classify them [<xref ref-type="bibr" rid="ref-20">20</xref>] accordingly. Deep learning techniques have shown great potential in ECG signal classification. Several algorithms support the ECG classification. Some of them are CNN, RNN, autoencoders, and GAN [<xref ref-type="bibr" rid="ref-21">21</xref>]. Hybrid deep learning algorithms offer several advantages over single deep learning techniques, such as improved accuracy, reduced overfitting, transfer learning, and scalability. Those algorithms include the Convolutional Neural Network&#x2014;Recurrent Neural Network (CNN&#x2014;RNN), the Convolutional Neural Network&#x2014;Long Short-Term Memory (CNN&#x2014;LSTM), the Residual Neural Network&#x2014;Long Short-Term Memory (ResNet&#x2014;LSTM) [<xref ref-type="bibr" rid="ref-22">22</xref>], and so on. The choice of a particular hybrid algorithm depends on the specific requirements of the ECG signal classification task at hand. Furthermore, there is a lack of time series-based deep learning algorithms for continuous monitoring of ECG signals. Hybrid models typically have many parameters, which can lead to overfitting and reduced generalization performance, especially when dealing with small datasets.</p>
<p><bold>AutoML Techniques:</bold> AutoML for ECG classification aims to automate the machine learning pipeline, including feature engineering, algorithm selection, hyperparameter tuning, and model ensembling. In reference [<xref ref-type="bibr" rid="ref-23">23</xref>], Auto-Keras, TPOT, Auto-Sklearn, and H2O commonly use predefined auto-ML tools. They automate feature engineering, model selection, and hyperparameter tuning to enhance accuracy in arrhythmia detection from ECG data. Researchers also employ AutoML for signal processing, ensemble building, transfer learning, hyperparameter optimization, and addressing class imbalance. These techniques streamline arrhythmia diagnosis and contribute to cardiac health monitoring.</p>
<p>The <xref ref-type="table" rid="table-1">Table 1</xref> states the so far techniques used for ECG classification for arrhythmias and their gaps are tabulated.</p>
<table-wrap id="table-1">
<label>Table 1</label>
<caption>
<title>Techniques used in ECG classification and its limitations</title>
</caption>
<table frame="hsides">
<colgroup>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
</colgroup>
<thead valign="top">
<tr>
<th>Techniques</th>
<th>Ref. No.</th>
<th>Methods</th>
<th>Advantages</th>
<th>Limitations</th>
<th>Research gaps</th>
</tr>
</thead>
<tbody valign="top">
<tr>
<td>Signal processing techniques</td>
<td>[<xref ref-type="bibr" rid="ref-24">24</xref>]</td>
<td>ML algorithm: PCA for R peak detection</td>
<td>PCA provides more accurate and robust results than traditional peak detection algorithms.</td>
<td>One limitation of the paper is that it only evaluates the proposed PCA-based R-peak detection method on a single dataset.</td>
<td>Studies on a broader range of datasets may be needed to fully evaluate the effectiveness of the proposed method.</td>
</tr>
<tr>
<td/>
<td>[<xref ref-type="bibr" rid="ref-25">25</xref>]</td>
<td>Filter technique: Matched filter</td>
<td>The proposed approach uses a matched filter to pre-process ECG signals and extract informative features related to arrhythmia patterns.</td>
<td>The paper does not provide a detailed analysis of the features learned by the CNN or how they relate to arrhythmia patterns.</td>
<td>Enhanced algorithms could help improve the interpretability of the proposed approach.</td>
</tr>
<tr>
<td/>
<td/>
<td>Feature Extraction: CNN</td>
<td/>
<td/>
<td/>
</tr>
<tr>
<td>Deep learning techniques</td>
<td>[<xref ref-type="bibr" rid="ref-26">26</xref>]</td>
<td>CNN</td>
<td>CNN,RNN and FCNN are trained separately and then combined for final classification.</td>
<td>Do not provide a detailed analysis of the features learned by the CNN or how they relate to arrhythmia.</td>
<td>Detailed patterns that need to be analyzed could help improve the interpretability of the proposed approach.</td>
</tr>
<tr>
<td/>
<td/>
<td>RNN</td>
<td/>
<td/>
<td/>
</tr>
<tr>
<td/>
<td/>
<td>FCNN</td>
<td/>
<td/>
<td/>
</tr>
<tr>
<td/>
<td>[<xref ref-type="bibr" rid="ref-27">27</xref>]</td>
<td>ResNet-50 model</td>
<td>The proposed approach on a large dataset of 12-lead ECGs and report high accuracy in ECG diagnosis, including classification of arrhythmias and detection of ischemic heart disease.</td>
<td>ResNet-50 model parameters are missing for ECG patterns analysis.</td>
<td>ResNet-50 may increase the computational complexity for real-time applications.</td>
</tr>
<tr>
<td>Auto-ML techniques</td>
<td>[<xref ref-type="bibr" rid="ref-28">28</xref>]</td>
<td>Auto-ML tools</td>
<td>Stated the usage of auto-ML for large medical dataset and its risk of overfitting.</td>
<td>Potential risks and ethical considerations associated with the use of machine learning in healthcare and laboratory medicine.</td>
<td>For the multi-modal dataset auto-Ml supports the generalization of the proposed model.</td>
</tr>
<tr>
<td></td>
<td>[<xref ref-type="bibr" rid="ref-29">29</xref>]</td>
<td>Auto-keras, TPOT, and H2O.ai</td>
<td>Used butterworth filter for pre-processing and then classified different ECG leads.</td>
<td>The need for high-quality labeled data.</td>
<td>Does not ensure interpretability of the resulting models.</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>Researchers have given limited focus to auto-ML techniques for ECG classification. Since auto-ML is a hot topic in research and arrhythmia detection plays a vital role in the survival of human health. The combination of these will give a platform for the development of AI techniques.</p>
</sec>
<sec id="s3">
<label>3</label>
<title>Materials and Methods</title>
<sec id="s3_1">
<label>3.1</label>
<title>Multi Modal ECG Dataset</title>
<p>A multimodal ECG dataset collects electrocardiogram (ECG) data that includes additional modalities or signals, such as respiratory signals or blood pressure. These different signals can provide complementary information that can improve the accuracy of ECG analysis, such as detecting arrhythmias, identifying cardiac abnormalities, or monitoring vital signs during medical procedures. Multimodal ECG datasets are often publicly available, which allows researchers and developers to benchmark their algorithms against existing methods and collaborate with other experts in the field. In this research, we utilized datasets such as the China Physiological Signal Challenge 2018 (CPSC2018), the St. Petersburg INCART 12-lead Arrhythmia Database, the Georgia 12-lead ECG Challenge Database (CinC2020), the PhysioNet Computing in Cardiology Challenge 2017 (CinC2017), and the Contextual Arrhythmia Database (CACHET-CADB) [<xref ref-type="bibr" rid="ref-30">30</xref>] as shown in <xref ref-type="fig" rid="fig-2">Fig. 2</xref>. The above databases have two data formats: header files (.hea) and mat files (.mat). The mat file consists of patient ECG signals. The header file consists of 12 lead signal values and their diagnostic values. <xref ref-type="table" rid="table-2">Table 2</xref> shows the summarization of the multimodal ECG database used.</p>
<fig id="fig-2">
<label>Figure 2</label>
<caption>
<title>Different types of arrhythmias signals used (X-axis &#x003D; time Y-axis &#x003D; amplitude)</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_45975-fig-2.tif"/>
</fig><table-wrap id="table-2">
<label>Table 2</label>
<caption>
<title>Summarization of multimodal ECG database</title>
</caption>
<table frame="hsides">
<colgroup>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
</colgroup>
<thead valign="top">
<tr>
<th>Dataset name</th>
<th>Number of subjects</th>
<th>Number of female (F) and male (M) subjects</th>
<th>Length of ECG signals recorded (in seconds (s))</th>
</tr>
</thead>
<tbody valign="top">
<tr>
<td>CPSC2018</td>
<td>6877</td>
<td>M: 3699 F: 3178</td>
<td>6&#x2013;60 s</td>
</tr>
<tr>
<td>St. Petersburg INCART</td>
<td>32</td>
<td>M: 17 F: 15</td>
<td>1800 s</td>
</tr>
<tr>
<td>Georgia</td>
<td>10,344</td>
<td>M: 5551 F: 4793</td>
<td>10 s</td>
</tr>
<tr>
<td>CACHET-CADB</td>
<td>24</td>
<td>M:15 F:9</td>
<td>10 s (till 24h)</td>
</tr>
</tbody>
</table>
</table-wrap>
</sec>
<sec id="s3_2">
<label>3.2</label>
<title>Multimodal Dataset Collaboration and Data Balancing</title>
<p>This section focuses on merging multiple datasets while retaining all data attributes. We initialize empty lists for data, labels, filenames, genders, and ages. For each dataset, we load it separately and append its data and associated information to their respective lists. We concatenate these lists into single NumPy arrays, ensuring the correct order. The resulting merged dataset retains all the original data and labels. Regarding data balancing, we extracted approximately 3,252 abnormal and 2,568 standard signals from each dataset, rendering additional data-balancing techniques unnecessary as the dataset already maintains a balanced class distribution.</p>
</sec>
<sec id="s3_3">
<label>3.3</label>
<title>Custom AutoRhythmAI Model for ECG Arrhythmias</title>
<p>AutoML capabilities are poised to take the first step towards AI standards by delivering ML with the push of a button or, at the very least, ensuring that they conceal algorithm execution, data pipelines, and code from view [<xref ref-type="bibr" rid="ref-31">31</xref>]. As a result, it provides a topic for research proposal methodology. As AutoML is a framework still under development for automating tasks, we cannot build a hybrid model for automation based on our own ECG customization. The proposed Custom AutoRhythmAI steps, which suit ECG arrhythmias detection for multi-model databases and hybrid model automation, are discussed below. <xref ref-type="fig" rid="fig-3">Fig. 3</xref> illustrates the working pipeline of the designed AutoML.</p>
<fig id="fig-3">
<label>Figure 3</label>
<caption>
<title>Proposed model architecture</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_45975-fig-3.tif"/>
</fig>
<p><bold>Steps:</bold> AutoRhythmAI model for ECG arrhythmias</p>
<p>Step 1: Import Python packages such as pandas, NumPy, and so on for computation and processing of the dataset.</p>
<p>Step 2: Set the parameters required to understand the problem and computations.</p>
<p>Step 3: Design the preprocessed pipelines for processing the ECG signals. In this model, there are two preprocessed pipelines based on filtering techniques.</p>
<p>Step 4: Construct the class for feature extraction and classification&#x2014;one class for multi-class data models and another type for binary class data models.</p>
<p>Level 1: Multi-Class Model: Create a multi-class class on one pipeline. We have used algorithms such as ANN, ResNet-50, Lenet-5, AlexNet, VGG-16, Inception, LSTM, ResIncept, InceptVGG, VGGRes, and LenetAlexLSTM model on that pipeline. This class processes all the test data, yielding the output of the best model for detection.</p>
<p>Level 2: Binary class Model: Create a Binary class on another pipeline. The novelty algorithm called &#x201C;RRI-Convoluted Transformer Model&#x201D; is on that pipeline, which takes the RRI features along with the highlighted CNN features for classification. Since the ECG signals are time series data, a transformer model is used.</p>
<p>Step 5: Train the custom model on the training set using the defined loss function and optimization algorithm.</p>
<p>Step 6: Evaluate the performance of the hybrid model on the validation set.</p>
<p>Step 7: Calculate the test set&#x2019;s performance metrics, such as accuracy, sensitivity, and specificity.</p>
<p>So, even for non-data science experts, we may get good accuracy at the end of these procedural steps with minimal computing time.</p>
</sec>
<sec id="s3_4">
<label>3.4</label>
<title>Pre-Processing Levels</title>
<p>In the existing research, auto-ML pre-processing [<xref ref-type="bibr" rid="ref-32">32</xref>] includes label encoders, one-hot encoding, target encoders, compression technique [<xref ref-type="bibr" rid="ref-33">33</xref>], etc. Some techniques will only be suitable for some ECG processing, so the traditional AutoML will not work for our applications. Thus, the research aims to develop filtering technique pipelines to eliminate signal noise. In this context, we categorize the pre-processing stages of channels into binary class and multi-class classifications. We apply the bandpass filtering technique for binary-type pre-processing to eliminate signal noise and artifacts. The primary purpose of using bandpass is to consolidate the lowpass and high-pass filter values as described in Pseudocode 1 and this act as base combining signals with different sample sizes.</p>
<p>On the other hand, the multi-class pre-processing technique has 27 distribution classes. So, we ensured the signal length and performed labeling of the data. After that, we used machine learning techniques named one hot encoding with a function called Multilabel binarize to process the data as described in Pseudocode 2.</p>
<fig id="fig-9">
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_45975-fig-9.tif"/>
</fig>
<fig id="fig-10">
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_45975-fig-10.tif"/>
</fig>
</sec>
<sec id="s3_5">
<label>3.5</label>
<title>Validation Schema</title>
<p>In health analysis data and AutoML, the traditional IID assumption may not hold due to rapidly changing ecosystem activities. This requires alternative validation methods like K-fold cross-validation or custom splits. In this case, we employed a 10-fold stratified cross-validation on a dataset of 52,369 ECGs. Custom splits can also be used based on the order of data and signal sequences for training and validation. These methods are crucial for accurate model performance estimation and hyperparameter tuning in AutoML.</p>
</sec>
<sec id="s3_6">
<label>3.6</label>
<title>Feature Selection and Classification</title>
<p>Feature selection or Feature extraction [<xref ref-type="bibr" rid="ref-34">34</xref>] is an essential step in any machine learning model, including deep learning models. In deep learning, feature selection selects the most relevant features from a dataset to improve the model&#x2019;s performance and efficiency. Deep learning models can easily overfit the training data if trained on too many irrelevant or redundant features. Feature selection helps to reduce overfitting by selecting only the most informative features and removing unrelated data, which improves the model&#x2019;s ability to generalize to new data. In this work, we have two types of feature processing levels.</p>
<p><bold>Level 1 Feature selection for Multi-class ECG data:</bold> In our comprehensive pipeline, we have harnessed a diverse array of algorithms, ranging from traditional machine learning approaches such as the Artificial Neural Network (ANN) to cutting-edge deep learning architectures like ResNet-50, Lenet-5, AlexNet, VGG-16, Inception, and LSTM. Moreover, we&#x2019;ve ingeniously combined these models into hybrid configurations, such as the ResIncept Model (ResNet-50 &#x002B; Inception), InceptVGG Model (Inception &#x002B; VGG-16), VGGRes Model (VGG-16 &#x002B; ResNet-50), and the LenetAlexLSTM Model (Lenet-5 &#x002B; AlexNet &#x002B; LSTM) as described on algorithm [<xref ref-type="bibr" rid="ref-1">1</xref>]. This multi-faceted approach enables us to harness the strengths of diverse algorithms simultaneously, resulting in enhanced performance, adaptability, and resilience to tackle complex tasks effectively. We will simultaneously pass each pre-processed input through these algorithms, merge the outcomes of those data, and evaluate the best-sorted features.</p>
<p><bold>Level 2 Feature selection for Binary-class ECG data:</bold> For Binary class ECG data classification, we introduced the novel hybrid combination approach named &#x201C;Novel RRI Convoluted Transformer model&#x201D; for time series ECG signal classification. <xref ref-type="fig" rid="fig-4">Fig. 4</xref> shows the proposed methodology, which includes the main stages of detection of RR features, feature extraction, and feature classification using the newly proposed deep learning techniques.</p>
<fig id="fig-4">
<label>Figure 4</label>
<caption>
<title>Working model of simulated deep convolutional transformer model</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_45975-fig-4.tif"/>
</fig>
<fig id="fig-11">
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_45975-fig-11.tif"/>
</fig>
<p>We have shown the variation in detecting the RRI peak values and using raw signals on this model. The purpose of this proposed model is to capture the local and global temporal features of ECG signals. The model receives a 1D ECG signal as input and processes it through convolutional layers. These convolutional layers employ filters of different sizes to capture various time-scale features. Following the convolutional layers, the model guides the output through a sequence of transformer blocks. These transformer blocks consist of self-attention and feed-forward layers, allowing the model to grasp dependencies between signal parts and acquire higher-level features. Overall, the convolutional transformer model is a robust architecture for ECG signal classification, as it can effectively capture both local and global temporal features and has shown promising results in various studies. The generalized features that are extracted are described in <xref ref-type="table" rid="table-3">Table 3</xref>. The layers and their corresponding equations are described in <xref ref-type="table" rid="table-4">Table 4</xref>.</p>
<table-wrap id="table-3">
<label>Table 3</label>
<caption>
<title>Feature extracted components</title>
</caption>
<table frame="hsides">
<colgroup>
<col align="left"/>
<col align="left"/>
</colgroup>
<thead valign="top">
<tr>
<th>Component</th>
<th>Description</th>
</tr>
</thead>
<tbody valign="top">
<tr>
<td>Input</td>
<td>RRI signals (R-R interval signals)</td>
</tr>
<tr>
<td>Feature extraction</td>
<td>Convolutional neural network (CNN) layers for local pattern extraction and transformer architecture for capturing long-range dependencies</td>
</tr>
<tr>
<td>Features extracted</td>
<td>Heart rate variability metrics, temporal patterns, and statistical properties of RRI intervals</td>
</tr>
<tr>
<td>Model layers</td>
<td>Multiple layers combining CNN and transformers layers</td>
</tr>
<tr>
<td>Output</td>
<td>Predictions or classifications related to arrhythmia detection</td>
</tr>
</tbody>
</table>
</table-wrap><table-wrap id="table-4">
<label>Table 4</label>
<caption>
<title>RRI Convoluted transformer model layers and its descriptions</title>
</caption>
<table frame="hsides">
<colgroup>
<col align="left"/>
<col align="left"/>
<col align="left"/>
</colgroup>
<thead valign="top">
<tr>
<th>Layers and components</th>
<th>Equations</th>
<th>Description</th>
</tr>
</thead>
<tbody valign="top">
<tr>
<td>Embedding</td>
<td>E<sub>i</sub> &#x003D; Embedding(X<sub>i</sub>)</td>
<td>E &#x2208; &#x211D;<sup>(T &#x00D7; d)</sup></td>
</tr>
<tr>
<td/>
<td>X<sub>i</sub> &#x003D; Input ECG sequences</td>
<td> where d is the embedding dimension</td>
</tr>
<tr>
<td>Positional encoding</td>
<td>PE<sub>(pos, i)</sub> &#x003D; sin(pos/10000^(2 &#x002A; i/d)) for even</td>
<td>PE<sub>(pos, i)</sub> represents the (i,j)-th element of the </td>
</tr>
<tr>
<td/>
<td>PE<sub>(pos, i)</sub> &#x003D; cos(pos/10000^(2 &#x002A; (i&#x2212;1)/d)) for odd</td>
<td>positional encoding matrix</td>
</tr>
<tr>
<td>Convolutional layer</td>
<td>H &#x003D; Convolution (E<sub>i</sub> &#x002B; PE<sub>(pos, i)</sub>)</td>
<td>The convolutional layers, typically used for feature extraction, process the ECG sequence to capture relevant patterns</td>
</tr>
<tr>
<td rowspan="4">Multihead attention layer</td>
<td>Q &#x003D; HW_Q,</td>
<td rowspan="3">Q`, `K`, and `V` are the query, key, and value matrices, respectively</td>
</tr>
<tr>
<td>K &#x003D; HW_K,</td>
</tr>
<tr>
<td rowspan="2">V &#x003D; HW_V,</td>
</tr>
<tr>
<td>W_Q`, `W_K`, and `W_V` are weight matrices for projecting the input into query, key, and value spaces</td>
</tr>
<tr>
<td>Attention score</td>
<td>A<sub>h</sub> &#x003D; softmax((Q<sub>h</sub> &#x002A; K<sub>h</sub>^T)/sqrt(d))</td>
<td>The A<sub>h</sub> is calculated by taking the softmax of the scaled dot product between the query and key vectors, divided by the square root of the dimensionality (d)</td>
</tr>
<tr>
<td>Weighted sum</td>
<td>V<sub>h</sub> &#x003D; A<sub>h</sub> &#x002A; V<sub>h</sub></td>
<td>The V<sub>h</sub> is computed by element-wise multiplying the attention scores A<sub>h</sub> with the value vectors V<sub>h</sub></td>
</tr>
<tr>
<td>MultiHeadOutput</td>
<td>MultiHeadOutput &#x003D; Concatenate(V<sub>1</sub>, V<sub>2</sub>,..., V<sub>H</sub>) &#x002A; W<sub>O</sub></td>
<td>Concatenating the output value vectors from multiple attention heads (V1, V2, ..., VH)</td>
</tr>
<tr>
<td>Position-wise feedforward networks</td>
<td>FFN_output &#x003D; ReLU(FFN(W_1 &#x002A; MultiHeadOutput &#x002B; b_1) &#x002A; W_2 &#x002B; b_2)</td>
<td>The feedforward network typically consists of two linear layers with a ReLU activation</td>
</tr>
<tr>
<td>Residual connections and layer normalization</td>
<td>Output1 &#x003D; LayerNorm(MultiHeadOutput &#x002B; Sublayer(MultiHeadOutput))</td>
<td>Residual connections are used to add the original input to the output of each sub-layer before applying layer normalization</td>
</tr>
<tr>
<td>Layer stacking</td>
<td>Output &#x003D; self.self_att_pool(Output1)`</td>
<td>Multiple layers of these components are stacked sequentially to form the complete TransformerEncoder</td>
</tr>
</tbody>
</table>
</table-wrap>
</sec>
<sec id="s3_7">
<label>3.7</label>
<title>Hyper-Parameter Used</title>
<p>Hyperparameters directly impact the performance of a model [<xref ref-type="bibr" rid="ref-35">35</xref>]. The choice of hyperparameters for the ECG signal processing model was made based on a combination of domain knowledge and experimentation for both classifications. For binary class system a window size of 10 s was selected to capture sufficient ECG data. A batch size of 64 balances computational efficiency and convergence. A learning rate of 0.0001 ensures gradual convergence during training. Two convolutional layers capture essential features, and an embedding dimension of 64 provides sufficient representation capacity. Four encoder and decoder layers handle sequence dependencies, with four attention heads capturing different aspects of the input. A feed-forward dimension of 256 balances model complexity, and a dropout rate of 0.25 regularizes the model. The log_interval of 1 indicates logging after each training step. These values aim to optimize model performance for ECG analysis.</p>
</sec>
<sec id="s3_8">
<label>3.8</label>
<title>Ensembling</title>
<p>Ensemble methods for ECG signals merge all used deep learning models to predict [<xref ref-type="bibr" rid="ref-36">36</xref>]. This work employs blending, which utilizes weighted averages of the best features from proposed algorithms to create a blender model. The innovative training model, incorporating algorithmic elements, is seamlessly integrated into the proposed framework.</p>
</sec>
<sec id="s3_9">
<label>3.9</label>
<title>Experimental Setup</title>
<p>The suggested model in this research uses the deep learning libraries NumPy Community, Mathematics, and Seaborn for implementation on the Python backend. With the assistance of Kaggle Respiratory, the model was trained. The model is trained on the Tesla P100 GPU in a Kaggle cloud computing environment because the data set is more than 1.5 GB.The raw ECG data were scaled in the 0&#x2013;1 range before normalization. Using the WandB cloud service, the test data was analyzed. WandB is a centralized interface for tracking and exchanging measurements, forecasts, and hyperparameters from our models in a competitive real-time setting. Test findings like sensitivity, specificity, epochs, and accuracy are available through the WandB service.</p>
</sec>
<sec id="s3_10">
<label>3.10</label>
<title>Evaluation</title>
<p>1. The Adam optimization technique [<xref ref-type="bibr" rid="ref-37">37</xref>] will choose the optimum parameters during model search and training and evaluate the best dropout strategy and model performance. The AdamW Optimizer improves on approaches like momentum and root-mean-square propagation by using their strengths to provide a more efficient gradient descent. This implies that we set AdamW with a weight decay of 1e-6 for this proposed model, and (0.9, 0.98) is configured to optimize the simulated model. The optimized model is initialized with a StepLR scheduler at 0.95.</p>
<p>2. The AutoRhythmAI model framework will execute a few operations during the experiment. The optimization strategy is based on hyperparameters, dropout, learning rate, and performance metrics.</p>
<p>3.The performance metrics of this model are evaluated using AUC, accuracy, recall, and precision metrics.</p>
</sec>
</sec>
<sec id="s4">
<label>4</label>
<title>Empirical Results and Discussion</title>
<p>We are posing the following empirical queries after describing the experiment in full:</p>
<p><bold>Question 1:</bold> What is the impact of separate preprocessing over arrhythmia detection in ECG signals?</p>
<p><bold>Question 2:</bold> Which Deep Learning method outperforms the benchmark compared to the fundamental approach, enabling the selection of the best base model for further testing?</p>
<p><bold>Question 3:</bold> Which category of features, like raw signals convoluted features or RRI convoluted features in binary class, have the best impact?</p>
<p>Thus, address questions 1 through 3 in this section by presenting and analyzing your experimental results.</p>
<sec id="s4_1">
<label>4.1</label>
<title>Results</title>
<p><bold>Question 1:</bold> What is the impact of separate preprocessing over arrhythmia detection in ECG signals?</p>
<p><xref ref-type="table" rid="table-5">Table 5</xref> presents the performance metrics such as accuracy, precision, recall, AUC score, and Loss [<xref ref-type="bibr" rid="ref-38">38</xref>] based on different groups of preprocessing, which we evaluated after training the models. It demonstrates that the AutoRhythmAI framework automates the evaluation of the model using Adam optimization. Notably, the VGGRes Model emerges as the top performer, boasting the highest accuracy of 97.39% and vital precision, recall, and AUC score, making it a robust choice for accurate diagnostics. Additionally, the Inception model demonstrates remarkable results with an accuracy of 96.39% and high precision and recall values. These findings underscore the effectiveness of deep learning architectures, especially the VGGRes Model, in achieving superior performance in ECG signal classification tasks. We plotted a boxplot chart to compare the model&#x2019;s overall performance in accuracy, precision, and AUC values for binary class distribution. The RRI preprocessed signals work well on the proposed model, with a model accuracy of 96.8%, which can be inferred from the <xref ref-type="fig" rid="fig-5">Fig. 5</xref>, and concluded that preprocessing techniques used for different classifications in the framework achieved higher performance in detecting the arrhythmias.</p>
<table-wrap id="table-5">
<label>Table 5</label>
<caption>
<title>Multi-class arrhythmias detection model performance metrics</title>
</caption>
<table frame="hsides">
<colgroup>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
</colgroup>
<thead valign="top">
<tr>
<th>Models</th>
<th>Accuracy (%)</th>
<th>Precision (%)</th>
<th>Recall (%)</th>
<th>AUC score (%)</th>
<th>Loss (%)</th>
</tr>
</thead>
<tbody valign="top">
<tr>
<td>ANN</td>
<td>95.50</td>
<td>81.27</td>
<td>17.44</td>
<td>73.60</td>
<td>14.20</td>
</tr>
<tr>
<td>Lenet-5</td>
<td>95.48</td>
<td>75.12</td>
<td>20.61</td>
<td>76.85</td>
<td>13.70</td>
</tr>
<tr>
<td>AlexNet</td>
<td>95.73</td>
<td>79.11</td>
<td>24.15</td>
<td>78.67</td>
<td>12.66</td>
</tr>
<tr>
<td>VGG-16</td>
<td>96.07</td>
<td>83.57</td>
<td>30.90</td>
<td>81.61</td>
<td>11.731</td>
</tr>
<tr>
<td>ResNet-50</td>
<td>95.96</td>
<td>78.66</td>
<td>28.04</td>
<td>81.07</td>
<td>11.97</td>
</tr>
<tr>
<td>Inception</td>
<td>96.39</td>
<td>88.18</td>
<td>35.54</td>
<td>86.54</td>
<td>10.48</td>
</tr>
<tr>
<td>LSTM</td>
<td>95.85</td>
<td>84.46</td>
<td>24.90</td>
<td>77.44</td>
<td>13.09</td>
</tr>
<tr>
<td>ResIncept model</td>
<td>96.40</td>
<td>78.61</td>
<td>28.04</td>
<td>81.07</td>
<td>11.97</td>
</tr>
<tr>
<td>VGGRes model</td>
<td>97.39</td>
<td>82.13</td>
<td>31.91</td>
<td>82.61</td>
<td>10.12</td>
</tr>
<tr>
<td>InceptVGG</td>
<td>96.86</td>
<td>89.18</td>
<td>38.45</td>
<td>88.12</td>
<td>10.88</td>
</tr>
<tr>
<td>LenetAlexLSTM</td>
<td>95.89</td>
<td>80.12</td>
<td>21.61</td>
<td>79.85</td>
<td>12.70</td>
</tr>
</tbody>
</table>
</table-wrap><fig id="fig-5">
<label>Figure 5</label>
<caption>
<title>Performance metrics of proposed novelty models for binary class distribution</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_45975-fig-5.tif"/>
</fig>
<p><bold>Question 2:</bold> Which Deep Learning method outperforms benchmarking when compared to the fundamental approach, allowing the best base model to be chosen for additional testing?</p>
<p>For multi-class:</p>
<p><xref ref-type="table" rid="table-6">Table 6</xref> shows that among the various machine learning approaches and ensemble methods evaluated, the proposed hybrid models, specifically the InceptVGG and VGGRes Models, consistently exhibit superior AUC scores on both the training and validation datasets. Notably, the VGGRes Model emerges as the standout performer, achieving the highest AUC score of 0.89 on the testing dataset. This outstanding result underscores the effectiveness of the VGGRes Model in the context of ECG signal classification, highlighting its potential for accurate and reliable diagnostics. Importantly, VGGRes models exhibit adaptability to various types of lead ECG signals, as visualized in <xref ref-type="fig" rid="fig-6">Figs. 6</xref>&#x2013;<xref ref-type="fig" rid="fig-8">8</xref>, which presents the performance metrics graph, confusion matrix, and visualization of ECG plots generated by the VGGRes with 27 class labels for the data.</p>
<table-wrap id="table-6">
<label>Table 6</label>
<caption>
<title>Comparison of proposed ensemble model with previous models using arrhythmia dataset</title>
</caption>
<table frame="hsides">
<colgroup>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
</colgroup>
<thead valign="top">
<tr>
<th>Method</th>
<th>Training AUC</th>
<th>Validation AUC</th>
<th>Testing AUC</th>
</tr>
</thead>
<tbody valign="top">
<tr>
<td>Different DL approaches [<xref ref-type="bibr" rid="ref-39">39</xref>]</td>
<td>0.76</td>
<td>0.68</td>
<td>0.72</td>
</tr>
<tr>
<td>ATI-CNN [<xref ref-type="bibr" rid="ref-40">40</xref>]</td>
<td>0.67</td>
<td>0.70</td>
<td>0.72</td>
</tr>
<tr>
<td>MBSF-Net [<xref ref-type="bibr" rid="ref-41">41</xref>]</td>
<td>0.79</td>
<td>0.71</td>
<td>0.69</td>
</tr>
<tr>
<td>Proposed ResIncept model</td>
<td>0.81</td>
<td>0.74</td>
<td>0.87</td>
</tr>
<tr>
<td>Proposed VGGRes model</td>
<td>0.81</td>
<td>0.82</td>
<td>0.89</td>
</tr>
<tr>
<td>Proposed InceptVGG model</td>
<td>0.80</td>
<td>0.80</td>
<td>0.86</td>
</tr>
<tr>
<td>Proposed LenetAlexLSTM model</td>
<td>0.77</td>
<td>0.78</td>
<td>0.79</td>
</tr>
</tbody>
</table>
</table-wrap><fig id="fig-6">
<label>Figure 6</label>
<caption>
<title>Confusion matrix of VGGRes models</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_45975-fig-6.tif"/>
</fig><fig id="fig-7">
<label>Figure 7</label>
<caption>
<title>Training, testing and loss graph of VGGRes models</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_45975-fig-7.tif"/>
</fig><fig id="fig-8">
<label>Figure 8</label>
<caption>
<title>Plot of ECG signals predicted by VGGRes models</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_45975-fig-8.tif"/>
</fig>
<p>Binary Class: novelty algorithm:</p>
<p>The ability of transformer models to attend to various sequence elements with varied degrees of priority allows them to handle variable-ECG length sequences. The transformer-based hybrid combination allowed this framework to achieve its higher performance. <xref ref-type="table" rid="table-7">Table 7</xref> shows that the fold values of the convoluted transformer model have earned an excellent sensitivity score compared to traditional methods. This underscores the model&#x2019;s robustness, primarily due to innovative algorithmic features and carefully designed pre-processing stages.</p>
<table-wrap id="table-7">
<label>Table 7</label>
<caption>
<title>K-fold values of proposed binary class model with traditional methods</title>
</caption>
<table frame="hsides">
<colgroup>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
</colgroup>
<thead valign="top">
<tr>
<th>Folds</th>
<th>CNN</th>
<th>ELM-RNN</th>
<th>LSTM-Autoencoders</th>
<th>Proposed model</th>
</tr>
</thead>
<tbody valign="top">
<tr>
<td>Fold 2</td>
<td>81.22</td>
<td>82.11</td>
<td>80.44</td>
<td>86.286</td>
</tr>
<tr>
<td>Fold 4</td>
<td>83.59</td>
<td>85.03</td>
<td>84.12</td>
<td>90.286</td>
</tr>
<tr>
<td>Fold 6</td>
<td>85.78</td>
<td>86.89</td>
<td>85.32</td>
<td>90.721</td>
</tr>
<tr>
<td>Fold 8</td>
<td>87.91</td>
<td>88.67</td>
<td>87.29</td>
<td>93.276</td>
</tr>
<tr>
<td>Fold 10</td>
<td>89.45</td>
<td>90.12</td>
<td>89.93</td>
<td>96.60</td>
</tr>
</tbody>
</table>
</table-wrap>
<p><bold>Question 3:</bold> Which category of features like raw signals convoluted features or RRI convoluted features in binary class over single and multi-modal dataset have the best impact?</p>
<p>The proposed novelty algorithm was observed to perform well on RRI-featured signals displayed in <xref ref-type="table" rid="table-8">Table 8</xref>. To prove the best impacts of this working model, we have tested with various datasets for raw and RRI-detected signals.</p>
<table-wrap id="table-8">
<label>Table 8</label>
<caption>
<title>Comparison of &#x201C;Convoluted Transformer model&#x201D; with raw and RRI signals</title>
</caption>
<table frame="hsides">
<colgroup>
<col/>
<col/>
<col/>
<col/>
<col/>
<col/>
<col/>
<col/>
<col/>
<col/>
</colgroup>
<thead valign="top">
<tr>
<th rowspan="2">Dataset</th>
<th rowspan="2">Epochs</th>
<th align="center" colspan="2">Test accuracy</th>
<th align="center" colspan="2">Test sensitivity</th>
<th align="center" colspan="2">Test specificity</th>
<th align="center" colspan="2">Test loss</th>
</tr>
<tr>
<th>Raw signal</th>
<th>RRI<break/>signal</th>
<th>Raw signals</th>
<th>RRI signal</th>
<th>Raw signal</th>
<th>RRI signals</th>
<th>Raw signal</th>
<th>RRI signal</th>
</tr>
</thead>
<tbody valign="top">
<tr>
<td>China physiological signal challenge 2018</td>
<td>100</td>
<td>84.57</td>
<td>88.5</td>
<td>87.8</td>
<td>88.9</td>
<td>71.9</td>
<td>71.9</td>
<td>0.151</td>
<td>0.189</td>
</tr>
<tr>
<td>PTB-XL electrocardiography database</td>
<td>100</td>
<td>82.3</td>
<td>83.2</td>
<td>85.5</td>
<td>86.0</td>
<td>77.1</td>
<td>80.5</td>
<td>0.154</td>
<td>0.164</td>
</tr>
<tr>
<td>Multi-model database</td>
<td>100</td>
<td>85</td>
<td>96.60</td>
<td>89.1</td>
<td>97</td>
<td>91.0</td>
<td>95.94</td>
<td>0.143</td>
<td>0.155</td>
</tr>
</tbody>
</table>
</table-wrap>
 
<p><xref ref-type="table" rid="table-8">Table 8</xref> briefly illustrates the hybrid combination &#x201C;CNN &#x002B; Transformer model&#x201D; performance with the single dataset value and multi-modal ECG signal dataset. All the datasets performed well on this working model. The best method of this model is given by a multi-model dataset with the pipeline setup of &#x201C;Filtering technique &#x002B; RRI detected signals &#x002B; CNN features &#x002B; transformer time series model&#x201D; with the accuracy of 96.8 seen by Auto ML &#x0026; DL framework. Compared to raw signals, which show a significant variation of difference, 11.8 states that peak detection has the best impact in detection for binary class arrhythmia. The training loss of the model is also 20%, and, following that, set the dropout rate at 0.25, which performed efficiently over other models.</p>
<p>To prove the best impacts of this working model we have further tested with various dataset for both raw and RRI detected signals. <xref ref-type="table" rid="table-8">Table 8</xref> illustrated the performance of hybrid combination &#x201C;CNN &#x002B; Transformer model&#x201D; with the single dataset value and multi modal ECG signal dataset.</p>
</sec>
<sec id="s4_2">
<label>4.2</label>
<title>Implications of Findings</title>
<p>Our findings offer crucial implications for the field of arrhythmia detection in ECG signals:</p>
<p>Enhanced Diagnosis Precision: Our meticulous approach to preprocessing, model selection, and feature engineering improves diagnostic accuracy, benefiting patient care through quicker, tailored treatment plans.</p>
<p>Streamlined Healthcare Delivery: Our guidance on selecting deep learning models streamlines the development of automated arrhythmia detection systems, reducing the workload on healthcare professionals, especially in resource-constrained settings.</p>
<p>Feature Engineering Insight: Emphasizing the importance of features, especially RRI convoluted features, enhances our understanding of critical elements in arrhythmia detection applicable to this field and other ECG-based diagnostic tasks.</p>
</sec>
<sec id="s4_3">
<label>4.3</label>
<title>Discussion</title>
<p>Our study has unveiled valuable insights into arrhythmia detection in ECG signals, shedding light on the advantages and considerations associated with the model and its findings. These insights could revolutionize the accuracy and efficiency of arrhythmia diagnosis in healthcare.</p>
<p>Enhanced Diagnostic Accuracy: One of the most significant advantages of our model is the substantial improvement in diagnostic accuracy achieved through meticulous preprocessing, model selection, and feature engineering. Notably, the VGGRes Model demonstrated a remarkable precision of 97.39% in specific configurations.</p>
<p>Model Selection Guidance: Our study provides invaluable guidance for selecting appropriate deep-learning models for ECG signal classification. The VGGRes Model, Inception model, and RRI transformer-based hybrid models emerged as top performers. This guidance simplifies the decision-making process for researchers and healthcare professionals, enabling them to choose models tailored to their specific diagnostic needs.</p>
<p>Despite its numerous advantages, our study does present certain disadvantages and considerations:
<list list-type="order">
<list-item>
<p>The potential variability in ECG signal data across different patient populations and healthcare settings could affect the generalizability of our findings.</p></list-item>
<list-item>
<p>Some of the deep learning models utilized, particularly those with complex architectures, may demand significant computational resources, which could be limiting in resource-constrained healthcare environments.</p></list-item>
<list-item>
<p>The ethical and regulatory aspects surrounding data privacy and security, especially when handling sensitive medical data, must be diligently addressed.</p></list-item>
</list></p>
</sec>
</sec>
<sec id="s5">
<label>5</label>
<title>Conclusion and Future Work</title>
<p>The AutoRhythmAI model framework conducted a hyperparameter optimization test to validate its novel approach. The framework demonstrated superior performance, showcasing its capability to automate ECG signal classification by combining binary and multi-class arrhythmia detection. Initially, we amalgamated diverse databases from various geographical locations, ensuring data balance through machine learning balancing algorithms. Subsequently, we subjected the data to two levels of preprocessing within the custom-designed auto-ML and deep learning techniques. We considered 27 groups for binary-class classification: 26 related to heart problems (including those merged with aberrant signals) and one associated with a healthy condition. All 27 groups underwent multi-label binarization. In feature extraction, we introduced the &#x201C;RRI-Convoluted Transformer model&#x201D; for binary ECG signal classification, demonstrating significantly higher accuracy. The ensemble model named the VGGRes Model for multi-class type also yielded the best accuracy. The obtained results validate the effectiveness and applicability of the proposed model, with comparisons and validation against recent literature. Despite its high computational complexity and limited interpretability, the model represents a pioneering step towards improving utility in the healthcare industry. Future studies will focus on refining this method. The research introduces the AutoRhythmAI framework, marking a groundbreaking advancement in arrhythmia detection. Its fusion of traditional and deep learning techniques, automated feature engineering, flexible pipeline, innovative algorithms, ensemble-based approach, real-time potential, and data integration highlights its potential impact on accurate and efficient diagnostics.</p>
<p>Designing this AutoRhythmAI model requires significant computational resources and can be computationally expensive. Small businesses or organizations with limited computational resources may find this challenging. This model is hard to interpret due to the complexity of the optimization process [<xref ref-type="bibr" rid="ref-42">42</xref>]. It is also clear that including RRI features improves model performance significantly but does so at the cost of more than a nine-fold increase in training and inference time. One of this model&#x2019;s main limitations is the delayed Pan-Tompkins approach, used to conduct R-peak identification and is the reason for the time increase. According to the suggested system, the classification also depends on two pipelines, which allows for a slow detection rate. In the future, the hybrid combination of different deep learning models and various preprocessing can be constructed as a single pipeline to achieve higher computation and accuracy for detection.</p>
</sec>
</body>
<back>
<ack>
<p>The authors would like to appreciate all anonymous reviewers for their insightful comments and constructive suggestions to polish this paper in high quality.</p>
</ack>
<sec><title>Funding Statement</title>
<p>The authors received no specific funding for this study.</p>
</sec>
<sec><title>Author Contributions</title>
<p>The authors confirm contribution to the paper as follows: study conception and design: S. Jayanthi, S. Prasanna Devi; data collection: S. Jayanthi; analysis and interpretation of results: S. Jayanthi; draft manuscript preparation: S. Jayanthi, S. Prasanna Devi. All authors reviewed the results and approved the final version of the manuscript.</p>
</sec>
<sec sec-type="data-availability"><title>Availability of Data and Materials</title>
<p>Not applicable.</p>
</sec>
<sec sec-type="COI-statement"><title>Conflicts of Interest</title>
<p>The authors declare that they have no conflicts of interest to report regarding the present study.</p>
</sec>
<ref-list content-type="authoryear">
<title>References</title>
<ref id="ref-1"><label>[1]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>C.</given-names> <surname>Leclercq</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Wearables, telemedicine, and artificial intelligence in arrhythmias and heart failure</article-title>,&#x201D; <source>Proc. Eur. Soc. Cardiol. Cardiovasc. Round Table, Europace</source>, vol. <volume>24</volume>, no. <issue>9</issue>, pp. <fpage>1372</fpage>&#x2013;<lpage>1383</lpage>, <year>2022</year>. doi: <pub-id pub-id-type="doi">10.1093/europace/euac052</pub-id>; <pub-id pub-id-type="pmid">35640917</pub-id></mixed-citation></ref>
<ref id="ref-2"><label>[2]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>R.</given-names> <surname>Lalo</surname></string-name>, <string-name><given-names>I.</given-names> <surname>Zekja</surname></string-name>, and <string-name><given-names>F.</given-names> <surname>Kamberi</surname></string-name></person-group>, &#x201C;<article-title>Association of cardiovascular disease risk and health-related behaviors in stroke patients</article-title>,&#x201D; <source>Int. J. Environ. Res. Public Health</source>, vol. <volume>20</volume>, no. <issue>4</issue>, pp. <fpage>3693</fpage>, <year>2023</year>. doi: <pub-id pub-id-type="doi">10.1016/j.est.2021.103009</pub-id>.</mixed-citation></ref>
<ref id="ref-3"><label>[3]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S. K.</given-names> <surname>Saini</surname></string-name> and <string-name><given-names>R.</given-names> <surname>Gupta</surname></string-name></person-group>, &#x201C;<article-title>Artificial intelligence methods for analysis of electrocardiogram signals for cardiac abnormalities: State-of-the-art and future challenges</article-title>,&#x201D; <source>Artif. Intell. Rev.</source>, vol. <volume>55</volume>, no. <issue>2</issue>, pp. <fpage>1519</fpage>&#x2013;<lpage>1565</lpage>, <year>2022</year>. doi: <pub-id pub-id-type="doi">10.1016/j.est.2021.103009</pub-id>.</mixed-citation></ref>
<ref id="ref-4"><label>[4]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>Y. L.</given-names> <surname>Liu</surname></string-name>, <string-name><given-names>C. S.</given-names> <surname>Lin</surname></string-name>, <string-name><given-names>C. C.</given-names> <surname>Cheng</surname></string-name>, and <string-name><given-names>C.</given-names> <surname>Lin</surname></string-name></person-group>, &#x201C;<article-title>A deep learning algorithm for detecting acute pericarditis by electrocardiogram</article-title>,&#x201D; <source>J. Pers. Med.</source>, vol. <volume>12</volume>, no. <issue>7</issue>, pp. <fpage>1150</fpage>, <year>2022</year>. doi: <pub-id pub-id-type="doi">10.3390/jpm12071150</pub-id>; <pub-id pub-id-type="pmid">35887647</pub-id></mixed-citation></ref>
<ref id="ref-5"><label>[5]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>Jayanthi</surname></string-name> and <string-name><given-names>S. P.</given-names> <surname>Devi</surname></string-name></person-group>, &#x201C;<article-title>Automated machine learning on high dimensional big data for prediction tasks</article-title>,&#x201D; in <conf-name>Proc. ICCES</conf-name>, <publisher-loc>Coimbatore, India</publisher-loc>, <year>2022</year>, pp. <fpage>995</fpage>&#x2013;<lpage>1001</lpage>.</mixed-citation></ref>
<ref id="ref-6"><label>[6]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>K. M.</given-names> <surname>Aamir</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Automatic heart disease detection by classification of ventricular arrhythmias on ECG using machine learning</article-title>,&#x201D; <source>Comput. Mater. Continua</source>, vol. <volume>71</volume>, no. <issue>1</issue>, <year>2022</year>. doi: <pub-id pub-id-type="doi">10.32604/cmc.2022.018613</pub-id></mixed-citation></ref>
<ref id="ref-7"><label>[7]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>J. P.</given-names> <surname>Consuegra-Ayala</surname></string-name>, <string-name><given-names>Y.</given-names> <surname>Guti&#x00E9;rrez</surname></string-name>, <string-name><given-names>Y.</given-names> <surname>Almeida-Cruz</surname></string-name>, and <string-name><given-names>M.</given-names> <surname>Palomar</surname></string-name></person-group>, &#x201C;<article-title>Intelligent ensembling of auto-ML system outputs for solving classification problems</article-title>,&#x201D; <source>Inform. Sci.</source>, vol. <volume>609</volume>, pp. <fpage>766</fpage>&#x2013;<lpage>780</lpage>, <year>2022</year>. doi: <pub-id pub-id-type="doi">10.1016/j.ins.2022.07.061</pub-id>.</mixed-citation></ref>
<ref id="ref-8"><label>[8]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>T.</given-names> <surname>Siriborvornratanakul</surname></string-name></person-group>, &#x201C;<article-title>Human behavior in image-based road health inspection systems despite the emerging AutoML</article-title>,&#x201D; <source>J. Big Data</source>, vol. <volume>9</volume>, no. <issue>1</issue>, pp. <fpage>96</fpage>, <year>2022</year>. doi: <pub-id pub-id-type="doi">10.1186/s40537-022-00646-8</pub-id>; <pub-id pub-id-type="pmid">35879937</pub-id></mixed-citation></ref>
<ref id="ref-9"><label>[9]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>X.</given-names> <surname>Wang</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>An ECG signal denoising method using conditional generative adversarial net</article-title>,&#x201D; <source>IEEE J. Biomed. Health Inform.</source>, vol. <volume>26</volume>, no. <issue>7</issue>, pp. <fpage>2929</fpage>&#x2013;<lpage>2940</lpage>, <year>2022</year>. doi: <pub-id pub-id-type="doi">10.1109/JBHI.2022.3169325</pub-id>; <pub-id pub-id-type="pmid">35446775</pub-id></mixed-citation></ref>
<ref id="ref-10"><label>[10]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>M.</given-names> <surname>Feurer</surname></string-name>, <string-name><given-names>K.</given-names> <surname>Eggensperger</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Falkner</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Lindauer</surname></string-name>, and <string-name><given-names>F.</given-names> <surname>Hutter</surname></string-name></person-group>, &#x201C;<article-title>Auto-sklearn 2.0: Hands-free automl via meta-learning</article-title>,&#x201D; <source>J. Mach. Learn. Res.</source>, vol. <volume>23</volume>, no. <issue>1</issue>, pp. <fpage>11936</fpage>&#x2013;<lpage>11996</lpage>, <year>2022</year>.</mixed-citation></ref>
<ref id="ref-11"><label>[11]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>H.</given-names> <surname>Jin</surname></string-name>, <string-name><given-names>Q.</given-names> <surname>Song</surname></string-name>, and <string-name><given-names>X.</given-names> <surname>Hu</surname></string-name></person-group>, &#x201C;<article-title>Auto-keras: An efficient neural architecture search system</article-title>,&#x201D; in <conf-name>Proc. 25th ACM SIGKDD Int. Conf. Knowl. Discov. Data Min.</conf-name>, <publisher-loc>New York, NY, USA</publisher-loc>, <year>2019</year>, pp. <fpage>1946</fpage>&#x2013;<lpage>1956</lpage>.</mixed-citation></ref>
<ref id="ref-12"><label>[12]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>J. D.</given-names> <surname>Romano</surname></string-name>, <string-name><given-names>T. T.</given-names> <surname>Le</surname></string-name>, <string-name><given-names>W.</given-names> <surname>Fu</surname></string-name>, and <string-name><given-names>J. H.</given-names> <surname>Moore</surname></string-name></person-group>, &#x201C;<article-title>TPOT-NN: Augmenting tree-based automated machine learning with neural network estimators</article-title>,&#x201D; <source>Genet. Program. Evolvable Mach.</source>, vol. <volume>22</volume>, pp. <fpage>207</fpage>&#x2013;<lpage>227</lpage>, <year>2021</year>. doi: <pub-id pub-id-type="doi">10.1007/s10710-021-09401-z</pub-id>.</mixed-citation></ref>
<ref id="ref-13"><label>[13]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>Naseem</surname></string-name>, <string-name><given-names>K.</given-names> <surname>Javed</surname></string-name>, <string-name><given-names>M. J.</given-names> <surname>Khan</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Rubab</surname></string-name>, <string-name><given-names>M. A.</given-names> <surname>Khan</surname></string-name> and <string-name><given-names>Y.</given-names> <surname>Nam</surname></string-name></person-group>, &#x201C;<article-title>Integrated CWT-CNN for epilepsy detection using multiclass EEG dataset</article-title>,&#x201D; <source>Comput. Mater. Continua</source>, vol. <volume>69</volume>, no. <issue>1</issue>, pp. <fpage>471</fpage>&#x2013;<lpage>486</lpage>, <year>2021</year>. doi: <pub-id pub-id-type="doi">10.32604/cmc.2021.018239</pub-id></mixed-citation></ref>
<ref id="ref-14"><label>[14]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>Chatterjee</surname></string-name>, <string-name><given-names>R. S.</given-names> <surname>Thakur</surname></string-name>, <string-name><given-names>R. N.</given-names> <surname>Yadav</surname></string-name>, <string-name><given-names>L.</given-names> <surname>Gupta</surname></string-name>, and <string-name><given-names>D. K.</given-names> <surname>Raghuvanshi</surname></string-name></person-group>, &#x201C;<article-title>Review of noise removal techniques in ECG signals</article-title>,&#x201D; <source>IET Signal Process</source>, vol. <volume>14</volume>, no. <issue>9</issue>, pp. <fpage>569</fpage>&#x2013;<lpage>590</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-15"><label>[15]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>M. U.</given-names> <surname>Ali</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Correlation-filter-based channel and feature selection framework for hybrid EEG-fNIRS BCI applications</article-title>,&#x201D; <source>IEEE J. Biomed. Health</source>, <year>2023</year>. doi: <pub-id pub-id-type="doi">10.1109/JBHI.2023.3294586</pub-id>; <pub-id pub-id-type="pmid">37436864</pub-id></mixed-citation></ref>
<ref id="ref-16"><label>[16]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>M.</given-names> <surname>Loria-Romero</surname></string-name>, <string-name><given-names>H.</given-names> <surname>Peregrina-Barreto</surname></string-name>, <string-name><given-names>J.</given-names> <surname>Rangel-Magdaleno</surname></string-name>, and <string-name><given-names>H. D.</given-names> <surname>Rico-Aniles</surname></string-name></person-group>, &#x201C;<article-title>Ventricular fibrillation characterization for sudden cardiac death risk prediction based on wavelet analysis</article-title>,&#x201D; in <conf-name>2022 IEEE Int. Symp. Med. Meas. Appl. (MeMeA)</conf-name>, <publisher-loc>Messina, Italy</publisher-loc>, <publisher-name>IEEE</publisher-name>, <year>2022</year>, pp. <fpage>1</fpage>&#x2013;<lpage>6</lpage>. doi: <pub-id pub-id-type="doi">10.1109/MeMeA54994.2022.9856497</pub-id>.</mixed-citation></ref>
<ref id="ref-17"><label>[17]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>N.</given-names> <surname>Salari</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Detection of sleep apnea using machine learning algorithms based on ECG signals: A comprehensive systematic review</article-title>,&#x201D; <source>Expert Syst. Appl.</source>, vol. <volume>187</volume>, pp. <fpage>115950</fpage>, <year>2022</year>. doi: <pub-id pub-id-type="doi">10.1016/j.eswa.2021.115950</pub-id>.</mixed-citation></ref>
<ref id="ref-18"><label>[18]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>Bibi</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>MSRNet: Multiclass skin lesion recognition using additional residual block based fine-tuned deep models information fusion and best feature selection</article-title>,&#x201D; <source>Diagnostics</source>, vol. <volume>13</volume>, no. <issue>19</issue>, pp. <fpage>3063</fpage>, <year>2023</year>; <pub-id pub-id-type="pmid">37835807</pub-id></mixed-citation></ref>
<ref id="ref-19"><label>[19]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>M.</given-names> <surname>Hussain</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>SkinNet-INIO: Multiclass skin lesion localization and classification using fusion-assisted deep neural networks and improved nature-inspired optimization algorithm</article-title>,&#x201D; <source>Diagnostics</source>, vol. <volume>13</volume>, no. <issue>18</issue>, pp. <fpage>2869</fpage>, <year>2023</year>; <pub-id pub-id-type="pmid">37761236</pub-id></mixed-citation></ref>
<ref id="ref-20"><label>[20]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>V.</given-names> <surname>Dillshad</surname></string-name>, <string-name><given-names>M. A.</given-names> <surname>Khan</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Nazir</surname></string-name>, <string-name><given-names>O.</given-names> <surname>Saidani</surname></string-name>, <string-name><given-names>N.</given-names> <surname>Alturki</surname></string-name> and <string-name><given-names>S.</given-names> <surname>Kadry</surname></string-name></person-group>, &#x201C;<article-title>D2LFS2Net: Multi-class skin lesion diagnosis using deep learning and variance-controlled marine predator optimisation: An application for precision medicine</article-title>,&#x201D; <source>CAAI Trans. Intell. Technol.</source>, <year>2023</year>. doi: <pub-id pub-id-type="doi">10.1049/cit2.12267</pub-id>.</mixed-citation></ref>
<ref id="ref-21"><label>[21]</label><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>Dey</surname></string-name>, <string-name><given-names>R.</given-names> <surname>Pal</surname></string-name>, and <string-name><given-names>S.</given-names> <surname>Biswas</surname></string-name></person-group>, &#x201C;<chapter-title>Deep learning algorithms for efficient analysis of ECG signals to detect heart disorders</chapter-title>,&#x201D; in <source>Biomedical Engineering</source>, <publisher-loc>London, UK</publisher-loc>: <publisher-name>IntechOpen</publisher-name>, <year>2022</year>. doi: <pub-id pub-id-type="doi">10.5772/intechopen.103075</pub-id>.</mixed-citation></ref>
<ref id="ref-22"><label>[22]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>O. M. A.</given-names> <surname>Ali</surname></string-name>, <string-name><given-names>S. W.</given-names> <surname>Kareem</surname></string-name>, and <string-name><given-names>A. S.</given-names> <surname>Mohammed</surname></string-name></person-group>, &#x201C;<article-title>Evaluation of electrocardiogram signals classification using CNN, SVM, and LSTM algorithm: A review</article-title>,&#x201D; in <conf-name>2022 8th Int. Eng. Conf. Sustain. Technol. Development (IEC)</conf-name>, <publisher-loc>Erbil, Iraq</publisher-loc>, <year>2022</year>, pp. <fpage>185</fpage>&#x2013;<lpage>191</lpage>.</mixed-citation></ref>
<ref id="ref-23"><label>[23]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>A.</given-names> <surname>Alqahtani</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Alsubai</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Sha</surname></string-name>, <string-name><given-names>M. A.</given-names> <surname>Khan</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Alhaisoni</surname></string-name> and <string-name><given-names>S. R.</given-names> <surname>Naqvi</surname></string-name></person-group>, &#x201C;<article-title>Automated white blood cell disease recognition using lightweight deep learning</article-title>,&#x201D; <source>Computer Systems Science and Engineering</source>, vol. <volume>46</volume>, no. <issue>1</issue>, pp. <fpage>107</fpage>&#x2013;<lpage>123</lpage>, <year>2023</year>.</mixed-citation></ref>
<ref id="ref-24"><label>[24]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>V.</given-names> <surname>Gupta</surname></string-name>, <string-name><given-names>N. K.</given-names> <surname>Saxena</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Kanungo</surname></string-name>, <string-name><given-names>P.</given-names> <surname>Kumar</surname></string-name>, and <string-name><given-names>S.</given-names> <surname>Diwania</surname></string-name></person-group>, &#x201C;<article-title>PCA as an effective tool for the detection of R-peaks in an ECG signal processing</article-title>,&#x201D; <source>Int. J. Syst. Assur. Eng. Manage.</source>, vol. <volume>13</volume>, no. <issue>5</issue>, pp. <fpage>2391</fpage>&#x2013;<lpage>2403</lpage>, <year>2022</year>. doi: <pub-id pub-id-type="doi">10.1007/s13198-022-01650-0</pub-id>.</mixed-citation></ref>
<ref id="ref-25"><label>[25]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>M. M.</given-names> <surname>Farag</surname></string-name></person-group>, &#x201C;<article-title>A matched filter-based convolutional neural network (CNN) for inter-patient ECG classification and arrhythmia detection at the edge</article-title>,&#x201D; <source>SSRN Electron. J.</source>, <year>2022</year>. doi: <pub-id pub-id-type="doi">10.2139/ssrn.4070665s</pub-id>.</mixed-citation></ref>
<ref id="ref-26"><label>[26]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>Irfan</surname></string-name>, <string-name><given-names>N.</given-names> <surname>Anjum</surname></string-name>, <string-name><given-names>T.</given-names> <surname>Althobaiti</surname></string-name>, <string-name><given-names>A. A.</given-names> <surname>Alotaibi</surname></string-name>, <string-name><given-names>A. B.</given-names> <surname>Siddiqui</surname></string-name> and <string-name><given-names>N.</given-names> <surname>Ramzan</surname></string-name></person-group>, &#x201C;<article-title>Heartbeat classification and arrhythmia detection using a multi-model deep-learning technique</article-title>,&#x201D; <source>Sens.</source>, vol. <volume>22</volume>, no. <issue>15</issue>, pp. <fpage>5606</fpage>, <year>2022</year>. doi: <pub-id pub-id-type="doi">10.3390/s22155606</pub-id>; <pub-id pub-id-type="pmid">35957162</pub-id></mixed-citation></ref>
<ref id="ref-27"><label>[27]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>M. S.</given-names> <surname>Islam</surname></string-name>, <string-name><given-names>M. N.</given-names> <surname>Islam</surname></string-name>, <string-name><given-names>N.</given-names> <surname>Hashim</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Rashid</surname></string-name>, <string-name><given-names>B. S.</given-names> <surname>Bari</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>New hybrid deep learning approach using BiGRU-BiLSTM and multilayered dilated CNN to detect arrhythmia</article-title>,&#x201D; <source>IEEE Access</source>, vol. <volume>10</volume>, pp. <fpage>58081</fpage>&#x2013;<lpage>58096</lpage>, <year>2022</year>. doi: <pub-id pub-id-type="doi">10.1109/ACCESS.2022.3178710</pub-id>.</mixed-citation></ref>
<ref id="ref-28"><label>[28]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>H. H.</given-names> <surname>Rashidi</surname></string-name>, <string-name><given-names>N.</given-names> <surname>Tran</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Albahra</surname></string-name>, and <string-name><given-names>L. T.</given-names> <surname>Dang</surname></string-name></person-group>, &#x201C;<article-title>Machine learning in health care and laboratory medicine: General overview of supervised learning and auto-ML</article-title>,&#x201D; <source>Int. J. Lab. Hematol.</source>, vol. <volume>43</volume>, pp. <fpage>15</fpage>&#x2013;<lpage>22</lpage>, <year>2021</year>. doi: <pub-id pub-id-type="doi">10.1111/ijlh.13537</pub-id>; <pub-id pub-id-type="pmid">34288435</pub-id></mixed-citation></ref>
<ref id="ref-29"><label>[29]</label><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><given-names>M.</given-names> <surname>Bodini</surname></string-name>, <string-name><given-names>M. W.</given-names> <surname>Rivolta</surname></string-name>, and <string-name><given-names>R.</given-names> <surname>Sassi</surname></string-name></person-group>, &#x201C;<chapter-title>Classification of ECG signals with different lead systems using autoML</chapter-title>,&#x201D; in <source>Computing in Cardiology (CinC)</source>, <publisher-loc>Brno, Czech Republic</publisher-loc>, <year>2021</year>, pp. <fpage>1</fpage>&#x2013;<lpage>4</lpage>.</mixed-citation></ref>
<ref id="ref-30"><label>[30]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>D.</given-names> <surname>Kumar</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Puthusserypady</surname></string-name>, <string-name><given-names>H.</given-names> <surname>Dominguez</surname></string-name>, <string-name><given-names>K.</given-names> <surname>Sharma</surname></string-name>, and <string-name><given-names>J. E.</given-names> <surname>Bardram</surname></string-name></person-group>, &#x201C;<article-title>CACHET-CADB: A contextualized ambulatory electrocardiography arrhythmia dataset</article-title>,&#x201D; <source>Front. Cardiovasc. Med.</source>, vol. <volume>9</volume>, pp. <fpage>893090</fpage>, <year>2022</year>. doi: <pub-id pub-id-type="doi">10.3389/fcvm.2022.893090</pub-id>; <pub-id pub-id-type="pmid">35845039</pub-id></mixed-citation></ref>
<ref id="ref-31"><label>[31]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>Y.</given-names> <surname>Xia</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>An automatic cardiac arrhythmia classification system with wearable electrocardiogram</article-title>,&#x201D; <source>IEEE Access</source>, vol. <volume>6</volume>, pp. <fpage>16529</fpage>&#x2013;<lpage>16538</lpage>, <year>2018</year>. doi: <pub-id pub-id-type="doi">10.1109/ACCESS.2018.2807700</pub-id>.</mixed-citation></ref>
<ref id="ref-32"><label>[32]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>D. K.</given-names> <surname>Atal</surname></string-name> and <string-name><given-names>M.</given-names> <surname>Singh</surname></string-name></person-group>, &#x201C;<article-title>Arrhythmia classification with ECG signals based on the optimization-enabled deep convolutional neural network</article-title>,&#x201D; <source>Comput. Meth. Prog. Bio.</source>, vol. <volume>196</volume>, pp. <fpage>105607</fpage>, <year>2020</year>. doi: <pub-id pub-id-type="doi">10.1016/j.est.2021.103009</pub-id>.</mixed-citation></ref>
<ref id="ref-33"><label>[33]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>J.</given-names> <surname>Hua</surname></string-name>, <string-name><given-names>B.</given-names> <surname>Chu</surname></string-name>, <string-name><given-names>J.</given-names> <surname>Zou</surname></string-name>, and <string-name><given-names>J.</given-names> <surname>Jia</surname></string-name></person-group>, &#x201C;<article-title>ECG signal classification in wearable devices based on compressed domain</article-title>,&#x201D; <source>PLoS One</source>, vol. <volume>18</volume>, no. <issue>4</issue>, pp. <fpage>e0284008</fpage>, <year>2023</year>. doi: <pub-id pub-id-type="doi">10.1371/journal.pone.0284008</pub-id>; <pub-id pub-id-type="pmid">37014879</pub-id></mixed-citation></ref>
<ref id="ref-34"><label>[34]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>M. R.</given-names> <surname>Rajeshwari</surname></string-name> and <string-name><given-names>K. S.</given-names> <surname>Kavitha</surname></string-name></person-group>, &#x201C;<article-title>Arrhythmia ventricular fibrillation classification on ECG signal using ensemble feature selection and deep neural network</article-title>,&#x201D; <source>Cluster Comput.</source>, vol. <volume>25</volume>, no. <issue>5</issue>, pp. <fpage>3085</fpage>&#x2013;<lpage>3102</lpage>, <year>2022</year>. doi: <pub-id pub-id-type="doi">10.1007/s10586-022-03547-w</pub-id>.</mixed-citation></ref>
<ref id="ref-35"><label>[35]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>M.</given-names> <surname>Hammad</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Meshoul</surname></string-name>, <string-name><given-names>P.</given-names> <surname>Dziwi&#x0144;ski</surname></string-name>, <string-name><given-names>P.</given-names> <surname>P&#x0142;awiak</surname></string-name>, and <string-name><given-names>I. A.</given-names> <surname>Elgendy</surname></string-name></person-group>, &#x201C;<article-title>Efficient lightweight multimodel deep fusion based on ECG for arrhythmia classification</article-title>,&#x201D; <source>Sens.</source>, vol. <volume>22</volume>, no. <issue>23</issue>, pp. <fpage>9347</fpage>, <year>2022</year>. doi: <pub-id pub-id-type="doi">10.3390/s22239347</pub-id>; <pub-id pub-id-type="pmid">36502049</pub-id></mixed-citation></ref>
<ref id="ref-36"><label>[36]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>Jayanthi</surname></string-name> and <string-name><given-names>S. P.</given-names> <surname>Devi</surname></string-name></person-group>, &#x201C;<article-title>Ensemble of deep learning models for classification of heart beats arrhythmias detection</article-title>,&#x201D; <source>J. Theor. Appl. Inform. Technol.</source>, vol. <volume>101</volume>, no. <issue>8</issue>, <year>2023</year>.</mixed-citation></ref>
<ref id="ref-37"><label>[37]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>I. H.</given-names> <surname>Tsai</surname></string-name> and <string-name><given-names>B. I.</given-names> <surname>Morshed</surname></string-name></person-group>, &#x201C;<article-title>Scalable and upgradable AI for detected beat-by-beat ECG signals in smart health</article-title>,&#x201D; in <conf-name>Proc. AIIoT Seattle</conf-name>, <publisher-loc>WA, USA</publisher-loc>, <year>2023</year>, pp. <fpage>0409</fpage>&#x2013;<lpage>0414</lpage>. doi: <pub-id pub-id-type="doi">10.1109/AIIoT58121.2023.10174482</pub-id>.</mixed-citation></ref>
<ref id="ref-38"><label>[38]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>N.</given-names> <surname>Salankar</surname></string-name>, <string-name><given-names>D.</given-names> <surname>Koundal</surname></string-name>, <string-name><given-names>C.</given-names> <surname>Chakraborty</surname></string-name>, and <string-name><given-names>L.</given-names> <surname>Garg</surname></string-name></person-group>, &#x201C;<article-title>Automated attention deficit classification system from multimodal physiological signals</article-title>,&#x201D; <source>Multimed. Tools Appl.</source>, vol. <volume>82</volume>, no. <issue>4</issue>, pp. <fpage>4897</fpage>&#x2013;<lpage>4912</lpage>, <year>2023</year>. doi: <pub-id pub-id-type="doi">10.1007/s11042-022-12170-1</pub-id>.</mixed-citation></ref>
<ref id="ref-39"><label>[39]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>A. K.</given-names> <surname>Singh</surname></string-name> and <string-name><given-names>S.</given-names> <surname>Krishnan</surname></string-name></person-group>, &#x201C;<article-title>ECG signal feature extraction trends in methods and applications</article-title>,&#x201D; <source>Biomed. Eng. Online</source>, vol. <volume>22</volume>, no. <issue>1</issue>, pp. <fpage>1</fpage>&#x2013;<lpage>36</lpage>, <year>2023</year>. doi: <pub-id pub-id-type="doi">10.1186/s12938-023-01075-1</pub-id>; <pub-id pub-id-type="pmid">36890566</pub-id></mixed-citation></ref>
<ref id="ref-40"><label>[40]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>Y.</given-names> <surname>Dong</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Detection of arrhythmia in 12-lead varied-length ECG using multi-branch signal fusion network</article-title>,&#x201D; <source>Physiol. Meas.</source>, vol. <volume>43</volume>, no. <issue>10</issue>, pp. <fpage>105009</fpage>, <year>2022</year>. doi: <pub-id pub-id-type="doi">10.1088/1361-6579/ac7938</pub-id>; <pub-id pub-id-type="pmid">35705072</pub-id></mixed-citation></ref>
<ref id="ref-41"><label>[41]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>Q.</given-names> <surname>Yao</surname></string-name>, <string-name><given-names>R.</given-names> <surname>Wang</surname></string-name>, <string-name><given-names>X.</given-names> <surname>Fan</surname></string-name>, <string-name><given-names>J.</given-names> <surname>Liu</surname></string-name>, and <string-name><given-names>Y.</given-names> <surname>Li</surname></string-name></person-group>, &#x201C;<article-title>Multi-class arrhythmia detection from 12-lead varied-length ECG using attention-based time-incremental convolutional neural network</article-title>,&#x201D; <source>Inform. Fusion</source>, vol. <volume>53</volume>, pp. <fpage>174</fpage>&#x2013;<lpage>182</lpage>, <year>2020</year>. doi: <pub-id pub-id-type="doi">10.1016/j.inffus.2019.06.024</pub-id>.</mixed-citation></ref>
<ref id="ref-42"><label>[42]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>Jayanthi</surname></string-name> and <string-name><given-names>S. P.</given-names> <surname>Devi</surname></string-name></person-group>, &#x201C;<article-title>Automated ECG arrhythmia classification using resnet and autoML learning model</article-title>,&#x201D; in <conf-name>Proc. ICSTCEE</conf-name>, <conf-loc>Bengaluru, India</conf-loc>, <year>2022</year>, pp. <fpage>1</fpage>&#x2013;<lpage>6</lpage>.</mixed-citation></ref>
</ref-list>
</back></article>