<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.1 20151215//EN" "http://jats.nlm.nih.gov/publishing/1.1/JATS-journalpublishing1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" article-type="research-article" dtd-version="1.1">
<front>
<journal-meta>
<journal-id journal-id-type="pmc">CMC</journal-id>
<journal-id journal-id-type="nlm-ta">CMC</journal-id>
<journal-id journal-id-type="publisher-id">CMC</journal-id>
<journal-title-group>
<journal-title>Computers, Materials &#x0026; Continua</journal-title>
</journal-title-group>
<issn pub-type="epub">1546-2226</issn>
<issn pub-type="ppub">1546-2218</issn>
<publisher>
<publisher-name>Tech Science Press</publisher-name>
<publisher-loc>USA</publisher-loc>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="publisher-id">19706</article-id>
<article-id pub-id-type="doi">10.32604/cmc.2022.019706</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Article</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>IoMT-Enabled Fusion-Based Model to Predict Posture for Smart Healthcare Systems</article-title>
<alt-title alt-title-type="left-running-head">IoMT-Enabled Fusion-Based Model to Predict Posture for Smart Healthcare Systems</alt-title>
<alt-title alt-title-type="right-running-head">IoMT-Enabled Fusion-Based Model to Predict Posture for Smart Healthcare Systems</alt-title>
</title-group>
<contrib-group content-type="authors">
<contrib id="author-1" contrib-type="author" corresp="yes">
<name name-style="western">
<surname>Ghazal</surname>
<given-names>Taher M.</given-names>
</name>
<xref ref-type="aff" rid="aff-1">1</xref>
<xref ref-type="aff" rid="aff-2">2</xref><email>ghazal1000@gmail.com</email>
</contrib>
<contrib id="author-2" contrib-type="author">
<name name-style="western">
<surname>Hasan</surname>
<given-names>Mohammad Kamrul</given-names>
</name>
<xref ref-type="aff" rid="aff-1">1</xref>
</contrib>
<contrib id="author-3" contrib-type="author">
<name name-style="western">
<surname>Abdullah</surname>
<given-names>Siti Norul Huda</given-names>
</name>
<xref ref-type="aff" rid="aff-1">1</xref>
</contrib>
<contrib id="author-4" contrib-type="author">
<name name-style="western">
<surname>Abubakkar</surname>
<given-names>Khairul Azmi</given-names>
</name>
<xref ref-type="aff" rid="aff-1">1</xref>
</contrib>
<contrib id="author-5" contrib-type="author">
<name name-style="western">
<surname>Afifi</surname>
<given-names>Mohammed A. M.</given-names>
</name>
<xref ref-type="aff" rid="aff-2">2</xref>
</contrib>
<aff id="aff-1"><label>1</label><institution>Faculty of Information Science and Technology, University Kebangsaan Malaysia, UKM</institution>, <addr-line>43600, Selangor</addr-line>, <country>Malaysia</country></aff>
<aff id="aff-2"><label>2</label><institution>Skyline University College</institution>, <addr-line>Sharjah</addr-line>, <country>United Arab Emirates</country></aff>
</contrib-group>
<author-notes>
<corresp id="cor1">&#x002A;Corresponding Author: Taher M. Ghazal. Email: <email>ghazal1000@gmail.com</email></corresp>
</author-notes>
<pub-date pub-type="epub" date-type="pub" iso-8601-date="2021-11-29">
<day>29</day>
<month>11</month>
<year>2021</year>
</pub-date>
<volume>71</volume>
<issue>2</issue>
<fpage>2579</fpage>
<lpage>2597</lpage>
<history>
<date date-type="received">
<day>22</day>
<month>4</month>
<year>2021</year>
</date>
<date date-type="accepted">
<day>26</day>
<month>8</month>
<year>2021</year>
</date>
</history>
<permissions>
<copyright-statement>&#x00A9; 2022 Ghazal et al.</copyright-statement>
<copyright-year>2022</copyright-year>
<copyright-holder>Ghazal et al.</copyright-holder>
<license xlink:href="https://creativecommons.org/licenses/by/4.0/">
<license-p>This work is licensed under a <ext-link ext-link-type="uri" xlink:type="simple" xlink:href="https://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</ext-link>, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.</license-p>
</license>
</permissions>
<self-uri content-type="pdf" xlink:href="TSP_CMC_19706.pdf"></self-uri>
<abstract>
<p>Smart healthcare applications depend on data from wearable sensors (WSs) mounted on a patient&#x2019;s body for frequent monitoring information. Healthcare systems depend on multi-level data for detecting illnesses and consequently delivering correct diagnostic measures. The collection of WS data and integration of that data for diagnostic purposes is a difficult task. This paper proposes an Errorless Data Fusion (EDF) approach to increase posture recognition accuracy. The research is based on a case study in a health organization. With the rise in smart healthcare systems, WS data fusion necessitates careful attention to provide sensitive analysis of the recognized illness. As a result, it is dependent on WS inputs and performs group analysis at a similar rate to improve diagnostic efficiency. Sensor breakdowns, the constant time factor, aggregation, and analysis results all cause errors, resulting in rejected or incorrect suggestions. This paper resolves this problem by using EDF, which is related to patient situational discovery through healthcare surveillance systems. Features of WS data are examined extensively using active and iterative learning to identify errors in specific postures. This technology improves position detection accuracy, analysis duration, and error rate, regardless of user movements. Wearable devices play a critical role in the management and treatment of patients. They can ensure that patients are provided with a unique treatment for their medical needs. This paper discusses the EDF technique for optimizing posture identification accuracy through multi-feature analysis. At first, the patients&#x2019; walking patterns are tracked at various time intervals. The characteristics are then evaluated in relation to the stored data using a random forest classifier.</p>
</abstract>
<kwd-group kwd-group-type="author">
<kwd>Data fusion (DF)</kwd>
<kwd>posture recognition</kwd>
<kwd>healthcare systems (HCS)</kwd>
<kwd>wearable sensor (WS)</kwd>
<kwd>medical data</kwd>
<kwd>errorless data fusion (EDF)</kwd>
</kwd-group>
</article-meta>
</front>
<body>
<sec id="s1">
<label>1</label>
<title>Introduction</title>
<p>A Wearable Sensor (WS) is utilized in the healthcare system to provide technical help as well as remote patient monitoring. The sensor is attached to the user&#x2019;s body and detects their motions at various time intervals. The collected information is forwarded to a healthcare facility for suggested treatments for the remote user [<xref ref-type="bibr" rid="ref-1">1</xref>]. On the heterogeneous platform, data is shared and compared to a predefined dataset. The dataset is made up of a predefined set of medical data and patient information [<xref ref-type="bibr" rid="ref-2">2</xref>]. It is indicated that the patient has a history of communication with healthcare providers and therapy. As a result, medical data is sensitive and private, and must be secured against unauthorized access [<xref ref-type="bibr" rid="ref-3">3</xref>], which may pose a serious threat to the patient&#x2019;s health. The WS monitors bodily function-related concerns at predetermined intervals and stores the data [<xref ref-type="bibr" rid="ref-4">4</xref>]. There may be errors and latency when storing the data, problems that must be addressed at the outset. In healthcare, WS data is an important element of safe data transfer [<xref ref-type="bibr" rid="ref-5">5</xref>].</p>
<p>The Microelectromechanical System (MEMS) is used in data analysis to improve the quality of medical data. It is used to help the caregivers acquire relevant data for providing appropriate treatment to the patient [<xref ref-type="bibr" rid="ref-6">6</xref>]. The data prediction is used to assess serial and current medical data. The data is then compared to current data [<xref ref-type="bibr" rid="ref-7">7</xref>]. It improves accuracy and enhances the security of medical data by creating a forecast. Data analysis (DA) is accomplished through the development of sensor-improved health information systems for decision-making. [<xref ref-type="bibr" rid="ref-8">8</xref>] The data is evaluated to see if it is relevant or not. It is carried out on time. The data collection and comparison are performed in a set period, and if there is a delay, an error occurs [<xref ref-type="bibr" rid="ref-5">5</xref>]. The data is received from the WS in order to complete the task within the specified period. It provides safe DA and transmits the outcomes to the HCS for evaluation [<xref ref-type="bibr" rid="ref-9">9</xref>].</p>
<p>Data fusion (DF) is performed for medical data collected through extraction and classification techniques. The retrieved data is categorized in order to decrease mistakes and improve the medical system&#x2019;s accuracy [<xref ref-type="bibr" rid="ref-10">10</xref>]. Numerous data from the sensors are combined to achieve WS fusion. Sensor fusion is the process of merging integrated data that is less ambiguous. Three forms of DF can be used: low level, feature level, and decision level [<xref ref-type="bibr" rid="ref-11">11</xref>]. Low-level refers to the merging of two sensors&#x2019; information, while feature-level is defined as extracting medical data features [<xref ref-type="bibr" rid="ref-12">12</xref>]. Finally, using current medical data, decision-level DF is used to reach an appropriate judgment. Three fusion models are utilized for DA: reactive, proactive, and interactive [<xref ref-type="bibr" rid="ref-13">13</xref>].</p>
<p>In Section II, the relevant research conducted to date is discussed in order to provide an overview of the current scenario. Section III illustrates how the Errorless Data Fusion (EDF) approach is achieved. The collected data is sent for feature extraction, followed by the classification of features. Finally, the random forest algorithm is used to acquire optimum data fusion. In Section IV, a comparative study of the suggested EDF approach&#x2019;s performance is given, which addresses the metrics identification accuracy, fusion error, and detection time. The objective of this work is to use Random Forest (RF) Machine Learning (ML) to enhance the precision of medical data by 20%. In this work, DF is accomplished by categorizing features in assigned time period, and forecasting is conducted through the use of an ML method.</p>
</sec>
<sec id="s2">
<label>2</label>
<title>Literature Review</title>
<p>Past research studies have helped in providing insight into the application of technology in healthcare institutions. These research studies have come up with the different findings that indicate the effectiveness of using the IoT to connect to various health devices and systems [<xref ref-type="bibr" rid="ref-14">14</xref>&#x2013;<xref ref-type="bibr" rid="ref-19">19</xref>]. Wang et al. [<xref ref-type="bibr" rid="ref-20">20</xref>] proposed a hybrid sensory system for patients to monitor routine activities and identify walking patterns. For DF methods collected from the WS, human activity recognition is employed. The features are categorized using a Feature Selection (FS) approach based on data and Support Vector Machine (SVM). A long Short-Term Memory and Convolutional Neural Network (LSTM-CNN) [<xref ref-type="bibr" rid="ref-15">15</xref>] fusion system is presented to identify unusual posture by Gao et al. [<xref ref-type="bibr" rid="ref-21">21</xref>], utilizing a Wearable Inertial Measurement Unit (IMU). This study involves leg Euler angle data to calculate classification precision.</p>
<p>The rapid development in technology has led to significant changes in the healthcare system. Currently, healthcare organizations depend on advanced technology to deliver healthcare products to patients. The IoT has played a critical role in changing the healthcare system. The emergence of wearable sensors has led to significant improvement in patient care and treatment. It enables medical practitioners to monitor patients in the hospital or remotely. Various wearable devices are designed to meet the specific and unique needs of patients. The IoT connects the various wearable devices to the patients and thus helps in gathering important data and information that can be used to make proper healthcare decisions and facilitate the efficient treatment of the patient.</p>
<p>The concept of smart healthcare was introduced to provide proper solutions in the delivery of efficient healthcare. Smart healthcare is regarded as a smart infrastructure that typically uses WSs to help detect and perceive information and data from the patient. The gathered information is then transmitted through the IoT and processed using cloud computing and supercomputers. Additionally, it can coordinate the integrated social system in order to understand the dynamic management of human services. Smart healthcare normally uses technology that includes wearable devices, IoT, and the mobile internet to help obtain data, and connect to individuals, institutions, and materials that are associated with healthcare. The information gathered is actively managed and responds to the needs of the medical ecosystem in an intelligent manner.</p>
<p>Smart healthcare is made up of patients, physicians, research institutes, and hospitals. In fact, it is viewed as an organic whole including different aspects like illness control and monitoring, diagnosis, treatment, health decision-making, hospital administration, and medical research. IoT, cloud computing, mobile internet, 5G, big data, Artificial Intelligence (AI), and contemporary biotechnology are all examples of current Information Technology (IT) and are key components of SCH [<xref ref-type="bibr" rid="ref-14">14</xref>]. These technologies can be used to monitor a patient&#x2019;s health <italic>via</italic> wearable devices. Wearable devices can also be by the patient to obtain medical assistance and services from one&#x0027;s own home. It also allows clinicians to handle medical data and information using an integrated information infrastructure that consists of a Laboratory Information Management System (LIMS), Electronic Medical Records (EMRs), an image archiving system, and other technologies. The adoption of Surgical Robots (SRs) and the mixed reality technique allows for more precise surgery. The usage of medical platforms can assist patients to have a better experience. Big data may also be used to examine a certain issue. The use of IoT and new technologies may decrease the risks and expenses of medical procedures and processes.</p>
<p>The application of technology such as AI, SRs, and mixed reality, has made the treatment and diagnosis of illnesses more efficient and intelligent. In most cases, the accuracy of AI diagnosis findings surpasses that of human doctors. ML-based systems are regarded as more accurate than the expert medical practitioners. In recent years, wearable IoT devices have been increasingly adopted in various application fields. The wearable devices are embedded or worn on the human body. The architecture of the wearable IoT network enables it to store important health information that facilitates the treatment and management of patients. The integration of several types of sensors into wearable IoT devices has led to significant improvement in their functionality. Smartwatches, for example, may be used for not just localization, entertainment, social networking, and payment, but also health and routine task tracking.</p>
<p>The software and hardware devices used in the healthcare system to support IoT can be exposed to certain challenges that can compromise their effectiveness. Unauthorized individuals&#x2019; access is one of the most significant challenges. Smart healthcare services, like any other internet-connected devices, are vulnerable to hackers [<xref ref-type="bibr" rid="ref-16">16</xref>]. Many wearable devices in the healthcare industry are subject to security threats and vulnerabilities. The intruders may obtain access to the IoT connected to numerous medical devices. They have access to and can change the data and information recorded on the devices. This may endanger the patients&#x2019; ability to get effective treatment. Furthermore, patient data may be compromised as a result of illegal access to the health organization&#x2019;s computer network. An attacker may have access to the patient&#x2019;s private and sensitive data. As a result, it is critical that cybersecurity measures are implemented to safeguard wearable devices from third-party attacks.</p>
<p>Zahra et al. [<xref ref-type="bibr" rid="ref-22">22</xref>] proposed multimodal sensor fusion (SF) to detect action in assembly production. Information from the wearable IMU and EMG is used for optimizing the training of CNN. Efficiency depends on the mixing as well as forecasting of SF data. Al-Amin et al. [<xref ref-type="bibr" rid="ref-23">23</xref>] developed a foot-mounted inertial sensor-based fusion method for detecting the posture of older people. The raw data is obtained using a hidden Markov model and a Neural Network (NN) in this study, which detects six categories of positions. Rule-based recognition using an optical motion seizure is the posture that is trained. Wang et al. [<xref ref-type="bibr" rid="ref-24">24</xref>] introduced deterministic learning based on DF to address various walking perspectives to detect human posture. The posture motion on spatio-temporal characteristics is extracted using the Radial Basis Function (RBF). For the identification of human posture, the deep Convolutional and Recurrent Neural Network (CRNN) is created.</p>
<p>The posture examination can provide insight on how to improve patient care and ensure enhanced quality care. The categorization of posture characteristics associated with knee, described by Fendri et al. [<xref ref-type="bibr" rid="ref-25">25</xref>] is based on posture analysis using deterministic learning. The patient&#x2019;s knee has osteoarthritis (OA). It is common in asymptomatic (AS). The RBFNN analyzes postural patterns and increases accuracy by separating them. Nweke et al. [<xref ref-type="bibr" rid="ref-26">26</xref>] used two branches of CNN to create posture feature extraction and classification. For posture identification, a two-branch CNN (TCNN) is employed. Multi-Frequency Posture Energy Images (MF-GEIs) is being utilized to train the posture inputs. Posture energy image is being used to identify posture.</p>
<p>Abbas et al. [<xref ref-type="bibr" rid="ref-27">27</xref>] offer a method for detecting posture recognition. It utilizes a covariate factor for appropriate behavioral biometric characteristics. The semantic information is utilized to improve posture-based identification accuracy. The dynamic selection for the person is performed in this procedure, and the human components are selected to obtain semantic information.</p>
<p>To evaluate human behavior and decrease the rate of misrecognition, Tran et al. [<xref ref-type="bibr" rid="ref-28">28</xref>] suggested human action recognition for multi-sensor fusion. This approach for multi-sensor DF introduces a Multi-View Ensemble Algorithm (MVEA). The feature vector is generated using Logistic Regression (LR) and the K-Nearest Neighbor (K-NN) technique. By using Synthetic Oversampling Minority Techniques (SOMT), the class imbalance is reduced. Hanif et al. [<xref ref-type="bibr" rid="ref-29">29</xref>] presented a data augmentation method for identifying the posture. It utilizes the deep neural network from the inertial sensor. For efficient training, two types of methods are used: Arbitrary Time Deformation (ATD) and Stochastic Magnitude Perturbation (SMP). General postures are recognized by CNN.</p>
<p>Dawar et al. [<xref ref-type="bibr" rid="ref-30">30</xref>] use deep learning-based fusion to incorporate depth and inertial sensing for action identification. A camera recognizes the movement or gesture and captures depth pictures of postures in various angles. Posture in a favorable influence is identified <italic>via</italic> a decision-level fusion. Zou et al. [<xref ref-type="bibr" rid="ref-31">31</xref>] describe a system that integrates inertial and RGBD sensors for strong posture identification. Eigen posture&#x2019;s color and depth from the accelerometer in eigenspace are included in posture data. The supervised classifier in this case uses a 3D dense trajectory to obtain greater identification accuracy.</p>
<p>The IoT, as represented by Fan et al. [<xref ref-type="bibr" rid="ref-32">32</xref>], is used to study attitude detection and data analysis. Through the creation of Fast Fourier Transformation (FFT), the goal of this project is to minimize error and frequency domain by 10%. To improve activity, human motion is evaluated by identifying posture.</p>
<p>Islam et al. [<xref ref-type="bibr" rid="ref-33">33</xref>] present data analytics for detecting bodily posture fatigue with the use of a WS. Fatigue is recognized <italic>via</italic> ML, and essential traits are selected based on its knowledge. Class dependencies are utilized to enhance accuracy when detecting fatigue.</p>
<p>The literature review will contribute immensely to the healthcare delivery system. It will monitor the patient from remote places and help in the facilitation of the treatment. The medical devices will help track the patients and record their health status at all times during and after the treatment. It is necessary for the healthcare professional to enact appropriate healthcare intervention measures to conform to the changes in technology. The proposed research study will also add the most recent information and data on the errorless data fusion technique, and therefore contribute to a new literature review on the topic.</p>
</sec>
<sec id="s3">
<label>3</label>
<title>Proposed Errorless Data Fusion Technique</title>
<p>The patients&#x2019; bodies are fitted with a WS that detects their posture and transmits their walking pattern to the smart healthcare system. The goal of this project is to improve precision and eliminate errors in medical data by integrating Feature Extraction (FE) and Feature Classification (FC). For sequential analysis of walking patterns, DF is used to determine the least feature depending on the time interval. The suggested technique&#x2019;s process flow is depicted in <?A3B2 "fig1",5,"anchor"?><xref ref-type="fig" rid="fig-1">Fig. 1</xref>.</p>
<fig id="fig-1">
<label>Figure 1</label>
<caption>
<title>Flow diagram. Multiple sensors collect data from a patient at fixed intervals and forward data to perform feature extraction. Features are classified and passed on to apply data fusion algorithms</title>
</caption>
<graphic mimetype="image" mime-subtype="png" xlink:href="CMC_19706-fig-1.png"/>
</fig>
<p>The FE is derived from the sensor using <xref ref-type="disp-formula" rid="eqn-1">Eq. (1)</xref>, and it incorporates data integrity, chaining, and data patterns.</p>
<p><disp-formula id="eqn-1">
<label>(1)</label>
<mml:math id="mml-eqn-1" display="block"><mml:msub><mml:mi>s</mml:mi><mml:mrow><mml:mi>e</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mi>q</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mrow><mml:msubsup><mml:mrow><mml:munder><mml:mtext>n</mml:mtext><mml:mi>,</mml:mi></mml:munder></mml:mrow><mml:mi>o</mml:mi><mml:mo>&#x2212;</mml:mo></mml:msubsup></mml:mrow><mml:mo>+</mml:mo><mml:mfrac><mml:mn>1</mml:mn><mml:msub><mml:mi>g</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mfrac><mml:mrow><mml:munderover><mml:mo>&#x220F;</mml:mo><mml:mrow><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:munder><mml:msup><mml:mi>&#x03C9;</mml:mi><mml:mo>&#x2032;</mml:mo></mml:msup><mml:mo>,</mml:mo></mml:munder></mml:mrow></mml:munderover><mml:msub><mml:mi>g</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:munderover><mml:mo>&#x2211;</mml:mo><mml:mrow><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>d</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:munderover><mml:msub><mml:mi>g</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mrow></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>&#x2217;</mml:mo><mml:munder><mml:mo>&#x220F;</mml:mo><mml:mrow><mml:msub><mml:mtext>\raisebox{1.5pt}{--}\kern-7ptX</mml:mtext><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:mrow></mml:munder><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mfrac><mml:mrow><mml:msub><mml:mi>i</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>c</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mo>+</mml:mo><mml:msub><mml:mi>p</mml:mi><mml:mrow><mml:mi>a</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:munder><mml:msup><mml:mi>&#x03C9;</mml:mi><mml:mo>&#x2032;</mml:mo></mml:msup><mml:mo>,</mml:mo></mml:munder></mml:mrow></mml:mfrac><mml:mo>)</mml:mo></mml:mrow></mml:math>
</disp-formula></p>
<p>The FE is performed by collecting the sensor data. Using that data, <inline-formula id="ieqn-1"><mml:math id="mml-ieqn-1"><mml:msub><mml:mi>i</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula> represents integrity that contains input and output data. Chaining <inline-formula id="ieqn-2"><mml:math id="mml-ieqn-2"><mml:mi>c</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:math></inline-formula> is the sensor&#x2019;s data nonstop stream, a pattern of data <inline-formula id="ieqn-3"><mml:math id="mml-ieqn-3"><mml:msub><mml:mi>p</mml:mi><mml:mrow><mml:mi>a</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> that refers to the patients&#x2019; walking patterns. The sensor data features utilizing sensor acquisition <inline-formula id="ieqn-4"><mml:math id="mml-ieqn-4"><mml:msub><mml:mi>s</mml:mi><mml:mrow><mml:mi>e</mml:mi></mml:mrow></mml:msub><mml:mo stretchy="false">(</mml:mo><mml:mi>q</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula> are included in all three. <inline-formula id="ieqn-5"><mml:math id="mml-ieqn-5"><mml:msub><mml:mi>n</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula> represents the data sensing and <inline-formula id="ieqn-6"><mml:math id="mml-ieqn-6"><mml:msub><mml:mi>g</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula> denotes posture recognition, the data are <inline-formula id="ieqn-7"><mml:math id="mml-ieqn-7"><mml:mo fence="false" stretchy="false">{</mml:mo><mml:mi>d</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mrow><mml:mo>(</mml:mo><mml:mn>1</mml:mn><mml:mo>)</mml:mo></mml:mrow><mml:mo>,</mml:mo><mml:mi>d</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mrow><mml:mo>(</mml:mo><mml:mn>2</mml:mn><mml:mo>)</mml:mo></mml:mrow><mml:mo>,</mml:mo><mml:mo>&#x2026;</mml:mo><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo fence="false" stretchy="false">}</mml:mo></mml:math></inline-formula>, where <inline-formula id="ieqn-8"><mml:math id="mml-ieqn-8"><mml:mi>d</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:math></inline-formula> refers to data and <inline-formula id="ieqn-9"><mml:math id="mml-ieqn-9"><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula> is represented as several data.</p>
<p>The walking patterns are denoted by <inline-formula id="ieqn-10"><mml:math id="mml-ieqn-10"><mml:mi>&#x03C9;</mml:mi></mml:math></inline-formula>. They are detected in a specific time duration. They are tracked by sensing WB at each certain period as <inline-formula id="ieqn-11"><mml:math id="mml-ieqn-11"><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mi>&#x03C9;</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo>,</mml:mo><mml:mo>&#x2026;</mml:mo><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub><mml:mo stretchy="false">(</mml:mo><mml:mi>&#x03C9;</mml:mi><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula>. The features are denoted as <inline-formula id="ieqn-12"><mml:math id="mml-ieqn-12"><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula>, which contains <inline-formula id="ieqn-13"><mml:math id="mml-ieqn-13"><mml:mrow><mml:mo>(</mml:mo><mml:mfrac><mml:mrow><mml:msub><mml:mi>i</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>c</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mo>+</mml:mo><mml:msub><mml:mi>p</mml:mi><mml:mrow><mml:mi>a</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mi>&#x03C9;</mml:mi></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>.</mml:mo></mml:math></inline-formula> The extraction is seen here. Following that, <xref ref-type="disp-formula" rid="eqn-2">Eq. (2)</xref> is utilized to obtain the data features.</p>
<p><disp-formula id="eqn-2">
<label>(2)</label>
<mml:math id="mml-eqn-2" display="block"><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mrow><mml:mo>{</mml:mo><mml:mtable columnalign="left left" rowspacing="1em 1em 0.2em" columnspacing="1em" displaystyle="false"><mml:mtr><mml:mtd><mml:msub><mml:mi>i</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:munderover><mml:mo>&#x2211;</mml:mo><mml:mrow><mml:mi>q</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mrow><mml:msub><mml:mrow><mml:mo>&#x146;</mml:mo></mml:mrow><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mrow></mml:munderover><mml:mrow><mml:mo>(</mml:mo><mml:mn>1</mml:mn><mml:mo>&#x2212;</mml:mo><mml:mfrac><mml:mrow><mml:mo>[</mml:mo><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>u</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mo>]</mml:mo></mml:mrow><mml:mi>w</mml:mi></mml:mfrac><mml:mo>)</mml:mo></mml:mrow></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mrow><mml:mo>&#xE7;</mml:mo></mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:msub><mml:mrow><mml:mo>&#x146;</mml:mo></mml:mrow><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>q</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub></mml:mfrac><mml:mo>&#x2217;</mml:mo><mml:munderover><mml:mo>&#x220F;</mml:mo><mml:mrow><mml:mi>w</mml:mi></mml:mrow><mml:mrow><mml:msub><mml:mi>g</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mrow></mml:munderover><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo stretchy="false">)</mml:mo><mml:mo>+</mml:mo><mml:mi>o</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:msub><mml:mi>p</mml:mi><mml:mrow><mml:mi>a</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:munderover><mml:mo>&#x2211;</mml:mo><mml:mrow><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>w</mml:mi></mml:mrow></mml:munderover><mml:mrow><mml:mo>[</mml:mo><mml:mi>q</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mo>+</mml:mo><mml:msub><mml:mtext>\raisebox{1.5pt}{--}\kern-7ptX</mml:mtext><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mi>w</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo>]</mml:mo></mml:mrow><mml:mo>&#x2212;</mml:mo><mml:mrow><mml:mo>[</mml:mo><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>&#x03B1;</mml:mi><mml:mo>]</mml:mo></mml:mrow></mml:mtd></mml:mtr></mml:mtable><mml:mo fence="true" stretchy="true" symmetric="true"></mml:mo></mml:mrow></mml:math>
</disp-formula></p>
<p>The integrity of the medical data, <inline-formula id="ieqn-14"><mml:math id="mml-ieqn-14"><mml:mrow><mml:mo>(</mml:mo><mml:mn>1</mml:mn><mml:mo>&#x2212;</mml:mo><mml:mfrac><mml:mrow><mml:mo>[</mml:mo><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>u</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mo>]</mml:mo></mml:mrow><mml:mi>&#x03C9;</mml:mi></mml:mfrac><mml:mo>)</mml:mo></mml:mrow></mml:math></inline-formula>, represents the incoming and outgoing <inline-formula id="ieqn-15"><mml:math id="mml-ieqn-15"><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mspace width="thinmathspace" /><mml:mspace width="thinmathspace" /><mml:mi>a</mml:mi><mml:mi>n</mml:mi><mml:mi>d</mml:mi><mml:mspace width="thinmathspace" /><mml:mspace width="thinmathspace" /><mml:mi>u</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:math></inline-formula>. This describes the patient&#x0027;s WB. A series of sensor data is represented by chaining, <inline-formula id="ieqn-16"><mml:math id="mml-ieqn-16"><mml:mfrac><mml:mrow><mml:msub><mml:mi>n</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>q</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:mrow><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub></mml:mfrac></mml:math></inline-formula> and it is followed for series of data inputs to the devices. <inline-formula id="ieqn-17"><mml:math id="mml-ieqn-17"><mml:mi>o</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:math></inline-formula> is denoted as series of WS data, and <inline-formula id="ieqn-18"><mml:math id="mml-ieqn-18"><mml:mi>&#x03B1;</mml:mi></mml:math></inline-formula> represents a single step made by the patients during their movement. These are employed in the analysis of data patterns. In <xref ref-type="disp-formula" rid="eqn-3">Eq. (3)</xref>, all three are applied.</p>
<p><disp-formula id="eqn-3">
<label>(3)</label>
<mml:math id="mml-eqn-3" display="block"><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mi>&#x03B2;</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:munderover><mml:mo>&#x2211;</mml:mo><mml:mrow><mml:mi>w</mml:mi></mml:mrow><mml:mrow><mml:mi>q</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:munderover><mml:msub><mml:mrow><mml:mo>&#x146;</mml:mo></mml:mrow><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>u</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mo stretchy="false">)</mml:mo><mml:mo>&#x2217;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mfrac><mml:mfrac><mml:mrow><mml:msub><mml:mrow><mml:mo>&#x146;</mml:mo></mml:mrow><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>q</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:msub><mml:mtext>\raisebox{1.5pt}{--}\kern-7ptX</mml:mtext><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:mfrac><mml:mrow><mml:mi>o</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:mrow><mml:mo>[</mml:mo><mml:mfrac><mml:mrow><mml:mi>q</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mo>+</mml:mo><mml:msub><mml:mrow><mml:mtext>\raisebox{1.5pt}{--}\kern-7ptX</mml:mtext></mml:mrow><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mspace width="thinmathspace" /><mml:mspace width="thinmathspace" /><mml:mspace width="thinmathspace" /><mml:mspace width="thinmathspace" /><mml:mrow><mml:munder><mml:msup><mml:mi>&#x03C9;</mml:mi><mml:mo>&#x2032;</mml:mo></mml:msup><mml:mo>,</mml:mo></mml:munder></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mfrac><mml:mo>]</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:mi>&#x03B1;</mml:mi></mml:math>
</disp-formula></p>
<p>The three features are extracted in a better way for the classification of data; the above <xref ref-type="disp-formula" rid="eqn-3">Eq. (3)</xref> is used to observe the fixed time DA. The calculation of <inline-formula id="ieqn-19"><mml:math id="mml-ieqn-19"><mml:mfrac><mml:mrow><mml:mi>q</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mo>+</mml:mo><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mi>&#x03C9;</mml:mi><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub></mml:mfrac></mml:math></inline-formula> denotes that the WB is tracked in a specific time interval, and <inline-formula id="ieqn-20"><mml:math id="mml-ieqn-20"><mml:mi>o</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mrow><mml:mo>(</mml:mo><mml:mi>&#x03C9;</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:mi>q</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:math></inline-formula>.<inline-formula id="ieqn-21"><mml:math id="mml-ieqn-21"><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>u</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula> are monitored and assess the outcomes of posture in the sensor, with <inline-formula id="ieqn-22"><mml:math id="mml-ieqn-22"><mml:mi>&#x03B2;</mml:mi></mml:math></inline-formula> represented as data analysis. After <inline-formula id="ieqn-23"><mml:math id="mml-ieqn-23"><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>i</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>c</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mo>+</mml:mo><mml:msub><mml:mi>p</mml:mi><mml:mrow><mml:mi>a</mml:mi></mml:mrow></mml:msub><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula> the FC is completed. The least feature (LF), which describes the remaining error data, is utilized to categorize the error data minimization. For posture identification, FC is achieved, and the error is observed using <xref ref-type="disp-formula" rid="eqn-4">Eq. (4)</xref>.</p>
<p><disp-formula id="eqn-4">
<label>(4)</label>
<mml:math id="mml-eqn-4" display="block"><mml:mo>=</mml:mo><mml:munderover><mml:mo>&#x220F;</mml:mo><mml:mrow><mml:mi>q</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mrow><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mrow></mml:munderover><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>&#x2212;</mml:mo><mml:mi>u</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mo stretchy="false">)</mml:mo><mml:mo>+</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mfrac><mml:mrow><mml:mi>o</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mo>+</mml:mo><mml:msub><mml:mtext>\raisebox{1.5pt}{--}\kern-7ptX</mml:mtext><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:mrow><mml:mi>&#x2205;</mml:mi></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>&#x2217;</mml:mo><mml:munderover><mml:mo>&#x220F;</mml:mo><mml:mrow><mml:mi>d</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mrow><mml:msub><mml:mrow><mml:mo>&#x146;</mml:mo></mml:mrow><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mrow></mml:munderover><mml:mrow><mml:mo>[</mml:mo><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>&#x03B1;</mml:mi><mml:mo>]</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:mfrac><mml:mrow><mml:mrow><mml:munder><mml:msup><mml:mi>&#x03C9;</mml:mi><mml:mo>&#x2032;</mml:mo></mml:msup><mml:mo>,</mml:mo></mml:munder></mml:mrow><mml:mo>&#x2217;</mml:mo><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>y</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mi>&#x2205;</mml:mi></mml:mfrac></mml:math>
</disp-formula></p>
<p>FE of the data is performed by detecting the WB of the patients, from which categorization is obtained by assessing <xref ref-type="disp-formula" rid="eqn-4">Eq. (4)</xref>. <inline-formula id="ieqn-24"><mml:math id="mml-ieqn-24"><mml:mi mathvariant="normal">&#x2202;</mml:mi></mml:math></inline-formula> is referred to as error data in identifying WB, and it contains network traffic (NT) and delays represented as <inline-formula id="ieqn-25"><mml:math id="mml-ieqn-25"><mml:mi>&#x2205;</mml:mi><mml:mspace width="thinmathspace" /><mml:mspace width="thinmathspace" /><mml:mi>a</mml:mi><mml:mi>n</mml:mi><mml:mi>d</mml:mi></mml:math></inline-formula>. These two concerns are solved utilizing the sensing WB pattern of the patients <inline-formula id="ieqn-26"><mml:math id="mml-ieqn-26"><mml:mi>&#x03C9;</mml:mi><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula>. The calculation of <inline-formula id="ieqn-27"><mml:math id="mml-ieqn-27"><mml:mrow><mml:mo>[</mml:mo><mml:mfrac><mml:mrow><mml:mi>&#x03C9;</mml:mi><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>y</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mrow><mml:mi>&#x2205;</mml:mi></mml:mrow></mml:mfrac><mml:mo>]</mml:mo></mml:mrow></mml:math></inline-formula> shows the walking detection seen by delay and NT. <?A3B2 "fig2",5,"anchor"?><xref ref-type="fig" rid="fig-2">Fig. 2a</xref> depicts the pattern classification procedure.</p>
<p><inline-formula id="ieqn-28"><mml:math id="mml-ieqn-28"><mml:mrow><mml:mo>(</mml:mo><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mfrac><mml:mrow><mml:mi>o</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mo>+</mml:mo><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>&#x2205;</mml:mi></mml:mrow></mml:mfrac><mml:mo>)</mml:mo></mml:mrow></mml:math></inline-formula> is a series of data that prevents NT at a certain time interval <inline-formula id="ieqn-29"><mml:math id="mml-ieqn-29"><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub><mml:mo stretchy="false">(</mml:mo><mml:mrow><mml:mi>&#x2205;</mml:mi></mml:mrow><mml:mo>+</mml:mo><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula>. The errors are discovered and then eliminated to improve precision. The classification is separated into two categories, error minimization and Least Feature Extraction (LFE), with the latter representing the remaining error. The LFE for classification is calculated using <xref ref-type="disp-formula" rid="eqn-5">Eq. (5)</xref>.</p>
<p><disp-formula id="eqn-5">
<label>(5)</label>
<mml:math id="mml-eqn-5" display="block"><mml:mi>&#x03B4;</mml:mi><mml:mo>=</mml:mo><mml:mrow><mml:mo fence="true" stretchy="true" symmetric="true"></mml:mo><mml:mtable columnalign="left" rowspacing="1.2em 1.2em 0.4em" columnspacing="1em"><mml:mtr><mml:mtd><mml:mfrac><mml:mrow><mml:munderover><mml:mo>&#x2211;</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mo>&#x12F;</mml:mo></mml:mrow><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>d</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:munderover><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>u</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mi>o</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:mfrac><mml:mo>&#x2217;</mml:mo><mml:munderover><mml:mo>&#x220F;</mml:mo><mml:mrow><mml:mi>&#x03B2;</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mi>d</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:msub><mml:mtext>\raisebox{1.5pt}{--}\kern-7ptX</mml:mtext><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:mrow></mml:munderover><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mi>o</mml:mi></mml:mrow></mml:msub></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mrow><mml:mi mathvariant="italic">w</mml:mi><mml:mi mathvariant="italic">h</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow><mml:mo>,</mml:mo><mml:mtext>&#xA0;</mml:mtext><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>&#x2208;</mml:mo><mml:mo stretchy="false">(</mml:mo><mml:mi>o</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mo>+</mml:mo><mml:mi>w</mml:mi><mml:mo stretchy="false">)</mml:mo></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:munderover><mml:mo>&#x220F;</mml:mo><mml:mrow><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>o</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:munderover><mml:mrow><mml:mo>(</mml:mo><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mfrac><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:msub><mml:mtext>\raisebox{1.5pt}{--}\kern-7ptX</mml:mtext><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>&#x2217;</mml:mo><mml:munderover><mml:mo>&#x2211;</mml:mo><mml:mrow><mml:mi>w</mml:mi></mml:mrow><mml:mrow><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mrow></mml:munderover><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mrow><mml:mo>&#x146;</mml:mo></mml:mrow><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>&#x03B2;</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:mi>l</mml:mi><mml:mo>,</mml:mo><mml:mi mathvariant="normal">&#x2200;</mml:mi><mml:mtext>&#xA0;</mml:mtext><mml:mrow><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi></mml:mrow><mml:mtext>&#xA0;</mml:mtext><mml:mrow><mml:mi mathvariant="italic">d</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">t</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">t</mml:mi><mml:mi mathvariant="italic">i</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">n</mml:mi></mml:mrow></mml:mtd></mml:mtr></mml:mtable><mml:mo>}</mml:mo></mml:mrow></mml:math>
</disp-formula></p>
<p>The LF error is identified by assessing <xref ref-type="disp-formula" rid="eqn-2">Eq. (2)</xref>, and in this <inline-formula id="ieqn-30"><mml:math id="mml-ieqn-30"><mml:mfrac><mml:mrow><mml:munderover><mml:mo>&#x2211;</mml:mo><mml:mrow><mml:msub><mml:mi>i</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>d</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:mrow></mml:munderover><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>u</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mi>o</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:mrow></mml:mfrac></mml:math></inline-formula>, the incoming and outgoing data are analyzed <inline-formula id="ieqn-31"><mml:math id="mml-ieqn-31"><mml:mi>&#x03B2;</mml:mi><mml:mo stretchy="false">(</mml:mo><mml:mi>&#x03C9;</mml:mi><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula>, where <inline-formula id="ieqn-32"><mml:math id="mml-ieqn-32"><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>&#x2208;</mml:mo><mml:mo stretchy="false">(</mml:mo><mml:mi>o</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mo>+</mml:mo><mml:mi>&#x03C9;</mml:mi><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula>. The next stage is represented by the error identified in the second phase, which is written as <inline-formula id="ieqn-33"><mml:math id="mml-ieqn-33"><mml:munderover><mml:mo>&#x220F;</mml:mo><mml:mrow><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>o</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:mrow></mml:munderover><mml:mrow><mml:mo>(</mml:mo><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mfrac><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow><mml:munderover><mml:mo>&#x2211;</mml:mo><mml:mrow><mml:mi>&#x03C9;</mml:mi></mml:mrow><mml:mrow><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub></mml:mrow></mml:munderover><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>)</mml:mo></mml:mrow></mml:math></inline-formula> In this situation, data are retrieved from the least error (LE) and are denoted as <inline-formula id="ieqn-34"><mml:math id="mml-ieqn-34"><mml:mi>&#x03B4;</mml:mi></mml:math></inline-formula>, which represents the <inline-formula id="ieqn-35"><mml:math id="mml-ieqn-35"><mml:mi mathvariant="normal">&#x2202;</mml:mi><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula>. <xref ref-type="disp-formula" rid="eqn-6">Eq. (6)</xref> can be used to illustrate the two kinds of classification.</p>
<p><disp-formula id="eqn-6">
<label>(6)</label>
<mml:math id="mml-eqn-6" display="block"><mml:mi>&#x03BE;</mml:mi><mml:mo>=</mml:mo><mml:mrow><mml:mo>{</mml:mo><mml:mtable columnalign="left left" rowspacing="1em 0.2em" columnspacing="1em" displaystyle="false"><mml:mtr><mml:mtd><mml:mrow><mml:mo>(</mml:mo><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mfrac><mml:mrow><mml:mi>o</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mo>+</mml:mo><mml:msub><mml:mtext>\raisebox{1.5pt}{--}\kern-7ptX</mml:mtext><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:mrow><mml:mi>&#x2205;</mml:mi></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:mrow><mml:mo>[</mml:mo><mml:mfrac><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>&#x03B1;</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>y</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>&#x2205;</mml:mi><mml:mo>)</mml:mo></mml:mrow></mml:mfrac><mml:mo>]</mml:mo></mml:mrow><mml:mo>&#x2217;</mml:mo><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>&#x2212;</mml:mo><mml:mi>u</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mo stretchy="false">)</mml:mo><mml:mo>,</mml:mo><mml:mi mathvariant="normal">&#x2200;</mml:mi><mml:mtext>&#xA0;</mml:mtext><mml:mrow><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi></mml:mrow><mml:mtext>&#xA0;</mml:mtext><mml:mrow><mml:mi mathvariant="italic">d</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">t</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">t</mml:mi><mml:mi mathvariant="italic">i</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">n</mml:mi></mml:mrow></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mrow><mml:mo>(</mml:mo><mml:mfrac><mml:mrow><mml:munderover><mml:mo>&#x2211;</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mo>&#x12F;</mml:mo></mml:mrow><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>d</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:munderover><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>u</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mi>o</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mfrac><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:msub><mml:mtext>\raisebox{1.5pt}{--}\kern-7ptX</mml:mtext><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>&#x2217;</mml:mo><mml:mo>,</mml:mo><mml:mi mathvariant="normal">&#x2200;</mml:mi><mml:mtext>&#xA0;</mml:mtext><mml:mrow><mml:mi mathvariant="italic">l</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">a</mml:mi><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">t</mml:mi></mml:mrow><mml:mtext>&#xA0;</mml:mtext><mml:mrow><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi></mml:mrow></mml:mtd></mml:mtr></mml:mtable><mml:mo fence="true" stretchy="true" symmetric="true"></mml:mo></mml:mrow></mml:math>
</disp-formula></p>
<p>The classification <inline-formula id="ieqn-36"><mml:math id="mml-ieqn-36"><mml:mi>&#x03B3;</mml:mi></mml:math></inline-formula> is accomplished by calculating both the error and the LE of the patients&#x2019; posture identification. The goal is to decrease the error. They are derived by calculating <xref ref-type="disp-formula" rid="eqn-7">Eq. (7)</xref>, in which the error is first assessed and then decreased to enhance data precision. As a result, the formula <inline-formula id="ieqn-37"><mml:math id="mml-ieqn-37"><mml:mi>o</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mrow><mml:mo>(</mml:mo><mml:mi>&#x03C9;</mml:mi><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow><mml:mfrac><mml:mi>&#x03B1;</mml:mi><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub></mml:mfrac><mml:mo>)</mml:mo></mml:mrow></mml:math></inline-formula> reflects the WB observed at each time interval. The WB is recurrently evaluated in this way. <xref ref-type="disp-formula" rid="eqn-7">Eq. (7)</xref> is evaluated by combining <xref ref-type="disp-formula" rid="eqn-5">Eqs. (5)</xref> and <xref ref-type="disp-formula" rid="eqn-6">(6)</xref>, as follows:</p>
<p><disp-formula id="eqn-7">
<label>(7)</label>
<mml:math id="mml-eqn-7" display="block"><mml:mrow><mml:mo>(</mml:mo><mml:mi>&#x03BE;</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mfrac><mml:mrow><mml:mrow><mml:mover><mml:mrow><mml:munder><mml:msup><mml:mi>&#x03C9;</mml:mi><mml:mo>&#x2032;</mml:mo></mml:msup><mml:mo>,</mml:mo></mml:munder></mml:mrow><mml:mo>&#x00B4;</mml:mo></mml:mover></mml:mrow><mml:mo>&#x2217;</mml:mo><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>y</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mi>&#x2205;</mml:mi></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>&#x2217;</mml:mo><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>&#x2212;</mml:mo><mml:mi>u</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mo stretchy="false">)</mml:mo><mml:mo>+</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mfrac><mml:mrow><mml:mi>o</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mo>+</mml:mo><mml:msub><mml:mtext>\raisebox{1.5pt}{--}\kern-7ptX</mml:mtext><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:mrow><mml:mi>&#x2205;</mml:mi></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:mrow><mml:mo>{</mml:mo><mml:mtable columnalign="left left" rowspacing="1em 1em 0.2em" columnspacing="1em" displaystyle="false"><mml:mtr><mml:mtd><mml:mrow><mml:mo>(</mml:mo><mml:mfrac><mml:mrow><mml:munderover><mml:mo>&#x2211;</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mo>&#x12F;</mml:mo></mml:mrow><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>d</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:munderover><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>u</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mo>)</mml:mo></mml:mrow></mml:mrow><mml:mrow><mml:mi>o</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>,</mml:mo><mml:mtext>&#xA0;</mml:mtext><mml:mi>s</mml:mi><mml:mi>u</mml:mi><mml:mi>c</mml:mi><mml:mi>h</mml:mi><mml:mtext>&#xA0;</mml:mtext><mml:mi>t</mml:mi><mml:mi>h</mml:mi><mml:mi>a</mml:mi><mml:mi>t</mml:mi><mml:mtext>&#xA0;</mml:mtext><mml:mi>d</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mo>&#x2208;</mml:mo><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mrow><mml:mo>[</mml:mo><mml:mfrac><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>&#x03B1;</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>y</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>&#x2205;</mml:mi><mml:mo>)</mml:mo></mml:mrow></mml:mfrac><mml:mo>]</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:munderover><mml:mo>&#x2211;</mml:mo><mml:mrow><mml:mi>w</mml:mi></mml:mrow><mml:mrow><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mrow></mml:munderover><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mrow><mml:mo>&#x146;</mml:mo></mml:mrow><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>&#x03B2;</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mrow><mml:mi mathvariant="italic">w</mml:mi><mml:mi mathvariant="italic">h</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow><mml:mo>,</mml:mo><mml:mtext>&#xA0;</mml:mtext><mml:mrow><mml:mi mathvariant="italic">d</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">l</mml:mi><mml:mi mathvariant="italic">a</mml:mi><mml:mi mathvariant="italic">y</mml:mi></mml:mrow><mml:mtext>&#xA0;</mml:mtext><mml:mi>a</mml:mi><mml:mi>n</mml:mi><mml:mi>d</mml:mi><mml:mtext>&#xA0;</mml:mtext><mml:mrow><mml:mi mathvariant="italic">n</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">t</mml:mi><mml:mi mathvariant="italic">w</mml:mi><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">k</mml:mi></mml:mrow><mml:mtext>&#xA0;</mml:mtext><mml:mrow><mml:mi mathvariant="italic">t</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">a</mml:mi><mml:mi mathvariant="italic">f</mml:mi><mml:mi mathvariant="italic">f</mml:mi><mml:mi mathvariant="italic">i</mml:mi><mml:mi mathvariant="italic">c</mml:mi></mml:mrow><mml:mtext>&#xA0;</mml:mtext><mml:mrow><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">d</mml:mi><mml:mi mathvariant="italic">u</mml:mi><mml:mi mathvariant="italic">c</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">d</mml:mi></mml:mrow></mml:mtd></mml:mtr></mml:mtable><mml:mo fence="true" stretchy="true" symmetric="true"></mml:mo></mml:mrow></mml:math>
</disp-formula></p>
<p>Calculating <xref ref-type="disp-formula" rid="eqn-7">Eq. (7)</xref>, where <inline-formula id="ieqn-38"><mml:math id="mml-ieqn-38"><mml:mrow><mml:mo>(</mml:mo><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mfrac><mml:mrow><mml:mi>o</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mo>+</mml:mo><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>&#x2205;</mml:mi></mml:mrow></mml:mfrac><mml:mo>)</mml:mo></mml:mrow></mml:math></inline-formula> is assessed to obtain the series of incoming sensor data, the classification is accomplished by deriving errorless data. Then, using <inline-formula id="ieqn-39"><mml:math id="mml-ieqn-39"><mml:mfrac><mml:msub><mml:mi>y</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mrow><mml:mi>&#x2205;</mml:mi></mml:mrow></mml:mfrac></mml:math></inline-formula>, the error-based data are categorized, and the LE is achieved by calculating <xref ref-type="disp-formula" rid="eqn-8">Eq. (8)</xref> as follows, resulting in the classification of the two types of posture data.</p>
<p><disp-formula id="eqn-8">
<label>(8)</label>
<mml:math id="mml-eqn-8" display="block"><mml:mi>&#x03BE;</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mi>w</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mfrac><mml:mn>1</mml:mn><mml:mn>2</mml:mn></mml:mfrac><mml:mo>&#x2217;</mml:mo><mml:munderover><mml:mo>&#x220F;</mml:mo><mml:mrow><mml:mi>&#x03B2;</mml:mi></mml:mrow><mml:mrow><mml:mi>o</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:munderover><mml:msqrt><mml:mfrac><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:msub><mml:mtext>\raisebox{1.5pt}{--}\kern-7ptX</mml:mtext><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:mfrac></mml:msqrt><mml:mo>+</mml:mo><mml:msubsup><mml:mo>&#x222B;</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mo>&#x146;</mml:mo></mml:mrow><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mrow></mml:msubsup><mml:mrow><mml:mo>(</mml:mo><mml:mfrac><mml:mrow><mml:msub><mml:mrow><mml:mo>&#x12F;</mml:mo></mml:mrow><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>c</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mo>+</mml:mo><mml:msub><mml:mrow><mml:mo>&#xFE;</mml:mo></mml:mrow><mml:mrow><mml:mrow><mml:mo>&#xE5;</mml:mo></mml:mrow></mml:mrow></mml:msub></mml:mrow><mml:mi>&#x03B1;</mml:mi></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>&#x2217;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>y</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>&#x2205;</mml:mi><mml:mo>)</mml:mo></mml:mrow></mml:math>
</disp-formula></p>
<p>The characteristic of WB is used to determine integrity. The steps in which leg motions are computed are tracked in this way. The classification of WB is assessed using <xref ref-type="disp-formula" rid="eqn-8">Eq. (8)</xref>, and the sequence of incoming data is studied <inline-formula id="ieqn-40"><mml:math id="mml-ieqn-40"><mml:mi>o</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:mi>&#x03B2;</mml:mi><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula>. Here, <inline-formula id="ieqn-41"><mml:math id="mml-ieqn-41"><mml:mi>d</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mo>&#x2208;</mml:mo><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula>, and the features represent <inline-formula id="ieqn-42"><mml:math id="mml-ieqn-42"><mml:mfrac><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>&#x2212;</mml:mo><mml:mi>u</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mrow><mml:mi>o</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:mrow></mml:mfrac></mml:math></inline-formula>, which shows the <inline-formula id="ieqn-43"><mml:math id="mml-ieqn-43"><mml:mi>&#x03B1;</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mi mathvariant="normal">&#x2202;</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula>. DF is done in this way, and the classification module is derived by solving <xref ref-type="disp-formula" rid="eqn-9">Eq. (9)</xref>.</p>
<p><disp-formula id="eqn-9">
<label>(9)</label>
<mml:math id="mml-eqn-9" display="block"><mml:mtable columnalign="right left right left right left right left right left right left" rowspacing="3pt" columnspacing="0em 2em 0em 2em 0em 2em 0em 2em 0em 2em 0em" displaystyle="true"><mml:mtr><mml:mtd><mml:mi>&#x03C1;</mml:mi><mml:mo>=</mml:mo></mml:mtd><mml:mtd><mml:mrow><mml:mo>(</mml:mo><mml:mfrac><mml:mrow><mml:msub><mml:mrow><mml:mo>&#x12F;</mml:mo></mml:mrow><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mrow><mml:mo>&#xE7;</mml:mo></mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mo>+</mml:mo><mml:msub><mml:mrow><mml:mo>&#xFE;</mml:mo></mml:mrow><mml:mrow><mml:mrow><mml:mo>&#xE5;</mml:mo></mml:mrow></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>&#x03B1;</mml:mi><mml:mo>+</mml:mo><mml:mi>l</mml:mi></mml:mrow></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>&#x2217;</mml:mo><mml:munderover><mml:mo>&#x220F;</mml:mo><mml:mrow><mml:msub><mml:mi>y</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>&#x2205;</mml:mi></mml:mrow></mml:munderover><mml:mrow><mml:mo>[</mml:mo><mml:mrow><mml:mo>[</mml:mo><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mfrac><mml:mrow><mml:mi>o</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mfrac><mml:mo>]</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:msqrt><mml:mfrac><mml:mrow><mml:munderover><mml:mo>&#x220F;</mml:mo><mml:mrow><mml:msub><mml:mi>g</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>&#x03B1;</mml:mi></mml:mrow></mml:munderover><mml:mi>&#x03B2;</mml:mi><mml:mo stretchy="false">(</mml:mo><mml:mi>l</mml:mi><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mrow><mml:mi>l</mml:mi><mml:mo>+</mml:mo><mml:msub><mml:mtext>\raisebox{1.5pt}{--}\kern-7ptX</mml:mtext><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:mrow></mml:mfrac></mml:msqrt><mml:mo>]</mml:mo></mml:mrow></mml:mtd></mml:mtr><mml:mtr><mml:mtd /><mml:mtd><mml:mi></mml:mi><mml:mo>&#x2217;</mml:mo><mml:munderover><mml:mo>&#x2211;</mml:mo><mml:mrow><mml:mi>&#x03BE;</mml:mi></mml:mrow><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:munderover><mml:mi>&#x2205;</mml:mi><mml:mo>+</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mfrac><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>o</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:mrow><mml:mo>{</mml:mo><mml:mtable columnalign="left left" rowspacing="1em 0.2em" columnspacing="1em" displaystyle="false"><mml:mtr><mml:mtd><mml:mrow><mml:mo fence="true" stretchy="true" symmetric="true"></mml:mo><mml:mtable rowspacing="1.2em 0.4em" columnspacing="1em"><mml:mtr><mml:mtd><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mtext>\raisebox{1.5pt}{--}\kern-7ptX</mml:mtext><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub><mml:mo stretchy="false">)</mml:mo><mml:mo>&#x2217;</mml:mo><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mrow><mml:mo>&#x146;</mml:mo></mml:mrow><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>&#x03B2;</mml:mi><mml:mo stretchy="false">)</mml:mo></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:munder><mml:mo movablelimits="true" form="prefix">min</mml:mo><mml:mrow><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:mrow></mml:munder><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mtext>\raisebox{1.5pt}{--}\kern-7ptX</mml:mtext><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>&#x03B1;</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo>&#x2217;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>y</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>&#x2205;</mml:mi><mml:mo>)</mml:mo></mml:mrow></mml:mtd></mml:mtr></mml:mtable><mml:mo>}</mml:mo></mml:mrow></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mrow><mml:mo fence="true" stretchy="true" symmetric="true"></mml:mo><mml:mtable rowspacing="1.2em 0.4em" columnspacing="1em"><mml:mtr><mml:mtd><mml:mrow><mml:mo>(</mml:mo><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mfrac><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:msub><mml:mtext>\raisebox{1.5pt}{--}\kern-7ptX</mml:mtext><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>&#x2217;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mrow><mml:mo>&#x146;</mml:mo></mml:mrow><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>&#x03B2;</mml:mi><mml:mo>)</mml:mo></mml:mrow></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:msqrt><mml:mfrac><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:msub><mml:mtext>\raisebox{1.5pt}{--}\kern-7ptX</mml:mtext><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:mfrac></mml:msqrt><mml:mo>&#x2217;</mml:mo><mml:mi>l</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mi>&#x03BE;</mml:mi><mml:mo>)</mml:mo></mml:mrow></mml:mtd></mml:mtr></mml:mtable><mml:mo>}</mml:mo></mml:mrow></mml:mtd></mml:mtr></mml:mtable><mml:mo fence="true" stretchy="true" symmetric="true"></mml:mo></mml:mrow></mml:mtd></mml:mtr></mml:mtable></mml:math>
</disp-formula></p>
<p>The identification of DF is achieved by computing error-free data, and the LE is produced by using <xref ref-type="disp-formula" rid="eqn-9">Eq. (9)</xref>. The recognition is indicated as <inline-formula id="ieqn-44"><mml:math id="mml-ieqn-44"><mml:mi>&#x03C1;</mml:mi></mml:math></inline-formula>. WB error is obtained through the &#x201C;consequential way,&#x201D; which accurately depicts the data. The time interval is reduced by <inline-formula id="ieqn-45"><mml:math id="mml-ieqn-45"><mml:munder><mml:mo movablelimits="true" form="prefix">min</mml:mo><mml:mrow><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub></mml:mrow></mml:munder><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>&#x03B1;</mml:mi><mml:mo>)</mml:mo></mml:mrow></mml:math></inline-formula>. In <inline-formula id="ieqn-46"><mml:math id="mml-ieqn-46"><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula>, the patients&#x2019; walking patterns are determined. Because <inline-formula id="ieqn-47"><mml:math id="mml-ieqn-47"><mml:mrow><mml:mo>(</mml:mo><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mfrac><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub></mml:mfrac><mml:mo>)</mml:mo></mml:mrow></mml:math></inline-formula>is linked with DF in a certain time interval, it employs the <inline-formula id="ieqn-48"><mml:math id="mml-ieqn-48"><mml:mi mathvariant="normal">&#x2202;</mml:mi><mml:mo stretchy="false">(</mml:mo><mml:mi>&#x03B3;</mml:mi><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula>. <xref ref-type="fig" rid="fig-2">Fig. 2b</xref> depicts the features-based posture categorization.</p>
<fig id="fig-2">
<label>Figure 2</label>
<caption>
<title>(a) The design classification procedure, with detection through pattern analysis and feature extraction. (b) Posture classification procedure, with detection through pattern analysis and feature extraction</title>
</caption>
<graphic mimetype="image" mime-subtype="png" xlink:href="CMC_19706-fig-2.png"/>
</fig>
<p>Features are examined in <inline-formula id="ieqn-49"><mml:math id="mml-ieqn-49"><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mfrac><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>o</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub></mml:mfrac></mml:math></inline-formula>, which indicates the consequential data series and the analysis. The improved DF is seen by determining <xref ref-type="disp-formula" rid="eqn-8">Eqs. (8)</xref> and <xref ref-type="disp-formula" rid="eqn-9">(9)</xref>. The <inline-formula id="ieqn-50"><mml:math id="mml-ieqn-50"><mml:mfrac><mml:mrow><mml:munderover><mml:mo>&#x220F;</mml:mo><mml:mrow><mml:msub><mml:mi>g</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>&#x03B1;</mml:mi></mml:mrow></mml:munderover><mml:mi>&#x03B2;</mml:mi><mml:mo stretchy="false">(</mml:mo><mml:mi mathvariant="normal">&#x2202;</mml:mi><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mrow><mml:mi mathvariant="normal">&#x2202;</mml:mi><mml:mo>+</mml:mo><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub></mml:mrow></mml:mfrac></mml:math></inline-formula> is related in this instance, and error is detected in the specified time interval. They are utilized to calculate <inline-formula id="ieqn-51"><mml:math id="mml-ieqn-51"><mml:msub><mml:mi>n</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>&#x03B2;</mml:mi></mml:math></inline-formula>.</p>
<p><disp-formula id="eqn-10">
<label>(10)</label>
<mml:math id="mml-eqn-10" display="block"><mml:mtable columnalign="left" rowspacing="4pt" columnspacing="1em"><mml:mtr><mml:mtd><mml:msub><mml:mi>&#x03C1;</mml:mi><mml:mrow><mml:mi>w</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mi>&#x03D1;</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mstyle displaystyle="true" scriptlevel="0"><mml:munderover><mml:mo>&#x2211;</mml:mo><mml:mrow><mml:mi>&#x03BE;</mml:mi></mml:mrow><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:munderover><mml:mi>&#x2205;</mml:mi><mml:mo>+</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mfrac><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>o</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mfrac></mml:mstyle><mml:mo>)</mml:mo></mml:mrow><mml:mo>&#x2217;</mml:mo><mml:msubsup><mml:mo>&#x222B;</mml:mo><mml:mrow><mml:msub><mml:mrow><mml:mo>&#x146;</mml:mo></mml:mrow><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mrow></mml:msubsup><mml:mrow><mml:mo>(</mml:mo><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mfrac><mml:mrow><mml:msub><mml:mrow><mml:mo>&#x12F;</mml:mo></mml:mrow><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mrow><mml:mo>&#xE7;</mml:mo></mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mo>+</mml:mo><mml:msub><mml:mrow><mml:mo>&#xFE;</mml:mo></mml:mrow><mml:mrow><mml:mrow><mml:mo>&#xE5;</mml:mo></mml:mrow></mml:mrow></mml:msub></mml:mrow><mml:mi>&#x03B1;</mml:mi></mml:mfrac></mml:mstyle><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mtext>\raisebox{1.5pt}{--}\kern-7ptX</mml:mtext><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub><mml:mo stretchy="false">)</mml:mo></mml:mstyle></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mo>&#x2217;</mml:mo><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mrow><mml:mo>&#x146;</mml:mo></mml:mrow><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>&#x03B2;</mml:mi><mml:mo stretchy="false">)</mml:mo><mml:mo>+</mml:mo><mml:msqrt><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mfrac><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:msub><mml:mtext>\raisebox{1.5pt}{--}\kern-7ptX</mml:mtext><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:mfrac></mml:mstyle></mml:msqrt><mml:mo>&#x2217;</mml:mo><mml:mi>l</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mi>&#x03BE;</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mstyle displaystyle="true" scriptlevel="0"><mml:mfrac><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:msub><mml:mtext>\raisebox{1.5pt}{--}\kern-7ptX</mml:mtext><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:mfrac></mml:mstyle><mml:mo>)</mml:mo></mml:mrow></mml:mtd></mml:mtr></mml:mtable></mml:math>
</disp-formula></p>
<p>Walking is identified <italic>via</italic> DF and is utilized to perform the classification process. It is denoted by <inline-formula id="ieqn-52"><mml:math id="mml-ieqn-52"><mml:msub><mml:mi>&#x03C1;</mml:mi><mml:mrow><mml:mi>&#x03C9;</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:mi>&#x03D1;</mml:mi><mml:mo>)</mml:mo></mml:mrow></mml:math></inline-formula>. The serial and current data are calculated. Incorporating an RF method improves accuracy. Time-based interval is utilized to make the decision in this method. Also, DF is observed in a better way.</p>
<sec id="s3_1">
<label>3.1</label>
<title>Random Forest (RF) for Data Fusion</title>
<p>RF is utilized in DF, as well as classification, regression, and other task-based Decision Trees (DT). Regression focuses on forecasting, whereas classification is for classes. The classification-based RF is used to forecast the repeated analysis in this study. The root and leaf nodes of the RF are described by classification and DF. The root node data features are divided into <inline-formula id="ieqn-53"><mml:math id="mml-ieqn-53"><mml:mi>&#x03B3;</mml:mi><mml:mspace width="thinmathspace" /><mml:mspace width="thinmathspace" /><mml:mi>a</mml:mi><mml:mi>n</mml:mi><mml:mi>d</mml:mi><mml:mspace width="thinmathspace" /><mml:mspace width="thinmathspace" /><mml:mi>&#x03D1;</mml:mi></mml:math></inline-formula> as a consequence of the results. The first stage is to make a forecast, which is then assessed using <xref ref-type="disp-formula" rid="eqn-11">Eq. (11)</xref>.</p>
<p><disp-formula id="eqn-11">
<label>(11)</label>
<mml:math id="mml-eqn-11" display="block"><mml:mi mathvariant="normal">&#x0394;</mml:mi><mml:mo>=</mml:mo><mml:mrow><mml:mo fence="true" stretchy="true" symmetric="true"></mml:mo><mml:mn>1</mml:mn><mml:mo>,</mml:mo><mml:mspace width="1em" /><mml:mi>i</mml:mi><mml:mi>f</mml:mi><mml:mtable columnalign="left" rowspacing="0.9em 0.4em" columnspacing="1em"><mml:mtr><mml:mtd><mml:msubsup><mml:mrow><mml:mo movablelimits="false">&#x2211;</mml:mo></mml:mrow><mml:mrow><mml:mi>&#x03C1;</mml:mi></mml:mrow><mml:mrow><mml:mi>&#x03D1;</mml:mi></mml:mrow></mml:msubsup><mml:mo>&#x2061;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mfrac><mml:mrow><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>w</mml:mi></mml:mrow><mml:mrow><mml:msub><mml:mtext>\raisebox{1.5pt}{--}\kern-7ptX</mml:mtext><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>l</mml:mi></mml:mrow></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>&#x2217;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mi>&#x03B8;</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mi>&#x03C4;</mml:mi><mml:mo>)</mml:mo></mml:mrow></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mn>0</mml:mn><mml:mo>,</mml:mo><mml:mspace width="1em" /><mml:mrow><mml:mi mathvariant="italic">o</mml:mi><mml:mi mathvariant="italic">t</mml:mi><mml:mi mathvariant="italic">h</mml:mi><mml:mi mathvariant="italic">e</mml:mi><mml:mi mathvariant="italic">r</mml:mi><mml:mi mathvariant="italic">w</mml:mi><mml:mi mathvariant="italic">i</mml:mi><mml:mi mathvariant="italic">s</mml:mi><mml:mi mathvariant="italic">e</mml:mi></mml:mrow></mml:mtd></mml:mtr></mml:mtable><mml:mo>}</mml:mo></mml:mrow></mml:math>
</disp-formula></p>
<p>The computation of <inline-formula id="ieqn-54"><mml:math id="mml-ieqn-54"><mml:mi>&#x03C3;</mml:mi><mml:mtext>&#xA0;</mml:mtext><mml:mi>a</mml:mi><mml:mi>n</mml:mi><mml:mi>d</mml:mi><mml:mtext>&#xA0;</mml:mtext><mml:mi>&#x03C4;</mml:mi></mml:math></inline-formula> is used to carry out the prediction process <inline-formula id="ieqn-55"><mml:math id="mml-ieqn-55"><mml:mi mathvariant="normal">&#x0394;</mml:mi></mml:math></inline-formula>. It incorporates both serial and current data. <inline-formula id="ieqn-56"><mml:math id="mml-ieqn-56"><mml:mrow><mml:mo>(</mml:mo><mml:mfrac><mml:mrow><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>&#x03C9;</mml:mi></mml:mrow><mml:mrow><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi mathvariant="normal">&#x2202;</mml:mi></mml:mrow></mml:mfrac><mml:mo>)</mml:mo></mml:mrow></mml:math></inline-formula>&#x2019;s computation represents the data features. Hence, WB is observed. If an error arises during the preceding step, it is again monitored, then eliminated in a continual procedure. It is accomplished with the application of a prediction-based technique, where <inline-formula id="ieqn-57"><mml:math id="mml-ieqn-57"><mml:mi>&#x03C3;</mml:mi><mml:mo>&#x2208;</mml:mo><mml:mi>&#x03C9;</mml:mi><mml:mo stretchy="false">(</mml:mo><mml:mi mathvariant="normal">&#x2202;</mml:mi><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula> include error-related data and are stored in the classification node. <?A3B2 "fig3",5,"anchor"?><xref ref-type="fig" rid="fig-3">Figs. 3a</xref>&#x02012;<xref ref-type="fig" rid="fig-3">3c</xref> depict the initial RF tree concentration, training process, and output process, respectively.</p>
<fig id="fig-3">
<label>Figure 3</label>
<caption>
<title>(a) Prediction process used for data fusion, random forest tree &#x0394; architecture. (b) Prediction process used for data fusion, training procedure. (c) Prediction process used for data fusion, split the root node data sorts into <inline-formula id="ieqn-58"><mml:math id="mml-ieqn-58"><mml:mi>&#x03B3;</mml:mi><mml:mtext>&#xA0;</mml:mtext><mml:mi>a</mml:mi><mml:mi>n</mml:mi><mml:mi>d</mml:mi><mml:mtext>&#xA0;</mml:mtext><mml:mi>&#x03D1;</mml:mi></mml:math></inline-formula></title>
</caption>
<graphic mimetype="image" mime-subtype="png" xlink:href="CMC_19706-fig-3.png"/>
</fig>
<p>DF shows <inline-formula id="ieqn-59"><mml:math id="mml-ieqn-59"><mml:mi>&#x03D1;</mml:mi><mml:mo>&#x2208;</mml:mo><mml:mo stretchy="false">(</mml:mo><mml:mi>&#x03B3;</mml:mi><mml:mo>+</mml:mo><mml:mi mathvariant="normal">&#x2202;</mml:mi><mml:mo stretchy="false">)</mml:mo><mml:mi>&#x03C9;</mml:mi></mml:math></inline-formula>, and posture data contains error data as well as classification. The main concept is to use ML to determine which node is best for DF processing in order to obtain more precision. As a result, the RF is a training-based approach for avoiding a series of errors. The training for the RF is examined using <xref ref-type="disp-formula" rid="eqn-12">Eq. (12)</xref>.</p>
<p><disp-formula id="eqn-12">
<label>(12)</label>
<mml:math id="mml-eqn-12" display="block"><mml:mi>&#x03C1;</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mi>&#x03B2;</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mfrac><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mrow><mml:mi>&#x03B8;</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mi>&#x03C4;</mml:mi></mml:mrow></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mi>&#x03B1;</mml:mi><mml:mo>&#x2217;</mml:mo><mml:mi>&#x03D1;</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:msubsup><mml:mo movablelimits="false">&#x220F;</mml:mo><mml:mrow><mml:mi>u</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msubsup><mml:mrow><mml:mo>(</mml:mo><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mfrac><mml:mi mathvariant="normal">&#x0394;</mml:mi><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>&#x03B8;</mml:mi><mml:mo stretchy="false">)</mml:mo></mml:math>
</disp-formula></p>
<p>The prediction-based technique is used for the analysis, and it is indicated as <inline-formula id="ieqn-60"><mml:math id="mml-ieqn-60"><mml:mi>&#x03C1;</mml:mi><mml:mo stretchy="false">(</mml:mo><mml:mi>&#x03B2;</mml:mi><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula>. They are assessed by<inline-formula id="ieqn-61"><mml:math id="mml-ieqn-61"><mml:mrow><mml:mo>(</mml:mo><mml:mfrac><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mrow><mml:mi>&#x03C3;</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mi>&#x03C4;</mml:mi></mml:mrow></mml:mfrac><mml:mo>)</mml:mo></mml:mrow></mml:math></inline-formula>, and a large number of posture-related data points are matched in a sequential manner to anticipate current data and enhance the process. This is accomplished by assessing <inline-formula id="ieqn-62"><mml:math id="mml-ieqn-62"><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula>, which represents training data that contains a sequential data error. For further processing of posture-related WB, the forecasting is produced by calculating <inline-formula id="ieqn-63"><mml:math id="mml-ieqn-63"><mml:mrow><mml:mo>(</mml:mo><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mfrac><mml:mi mathvariant="normal">&#x0394;</mml:mi><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula>. The assessment results are generated in <xref ref-type="disp-formula" rid="eqn-13">Eq. (13)</xref> by evaluating the DF and finding the optimal node for forecasting.</p>
<p><disp-formula id="eqn-13">
<label>(13)</label>
<mml:math id="mml-eqn-13" display="block"><mml:mi>&#x03D1;</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mi>&#x03C1;</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo>=</mml:mo><mml:mrow><mml:mo fence="true" stretchy="true" symmetric="true"></mml:mo><mml:mtable columnalign="left" rowspacing="0.9em 0.4em" columnspacing="1em"><mml:mtr><mml:mtd><mml:mrow><mml:mo>(</mml:mo><mml:mfrac><mml:mrow><mml:mi>&#x03B1;</mml:mi><mml:mo>+</mml:mo><mml:mi>&#x03B2;</mml:mi></mml:mrow><mml:mi>&#x03BC;</mml:mi></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>&#x2217;</mml:mo><mml:msubsup><mml:mo movablelimits="false">&#x220F;</mml:mo><mml:mrow><mml:mi>o</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mrow><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mrow></mml:msubsup><mml:mrow><mml:mo>(</mml:mo><mml:mi>&#x03BE;</mml:mi><mml:mo>+</mml:mo><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:msubsup><mml:mo movablelimits="false">&#x220F;</mml:mo><mml:mrow><mml:mi>&#x03B4;</mml:mi></mml:mrow><mml:mrow><mml:mi>l</mml:mi></mml:mrow></mml:msubsup><mml:mi>&#x03BE;</mml:mi><mml:mo>+</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mfrac><mml:mrow><mml:mi>&#x03B1;</mml:mi><mml:mo>&#x2217;</mml:mo><mml:mi>w</mml:mi></mml:mrow><mml:mfrac><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub><mml:mi mathvariant="normal">&#x0394;</mml:mi></mml:mfrac></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub></mml:mtd></mml:mtr></mml:mtable><mml:mo>}</mml:mo></mml:mrow></mml:math>
</disp-formula></p>
<p>The detection for improved prediction is calculated by assessing <xref ref-type="disp-formula" rid="eqn-13">Eq. (13)</xref>, in which <inline-formula id="ieqn-64"><mml:math id="mml-ieqn-64"><mml:mfrac><mml:mrow><mml:mi>&#x03B1;</mml:mi><mml:mo>+</mml:mo><mml:mi>&#x03B2;</mml:mi></mml:mrow><mml:mi>&#x03BC;</mml:mi></mml:mfrac></mml:math></inline-formula> are utilized to represent the steps of WB as well as the repeated analysis. The outcome is associated after features are identified in a chaining way. The prediction technique for finding nodes on the RF is shown by the estimation <inline-formula id="ieqn-65"><mml:math id="mml-ieqn-65"><mml:mrow><mml:mo>(</mml:mo><mml:mfrac><mml:mrow><mml:mi>&#x03B1;</mml:mi><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow><mml:mi>&#x03C9;</mml:mi></mml:mrow><mml:mfrac><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub><mml:mi mathvariant="normal">&#x0394;</mml:mi></mml:mfrac></mml:mfrac><mml:mo>)</mml:mo></mml:mrow></mml:math></inline-formula>. As a result, it incorporates the <inline-formula id="ieqn-66"><mml:math id="mml-ieqn-66"><mml:mrow><mml:mo>(</mml:mo><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mfrac><mml:mi mathvariant="normal">&#x0394;</mml:mi><mml:mi>&#x03B4;</mml:mi></mml:mfrac><mml:mo>)</mml:mo></mml:mrow></mml:math></inline-formula> least error prediction. To obtain higher precision, the prediction is performed using RF by formulating <xref ref-type="disp-formula" rid="eqn-14">Eq. (14)</xref>.</p>
<p><disp-formula id="eqn-14">
<label>(14)</label>
<mml:math id="mml-eqn-14" display="block"><mml:mi>&#x03B8;</mml:mi><mml:mo>=</mml:mo><mml:mrow><mml:mo>{</mml:mo><mml:mtable columnalign="left left" rowspacing=".2em" columnspacing="1em" displaystyle="false"><mml:mtr><mml:mtd><mml:munderover><mml:mo>&#x2211;</mml:mo><mml:mrow><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>w</mml:mi></mml:mrow></mml:munderover><mml:mrow><mml:mo>(</mml:mo><mml:mi>&#x03BC;</mml:mi><mml:mo>+</mml:mo><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>)</mml:mo></mml:mrow><mml:mo>&#x2217;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mfrac><mml:mi>&#x03C1;</mml:mi><mml:mrow><mml:mi>&#x03B8;</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mi>&#x03C4;</mml:mi></mml:mrow></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:msub><mml:mo movablelimits="false">&#x220F;</mml:mo><mml:mrow><mml:mi>o</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:msub><mml:mi>&#x03B4;</mml:mi><mml:mo>+</mml:mo><mml:mi>&#x03C1;</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:mi>&#x03B2;</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo>&#x2217;</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mi>w</mml:mi><mml:mo>+</mml:mo><mml:mfrac><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mi>l</mml:mi></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:mfrac><mml:mrow><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>w</mml:mi></mml:mrow><mml:mrow><mml:msub><mml:mtext>\raisebox{1.5pt}{--}\kern-7ptX</mml:mtext><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>l</mml:mi></mml:mrow></mml:mfrac><mml:mo>&#x003C;</mml:mo><mml:mn>1</mml:mn></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:munderover><mml:mo>&#x2211;</mml:mo><mml:mrow><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>w</mml:mi></mml:mrow></mml:munderover><mml:mrow><mml:mo>(</mml:mo><mml:mi>&#x03BC;</mml:mi><mml:mo>+</mml:mo><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>)</mml:mo></mml:mrow><mml:mo>&#x2217;</mml:mo><mml:msub><mml:mo movablelimits="false">&#x220F;</mml:mo><mml:mrow><mml:mi>o</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:msub><mml:mo>&#x2217;</mml:mo><mml:mtext>&#xA0;</mml:mtext><mml:mi>&#x03BE;</mml:mi><mml:mo>+</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:mi>w</mml:mi><mml:mo>+</mml:mo><mml:mfrac><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mi>l</mml:mi></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:mfrac><mml:mrow><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mrow><mml:mover><mml:mi>o</mml:mi><mml:mo stretchy="false">&#x00AF;</mml:mo></mml:mover></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>w</mml:mi></mml:mrow><mml:mrow><mml:msub><mml:mtext>\raisebox{1.5pt}{--}\kern-7ptX</mml:mtext><mml:mrow><mml:mrow><mml:mi mathvariant="script">T</mml:mi></mml:mrow></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>l</mml:mi></mml:mrow></mml:mfrac><mml:mo>&#x003E;</mml:mo><mml:mn>1</mml:mn></mml:mtd></mml:mtr></mml:mtable><mml:mo fence="true" stretchy="true" symmetric="true"></mml:mo></mml:mrow></mml:math>
</disp-formula></p>
<p>In <xref ref-type="disp-formula" rid="eqn-14">Eq. (14)</xref>, when these two criteria are utilized, the RF is used to obtain higher precision, and the outcome is either larger than or less than 1. The computation <inline-formula id="ieqn-67"><mml:math id="mml-ieqn-67"><mml:mi>&#x03C9;</mml:mi><mml:mo>+</mml:mo><mml:mfrac><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mi mathvariant="normal">&#x2202;</mml:mi></mml:mfrac></mml:math></inline-formula> depicts WB, calculates the error, and reduces the calculated error from the subsequent posture identification. The identification is based on frequent data analysis in a timely manner. In <inline-formula id="ieqn-68"><mml:math id="mml-ieqn-68"><mml:mfrac><mml:mi>&#x03C1;</mml:mi><mml:mrow><mml:mi>&#x03C3;</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mi>&#x03C4;</mml:mi></mml:mrow></mml:mfrac></mml:math></inline-formula>, the detection denotes the serial and current data. The first condition is preferable to the other since it fulfills the DF and improves precision more. The output processing of the classification is shown in <xref ref-type="fig" rid="fig-3">Fig. 3c</xref>.</p>
<p>If there is a decrease in DF and prediction during classification, <inline-formula id="ieqn-69"><mml:math id="mml-ieqn-69"><mml:mi>&#x03B2;</mml:mi><mml:mo>&#x2208;</mml:mo><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>&#x03BC;</mml:mi><mml:mrow><mml:mtext>&#xA0;</mml:mtext></mml:mrow><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow><mml:mtext>&#xA0;</mml:mtext><mml:mi mathvariant="normal">&#x0394;</mml:mi></mml:math></inline-formula> is called to update the features. By assessment, it optimizes the model with less error for recurrent analysis utilizing RF. It also offers higher posture accuracy and identifies the WB of the patient.</p>
</sec>
</sec>
<sec id="s4">
<label>4</label>
<title>Results and Discussion</title>
<p>Through a comparison study, this section explains the performance evaluation of the suggested EDF approach. The comparison takes into account the metrics identification accuracy, fusion error, and recognition time. In this comparison study, the suggested EDF is added to the current techniques TCNN [<xref ref-type="bibr" rid="ref-20">20</xref>], MVEA [<xref ref-type="bibr" rid="ref-22">22</xref>], and LSTM-CNN [<xref ref-type="bibr" rid="ref-15">15</xref>]. The material in [<xref ref-type="bibr" rid="ref-34">34</xref>&#x2013;<xref ref-type="bibr" rid="ref-36">36</xref>] is used to analyze the suggested approach. The above-mentioned metrics are estimated using Dataset A from the source, which is 16 MB in size and contains the posture patterns of 20 participants. In this study, five participants were chosen from a total of 20 to examine the recognition of 12 occurrences each. Sub1, Sub2&#x2026;, Sub5 were the names of the subjects, and Sub1 and Sub4 were young subjects. Through training, 110 posture patterns were assessed on the front and back from the observing point, with a maximum of 20 features retrieved and a 4 s observation interval.</p>
<sec id="s4_1">
<label>4.1</label>
<title>Recognition Accuracy</title>
<p>The recognition accuracy is examined in <?A3B2 "fig4",5,"anchor"?><xref ref-type="fig" rid="fig-4">Fig. 4</xref> by changing the patterns, features, and time intervals. While calculating <inline-formula id="ieqn-70"><mml:math id="mml-ieqn-70"><mml:mrow><mml:mo>(</mml:mo><mml:mfrac><mml:mrow><mml:msub><mml:mi>n</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>q</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:mrow><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub></mml:mfrac><mml:mo>)</mml:mo></mml:mrow></mml:math></inline-formula>, when the posture is detected and the WB is acquired, the accuracy is excellent. The feature data <inline-formula id="ieqn-71"><mml:math id="mml-ieqn-71"><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>g</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>o</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula> is evaluated using the posture patterns. The posture features are retrieved, and the patterns are detected at a specified period <inline-formula id="ieqn-72"><mml:math id="mml-ieqn-72"><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>g</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow><mml:mfrac><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mi>&#x03C9;</mml:mi></mml:mfrac><mml:mo>)</mml:mo></mml:mrow></mml:math></inline-formula>. By comparing the WB with the posture database <inline-formula id="ieqn-73"><mml:math id="mml-ieqn-73"><mml:mi>&#x03C9;</mml:mi><mml:mo>+</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>u</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mo>)</mml:mo></mml:mrow><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow><mml:msub><mml:mi>p</mml:mi><mml:mrow><mml:mi>a</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, the accuracy of the derived features is improved. LFs are retrieved from WB. This generated information is analyzed <italic>via</italic> eliminating errors. Following FE, classification is performed in order to eliminate error and achieve the LF error. The computation of <inline-formula id="ieqn-74"><mml:math id="mml-ieqn-74"><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow><mml:mfrac><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo stretchy="false">)</mml:mo><mml:mo>+</mml:mo><mml:mi>o</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow></mml:mrow><mml:msub><mml:mi>p</mml:mi><mml:mrow><mml:mi>a</mml:mi></mml:mrow></mml:msub></mml:mfrac><mml:mo>)</mml:mo></mml:mrow></mml:math></inline-formula> is performed in a specified time duration for the sequence of data inputs, <inline-formula id="ieqn-75"><mml:math id="mml-ieqn-75"><mml:msub><mml:mi>n</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo stretchy="false">(</mml:mo><mml:mi>c</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mo>+</mml:mo><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>u</mml:mi><mml:mi mathvariant="normal">&#x2032;</mml:mi><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula>. For each set time interval, the identification accuracy varies for features. As a result, it displays different patterns and features for each time period, <inline-formula id="ieqn-76"><mml:math id="mml-ieqn-76"><mml:mrow><mml:mo>(</mml:mo><mml:mi>&#x03C9;</mml:mi><mml:mo>+</mml:mo><mml:mfrac><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub></mml:mfrac><mml:mo>)</mml:mo></mml:mrow></mml:math></inline-formula>. The suggested EDF improves the recognition of posture patterns.</p>
<p>The above data was obtained by monitoring the movement of the patient at a different time interval. The patient was given a wearable device that was connected to the computer system of the client. The data collection was made possible through IoT technology. The IoT enables the connection between the patients and the hospitals. This ensured that there is sufficient information flow between the patient and doctors. The wearable device that was mounted on the patient recorded important information such as posture at various intervals. It shows the trend in the distribution of data, features, and time intervals. The data proves more authentic as compared to the benchmark paper. The dataset obtained has components that can be used by the healthcare organization to make important decisions regarding the treatment of the patient. It indicates how the fusion error can be reduced due to the adoption of new technology to support the medication of the patients.</p>
</sec>
<sec id="s4_2">
<label>4.2</label>
<title>Fusion Error</title>
<p>By assessing<inline-formula id="ieqn-77"><mml:math id="mml-ieqn-77"><mml:mrow><mml:mo>[</mml:mo><mml:mfrac><mml:mrow><mml:mi>&#x03C9;</mml:mi><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>y</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:mrow><mml:mi>&#x2205;</mml:mi></mml:mrow></mml:mfrac><mml:mo>]</mml:mo></mml:mrow></mml:math></inline-formula>, and detecting the patients&#x0027; WB from that classification, the fusion error is reduced. The fusion of walking patterns is detected by minimizing the error and LFE <inline-formula id="ieqn-78"><mml:math id="mml-ieqn-78"><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>&#x2208;</mml:mo><mml:mo stretchy="false">(</mml:mo><mml:mi>o</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mo>+</mml:mo><mml:mi>&#x03C9;</mml:mi><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula>. The error minimization at the initial step of data acquisition is represented by the formulation of <inline-formula id="ieqn-79"><mml:math id="mml-ieqn-79"><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>n</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>&#x03B2;</mml:mi><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:mi mathvariant="normal">&#x2202;</mml:mi></mml:math></inline-formula>. As a consequence, the detection comprises <inline-formula id="ieqn-80"><mml:math id="mml-ieqn-80"><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mfrac><mml:mrow><mml:mi>o</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mo>+</mml:mo><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:mi>&#x2205;</mml:mi></mml:mrow></mml:mfrac></mml:math></inline-formula>, for a data sequence, and analyzes data from the devices, which leads to NT being avoided. The categorization of feature is linked to the integrity, chaining, and data patterns that are calculated,<inline-formula id="ieqn-81"><mml:math id="mml-ieqn-81"><mml:mrow><mml:mo>[</mml:mo><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>i</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>c</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mo>+</mml:mo><mml:msub><mml:mi>p</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>)</mml:mo></mml:mrow><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow><mml:mfrac><mml:mi>&#x03B1;</mml:mi><mml:mi>&#x03B3;</mml:mi></mml:mfrac><mml:mo>]</mml:mo></mml:mrow></mml:math></inline-formula>. The suggested technique reduces the fusion error when compared to the other techniques, TCNN, MVEA, and LSTM-CNN. Training data are derived through feature classification and are collected at a certain time period. By comparing the feature, error, and time interval, as <inline-formula id="ieqn-82"><mml:math id="mml-ieqn-82"><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi mathvariant="normal">&#x2202;</mml:mi><mml:mo>+</mml:mo><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula>, the fusion error is reduced. By deriving <inline-formula id="ieqn-83"><mml:math id="mml-ieqn-83"><mml:mrow><mml:mo>(</mml:mo><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mfrac><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>o</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mo>+</mml:mo><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula> the time-based DF error is reduced for changing features, time intervals, and training data, as shown in <?A3B2 "fig5",5,"anchor"?><xref ref-type="fig" rid="fig-5">Fig. 5</xref>.</p>
<fig id="fig-4">
	<label>Figure 4</label>
	<caption>
		<title>(a) Recognition accuracy for varying patterns. (b) Recognition accuracy for varying features. (c) Recognition accuracy for varying time intervals</title>
	</caption>
	<graphic mimetype="image" mime-subtype="png" xlink:href="CMC_19706-fig-4.png"/>
	</fig>
</sec>
<sec id="s4_3">
<label>4.3</label>
<title>Detection Time</title>
<p>The proposed EDF demonstrates posture recognition in a short time span, with the LFE indicating DF from classification. It searches the walking patterns and training data at the particular time period,<inline-formula id="ieqn-84"><mml:math id="mml-ieqn-84"><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>&#x03B3;</mml:mi><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow><mml:mfrac><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mi>&#x03B2;</mml:mi></mml:mfrac><mml:mo>)</mml:mo></mml:mrow></mml:math></inline-formula>, as shown in <?A3B2 "fig6",5,"anchor"?><xref ref-type="fig" rid="fig-6">Fig. 6</xref>. The starting time and finishing time of processing are marked as <inline-formula id="ieqn-85"><mml:math id="mml-ieqn-85"><mml:mrow><mml:mo>(</mml:mo><mml:mi>&#x03C1;</mml:mi><mml:mo>+</mml:mo><mml:mi>&#x03B4;</mml:mi><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow><mml:mi>&#x03B1;</mml:mi><mml:mo>)</mml:mo></mml:mrow></mml:math></inline-formula> in the derivation of this posture recognition. <inline-formula id="ieqn-86"><mml:math id="mml-ieqn-86"><mml:mn>1</mml:mn><mml:mo>+</mml:mo><mml:mfrac><mml:mrow><mml:mo stretchy="false">(</mml:mo><mml:msub><mml:mi>m</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>o</mml:mi><mml:mrow><mml:mi mathvariant="normal">&#x2032;</mml:mi></mml:mrow><mml:mo stretchy="false">)</mml:mo></mml:mrow><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub></mml:mfrac></mml:math></inline-formula> is the result of computing the consequence of a data series. To monitor and eliminate the continuous process, the detection is acquired <inline-formula id="ieqn-87"><mml:math id="mml-ieqn-87"><mml:mrow><mml:mo>(</mml:mo><mml:mfrac><mml:mrow><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>&#x03C9;</mml:mi></mml:mrow><mml:mrow><mml:msub><mml:mi>x</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi mathvariant="normal">&#x2202;</mml:mi></mml:mrow></mml:mfrac><mml:mo>)</mml:mo></mml:mrow></mml:math></inline-formula> for features at the appropriate time. The categorization of error data is, <inline-formula id="ieqn-88"><mml:math id="mml-ieqn-88"><mml:mi>&#x03D1;</mml:mi><mml:mo>&#x2208;</mml:mo><mml:mo stretchy="false">(</mml:mo><mml:mi>&#x03B3;</mml:mi><mml:mo>+</mml:mo><mml:mi mathvariant="normal">&#x2202;</mml:mi><mml:mo stretchy="false">)</mml:mo><mml:mi>&#x03C9;</mml:mi></mml:math></inline-formula>. It shows improvement while training the data at the acquisition time. It associates the forecast with the posture pattern and sequential data processing. <inline-formula id="ieqn-89"><mml:math id="mml-ieqn-89"><mml:mi mathvariant="normal">&#x0394;</mml:mi><mml:mo>+</mml:mo><mml:mo stretchy="false">(</mml:mo><mml:mi>&#x03C3;</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mi>&#x03C4;</mml:mi><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula> is included in the analysis where they are linked to the <inline-formula id="ieqn-90"><mml:math id="mml-ieqn-90"><mml:msub><mml:mi>r</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo stretchy="false">(</mml:mo><mml:mi>&#x03C9;</mml:mi><mml:mo>+</mml:mo><mml:msub><mml:mi>s</mml:mi><mml:mrow><mml:mi>e</mml:mi></mml:mrow></mml:msub><mml:mo stretchy="false">)</mml:mo></mml:math></inline-formula>. From this, the detection is calculated as <inline-formula id="ieqn-91"><mml:math id="mml-ieqn-91"><mml:mfrac><mml:mi>&#x03C1;</mml:mi><mml:mrow><mml:mi>&#x03C3;</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mi>&#x03C4;</mml:mi></mml:mrow></mml:mfrac></mml:math></inline-formula> to the updating of features by <inline-formula id="ieqn-92"><mml:math id="mml-ieqn-92"><mml:mi>&#x03B2;</mml:mi><mml:mo>&#x2208;</mml:mo><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mn>0</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mi>&#x03BC;</mml:mi><mml:mrow><mml:mo>&#x2217;</mml:mo></mml:mrow><mml:mi mathvariant="normal">&#x0394;</mml:mi></mml:math></inline-formula>. At a predetermined time interval, the prediction of current and successive posture is examined, as shown in <xref ref-type="fig" rid="fig-6">Fig. 6</xref>. Sub3&#x2019;s accuracy and error displaying 20 features are shown in <?A3B2 "tbl1",5,"anchor"?><xref ref-type="table" rid="table-1">Tab. 1</xref>.</p>
<fig id="fig-5">
	<label>Figure 5</label>
	<caption>
		<title>(a) Fusion error for varying features. (b) Fusion error for varying time intervals. (c) Fusion error for varying training data</title>
	</caption>
	<graphic mimetype="image" mime-subtype="png" xlink:href="CMC_19706-fig-5.png"/>
	</fig>
<fig id="fig-6">
<label>Figure 6</label>
<caption>
<title>Detection time for inconsistent patterns and training data</title>
</caption>
<graphic mimetype="image" mime-subtype="png" xlink:href="CMC_19706-fig-6.png"/>
</fig>
<p>The result of the study shows the relationship between the accuracy ratio and error. Accuracy and error are in inverse proportion, although it varies based on the pattern displayed by the subject in both directions. This estimation takes into account the subject&#x2019;s height, speed, and pause time, as well as variations in orientation and motion angle, which affect precision and error. <?A3B2 "tbl2",5,"anchor"?><xref ref-type="table" rid="table-2">Tab. 2</xref>. shows five subjects&#x2019; accuracy under various features.</p>
<p>Sub1 and Sub5 have extremely precise movement patterns. This precision may be shown for a variety of features. This is due to the subjects&#x2019; identical patterns and maximal fused sensor data. Furthermore, because Sub1 and Sub4 are young, they take a shorter time to display various patterns. As a result, in the case of the two subjects, the extracted features are high. The correlation between these patterns is typically strong, and therefore the accuracy is higher. The error and patterns for the subjects are shown in <?A3B2 "tbl3",5,"anchor"?><xref ref-type="table" rid="table-3">Tab. 3</xref>.</p>
<p>The inverse relationship between the feature extraction and error is evident from the results shown in <xref ref-type="table" rid="table-3">Tab. 3</xref>. The error is reduced as the number of FEs rise, which is done by attaining a large count of fused patterns. Changes in feature count help improve the correlation between multiple stored inputs, resulting in high posture identification in various scenarios. As a result, the accuracy of posture recognition improves, and the error is reduced. The calculation time for high fused patterns is long, but the computation time for Sub2 is long, due to multiple iterations.</p>
<p>The result of the research study indicates that the errorless data fusion approach can help address the unique needs of the patients. It allows the medical practitioners to monitor the posture of the patients and address their needs. The data collected from advanced medical devices is likely to improve patient care and enhance the delivery of healthcare.</p>
<table-wrap id="table-1">
	<label>Table 1</label>
	<caption><title>Accuracy and error in a different direction (for Sub3)</title></caption>
	
	<table>
		<colgroup>
			<col/>
				<col/>
					<col/>
						<col/>
							<col/>
								<col/>
								</colgroup>
								<thead>
									<tr>
										<th align="left">Featu Res</th>
										<th align="left">Time interval</th>
									
									<th align="left">Left</th>
										<th align="left"/>
										<th align="left">Right</th>
									</tr>
									<tr>
										<th/>
											<th/>
												<th align="left">Accuracy</th>
												<th align="left">Error</th>
												<th align="left">Accuracy</th>
												<th align="left">Error</th>
											</tr>
										</thead>
										<tbody>
											<tr>
												<td align="left">5</td>
												<td align="left">12.51</td>
												<td align="left">85.65</td>
												<td align="left">0.0518</td>
												<td align="left">84.601</td>
												<td align="left">0.0681</td>
											</tr>
											<tr>
												<td align="left">6</td>
												<td align="left">17.39</td>
												<td align="left">88.73</td>
												<td align="left">0.0358</td>
												<td align="left">82.943</td>
												<td align="left">0.1272</td>
											</tr>
											<tr>
												<td align="left">7</td>
												<td align="left">9.64</td>
												<td align="left">91.43</td>
												<td align="left">0.1123</td>
												<td align="left">81.991</td>
												<td align="left">0.1684</td>
											</tr>
											<tr>
												<td align="left">8</td>
												<td align="left">12.61</td>
												<td align="left">79.36</td>
												<td align="left">0.0976</td>
												<td align="left">84.312</td>
												<td align="left">0.1316
												</td>
											</tr>
											<tr>
												<td align="left">9</td>
												<td align="left">7.34</td>
												<td align="left">79.52</td>
												<td align="left">0.0186</td>
												<td align="left">84.788</td>
												<td align="left">0.1999</td>
											</tr>
											<tr>
												<td align="left">10</td>
												<td align="left">7.34</td>
												<td align="left">84.81</td>
												<td align="left">0.1129</td>
												<td align="left">94.768</td>
												<td align="left">0.0689</td>
											</tr>
											<tr>
												<td align="left">11</td>
												<td align="left">13.34</td>
												<td align="left">82.6</td>
												<td align="left">0.0234</td>
												<td align="left">85.947</td>
												<td align="left">0.0574</td>
											</tr>
											<tr>
												<td align="left">12</td>
												<td align="left">9.66</td>
												<td align="left">84</td>
												<td align="left">0.1312</td>
												<td align="left">85.558</td>
												<td align="left">0.1528</td>
											</tr>
											<tr>
												<td align="left">13</td>
												<td align="left">15.83</td>
												<td align="left">95.58</td>
												<td align="left">0.1797</td>
												<td align="left">77.732</td>
												<td align="left">0.0561</td>
											</tr>
											<tr>
												<td align="left">14</td>
												<td align="left">5.68</td>
												<td align="left">85.93</td>
												<td align="left">0.1286</td>
												<td align="left">85.943</td>
												<td align="left">0.1498</td>
											</tr>
											<tr>
												<td align="left">15</td>
												<td align="left">14.36</td>
												<td align="left">91.6</td>
												<td align="left">0.047</td>
												<td align="left">90.244</td>
												<td align="left">0.046</td>
											</tr>
											<tr>
												<td align="left">16</td>
												<td align="left">10.63</td>
												<td align="left">90.57</td>
												<td align="left">0.0757</td>
												<td align="left">82.913</td>
												<td align="left">0.1117</td>
											</tr>
											<tr>
												<td align="left">17</td>
												<td align="left">16.98</td>
												<td align="left">93.87</td>
												<td align="left">0.0708</td>
												<td align="left">84.192</td>
												<td align="left">0.0634</td>
											</tr>
											<tr>
												<td align="left">13</td>
												<td align="left">15.56</td>
												<td align="left">92.57</td>
												<td align="left">0.1599</td>
												<td align="left">92.843</td>
												<td align="left">0.0142</td>
											</tr>
											<tr>
												<td align="left">19</td>
												<td align="left">10.2</td>
												<td align="left">86.49</td>
												<td align="left">0.0932</td>
												<td align="left">84.314</td>
												<td align="left">0.1463</td>
											</tr>
											<tr>
												<td align="left">20</td>
												<td align="left">18.09</td>
												<td align="left">94.731</td>
												<td align="left">0.0336</td>
												<td align="left">92.035</td>
												<td align="left">0.1503</td>
											</tr>
											</tbody>
</table>
</table-wrap>
<p>The result of the research study shows the relationship between the accuracy ratio and error. Precision and error are in inverse proportion, while it is distinctive for both the headings, contingent upon the subject&#x2019;s example. The height, speed, and pause time of the subjects are considered in this assessment, wherein the subject&#x2019;s adjustments impact the accuracy and error. <xref ref-type="table" rid="table-2">Tab. 2</xref>. shows the precision of the five subjects under various parameters.</p>
<table-wrap id="table-2">
<label>Table 2</label>
<caption>
<title>Accuracy and error in different direction (for Sub3)</title>
</caption>
<table>
<colgroup>
<col/>
<col/>
<col/>
<col/>
<col/>
<col/>
</colgroup>
<thead>
<tr>
<th>Features</th>
<th>Sub1</th>
<th>Sub2</th>
<th>Sub3</th>
<th>Sub4</th>
<th>Sub5</th>
</tr>
</thead>
<tbody>
<tr>
<td>5</td>
<td>79.6669</td>
<td>82.2481</td>
<td>89.4749</td>
<td>80.469</td>
<td>92.793</td>
</tr>
<tr>
<td>6</td>
<td>80.7802</td>
<td>82.3904</td>
<td>88.5191</td>
<td>80.269</td>
<td>92.042</td>
</tr>
<tr>
<td>7</td>
<td>81.1904</td>
<td>85.6005</td>
<td>85.1466</td>
<td>81.693</td>
<td>93.888</td>
</tr>
<tr>
<td>8</td>
<td>80.5274</td>
<td>85.1991</td>
<td>86.328</td>
<td>82.65</td>
<td>90.316</td>
</tr>
<tr>
<td>9</td>
<td>78.9743</td>
<td>84.8541</td>
<td>89.8168</td>
<td>82.727</td>
<td>90.981</td>
</tr>
<tr>
<td>10</td>
<td>79.5009</td>
<td>81.9002</td>
<td>85.974</td>
<td>81.196</td>
<td>95.166</td>
</tr>
<tr>
<td>11</td>
<td>80.6072</td>
<td>81.9016</td>
<td>85.3281</td>
<td>81.105</td>
<td>93.209</td>
</tr>
<tr>
<td>12</td>
<td>80.2143</td>
<td>87.2282</td>
<td>87.1611</td>
<td>82.272</td>
<td>90.933</td>
</tr>
<tr>
<td>13</td>
<td>79.6867</td>
<td>83.6132</td>
<td>88.0086</td>
<td>82.825</td>
<td>95.884</td>
</tr>
<tr>
<td>14</td>
<td>81.3808</td>
<td>85.6749</td>
<td>87.9456</td>
<td>80.625</td>
<td>91.789</td>
</tr>
<tr>
<td>15</td>
<td>78.5056</td>
<td>83.785</td>
<td>89.6364</td>
<td>80.905</td>
<td>92.738</td>
</tr>
<tr>
<td>16</td>
<td>80.9325</td>
<td>82.9618</td>
<td>88.8079</td>
<td>82.927</td>
<td>90.397</td>
</tr>
<tr>
<td>17</td>
<td>79.247</td>
<td>85.1505</td>
<td>88.812</td>
<td>82.257</td>
<td>92.145</td>
</tr>
<tr>
<td>18</td>
<td>80.0675</td>
<td>86.0757</td>
<td>87.8409</td>
<td>81.795</td>
<td>91.011</td>
</tr>
<tr>
<td>19</td>
<td>80.3892</td>
<td>82.8792</td>
<td>88.7359</td>
<td>81.004</td>
<td>91.601</td>
</tr>
<tr>
<td>20</td>
<td>81.0892</td>
<td>82.4575</td>
<td>89.8023</td>
<td>82.147</td>
<td>92.77</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>The development examples of Sub1 and Sub5 shows a high precision. This is due to the comparative examples and the combined sensor information from the subjects. Sub1 and Sub4 are both young, and consequently, each subject is taken as an example in <xref ref-type="table" rid="table-3">Tab. 3</xref> to measure precision and accuracy. Therefore, the number of extracted features for Sub1 and Sub4 is high. The connection of these examples is expectedly high, and consequently, the precision is improved.</p>
<table-wrap id="table-3">
	<label>Table 3</label>
	<caption><title>Error ratio and fused patterns for different subjects</title></caption>
	
	
	<table>
		<colgroup>
			<col/>
				<col/>
					<col/>
						<col/>
							<col/>
							</colgroup>
							<thead>
								<tr>
									<th align="left">S's</th>
									<th align="left">Features</th>
									<th align="left">Error (%)</th>
									<th align="left">Computation time</th>
									<th align="left">Fused patterns</th>
								</tr>
							</thead>
							<tbody>
								<tr>
									<td align="left">1</td>
									<td align="left">18</td>
									<td align="left">0.019</td>
									<td align="left">0.468</td>
									<td align="left">91</td>
								</tr>
								<tr>
									<td align="left">2</td>
									<td align="left">11</td>
									<td align="left">0.0952</td>
									<td align="left">0.961</td>
									<td align="left">49</td>
								</tr>
								<tr>
									<td align="left">3</td>
									<td align="left">10</td>
									<td align="left">0.154</td>
									<td align="left">0.157</td>
									<td align="left">89</td>
								</tr>
								<tr>
									<td align="left">4</td>
									<td align="left">16</td>
									<td align="left">0.055</td>
									<td align="left">0.693</td>
									<td align="left">78</td>
								</tr>
								<tr>
									<td align="left">5</td>
									<td align="left">14</td>
									<td align="left">0.171</td>
									<td align="left">0.219</td>
									<td align="left">56</td>
								</tr>
</tbody>
</table>
</table-wrap>
<p>The inverse relationship between the feature extraction and error is evident from the result demonstrated in <xref ref-type="table" rid="table-3">Tab. 3</xref>. As the feature extraction expands, the errors decrease; this is accomplished through a high number of intertwined designs. The Feature Check Adjustments help improve the connection between various sources of information, and subsequently, the acknowledgment of posture in various examples is high. The location of the posture enhances the precision, and consequently, the error rate decreases. The calculation time for high melded model is high, while now and again, the calculation time is high because of numerous repeats (for Sub2).</p>
<p>The research study results indicate that the EDF approach can help medical practitioners to monitor the posture of the patients and address their unique needs.</p>
</sec>
</sec>
<sec id="s5">
<label>5</label>
<title>Limitations of the Proposed EDF Method</title>
<p>The EDF method has certain limitations, such as its inability to handle uncertainty and inconsistency. Combining data from many sources with a multisensory DF algorithm exploits the data redundancy to help minimize the uncertainty. These sources can lead to inconsistent data and poor fusion when the multisensory DF&#x2019;s performance is less than that of each individual sensor.</p>
<p>The other limitation is the inability of EDF to address the diverse needs of the patients. The research focuses on a specific group of patients, which implies that it cannot address the needs of all the patients requiring specific treatment. The research focuses only on posture recognition accuracy and cannot be applied in other areas. It is thus important to improve the results of the study to ensure that EDF it covers other areas and concerns of the patients.</p>
</sec>
<sec id="s6">
<label>6</label>
<title>Conclusion</title>
<p>This paper discusses the EDF approach for increasing the accuracy of posture identification through multi-feature analysis. In the beginning, the patients&#x2019; walking patterns are observed at various time intervals. The characteristics are then evaluated in relation to the saved data utilizing an RF classifier. This procedure is dependent on several time periods in order for the iterations to efficiently detect classification mistakes. Finally, conditional training is utilized to fuse the disaggregated errorless data to find the posture pattern that fits the stored pattern. Patterns and features are frequently evaluated in this classification process, and conditional training is computed depending on the prior error in order to improve identification accuracy. The results reveal that the proposed approach improves accuracy while reducing fusion and detection errors as well as computation time.</p>
</sec>
</body>
<back>
<ack>
<p>Thanks to our families and colleagues who gave us moral support.</p>
</ack>
<fn-group>
<fn fn-type="other">
<p><bold>Funding Statement:</bold> The authors received no specific funding for this study.</p>
</fn>
<fn fn-type="conflict">
<p><bold>Conflicts of Interest:</bold> The authors declare that they have no conflicts of interest to report regarding the present study.</p>
</fn>
</fn-group>
<ref-list content-type="authoryear">
<title>References</title>
<ref id="ref-1"><label>[1]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>H.</given-names> <surname>Zhang</surname></string-name> and <string-name><given-names>Y.</given-names> <surname>Li</surname></string-name></person-group>, &#x201C;<article-title>A low-power dynamic-range relaxed analog front end for wearable heart rate and blood oximetry sensor</article-title>,&#x201D; <source>IEEE Sensors Journal</source>, vol. <volume>19</volume>, no. <issue>19</issue>, pp. <fpage>8387</fpage>&#x2013;<lpage>8392</lpage>, <year>2018</year>.</mixed-citation></ref>
<ref id="ref-2"><label>[2]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>F.</given-names> <surname>Sun</surname></string-name>, <string-name><given-names>C.</given-names> <surname>Mao</surname></string-name>, <string-name><given-names>X.</given-names> <surname>Fan</surname></string-name> and <string-name><given-names>Y.</given-names> <surname>Li</surname></string-name></person-group>, &#x201C;<article-title>Accelerometer-based speed-adaptive gait authentication method for wearable IoT devices</article-title>,&#x201D; <source>IEEE Internet of Things Journal</source>, vol. <volume>6</volume>, no. <issue>1</issue>, pp. <fpage>820</fpage>&#x2013;<lpage>830</lpage>, <year>2018</year>.</mixed-citation></ref>
<ref id="ref-3"><label>[3]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>M.</given-names> <surname>Jafarzadeh</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Brooks</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Yu</surname></string-name>, <string-name><given-names>B.</given-names> <surname>Prabhakaran</surname></string-name> and <string-name><given-names>Y.</given-names> <surname>Tadesse</surname></string-name></person-group>, &#x201C;<article-title>A wearable sensor vest for social humanoid robots with GPGPU, IoT, and modular software architecture</article-title>,&#x201D; <source>Robotics and Autonomous Systems</source>, vol. <volume>139</volume>, pp. <fpage>103536</fpage>, <year>2021</year>.</mixed-citation></ref>
<ref id="ref-4"><label>[4]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>C. P.</given-names> <surname>Walmsley</surname></string-name>, <string-name><given-names>S. A.</given-names> <surname>Williams</surname></string-name>, <string-name><given-names>T.</given-names> <surname>Grisbrook</surname></string-name>, <string-name><given-names>C.</given-names> <surname>Elliott</surname></string-name>, <string-name><given-names>C.</given-names> <surname>Imms</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Measurement of upper limb range of motion using wearable sensors: A systematic review</article-title>,&#x201D; <source>Sports Medicine Open</source>, vol. <volume>4</volume>, no. <issue>1</issue>, pp. <fpage>1</fpage>&#x2013;<lpage>22</lpage>, <year>2018</year>.</mixed-citation></ref>
<ref id="ref-5"><label>[5]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>M.</given-names> <surname>Guo</surname></string-name> and <string-name><given-names>Z.</given-names> <surname>Wang</surname></string-name></person-group>, &#x201C;<article-title>Segmentation and recognition of human motion sequences using wearable inertial sensors</article-title>,&#x201D; <source>Multimedia Tools and Applications</source>, vol. <volume>77</volume>, no. <issue>16</issue>, pp. <fpage>21201</fpage>&#x2013;<lpage>21220</lpage>, <year>2018</year>.</mixed-citation></ref>
<ref id="ref-6"><label>[6]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>C.</given-names> <surname>Cui</surname></string-name>, <string-name><given-names>G. B.</given-names> <surname>Bian</surname></string-name>, <string-name><given-names>Z. G.</given-names> <surname>Hou</surname></string-name>, <string-name><given-names>J.</given-names> <surname>Zhao</surname></string-name>, <string-name><given-names>G.</given-names> <surname>Su</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Simultaneous recognition and assessment of post-stroke hemiparetic gait by fusing kinematic, kinetic, and electrophysiological data</article-title>,&#x201D; <source>IEEE Transactions on Neural Systems &#x0026; Rehabilitation Engineering</source>, vol. <volume>26</volume>, no. <issue>4</issue>, pp. <fpage>856</fpage>&#x2013;<lpage>864</lpage>, <year>2018</year>.</mixed-citation></ref>
<ref id="ref-7"><label>[7]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>E.</given-names> <surname>Dorschky</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Nitschke</surname></string-name>, <string-name><given-names>A. K.</given-names> <surname>Seifer</surname></string-name>, <string-name><given-names>A. J. V. D.</given-names> <surname>Bogert</surname></string-name> and <string-name><given-names>B. M.</given-names> <surname>Eskofier</surname></string-name></person-group>, &#x201C;<article-title>Estimation of posture kinematics and kinetics from inertial sensor data using optimal control of musculoskeletal models</article-title>,&#x201D; <source>Journal of Biomechanics</source>, vol. <volume>95</volume>, no. <issue>8</issue>, pp. <fpage>109278</fpage>&#x2013;<lpage>109288</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-8"><label>[8]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>K.</given-names> <surname>Hyodo</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Kanamori</surname></string-name>, <string-name><given-names>H.</given-names> <surname>Kadone</surname></string-name>, <string-name><given-names>T.</given-names> <surname>Takahashi</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Kajiwara</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Posture analysis comparing kinematic, kinetic, and muscle activation data of modern and conventional total knee arthroplasty</article-title>,&#x201D; <source>Arthroplasty Today</source>, vol. <volume>6</volume>, no. <issue>3</issue>, pp. <fpage>338</fpage>&#x2013;<lpage>342</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-9"><label>[9]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>E.</given-names> <surname>Gianaria</surname></string-name> and <string-name><given-names>M.</given-names> <surname>Grangetto</surname></string-name></person-group>, &#x201C;<article-title>Robust posture identification using Kinect dynamic skeleton data</article-title>,&#x201D; <source>Multimedia Tools and Applications</source>, vol. <volume>78</volume>, no. <issue>10</issue>, pp. <fpage>13925</fpage>&#x2013;<lpage>13948</lpage>, <year>2018</year>.</mixed-citation></ref>
<ref id="ref-10"><label>[10]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>Qiu</surname></string-name>, <string-name><given-names>L.</given-names> <surname>Liu</surname></string-name>, <string-name><given-names>Z.</given-names> <surname>Wang</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Li</surname></string-name>, <string-name><given-names>H.</given-names> <surname>Zhao</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Body sensor network-based posture quality assessment for clinical decision-support via multi-sensor fusion</article-title>,&#x201D; <source>IEEE Access</source>, vol. <volume>7</volume>, pp. <fpage>59884</fpage>&#x2013;<lpage>59894</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-11"><label>[11]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>Y.</given-names> <surname>Zhang</surname></string-name>, <string-name><given-names>Y.</given-names> <surname>Huang</surname></string-name>, <string-name><given-names>X.</given-names> <surname>Sun</surname></string-name>, <string-name><given-names>Y.</given-names> <surname>Zhao</surname></string-name>, <string-name><given-names>X.</given-names> <surname>Guo</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Static and dynamic human arm/hand gesture capturing and recognition via multi-information fusion of flexible strain sensors</article-title>,&#x201D; <source>IEEE Sensors Journal</source>, vol. <volume>20</volume>, no. <issue>12</issue>, pp. <fpage>6450</fpage>&#x2013;<lpage>6459</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-12"><label>[12]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>M.</given-names> <surname>Muzammal</surname></string-name>, <string-name><given-names>R.</given-names> <surname>Talat</surname></string-name>, <string-name><given-names>A. H.</given-names> <surname>Sodhro</surname></string-name> and <string-name><given-names>S.</given-names> <surname>Pirbhulal</surname></string-name></person-group>, &#x201C;<article-title>A multi-sensor data fusion enabled ensemble approach for medical data from body sensor networks</article-title>,&#x201D; <source>Information Fusion</source>, vol. <volume>53</volume>, pp. <fpage>155</fpage>&#x2013;<lpage>164</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-13"><label>[13]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>F. M.</given-names> <surname>Castro</surname></string-name>, <string-name><given-names>M. J.</given-names> <surname>Mar&#x00ED;n-Jim&#x00E9;nez</surname></string-name>, <string-name><given-names>N.</given-names> <surname>Guil</surname></string-name> and <string-name><given-names>N. P. D. L.</given-names> <surname>Blanca</surname></string-name></person-group>, &#x201C;<article-title>Multimodal feature fusion for CNN-based posture recognition: An empirical comparison</article-title>,&#x201D; <source>Neural Computing and Applications</source>, vol. <volume>60</volume>, no. <issue>2</issue>, pp. <fpage>1</fpage>&#x2013;<lpage>9</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-14"><label>[14]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>T. M.</given-names> <surname>Ghazal</surname></string-name>, <string-name><given-names>M. K.</given-names> <surname>Hasan</surname></string-name>, <string-name><given-names>R.</given-names> <surname>Hassan</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Islam</surname></string-name>, <string-name><given-names>S. N. H. S.</given-names> <surname>Abdullah</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Security vulnerabilities, attacks, threats and the proposed countermeasures for the internet of things applications</article-title>,&#x201D; <source>Solid State Technology</source>, vol. <volume>63</volume>, no. <issue>1</issue>, pp. <fpage>2513</fpage>&#x2013;<lpage>2521</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-15"><label>[15]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>A.</given-names> <surname>Ghazvini</surname></string-name>, <string-name><given-names>S. N. H. S.</given-names> <surname>Abdullah</surname></string-name>, <string-name><given-names>M. K.</given-names> <surname>Hasan</surname></string-name> and <string-name><given-names>D. Z. A. B.</given-names> <surname>Kasim</surname></string-name></person-group>, &#x201C;<article-title>Crime spatiotemporal prediction with fused objective function in time delay neural network</article-title>,&#x201D; <source>IEEE Access</source>, vol. <volume>8</volume>, pp. <fpage>115167</fpage>&#x2013;<lpage>115183</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-16"><label>[16]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>M. K.</given-names> <surname>Hasan</surname></string-name>, <string-name><given-names>M. M.</given-names> <surname>Ahmed</surname></string-name>, <string-name><given-names>S. S.</given-names> <surname>Musa</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Islam</surname></string-name> and <string-name><given-names>S. N.</given-names> <surname>Abdullah</surname></string-name></person-group>, &#x201C;<article-title>An improved dynamic thermal current rating model for PMU-based wide area measurement framework for reliability analysis utilizing sensor cloud system</article-title>,&#x201D; <source>IEEE Access</source>, vol. <volume>18</volume>, no. <issue>9</issue>, pp. <fpage>14446</fpage>&#x2013;<lpage>14458</lpage>, <year>2021</year>.</mixed-citation></ref>
<ref id="ref-17"><label>[17]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>A. H. M.</given-names> <surname>Aman</surname></string-name>, <string-name><given-names>E.</given-names> <surname>Yadegaridehkordi</surname></string-name>, <string-name><given-names>Z. S.</given-names> <surname>Attarbashi</surname></string-name>, <string-name><given-names>R.</given-names> <surname>Hassan</surname></string-name> and <string-name><given-names>Y. J.</given-names> <surname>Park</surname></string-name></person-group>, &#x201C;<article-title>A survey on trend and classification of internet of things reviews</article-title>,&#x201D; <source>IEEE Access</source>, vol. <volume>8</volume>, pp. <fpage>111763</fpage>&#x2013;<lpage>111782</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-18"><label>[18]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>R.</given-names> <surname>Hassan</surname></string-name>, <string-name><given-names>F.</given-names> <surname>Qamar</surname></string-name>, <string-name><given-names>M. K.</given-names> <surname>Hasan</surname></string-name>, <string-name><given-names>A. H. M.</given-names> <surname>Aman</surname></string-name> and <string-name><given-names>A. S.</given-names> <surname>Ahmed</surname></string-name></person-group>, &#x201C;<article-title>Internet of things and its applications: A comprehensive survey</article-title>,&#x201D; <source>Symmetry</source>, vol. <volume>12</volume>, no. <issue>10</issue>, pp. <fpage>1674</fpage>&#x2013;<lpage>1684</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-19"><label>[19]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>A.</given-names> <surname>Meri</surname></string-name>, <string-name><given-names>M. K.</given-names> <surname>Hasa</surname></string-name> and <string-name><given-names>N.</given-names> <surname>Safie</surname></string-name></person-group>, &#x201C;<article-title>Success factors affecting the healthcare professionals to utilize cloud computing services</article-title>,&#x201D; <source>Asia-Pacific Journal of Information Technology and Multimedia</source>, vol. <volume>6</volume>, no. <issue>2</issue>, pp. <fpage>31</fpage>&#x2013;<lpage>42</lpage>, <year>2017</year>.</mixed-citation></ref>
<ref id="ref-20"><label>[20]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>Y.</given-names> <surname>Wang</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Cang</surname></string-name> and <string-name><given-names>H.</given-names> <surname>Yu</surname></string-name></person-group>, &#x201C;<article-title>A data fusion-based hybrid sensory system for older people&#x2019;s daily activity and daily routine recognition</article-title>,&#x201D; <source>IEEE Sensors Journal</source>, vol. <volume>18</volume>, no. <issue>16</issue>, pp. <fpage>6874</fpage>&#x2013;<lpage>6888</lpage>, <year>2018</year>.</mixed-citation></ref>
<ref id="ref-21"><label>[21]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>J.</given-names> <surname>Gao</surname></string-name>, <string-name><given-names>P.</given-names> <surname>Gu</surname></string-name>, <string-name><given-names>Q.</given-names> <surname>Ren</surname></string-name>, <string-name><given-names>J.</given-names> <surname>Zhang</surname></string-name> and <string-name><given-names>X.</given-names> <surname>Song</surname></string-name></person-group>, &#x201C;<article-title>Abnormal posture recognition algorithm based on LSTM-CNN fusion network</article-title>,&#x201D; <source>IEEE Access</source>, vol. <volume>7</volume>, pp. <fpage>163180</fpage>&#x2013;<lpage>163190</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-22"><label>[22]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S. B.</given-names> <surname>Zahra</surname></string-name>, <string-name><given-names>M. A.</given-names> <surname>Khan</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Abbas</surname></string-name>, <string-name><given-names>K. M.</given-names> <surname>Khan</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Ghamdi</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Marker-based and marker-less motion capturing video data: Person and activity identification comparison based on machine learning approaches</article-title>,&#x201D; <source>Computers, Materials &#x0026; Continua</source>, vol. <volume>66</volume>, no. <issue>2</issue>, pp. <fpage>1269</fpage>&#x2013;<lpage>1282</lpage>, <year>2021</year>.</mixed-citation></ref>
<ref id="ref-23"><label>[23]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>M.</given-names> <surname>Al-Amin</surname></string-name>, <string-name><given-names>W.</given-names> <surname>Tao</surname></string-name>, <string-name><given-names>D.</given-names> <surname>Doell</surname></string-name>, <string-name><given-names>R.</given-names> <surname>Lingard</surname></string-name>, <string-name><given-names>Z.</given-names> <surname>Yin</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Action recognition in manufacturing assembly using multimodal sensor fusion</article-title>,&#x201D; <source>Procedia Manufacturing</source>, vol. <volume>39</volume>, no. <issue>3</issue>, pp. <fpage>158</fpage>&#x2013;<lpage>167</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-24"><label>[24]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>X.</given-names> <surname>Wang</surname></string-name> and <string-name><given-names>J.</given-names> <surname>Zhang</surname></string-name></person-group>, &#x201C;<article-title>Posture feature extraction and posture classification using two-branch CNN</article-title>,&#x201D; <source>Multimedia Tools and Applications</source>, vol. <volume>79</volume>, no. <issue>3&#x2013;4</issue>, pp. <fpage>2917</fpage>&#x2013;<lpage>2930</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-25"><label>[25]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>E.</given-names> <surname>Fendri</surname></string-name>, <string-name><given-names>I.</given-names> <surname>Chtourou</surname></string-name> and <string-name><given-names>M.</given-names> <surname>Hammami</surname></string-name></person-group>, &#x201C;<article-title>Posture-based person re-identification under covariate factors</article-title>,&#x201D; <source>Pattern Analysis and Applications</source>, vol. <volume>22</volume>, no. <issue>4</issue>, pp. <fpage>1629</fpage>&#x2013;<lpage>1642</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-26"><label>[26]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>H. F.</given-names> <surname>Nweke</surname></string-name>, <string-name><given-names>Y. W.</given-names> <surname>Teh</surname></string-name>, <string-name><given-names>G.</given-names> <surname>Mujtaba</surname></string-name>, <string-name><given-names>U. R.</given-names> <surname>Alo</surname></string-name> and <string-name><given-names>M. A. A.</given-names> <surname>Garadi</surname></string-name></person-group>, &#x201C;<article-title>Multi-sensor fusion based on multiple classifier systems for human activity identification</article-title>,&#x201D; <source>Human Centric Computing and Information Sciences</source>, vol. <volume>9</volume>, no. <issue>1</issue>, pp. <fpage>1</fpage>&#x2013;<lpage>14</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-27"><label>[27]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>Abbas</surname></string-name>, <string-name><given-names>M. A.</given-names> <surname>Khan</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Athar</surname></string-name>, <string-name><given-names>S. A.</given-names> <surname>Shan</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Saeed</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Enabling smart city with intelligent congestion control using hops with a hybrid computational approach</article-title>,&#x201D; <source>The Computer Journal</source>, vol. <volume>12</volume>, no. <issue>8</issue>, pp. <fpage>286</fpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-28"><label>[28]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>L.</given-names> <surname>Tran</surname></string-name> and <string-name><given-names>D.</given-names> <surname>Choi</surname></string-name></person-group>, &#x201C;<article-title>Data augmentation for inertial sensor-based posture deep neural network</article-title>,&#x201D; <source>IEEE Access</source>, vol. <volume>8</volume>, pp. <fpage>12364</fpage>&#x2013;<lpage>12378</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-29"><label>[29]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>M.</given-names> <surname>Hanif</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Abbas</surname></string-name>, <string-name><given-names>M. A.</given-names> <surname>Khan</surname></string-name>, <string-name><given-names>N.</given-names> <surname>Iqbal</surname></string-name>, <string-name><given-names>Z. U.</given-names> <surname>Rehman</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>A novel and efficient multiple RGB images cipher based on chaotic system and circular shift operations</article-title>,&#x201D; <source>IEEE Access</source>, vol. <volume>8</volume>, no. <issue>8</issue>, pp. <fpage>146408</fpage>&#x2013;<lpage>146427</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-30"><label>[30]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>N.</given-names> <surname>Dawar</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Ostadabbas</surname></string-name> and <string-name><given-names>N.</given-names> <surname>Kehtarnavaz</surname></string-name></person-group>, &#x201C;<article-title>Data augmentation in deep learning-based fusion of depth and inertial sensing for action recognition</article-title>,&#x201D; <source>IEEE Sensors Letters</source>, vol. <volume>3</volume>, no. <issue>1</issue>, pp. <fpage>1</fpage>&#x2013;<lpage>4</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-31"><label>[31]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>Q.</given-names> <surname>Zou</surname></string-name>, <string-name><given-names>L.</given-names> <surname>Ni</surname></string-name>, <string-name><given-names>Q.</given-names> <surname>Wang</surname></string-name>, <string-name><given-names>Q.</given-names> <surname>Li</surname></string-name> and <string-name><given-names>S.</given-names> <surname>Wang</surname></string-name></person-group>, &#x201C;<article-title>Robust posture recognition by integrating inertial and RGBD sensors</article-title>,&#x201D; <source>IEEE Transactions on Cybernetics</source>, vol. <volume>48</volume>, no. <issue>4</issue>, pp. <fpage>1136</fpage>&#x2013;<lpage>1150</lpage>, <year>2018</year>.</mixed-citation></ref>
<ref id="ref-32"><label>[32]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>Y.</given-names> <surname>Fan</surname></string-name>, <string-name><given-names>H.</given-names> <surname>Jin</surname></string-name>, <string-name><given-names>Y.</given-names> <surname>Ge</surname></string-name> and <string-name><given-names>N.</given-names> <surname>Wang</surname></string-name></person-group>, &#x201C;<article-title>Wearable motion attitude detection and data analysis based on internet of things</article-title>,&#x201D; <source>IEEE Access</source>, vol. <volume>8</volume>, pp. <fpage>1327</fpage>&#x2013;<lpage>1338</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-33"><label>[33]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>Islam</surname></string-name>, <string-name><given-names>A. H.</given-names> <surname>Abdalla</surname></string-name> and <string-name><given-names>M. K.</given-names> <surname>Hasan</surname></string-name></person-group>, &#x201C;<article-title>Novel multihoming-based flow mobility scheme for proxy NEMO environment: A numerical approach to analyse handoff performance</article-title>,&#x201D; <source>Science Asia</source>, vol. <volume>1</volume>, no. <issue>43</issue>, pp. <fpage>27</fpage>&#x2013;<lpage>34</lpage>, <year>2017</year>.</mixed-citation></ref>
<ref id="ref-34"><label>[34]</label><mixed-citation publication-type="other"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>Zheng</surname></string-name></person-group>, &#x201C;<article-title>Gait dataset</article-title>,&#x201D; <year>2009</year>. <comment>retrived 31 December 2020</comment>. [Online]. Available: <uri>http://www.cbsr.ia.ac.cn/users/szheng/?page_id&#x003D;71</uri>.</mixed-citation></ref>
<ref id="ref-35"><label>[35]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>M.</given-names> <surname>Akhtaruzzaman</surname></string-name>, <string-name><given-names>M. K.</given-names> <surname>Hasan</surname></string-name>, <string-name><given-names>S. R.</given-names> <surname>Kabir</surname></string-name>, <string-name><given-names>S. N.</given-names> <surname>Abdullah</surname></string-name> and <string-name><given-names>M. J.</given-names> <surname>Sadeq</surname></string-name></person-group>, &#x201C;<article-title>HSIC bottleneck based distributed deep learning model for load forecasting in smart grid with a comprehensive survey</article-title>,&#x201D; <source>IEEE Access</source>, vol. <volume>7</volume>, no. <issue>2</issue>, pp. <fpage>34567</fpage>&#x2013;<lpage>34589</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-36"><label>[36]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>M. K.</given-names> <surname>Hasan</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Islam</surname></string-name>, <string-name><given-names>R.</given-names> <surname>Sulaiman</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Khan</surname></string-name> and <string-name><given-names>A. H.</given-names> <surname>Hashim</surname></string-name></person-group>, &#x201C;<article-title>Lightweight encryption technique to enhance medical image security on internet of medical things applications</article-title>,&#x201D; <source>IEEE Access</source>, vol. <volume>24</volume>, no. <issue>9</issue>, pp. <fpage>47731</fpage>&#x2013;<lpage>47742</lpage>, <year>2021</year>.</mixed-citation></ref>
</ref-list>
</back>
</article>
