<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.1 20151215//EN" "http://jats.nlm.nih.gov/publishing/1.1/JATS-journalpublishing1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:mml="http://www.w3.org/1998/Math/MathML" xml:lang="en" article-type="research-article" dtd-version="1.1">
<front>
<journal-meta>
<journal-id journal-id-type="pmc">CMC</journal-id>
<journal-id journal-id-type="nlm-ta">CMC</journal-id>
<journal-id journal-id-type="publisher-id">CMC</journal-id>
<journal-title-group>
<journal-title>Computers, Materials &#x0026; Continua</journal-title>
</journal-title-group>
<issn pub-type="epub">1546-2226</issn>
<issn pub-type="ppub">1546-2218</issn>
<publisher>
<publisher-name>Tech Science Press</publisher-name>
<publisher-loc>USA</publisher-loc>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="publisher-id">35360</article-id>
<article-id pub-id-type="doi">10.32604/cmc.2023.035360</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Article</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>Relative-Position Estimation Based on Loosely Coupled UWB&#x2013;IMU Fusion for Wearable IoT Devices</article-title>
<alt-title alt-title-type="left-running-head">Relative-Position Estimation Based on Loosely Coupled UWB&#x2013;IMU Fusion for Wearable IoT Devices</alt-title>
<alt-title alt-title-type="right-running-head">Relative-Position Estimation Based on Loosely Coupled UWB&#x2013;IMU Fusion for Wearable IoT Devices</alt-title>
</title-group>
<contrib-group>
<contrib id="author-1" contrib-type="author">
<name name-style="western"><surname>Sharifuzzaman Sagar</surname><given-names>A. S. M.</given-names></name><xref ref-type="aff" rid="aff-1">1</xref></contrib>
<contrib id="author-2" contrib-type="author">
<name name-style="western"><surname>Kim</surname><given-names>Taein</given-names></name><xref ref-type="aff" rid="aff-1">1</xref></contrib>
<contrib id="author-3" contrib-type="author">
<name name-style="western"><surname>Park</surname><given-names>Soyoung</given-names></name><xref ref-type="aff" rid="aff-1">1</xref></contrib>
<contrib id="author-4" contrib-type="author">
<name name-style="western"><surname>Lee</surname><given-names>Hee Seh</given-names></name><xref ref-type="aff" rid="aff-2">2</xref></contrib>
<contrib id="author-5" contrib-type="author" corresp="yes">
<name name-style="western"><surname>Kim</surname><given-names>Hyung Seok</given-names></name><xref ref-type="aff" rid="aff-1">1</xref><email>hyungkim@sejong.edu</email></contrib>
<aff id="aff-1"><label>1</label><institution>Sejong University</institution>, <addr-line>Seoul, 05006</addr-line>, <country>Korea</country></aff>
<aff id="aff-2"><label>2</label><institution>Samsung Advanced Institute of Technology</institution>,<addr-line> Suwon, 16678</addr-line>, <country>Korea</country></aff>
</contrib-group>
<author-notes>
<corresp id="cor1"><label>&#x002A;</label>Corresponding Author: Hyung Seok Kim. Email: <email>hyungkim@sejong.edu</email></corresp>
</author-notes>
<pub-date date-type="collection" publication-format="electronic"><year>2023</year></pub-date>
<pub-date date-type="pub" publication-format="electronic"><day>24</day><month>1</month><year>2023</year></pub-date>
<volume>75</volume>
<issue>1</issue>
<fpage>1941</fpage>
<lpage>1961</lpage>
<history>
<date date-type="received"><day>18</day><month>8</month><year>2022</year></date>
<date date-type="accepted"><day>15</day><month>11</month><year>2022</year></date>
</history>
<permissions>
<copyright-statement>&#x00A9; 2023 Sharifuzzaman Sagar et al.</copyright-statement>
<copyright-year>2023</copyright-year>
<copyright-holder>Sharifuzzaman Sagar et al.</copyright-holder>
<license xlink:href="https://creativecommons.org/licenses/by/4.0/">
<license-p>This work is licensed under a <ext-link ext-link-type="uri" xlink:type="simple" xlink:href="https://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</ext-link>, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.</license-p>
</license>
</permissions>
<self-uri content-type="pdf" xlink:href="TSP_CMC_35360.pdf"></self-uri>
<abstract><p>Relative positioning is one of the important techniques in collaborative robotics, autonomous vehicles, and virtual/augmented reality (VR/AR) applications. Recently, ultra-wideband (UWB) has been utilized to calculate relative position as it does not require a line of sight compared to a camera to calculate the range between two objects with centimeter-level accuracy. However, the single UWB range measurement cannot provide the relative position and attitude of any device in three dimensions (3D) because of lacking bearing information. In this paper, we have proposed a UWB-IMU fusion-based relative position system to provide accurate relative position and attitude between wearable Internet of Things (IoT) devices in 3D. We introduce a distributed Euler angle antenna orientation which can be equipped with the mobile structure to enable relative positioning. Moving average and min-max removing preprocessing filters are introduced to reduce the standard deviation. The standard multilateration method is modified to calculate the relative position between mobile structures. We combine UWB and IMU measurements in a probabilistic framework that enables users to calculate the relative position between two nodes with less error. We have carried out different experiments to illustrate the advantages of fusing IMU and UWB ranges for relative positioning systems. We have achieved a mean accuracy of 0.31&#x2005;m for 3D relative positioning in indoor line of sight conditions.</p>
</abstract>
<kwd-group kwd-group-type="author">
<kwd>Relative position</kwd>
<kwd>UWB</kwd>
<kwd>IMU</kwd>
<kwd>trilateration</kwd>
<kwd>IoT</kwd>
<kwd>bayesian filter</kwd>
</kwd-group>
</article-meta>
</front>
<body>
<sec id="s1"><label>1</label><title>Introduction</title>
<p>Establishing the relative positions of objects is critical for various collaborative applications, including robot movement tracking, virtual/augmented reality (VR/AR), line following, and autonomous vehicles. Establishing the relative positioning between two objects is usually dependent on the deployment of fixed or mobile anchors based on Bluetooth [<xref ref-type="bibr" rid="ref-1">1</xref>], Wi-Fi [<xref ref-type="bibr" rid="ref-2">2</xref>], Global Positioning System (GPS) [<xref ref-type="bibr" rid="ref-3">3</xref>], or ultra-wideband (UWB) radio technology [<xref ref-type="bibr" rid="ref-4">4</xref>]. Most of the solutions applied in relative-position systems use range-measurement techniques. The most common range-measurement techniques are two-way ranging (TWR), time of arrival, time difference of arrival (TDOA), and angle of arrival (AOA). These range-measurement techniques use radiofrequency and acoustic signals to calculate the range between the two devices [<xref ref-type="bibr" rid="ref-5">5</xref>]. Range-measurement devices are especially appealing because they are often affordable, lightweight, computationally simple, and may be used in settings in which GPS is unavailable. This minimizes architecture and processing needs, allowing for the deployment of huge swarms of tiny, low-cost tracking devices in indoor or subterranean locations.</p>
<p>There are numerous existing approaches that seek to integrate range measurements into positioning systems. Most techniques for indoor localization have typically assumed the availability of an infrastructure consisting of four or more anchors with defined coordinates [<xref ref-type="bibr" rid="ref-6">6</xref>,<xref ref-type="bibr" rid="ref-7">7</xref>]. In [<xref ref-type="bibr" rid="ref-8">8</xref>], relative positioning was accomplished using a single anchor under the assumption of fixed motion; in other research, measurements of relative positioning have been accomplished using a range-based simultaneous localization and mapping (SLAM) technique. Other techniques use displacement data obtained from optical sensors, including infrared (IR) and visible-light communication (VLC) devices, to calculate the relative positioning of two objects. However, these methods do not provide a precise position, as IR methods suffer from interference from fluorescent light and sunlight [<xref ref-type="bibr" rid="ref-9">9</xref>], and VLC requires an environment with a clear line of sight (LOS) [<xref ref-type="bibr" rid="ref-10">10</xref>]. Moreover, the hardware and maintenance costs of these solutions are very high. Therefore, researchers have commonly opted for a radio-signal-based range-measurement approach for positioning due to the wide availability of inexpensive and lightweight sensors.</p>
<p>Most range-measurement techniques require constant relative movement between an anchor and an agent to acquire their relative positions [<xref ref-type="bibr" rid="ref-11">11</xref>&#x2013;<xref ref-type="bibr" rid="ref-14">14</xref>]. Many studies have been conducted in robotics examining relative-location measurement using vision or laser-based sensors. These techniques are restricted in their use in complex scenarios due to field-of-view problems, sensor data association, and ambient illumination issues [<xref ref-type="bibr" rid="ref-14">14</xref>]. Other researchers have used radio signal strength (RSS) techniques with wireless devices through a variety of model-based or fingerprint-based solutions to estimate an object&#x2019;s location. In contrast to the above-mentioned visual-based sensors, RSS may be used even when there is no LOS. However, the accuracy of RSS-based techniques may vary because radio propagation in an indoor environment can be affected by severe multipath and other site-specific characteristics, and fingerprinting methods require the dataset to be updated if the environment changes [<xref ref-type="bibr" rid="ref-15">15</xref>,<xref ref-type="bibr" rid="ref-16">16</xref>].</p>
<p>Recently, researchers have proposed UWB-based positioning systems that can provide very precise range measurements for both LOS and non-LOS (NLOS) environments. UWB-based positioning employs an approach using TWR to determine the range between two objects and can provide positioning precision within a few centimeters, yielding better results than RSS-based location-tracking methods. UWB-based range-measurement devices use two tags to calculate the relative positioning of objects, such as unmanned aerial vehicles (UAVs) [<xref ref-type="bibr" rid="ref-17">17</xref>] and autonomous vehicles [<xref ref-type="bibr" rid="ref-18">18</xref>].</p>
<p>Some researchers have proposed single range measurement methods to enable localization between two devices [<xref ref-type="bibr" rid="ref-19">19</xref>,<xref ref-type="bibr" rid="ref-20">20</xref>]. These single range measurement based methods involve initiating range measurement between a single anchor and a tag device and calculating the relative position using the acquired single range measurement information. However, such methods require continuous relative motion between two objects, and the relative position accuracy can deteriorate due to parallel motion [<xref ref-type="bibr" rid="ref-21">21</xref>]. Other studies have used three tags to calculate relative positions [<xref ref-type="bibr" rid="ref-22">22</xref>,<xref ref-type="bibr" rid="ref-23">23</xref>]. Both strategies retrieve two-dimensional data about the objects&#x2019; relative positions. Only a few studies have been conducted examining three-dimensional (3D) relative-positioning schemes using UWB and other complementary measurement units [<xref ref-type="bibr" rid="ref-24">24</xref>,<xref ref-type="bibr" rid="ref-25">25</xref>]. However, these solutions require the UWB anchors to be placed at fixed sites to enable real-time positioning, and massive installation of UWB anchors is costly and complicated.</p>
<p>Some researchers have proposed on-board UWB-based relative-position systems to solve the above-mentioned issue [<xref ref-type="bibr" rid="ref-26">26</xref>,<xref ref-type="bibr" rid="ref-27">27</xref>]. Although their solutions were found to achieve higher accuracy, they only considered mobile machinery in which the objects move slowly, and they did not consider standard-deviation mitigation in the range measurement, which is one of the main sources of error in UWB-based relative-position systems. Moreover, a UWB-based positioning system can only provide relative position information and not any attitude information, which is a very important factor in most collaborative tasks. An inertial navigation system (INS) can provide attitude information and compensate for positioning errors due to an NLOS environment. Inertial measurement units (IMUs) are generally used in INS-based solutions to provide positioning. However, the data from IMU sensors may be inaccurate due to the accumulation of errors; this must be mitigated or reduced by other sources, such as UWB, to enable precise positioning.</p>
<p>Therefore, a robust relative positioning system must be proposed by incorporating the UWB and IMU sensors to enable precise positioning. The range measurement between two objects is the primary means of calculating the relative position between the two devices. The UWB device can be used as a range measurement sensor because it can provide cm level accuracy unlike other devices such as Wi-Fi, Bluetooth, and Zigbee. Moreover, the recently developed UWB devices are cost effective compared to previously developed devices. The available UWB devices use different ranging methods such as TWR, TDOA, and AOA to enable range measurement between two devices. However, it is found that TWR can provide more precise range measurement in a dynamic environment as it can mitigate the clock drift and time synchronization issues found in UWB based range measurement methods. Therefore, TWR range measurements can be used for UWB devices. One of the limitations of the UWB device is the lack of pose estimation and deterioration of range accuracy in NLOS, which can be solved by using an IMU sensor. The IMU sensors can provide information about the displacement of the objects, which can be integrated with UWB devices to reduce the standard deviation of the relative positioning error. Moreover, the IMU sensor can also provide pose estimation, and the pose estimation information from both objects can be integrated to enable relative pose estimation between two objects. Therefore, this study integrates the UWB and IMU devices to propose a relative positioning system for wearable IoT devices. The UWB-based range measurement technique is used to estimate the range between two devices. The estimated range is then filtered using the moving average and min-max filtering methods prior to calculating the 3D relative position between the two devices. UWB antennas are distributed in a squared mobile structure with the Euler angle orientation method. The distributed Euler angle orientation is responsible for acquiring 3D relative position with precise accuracy. The multilateration method is modified according to the structural requirement of the proposed relative positioning system. Moreover, a novel UWB and IMU fusion method is proposed to reduce position error and increase the update rate of the system. The main contributions of this study are as follows:
<list list-type="order">
<list-item><p>A Distributed Euler antenna orientation is proposed to enable 3D relative positioning using a mobile structure.</p></list-item>
<list-item><p>The original multilateration technique is modified to accommodate the proposed antenna orientation.</p></list-item>
<list-item><p>The standard deviation of the UWB range measurements is mitigated using the moving average filter and min-max sample removal filter. Therefore, accurate range measurement is achieved prior to the relative position calculation.</p></list-item>
<list-item><p>A multi anchor based small mobile structure is designed, which can simultaneously provide relative position and orientation of the mobile nodes without needing any fixed anchor placement in an indoor environment. Moreover, the IoT platform is also utilized to transmit the relative position data to Unity software to provide real time position representation.</p></list-item>
<list-item><p>Real-world experiments demonstrate that the system achieves very good accuracy in an indoor environment, where the proposed system outperforms other available solutions in terms of accuracy for 3D relative position.</p></list-item>
</list></p>
<p>The rest of the paper is divided as follows: Section 2 discusses the related literature, preliminaries are discussed in Section 3, Section 4 is dedicated to the materials and methods of the proposed solution, experimental results are analyzed in Section 5, discussion about the proposed solution in contrast to existing solutions is done in Section 6, and finally, a conclusion is drawn in Section 7.</p>
</sec>
<sec id="s2"><label>2</label><title>Related Works</title>
<p>Due to the increasing need for location-aware services, there has been a rise in research examining indoor localization over the past few years. Numerous studies have concentrated on indoor positioning using a static infrastructure. Wireless-network-based infrastructures, for instance, often include many wireless access points to provide full network coverage. Numerous off-the-shelf technologies are capable of providing RSS indicators, which can be used to estimate the position of a mobile node [<xref ref-type="bibr" rid="ref-28">28</xref>,<xref ref-type="bibr" rid="ref-29">29</xref>]. Obtaining <italic>a priori</italic> information about a structure is impractical in various situations, such as when workers are exploring disaster zones, during collaborative tasks, and for VR/AR agents. Therefore, researchers are now concentrating on relative positioning as opposed to indoor positioning. The authors of [<xref ref-type="bibr" rid="ref-30">30</xref>] described a placement technique for static sensor nodes in a sensor network. During placement, location uncertainty occurs in some nodes [<xref ref-type="bibr" rid="ref-31">31</xref>,<xref ref-type="bibr" rid="ref-32">32</xref>], and this will spread to neighboring nodes, resulting in inaccurate positioning.</p>
<p>To address these issues, a UWB-based relative-positioning scheme can be deployed, which can concurrently perform positioning and transmission with low latency. In comparison to the aforementioned conventional positioning technologies, UWB-based relative-positioning technology is much more flexible to complex scenarios and possesses all of the benefits of collaborative positioning applications. Although UWB is a kind of wireless-based positioning system, its greater temporal resolution allows it to deliver significantly more precise locations than other wireless-based solutions. UWB-based approaches have been extensively employed in transportation in recent years, but in most situations, UWB anchors must be placed beside pathways to determine the exact location of a vehicle [<xref ref-type="bibr" rid="ref-33">33</xref>,<xref ref-type="bibr" rid="ref-34">34</xref>]. However, the cost of placing UWB anchors on a large scale is high, and their installation can be very complex.</p>
<p><xref ref-type="table" rid="table-1">Table 1</xref> shows the comparison between the existing solution and the proposed method. The comparison shows that most of the works only considered relative range measurement as relative positioning. Other works lack precise accuracy, mobile structure, and cost effective solutions. Therefore, it is evident from this literature review that the relative position accuracy for both 2D and 3D applications of these approaches is still not suitable for real-time implementation in scenarios such as VR and collaborative tasks. Moreover, the approaches used by most studies do not provide relative attitude information between two mobile nodes, which is crucial for life-critical collaborative tasks. Therefore, we propose a UWB&#x2013;IMU fusion-based relative-position system to accurately calculate the relative position between two objects with a small mobile structure. The relative attitude is also calculated using a low-cost on-board IMU sensor.</p>
<table-wrap id="table-1"><label>Table 1</label><caption><title>The comparison of the existing relative positioning solutions and the proposed study</title></caption>
<table frame="hsides">
<colgroup>
<col align="left"/>
<col align="left"/>
<col align="left"/>
</colgroup>
<thead>
<tr>
<th align="left">Reference</th>
<th align="left">Methods</th>
<th align="left">Contributions and limitations</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">[<xref ref-type="bibr" rid="ref-35">35</xref>]</td>
<td align="left">UWB (TWR)</td>
<td align="left">&#x2022; Real time Collison avoidance system using UWB wasproposed.<break/>&#x2022; The mean error of 0.75&#x2005;m was achieved.<break/>&#x2022; Only the relative distance between two objects wasconsidered as a relative position.</td>
</tr>
<tr>
<td align="left">[<xref ref-type="bibr" rid="ref-36">36</xref>]</td>
<td align="left">UWB (TWR)</td>
<td align="left">&#x2022; Virtual pedestrian traffic light using UWB wasproposed to provide distance between vehicles.<break/>&#x2022; Only range measurement was considered as the relativeposition between two vehicles.</td>
</tr>
<tr>
<td align="left">[<xref ref-type="bibr" rid="ref-37">37</xref>]</td>
<td align="left">Monocular camera&#x002B;IMU&#x002B;UWB</td>
<td align="left">&#x2022; Simultaneous localization and mapping algorithm wasproposed to reduce location accuracy.<break/>&#x2022; The 3D error of 0.602&#x2005;m was achieved in theexperiments.<break/>&#x2022; The computational complexity of the proposed devicewas high.</td>
</tr>
<tr>
<td align="left">[<xref ref-type="bibr" rid="ref-38">38</xref>]</td>
<td align="left">Camera&#x002B;IMU&#x002B;UWB</td>
<td align="left">&#x2022; Range based localization with IMU and a visual camerawere proposed to estimate unknown object location.<break/>&#x2022; Keyframe based localization method was proposed toreduce drift error.<break/>&#x2022; The computational complexity of the proposed devicewas high.</td>
</tr>
<tr>
<td align="left">[<xref ref-type="bibr" rid="ref-39">39</xref>]</td>
<td align="left">UWB&#x002B;IMU</td>
<td align="left">&#x2022; UWB and IMU fusion based method was proposed forcooperative positioning.<break/>&#x2022; A dual particle filter was utilized to reduce error andacquired 2.2&#x2005;m mean error in positioning.<break/>&#x2022; The acquired accuracy is not suitable for indoorapplications.</td>
</tr>
<tr>
<td align="left">[<xref ref-type="bibr" rid="ref-40">40</xref>]</td>
<td align="left">UWB</td>
<td align="left">&#x2022; An extended Kalman filter based relative positioningmethod was proposed using UWB devices.&#x2022; The average relative positioning error of 0.8&#x2005;m wasobserved during the experiment.&#x2022; The pose estimation was not considered in this study.</td>
</tr>
<tr>
<td align="left">[<xref ref-type="bibr" rid="ref-41">41</xref>]</td>
<td align="left">UWB&#x002B;IMU</td>
<td align="left">&#x2022; UWB and IMU fusion based on MEKF was proposedfor relative positioning.<break/>&#x2022; A mean accuracy between 40&#x2013;50&#x2005;cm was observedduring the experiment.<break/>&#x2022; A decentralized mobile structure was not considered.</td>
</tr>
<tr>
<td align="left">This study</td>
<td align="left">UWB&#x002B;IMU</td>
<td align="left">&#x2022; A novel UWB and IMU fusion for relative positioningis proposed to increase the sampling rate of positionupdates.<break/>&#x2022; A distributed Euler antenna orientation is proposed toenable mobile structure based 3D relative position.<break/>&#x2022; Moving average and min-max removing filters areproposed to mitigate the standard deviation of therange measurement.<break/>&#x2022; A modified multilateration method is proposed toenable relative positioning between two objects for theproposed antenna orientation.</td>
</tr>
</tbody>
</table>
</table-wrap>
</sec>
<sec id="s3"><label>3</label><title>Preliminaries</title>
<sec id="s3_1"><label>3.1</label><title>Basic Concepts</title>
<p>A schematic of the basic system architecture can be seen in <xref ref-type="fig" rid="fig-1">Fig. 1</xref>. Multiple UWB anchors are placed in a small mobile structure to accurately estimate other mobile nodes&#x2019; relative positions. There are four UWB anchors and one UWB tag on each mobile structure (node); this enables the measurement of a 3D relative position between them.</p>
<fig id="fig-1"><label>Figure 1</label><caption><title>Overall structure of the mobile units (nodes) in the proposed relative-position system. Each node is equipped with four UWB anchors and one UWB tag</title></caption><graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_35360-fig-1.tif"/></fig>
<p>The attitude of each mobile structure is defined as <inline-formula id="ieqn-1"><mml:math id="mml-ieqn-1"><mml:msub><mml:mi>C</mml:mi><mml:mrow><mml:mi>B</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, and the body frame of mobile structure <italic>j</italic> and its 3D attitude at time <italic>t</italic> can be defined in the reference frame as <inline-formula id="ieqn-2"><mml:math id="mml-ieqn-2"><mml:msub><mml:mi>C</mml:mi><mml:mrow><mml:mi>R</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> using the transformation matrix <italic>T</italic>. The body frame is an orthogonal axis set coordinated with the mobile structure&#x2019;s attitude. The body frame in this study is defined with forward, right, and down corresponding to the <italic>x</italic>, <italic>y</italic>, and <italic>z</italic> axes, respectively. The navigation frame is a set of coordinates fixed to the Earth&#x2019;s surface and with its origin at the navigation system. The north-east-down method is used as a navigation frame. The transformation matrix of the mobile structure is defined as:
<disp-formula id="eqn-1"><label>(1)</label><mml:math id="mml-eqn-1" display="block"><mml:msubsup><mml:mrow><mml:mi>T</mml:mi></mml:mrow><mml:mrow><mml:mi>j</mml:mi></mml:mrow><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msubsup><mml:mo>=</mml:mo><mml:mrow><mml:mo>[</mml:mo><mml:mtable columnalign="left left" rowspacing="4pt" columnspacing="1em"><mml:mtr><mml:mtd><mml:msubsup><mml:mrow><mml:mi>R</mml:mi></mml:mrow><mml:mrow><mml:mi>j</mml:mi></mml:mrow><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msubsup></mml:mtd><mml:mtd><mml:msubsup><mml:mrow><mml:mi>P</mml:mi></mml:mrow><mml:mrow><mml:mi>j</mml:mi></mml:mrow><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msubsup></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:mn>0</mml:mn></mml:mtd><mml:mtd><mml:mn>1</mml:mn></mml:mtd></mml:mtr></mml:mtable><mml:mo>]</mml:mo></mml:mrow><mml:mrow><mml:mo>(</mml:mo><mml:msubsup><mml:mrow><mml:mi>R</mml:mi></mml:mrow><mml:mrow><mml:mi>j</mml:mi></mml:mrow><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msubsup><mml:mo>&#x2208;</mml:mo><mml:msup><mml:mi>M</mml:mi><mml:mrow><mml:mn>3</mml:mn><mml:mo>&#x00D7;</mml:mo><mml:mn>3</mml:mn></mml:mrow></mml:msup><mml:mo>,</mml:mo><mml:mspace width="thinmathspace" /><mml:mspace width="thinmathspace" /><mml:msubsup><mml:mrow><mml:mi>P</mml:mi></mml:mrow><mml:mrow><mml:mi>j</mml:mi></mml:mrow><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msubsup><mml:mo>&#x2208;</mml:mo><mml:msup><mml:mi>M</mml:mi><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msup><mml:mo>)</mml:mo></mml:mrow><mml:mo>,</mml:mo></mml:math></disp-formula>where <inline-formula id="ieqn-3"><mml:math id="mml-ieqn-3"><mml:msubsup><mml:mrow><mml:mi>R</mml:mi></mml:mrow><mml:mrow><mml:mi>j</mml:mi></mml:mrow><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msubsup></mml:math></inline-formula> represents the rotation matrix of the mobile structure in the reference frame and <inline-formula id="ieqn-4"><mml:math id="mml-ieqn-4"><mml:msubsup><mml:mrow><mml:mi>P</mml:mi></mml:mrow><mml:mrow><mml:mi>j</mml:mi></mml:mrow><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msubsup></mml:math></inline-formula> represents the position of the mobile structure in the reference frame. The reference frame can be arbitrary because the attitude of the mobile structure will remain the same in the case of mapping into Euclidean space. The relative attitude can be represented as <inline-formula id="ieqn-5"><mml:math id="mml-ieqn-5"><mml:msubsup><mml:mrow><mml:mi>T</mml:mi></mml:mrow><mml:mrow><mml:msub><mml:mi>B</mml:mi><mml:mrow><mml:mi>j</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:msub><mml:mi>B</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub></mml:mrow></mml:msubsup></mml:math></inline-formula>, which is the transformation of the mobile structure&#x2019;s body frame to the reference frame. Therefore, the position of mobile structure <italic>k</italic> in an indoor environment can be defined as:
<disp-formula id="eqn-2"><label>(2)</label><mml:math id="mml-eqn-2" display="block"><mml:msubsup><mml:mrow><mml:mi>P</mml:mi></mml:mrow><mml:mrow><mml:msub><mml:mi>R</mml:mi><mml:mrow><mml:mrow><mml:msub><mml:mi>L</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub></mml:mrow></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:msub><mml:mi>R</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub></mml:mrow></mml:msubsup><mml:mo>=</mml:mo><mml:msubsup><mml:mrow><mml:mi>P</mml:mi></mml:mrow><mml:mrow><mml:msub><mml:mi>R</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:msub><mml:mi>R</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub></mml:mrow></mml:msubsup><mml:mo>+</mml:mo><mml:msubsup><mml:mrow><mml:mi>P</mml:mi></mml:mrow><mml:mrow><mml:msub><mml:mi>R</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:msub><mml:mi>R</mml:mi><mml:mrow><mml:mi>t</mml:mi></mml:mrow></mml:msub></mml:mrow></mml:msubsup><mml:mspace width="thinmathspace" /><mml:mo>.</mml:mo><mml:mspace width="thinmathspace" /><mml:msubsup><mml:mrow><mml:mi>P</mml:mi></mml:mrow><mml:mrow><mml:msub><mml:mi>R</mml:mi><mml:mrow><mml:mrow><mml:msub><mml:mi>L</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub></mml:mrow></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:msub><mml:mi>B</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub></mml:mrow></mml:msubsup><mml:mo>,</mml:mo></mml:math></disp-formula>where <inline-formula id="ieqn-6"><mml:math id="mml-ieqn-6"><mml:msubsup><mml:mrow><mml:mi>P</mml:mi></mml:mrow><mml:mrow><mml:msub><mml:mi>R</mml:mi><mml:mrow><mml:mrow><mml:msub><mml:mi>L</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub></mml:mrow></mml:mrow></mml:msub></mml:mrow><mml:mrow><mml:msub><mml:mi>B</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub></mml:mrow></mml:msubsup></mml:math></inline-formula> represents the previous position of the mobile structure in its own body frame. Using this method, the relative position and attitude of a mobile reference structure and another mobile structure can be related geometrically, and their relevant relative position and attitude can be estimated in the same frame.</p>
</sec>
<sec id="s3_2"><label>3.2</label><title>UWB Range Measurement Constraints</title>
<p>A UWB-based range measurement in an LOS environment can be defined as:
<disp-formula id="eqn-3"><label>(3)</label><mml:math id="mml-eqn-3" display="block"><mml:mrow><mml:mover><mml:mi>d</mml:mi><mml:mo>&#x20DB;</mml:mo></mml:mover></mml:mrow><mml:mo>=</mml:mo><mml:mi>d</mml:mi><mml:mo>+</mml:mo><mml:mi mathvariant="normal">&#x0394;</mml:mi><mml:mi>d</mml:mi><mml:mo>+</mml:mo><mml:mi>&#x03B5;</mml:mi><mml:mo>,</mml:mo></mml:math></disp-formula>where the reported distance <inline-formula id="ieqn-7"><mml:math id="mml-ieqn-7"><mml:mrow><mml:mover><mml:mi>d</mml:mi><mml:mo>&#x20DB;</mml:mo></mml:mover></mml:mrow></mml:math></inline-formula> provided by the device consists of the actual distance <italic>d</italic>, the range error due to the antenna delay <inline-formula id="ieqn-8"><mml:math id="mml-ieqn-8"><mml:mi mathvariant="normal">&#x0394;</mml:mi><mml:mi>d</mml:mi></mml:math></inline-formula>, and the standard deviation due to the nature of the UWB device <inline-formula id="ieqn-9"><mml:math id="mml-ieqn-9"><mml:mi>&#x03B5;</mml:mi></mml:math></inline-formula>. The error of a range measurement with a UWB device is mainly due to the difference in the antenna delay. Thus, this error can be minimized by calibrating the antenna delay between the anchor and the tag. The antenna can be calibrated by initiating a range measurement and manually adjusting the antenna delay until the desired range measurement is achieved. The standard deviation of the range measurement is mitigated using a moving-average filter and a min-max sample removal filter.</p>
</sec>
</sec>
<sec id="s4"><label>4</label><title>Materials and Methods</title>
<p>The proposed system uses an IoT platform to transfer the data from the remote device to the Unity software. <xref ref-type="fig" rid="fig-2">Fig. 2</xref> shows the overall system architecture of the UWB&#x2013;IMU-based relative-position system. The UWB and IMU sensors collect data and send it to a microcontroller device using the Serial Peripheral Interface and I<sup>2</sup>C protocols. The Qorvo DWS1000 UWB module and the InvenSense MPU-9250 IMU module were selected, as they are inexpensive and provide high accuracy when compared to other modules available at a similar price. The microcontroller collects the data and performs TWR to determine the distance between two objects. Data processing is also carried out prior to the calculation of the relative position. The STM32-based Nucleo F429ZI microcontroller development board was used as per the instructions of Decawave. Then, the acquired range measurement is sent to a Raspberry Pi development board using the universal asynchronous receiver-transmitter method to calculate the relative position using a fusion of the UWB and IMU sensors. The relative-position data is then sent to the base station via the hypertext transfer protocol (HTTP) and the Unity software as a VR platform to represent the relative position between the two objects in an indoor environment. In this work, a laptop was used as a base station to collect the relative-position data and represent it in the software. The UWB&#x2013;IMU fusion-based relative-position system can be divided into two parts: a UWB-based relative-position system and a UWB&#x2013;IMU fusion-based relative-position system. Detailed descriptions of these parts are presented in this section.</p>
<fig id="fig-2"><label>Figure 2</label><caption><title>Overall architecture of the UWB&#x2013;IMU-based relative-position system</title></caption><graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_35360-fig-2.tif"/></fig>
<p><xref ref-type="fig" rid="fig-3">Fig. 3</xref> shows the proposed UWB&#x2013;IMU fusion method that is used to calculate the relative positioning of the two mobile nodes. The range measurement from the UWB devices and the IMU information from the two nodes are fused to perform this calculation. Then, an extended Kalman filter is used to process the calculated relative position, and a filtered relative position is obtained.</p>
<fig id="fig-3"><label>Figure 3</label><caption><title>Overall architecture of UWB&#x2013;IMU fusion-based 3D relative-position system</title></caption><graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_35360-fig-3.tif"/></fig>
<p>The IMU can provide displacement information about a device without the need for other devices. However, as noted, IMU measurements suffer from errors that must be minimized before fusing the IMU information with that from the UWB devices. The extended Kalman filter is used to mitigate the Euler angle and gyro-bias errors from the IMU device. The filtered values are then used to calculate the acquired relative position from the UWB system, and the attitude from the IMU module is sent to the relative position and attitude estimation unit to fuse the data. The device&#x2019;s attitude (<italic>&#x03D5;</italic>, <italic>&#x03B8;</italic>, and <italic>&#x03C8;</italic>) is calculated according to the north-east-down standard.</p>
<sec id="s4_1"><label>4.1</label><title>UWB Relative-Position System</title>
<p>The proposed method uses UWB-based range measurements from four anchors and one tag to calculate the relative position between two objects. The overall concept of this system is presented in <xref ref-type="fig" rid="fig-4">Fig. 4</xref>. Range measurements are taken using the four anchors of one node and the single tag of another node. The UWB range-measurement system uses a TWR technique to acquire the range between each anchor and the tag. The acquired range measurements are then sent to data-processing algorithms to filter and stabilize the data; moving average and min-max sample removal filters are used to process the data. The processed data are then used to calculate the relative position of the nodes in 3D. A modified multilateration technique with novel antenna orientation is used to calculate 3D relative position with reasonable accuracy. Although the available positioning algorithms have good accuracy, absolute mitigation of the position error is not possible. A Kalman filter was developed for the proposed method to reduce position error.</p>
<fig id="fig-4"><label>Figure 4</label><caption><title>Proposed 3D relative-position measurement using UWB</title></caption><graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_35360-fig-4.tif"/></fig>
<sec id="s4_1_1"><label>4.1.1</label><title>Range Measurement</title>
<p>UWB devices offer many approaches for determining the distance between mobile nodes, including TDOA, TWR, and AOA. In our proposed system, the asymmetric double-sided TWR method is used to calculate the range between two nodes by exchanging four messages [<xref ref-type="bibr" rid="ref-18">18</xref>]. The deployed ranging approach lowers the error due to clock drift on both nodes, and it does not require matching reply delays. <xref ref-type="fig" rid="fig-5">Fig. 5</xref> shows a complete range cycle being used to derive the range between two nodes. The tag starts the process and collects all the timestamps with the fourth message to calculate the range. The UWB device&#x2019;s antenna delay is also calibrated according to the instructions in the Decawave UWB chip user manual. The relevant equations are:
<disp-formula id="eqn-4"><label>(4)</label><mml:math id="mml-eqn-4" display="block"><mml:msub><mml:mrow><mml:mover><mml:mi>T</mml:mi><mml:mo stretchy="false">&#x005E;</mml:mo></mml:mover></mml:mrow><mml:mrow><mml:mrow><mml:mtext>prop</mml:mtext></mml:mrow></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>T</mml:mi><mml:mrow><mml:mrow><mml:mtext>round</mml:mtext></mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>&#x00D7;</mml:mo><mml:msub><mml:mi>T</mml:mi><mml:mrow><mml:mrow><mml:mtext>round</mml:mtext></mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msub><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>T</mml:mi><mml:mrow><mml:mrow><mml:mtext>reply</mml:mtext></mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>&#x00D7;</mml:mo><mml:msub><mml:mi>T</mml:mi><mml:mrow><mml:mrow><mml:mtext>reply</mml:mtext></mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msub><mml:mo>)</mml:mo></mml:mrow><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>T</mml:mi><mml:mrow><mml:mrow><mml:mtext>round</mml:mtext></mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:msub><mml:mi>T</mml:mi><mml:mrow><mml:mrow><mml:mtext>round</mml:mtext></mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:msub><mml:mi>T</mml:mi><mml:mrow><mml:mrow><mml:mtext>reply</mml:mtext></mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:msub><mml:mi>T</mml:mi><mml:mrow><mml:mrow><mml:mtext>reply</mml:mtext></mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msub><mml:mo>)</mml:mo></mml:mrow></mml:mfrac><mml:mo>,</mml:mo></mml:math></disp-formula>
<disp-formula id="eqn-5"><label>(5)</label><mml:math id="mml-eqn-5" display="block"><mml:mi>d</mml:mi><mml:mo>=</mml:mo><mml:msub><mml:mrow><mml:mover><mml:mi>T</mml:mi><mml:mo stretchy="false">&#x005E;</mml:mo></mml:mover></mml:mrow><mml:mrow><mml:mrow><mml:mtext>prop</mml:mtext></mml:mrow></mml:mrow></mml:msub><mml:mo>&#x00D7;</mml:mo><mml:mi>c</mml:mi><mml:mo>.</mml:mo></mml:math></disp-formula></p>
<fig id="fig-5"><label>Figure 5</label><caption><title>Double-sided TWR used in our proposed system</title></caption><graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_35360-fig-5.tif"/></fig>
</sec>
<sec id="s4_1_2"><label>4.1.2</label><title>Min&#x2013;Max Removal and Moving-Average Filter</title>
<p>As noted, the estimated range measurement from the UWB device is also affected by the standard-deviation problem, which makes positioning very unstable. To mitigate this problem, we used min-max sample removal followed by a moving-average filter. The update rate of the UWB range samples is 30&#x2005;Hz. Thus, 30 samples are collected in a list to find the maximum and minimum values, and these values are removed from the list prior to it being sent to the moving-average filter.</p>
<p>Moving-average filters are widely used to regulate many types of collected data and signals; they take <italic>M</italic> input samples at a time and average them to get a single output point. The smoothness of the output increases as the filter length grows, and any sharp modulations in the data become progressively flatter. Initially, 30 samples are taken to smooth the range measurement, but after the min-max sample removal, 28 samples are used to perform the moving-average filter. The moving average (MA) is:
<disp-formula id="eqn-6"><label>(6)</label><mml:math id="mml-eqn-6" display="block"><mml:mrow><mml:mtext>MA</mml:mtext></mml:mrow><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:msub><mml:mrow><mml:mover><mml:mi>d</mml:mi><mml:mo>&#x20DB;</mml:mo></mml:mover></mml:mrow><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:msub><mml:mrow><mml:mover><mml:mi>d</mml:mi><mml:mo>&#x20DB;</mml:mo></mml:mover></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:mo>&#x2026;</mml:mo><mml:mo>+</mml:mo><mml:msub><mml:mrow><mml:mover><mml:mi>d</mml:mi><mml:mo>&#x20DB;</mml:mo></mml:mover></mml:mrow><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:mrow><mml:mi>n</mml:mi></mml:mfrac><mml:mo>,</mml:mo></mml:math></disp-formula>where <inline-formula id="ieqn-10"><mml:math id="mml-ieqn-10"><mml:msub><mml:mrow><mml:mover><mml:mi>d</mml:mi><mml:mo>&#x20DB;</mml:mo></mml:mover></mml:mrow><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> represents the estimated range-measurement samples, and <italic>n</italic> represents the sample number.</p>
</sec>
<sec id="s4_1_3"><label>4.1.3</label><title>UWB-Based Relative Position</title>
<p>Each node has four anchors at a fixed distance, and a tag is placed between them. In UWB-based devices, the antenna orientation affects the positioning performance, and researchers have found that a vertical orientation yields better results than a horizontal orientation [<xref ref-type="bibr" rid="ref-42">42</xref>]. Therefore, we placed the antennas of the four anchors in a vertical orientation, with the antennas of anchors 3 and 4 rotated by 90 degrees. <xref ref-type="fig" rid="fig-6">Fig. 6</xref> shows a diagram of this arrangement. The equations required to calculate the relative position using the four anchors are:
<disp-formula id="eqn-7"><label>(7)</label><mml:math id="mml-eqn-7" display="block"><mml:msup><mml:mi>x</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup><mml:mo>+</mml:mo><mml:msup><mml:mrow><mml:mo>(</mml:mo><mml:mi>y</mml:mi><mml:mo>+</mml:mo><mml:mfrac><mml:mrow><mml:mi>k</mml:mi><mml:mi>y</mml:mi></mml:mrow><mml:mn>2</mml:mn></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup><mml:mo>+</mml:mo><mml:msup><mml:mi>z</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup><mml:mo>=</mml:mo><mml:msubsup><mml:mrow><mml:mi>d</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msubsup><mml:mo>,</mml:mo></mml:math></disp-formula>
<disp-formula id="eqn-8"><label>(8)</label><mml:math id="mml-eqn-8" display="block"><mml:msup><mml:mi>x</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup><mml:mo>+</mml:mo><mml:msup><mml:mrow><mml:mo>(</mml:mo><mml:mi>y</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mfrac><mml:mrow><mml:mi>k</mml:mi><mml:mi>y</mml:mi></mml:mrow><mml:mn>2</mml:mn></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup><mml:mo>+</mml:mo><mml:msup><mml:mi>z</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup><mml:mo>=</mml:mo><mml:msubsup><mml:mrow><mml:mi>d</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msubsup><mml:mo>,</mml:mo></mml:math></disp-formula>
<disp-formula id="eqn-9"><label>(9)</label><mml:math id="mml-eqn-9" display="block"><mml:msup><mml:mi>x</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup><mml:mo>+</mml:mo><mml:msup><mml:mi>y</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup><mml:mo>+</mml:mo><mml:msup><mml:mrow><mml:mo>(</mml:mo><mml:mi>z</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mfrac><mml:mrow><mml:mi>k</mml:mi><mml:mi>z</mml:mi></mml:mrow><mml:mn>2</mml:mn></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup><mml:mo>=</mml:mo><mml:msubsup><mml:mrow><mml:mi>d</mml:mi></mml:mrow><mml:mrow><mml:mn>3</mml:mn></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msubsup><mml:mo>,</mml:mo></mml:math></disp-formula>
<disp-formula id="eqn-10"><label>(10)</label><mml:math id="mml-eqn-10" display="block"><mml:msup><mml:mi>x</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup><mml:mo>+</mml:mo><mml:msup><mml:mi>y</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup><mml:mo>+</mml:mo><mml:msup><mml:mrow><mml:mo>(</mml:mo><mml:mi>z</mml:mi><mml:mo>+</mml:mo><mml:mfrac><mml:mrow><mml:mi>k</mml:mi><mml:mi>z</mml:mi></mml:mrow><mml:mn>2</mml:mn></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup><mml:mo>=</mml:mo><mml:msubsup><mml:mrow><mml:mi>d</mml:mi></mml:mrow><mml:mrow><mml:mn>4</mml:mn></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msubsup><mml:mo>,</mml:mo></mml:math></disp-formula>where <inline-formula id="ieqn-11"><mml:math id="mml-ieqn-11"><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mn>1</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-12"><mml:math id="mml-ieqn-12"><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula>, <inline-formula id="ieqn-13"><mml:math id="mml-ieqn-13"><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula>, and <inline-formula id="ieqn-14"><mml:math id="mml-ieqn-14"><mml:msub><mml:mi>d</mml:mi><mml:mrow><mml:mn>4</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula> are the range measurements from the four anchor devices. The above equations are formulated to calculate the relative position to accommodate the anchor placement on the device. These equations are then solved to obtain the other agent&#x2019;s <italic>x</italic>, <italic>y</italic>, and <italic>z</italic> coordinates in three dimensions. The following equations are the solution of <xref ref-type="disp-formula" rid="eqn-7">(7)</xref>&#x2013;<xref ref-type="disp-formula" rid="eqn-10">(10)</xref>:
<disp-formula id="eqn-11"><label>(11)</label><mml:math id="mml-eqn-11" display="block"><mml:mi>y</mml:mi><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:msubsup><mml:mrow><mml:mi>d</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msubsup><mml:mo>&#x2212;</mml:mo><mml:msubsup><mml:mrow><mml:mi>d</mml:mi></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msubsup></mml:mrow><mml:mrow><mml:mn>4</mml:mn><mml:mo>&#x00D7;</mml:mo><mml:mi>k</mml:mi><mml:mi>y</mml:mi><mml:mrow><mml:mo>/</mml:mo></mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:mfrac><mml:mo>,</mml:mo></mml:math></disp-formula>
<disp-formula id="eqn-12"><label>(12)</label><mml:math id="mml-eqn-12" display="block"><mml:mi>z</mml:mi><mml:mo>=</mml:mo><mml:mfrac><mml:mrow><mml:msubsup><mml:mrow><mml:mi>d</mml:mi></mml:mrow><mml:mrow><mml:mn>4</mml:mn></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msubsup><mml:mo>&#x2212;</mml:mo><mml:msubsup><mml:mrow><mml:mi>d</mml:mi></mml:mrow><mml:mrow><mml:mn>3</mml:mn></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msubsup></mml:mrow><mml:mrow><mml:mn>4</mml:mn><mml:mo>&#x00D7;</mml:mo><mml:mi>k</mml:mi><mml:mi>z</mml:mi><mml:mrow><mml:mo>/</mml:mo></mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:mfrac><mml:mo>,</mml:mo></mml:math></disp-formula>
<disp-formula id="eqn-13"><label>(13)</label><mml:math id="mml-eqn-13" display="block"><mml:mi>x</mml:mi><mml:mo>=</mml:mo><mml:msqrt><mml:msubsup><mml:mrow><mml:mi>d</mml:mi></mml:mrow><mml:mrow><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msubsup><mml:mo>&#x2212;</mml:mo><mml:msup><mml:mrow><mml:mo>(</mml:mo><mml:mi>y</mml:mi><mml:mo>+</mml:mo><mml:mfrac><mml:mrow><mml:mi>k</mml:mi><mml:mi>y</mml:mi></mml:mrow><mml:mn>2</mml:mn></mml:mfrac><mml:mo>)</mml:mo></mml:mrow><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup><mml:mo>&#x2212;</mml:mo><mml:msup><mml:mi>z</mml:mi><mml:mrow><mml:mn>2</mml:mn></mml:mrow></mml:msup></mml:msqrt><mml:mo>.</mml:mo></mml:math></disp-formula></p>
<fig id="fig-6"><label>Figure 6</label><caption><title>The proposed architecture of the 3D relative-position system using UWB between two devices</title></caption><graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_35360-fig-6.tif"/></fig>
</sec>
</sec>
<sec id="s4_2"><label>4.2</label><title>UWB&#x2013;IMU Fusion Relative-Position System</title>
<sec id="s4_2_1"><label>4.2.1</label><title>Fusion Method</title>
<p>The relative positions acquired from the UWB measurements and the attitude acquired from the IMU are sent to the relative position and attitude estimation unit to fuse the data using an extended Kalman filter. An extended Kalman filter can be applied to solve nonlinear problems, and it is widely used to estimate the positions and attitudes of objects. The relative position and attitude at time <italic>k</italic> can be defined as state <inline-formula id="ieqn-15"><mml:math id="mml-ieqn-15"><mml:msub><mml:mi>S</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula>, which can be calculated from the function related to state <inline-formula id="ieqn-16"><mml:math id="mml-ieqn-16"><mml:msub><mml:mi>S</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> at the (<italic>k</italic>&#x2212;1)th timestep measurement:
<disp-formula id="eqn-14"><label>(14)</label><mml:math id="mml-eqn-14" display="block"><mml:msub><mml:mi>S</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>S</mml:mi><mml:mrow><mml:mi>k</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>)</mml:mo></mml:mrow><mml:mo>,</mml:mo></mml:math></disp-formula>where <inline-formula id="ieqn-17"><mml:math id="mml-ieqn-17"><mml:msub><mml:mi>f</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> is a nonlinear function. The measurement model can be defined using the measurement <inline-formula id="ieqn-18"><mml:math id="mml-ieqn-18"><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> obtained at state <inline-formula id="ieqn-19"><mml:math id="mml-ieqn-19"><mml:msub><mml:mi>S</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> with measurement noise <italic>n</italic> at time <italic>k</italic> using:
<disp-formula id="eqn-15"><label>(15)</label><mml:math id="mml-eqn-15" display="block"><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:msub><mml:mi>h</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>S</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mo>,</mml:mo><mml:msub><mml:mi>n</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mo>)</mml:mo></mml:mrow><mml:mo>,</mml:mo></mml:math></disp-formula>where <inline-formula id="ieqn-20"><mml:math id="mml-ieqn-20"><mml:msub><mml:mi>h</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub></mml:math></inline-formula> can be a linear or nonlinear function. Based on the previous mathematical steps, the state vector can be defined using:
<disp-formula id="eqn-16"><label>(16)</label><mml:math id="mml-eqn-16" display="block"><mml:msub><mml:mi>S</mml:mi><mml:mrow><mml:mi>k</mml:mi><mml:mo>,</mml:mo><mml:mi>k</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mi>k</mml:mi><mml:mo>,</mml:mo><mml:mi>k</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:msub><mml:mi>S</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:msub><mml:mi>V</mml:mi><mml:mrow><mml:mi>k</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>,</mml:mo></mml:math></disp-formula>where <inline-formula id="ieqn-21"><mml:math id="mml-ieqn-21"><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mi>k</mml:mi><mml:mo>,</mml:mo><mml:mi>k</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula> is the transition matrix and <inline-formula id="ieqn-22"><mml:math id="mml-ieqn-22"><mml:msub><mml:mi>V</mml:mi><mml:mrow><mml:mi>k</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub></mml:math></inline-formula> is the random distribution vector. The prediction of the error covariance of the extended Kalman filter can be defined as:
<disp-formula id="eqn-17"><label>(17)</label><mml:math id="mml-eqn-17" display="block"><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>k</mml:mi><mml:mo>,</mml:mo><mml:mi>k</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:msub><mml:mi>F</mml:mi><mml:mrow><mml:mi>k</mml:mi><mml:mo>,</mml:mo><mml:mi>k</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>k</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:msubsup><mml:mrow><mml:mi>F</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi><mml:mo>,</mml:mo><mml:mi>k</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow><mml:mrow><mml:mrow><mml:mtext>T</mml:mtext></mml:mrow></mml:mrow></mml:msubsup><mml:mo>+</mml:mo><mml:msub><mml:mi>Q</mml:mi><mml:mrow><mml:mi>k</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>.</mml:mo></mml:math></disp-formula></p>
<p>The Kalman gain can be calculated using:
<disp-formula id="eqn-18"><label>(18)</label><mml:math id="mml-eqn-18" display="block"><mml:msub><mml:mi>K</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>k</mml:mi><mml:mo>,</mml:mo><mml:mi>k</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:msubsup><mml:mrow><mml:mi>H</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow><mml:mrow><mml:mrow><mml:mtext>T</mml:mtext></mml:mrow></mml:mrow></mml:msubsup><mml:mrow><mml:mo>[</mml:mo><mml:msub><mml:mi>H</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>k</mml:mi><mml:mo>,</mml:mo><mml:mi>k</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:msubsup><mml:mrow><mml:mi>H</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow><mml:mrow><mml:mrow><mml:mtext>T</mml:mtext></mml:mrow></mml:mrow></mml:msubsup><mml:mo>+</mml:mo><mml:msub><mml:mi>R</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mo>]</mml:mo></mml:mrow><mml:mo>.</mml:mo></mml:math></disp-formula></p>
<p>The state vector and error covariance are then updated at time <italic>k</italic> using:
<disp-formula id="eqn-19"><label>(19)</label><mml:math id="mml-eqn-19" display="block"><mml:msub><mml:mi>S</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:msub><mml:mi>S</mml:mi><mml:mrow><mml:mi>k</mml:mi><mml:mo>,</mml:mo><mml:mi>k</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>+</mml:mo><mml:msub><mml:mi>K</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>[</mml:mo><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mo>&#x2212;</mml:mo><mml:mi>H</mml:mi><mml:mrow><mml:mo>(</mml:mo><mml:msub><mml:mi>X</mml:mi><mml:mrow><mml:mi>k</mml:mi><mml:mo>,</mml:mo><mml:mi>k</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>)</mml:mo></mml:mrow><mml:mo>]</mml:mo></mml:mrow><mml:mo>,</mml:mo></mml:math></disp-formula>
<disp-formula id="eqn-20"><label>(20)</label><mml:math id="mml-eqn-20" display="block"><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mo>=</mml:mo><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>k</mml:mi><mml:mo>,</mml:mo><mml:mi>k</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>K</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:mrow><mml:mo>[</mml:mo><mml:msub><mml:mi>H</mml:mi><mml:mrow><mml:mi>k</mml:mi></mml:mrow></mml:msub><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mi>k</mml:mi><mml:mo>,</mml:mo><mml:mi>k</mml:mi><mml:mo>&#x2212;</mml:mo><mml:mn>1</mml:mn></mml:mrow></mml:msub><mml:msubsup><mml:mrow><mml:mi>H</mml:mi></mml:mrow><mml:mrow><mml:mi>k</mml:mi></mml:mrow><mml:mrow><mml:mrow><mml:mtext>T</mml:mtext></mml:mrow></mml:mrow></mml:msubsup><mml:mo>]</mml:mo></mml:mrow><mml:mo>.</mml:mo></mml:math></disp-formula></p>
<p>The relative position and attitude are then fused to calculate the final relative position and attitude between two mobile nodes using an extended Kalman filter. The state variable <italic>s</italic> of the extended filter consists of the relative position from the IMU <inline-formula id="ieqn-23"><mml:math id="mml-ieqn-23"><mml:msup><mml:mi>P</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msup></mml:math></inline-formula>, the velocity <inline-formula id="ieqn-24"><mml:math id="mml-ieqn-24"><mml:msup><mml:mi>V</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msup></mml:math></inline-formula>, and attitude <inline-formula id="ieqn-25"><mml:math id="mml-ieqn-25"><mml:msubsup><mml:mrow><mml:mi>B</mml:mi></mml:mrow><mml:mrow><mml:mi>f</mml:mi></mml:mrow><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msubsup></mml:math></inline-formula>:</p>
<p>State variable, <inline-formula id="ieqn-26"><mml:math id="mml-ieqn-26"><mml:mi>s</mml:mi><mml:mo>=</mml:mo><mml:msup><mml:mrow><mml:mo>[</mml:mo><mml:mtable columnalign="left left left" rowspacing="4pt" columnspacing="1em"><mml:mtr><mml:mtd><mml:mi>&#x03B4;</mml:mi><mml:msup><mml:mi>P</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msup></mml:mtd><mml:mtd><mml:mi>&#x03B4;</mml:mi><mml:msup><mml:mi>V</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msup></mml:mtd><mml:mtd><mml:mi>&#x03B4;</mml:mi><mml:msubsup><mml:mrow><mml:mi>B</mml:mi></mml:mrow><mml:mrow><mml:mi>f</mml:mi></mml:mrow><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msubsup></mml:mtd></mml:mtr></mml:mtable><mml:mo>]</mml:mo></mml:mrow><mml:mrow><mml:mrow><mml:mtext>T</mml:mtext></mml:mrow></mml:mrow></mml:msup><mml:mo>.</mml:mo></mml:math></inline-formula></p>
<p>The state transition matrix of the fusion method is defined as a 3&#x2009;&#x00D7;&#x2009;3 matrix, which consists of an identity matrix, zero matrices, and delta time of measurements:</p>
<p>State transition matrix, <inline-formula id="ieqn-27"><mml:math id="mml-ieqn-27"><mml:mi>F</mml:mi><mml:mo>=</mml:mo><mml:mrow><mml:mo>[</mml:mo><mml:mtable columnalign="left center center" rowspacing="4pt" columnspacing="1em"><mml:mtr><mml:mtd><mml:msub><mml:mi>I</mml:mi><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msub></mml:mtd><mml:mtd><mml:mi mathvariant="normal">&#x0394;</mml:mi><mml:mi>t</mml:mi></mml:mtd><mml:mtd><mml:msub><mml:mi>O</mml:mi><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msub></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:msub><mml:mi>O</mml:mi><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msub></mml:mtd><mml:mtd><mml:msub><mml:mi>I</mml:mi><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msub></mml:mtd><mml:mtd><mml:mi mathvariant="normal">&#x0394;</mml:mi><mml:mi>t</mml:mi></mml:mtd></mml:mtr><mml:mtr><mml:mtd><mml:msub><mml:mi>O</mml:mi><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msub></mml:mtd><mml:mtd><mml:msub><mml:mi>O</mml:mi><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msub></mml:mtd><mml:mtd><mml:msub><mml:mi>I</mml:mi><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msub></mml:mtd></mml:mtr></mml:mtable><mml:mo>]</mml:mo></mml:mrow><mml:mo>.</mml:mo></mml:math></inline-formula></p>
<p>Two measurement values are defined to update the position and velocity of the positioning module. The first measurement value is equal to the relative position of the UWB subtracted from the relative position of the IMU. The second measurement value is initiated with the velocity acquired from the IMU sensors.</p>
<p>Measurement, <inline-formula id="ieqn-28"><mml:math id="mml-ieqn-28"><mml:mi>x</mml:mi><mml:mo>=</mml:mo><mml:msup><mml:mrow><mml:mo>[</mml:mo><mml:msup><mml:mi>P</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msup><mml:mo>&#x2212;</mml:mo><mml:msub><mml:mi>P</mml:mi><mml:mrow><mml:mrow><mml:mtext>UWB</mml:mtext></mml:mrow></mml:mrow></mml:msub><mml:mo>]</mml:mo></mml:mrow><mml:mrow><mml:mrow><mml:mtext>T</mml:mtext></mml:mrow></mml:mrow></mml:msup></mml:math></inline-formula> and <inline-formula id="ieqn-29"><mml:math id="mml-ieqn-29"><mml:mi>x</mml:mi><mml:mo>=</mml:mo><mml:msup><mml:mi>V</mml:mi><mml:mrow><mml:mi>n</mml:mi></mml:mrow></mml:msup></mml:math></inline-formula>.</p>
<p><italic>H</italic> is defined as two measurement matrices to calculate the Kalman gain and the update of the measurement from the positioning system. The measurement matrix consists of identity and zero matrices:</p>
<p>Measurement matrix,<inline-formula id="ieqn-30"><mml:math id="mml-ieqn-30"><mml:mi>H</mml:mi><mml:mo>=</mml:mo><mml:mrow><mml:mo>[</mml:mo><mml:mtable columnalign="left left left" rowspacing="4pt" columnspacing="1em"><mml:mtr><mml:mtd><mml:msub><mml:mi>I</mml:mi><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msub></mml:mtd><mml:mtd><mml:msub><mml:mi>O</mml:mi><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msub></mml:mtd><mml:mtd><mml:msub><mml:mi>O</mml:mi><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msub></mml:mtd></mml:mtr></mml:mtable><mml:mo>]</mml:mo></mml:mrow></mml:math></inline-formula> and <inline-formula id="ieqn-31"><mml:math id="mml-ieqn-31"><mml:mi>H</mml:mi><mml:mo>=</mml:mo><mml:mrow><mml:mo>[</mml:mo><mml:mtable columnalign="left left left" rowspacing="4pt" columnspacing="1em"><mml:mtr><mml:mtd><mml:msub><mml:mi>O</mml:mi><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msub></mml:mtd><mml:mtd><mml:msub><mml:mi>I</mml:mi><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msub></mml:mtd><mml:mtd><mml:msub><mml:mi>O</mml:mi><mml:mrow><mml:mn>3</mml:mn></mml:mrow></mml:msub></mml:mtd></mml:mtr></mml:mtable><mml:mo>]</mml:mo></mml:mrow></mml:math></inline-formula>.</p>
</sec>
<sec id="s4_2_2"><label>4.2.2</label><title>Data Transmission</title>
<p>The acquired relative position is transferred to a base station, in this case as a laptop, to represent it in the Unity software with avatars. In this study, Wi-Fi was used to transmit the data to the laptop via HTTP. The data was initially stored on a Raspberry Pi before being aggregated and coded into a data frame, including the device ID, 3D position value, and 3D attitude value. The unity VR simulation platform was programmed so that a real-time relative-position demonstration could be shown in the form of an avatar-to-avatar relative positioning.</p>
</sec>
</sec>
</sec>
<sec id="s5"><label>5</label><title>Experimental Results</title>
<p>An extensive experimental study was conducted to evaluate the performance of the proposed UWB&#x2013;IMU fusion-based relative-position scheme. This section presents experimental results and relevant plots to show the potential of fusing UWB and IMU to provide a smooth relative position.</p>
<sec id="s5_1"><label>5.1</label><title>Experimental Environment and Setup</title>
<p>All experiments were performed in an indoor environment with two mobile nodes. Decawave DW1000 wireless transceiver chips were used, as these devices are very cheap and can be programmed for development. Researchers conducted extensive experiments on the different UWB devices such as Decawave, BeSpoon, and Ubisense for indoor localization scenarios and found out that the Decawave UWB device performs better than other UWB devices [<xref ref-type="bibr" rid="ref-43">43</xref>]. Therefore, Decawave DW1000 UWB chips were used throughout the study to acquire range measurements. The anchor height was 12&#x2005;cm, and the width was 65&#x2005;cm. Moreover, the anchor was vertically placed relative to the tag. The first two anchors were placed vertically at a 90-degree angle, and the last two anchors were rotated to 90 degrees left. InvenSense MPU-9250 nine-axis IMU sensors were used to provide the acceleration, gyroscope, and magnetic information. Both stationary and moving experiments were conducted, and the results were analyzed by calculating the root-mean-square error (RMSE).</p>
</sec>
<sec id="s5_2"><label>5.2</label><title>UWB-Based Relative-Position Experiments</title>
<p>For the experiments, two scenarios were considered in which two mobile nodes were placed 5&#x2005;m apart: in scenario 1, one of the nodes was moved 50&#x2005;cm in the <italic>x</italic> direction; in scenario 2, it was moved 50&#x2005;cm in the <italic>y</italic> direction. The initial position of the anchor mobile node was fixed at (0, 0, 0), and the target mobile node was initially fixed at (5, 0, 0).</p>
<p><xref ref-type="fig" rid="fig-7">Fig. 7</xref> illustrates the results of scenario 1, in which the mobile node was moved 50&#x2005;cm in the <italic>x</italic> direction. The target node&#x2019;s relative position was acquired with respect to the anchor node. It can be seen from the plot that the mobile node was stable for the initial few seconds prior to moving. It should be noted that the calculated position of the mobile node fluctuated from 0 to 20 cm on the <italic>y</italic>-axis and &#x2212;10 to &#x2212;20&#x2005;cm on the <italic>z</italic>-axis, even though there was no actual movement in these directions. The movement of the target node by 50&#x2005;cm on the <italic>x</italic>-axis can be seen in the upper plot. However, the new position should correspond to 450&#x2005;cm, but the mobile node&#x2019;s calculated relative position varied between 450 and 470&#x2005;cm. Furthermore, the mobile node&#x2019;s relative <italic>y-</italic>axis and <italic>z</italic>-axis positions continued to vary throughout the experiment.</p>
<fig id="fig-7"><label>Figure 7</label><caption><title>Relative position estimation using the UWB module with a 50&#x2005;cm movement on the <italic>x</italic>-axis (from top to bottom, the plots show positions in the <italic>x</italic>, <italic>y</italic>, and <italic>z</italic> directions, respectively)</title></caption><graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_35360-fig-7.tif"/></fig>
<p><xref ref-type="table" rid="table-2">Table 2</xref> shows the RMSE of the 3D position from the UWB method with scenario 1. The mobile node was stationary for the initial 3000 samples, and the RMSE values were calculated using those samples. The RMSE values were then calculated again using the samples acquired after the node had been moved.</p>
<table-wrap id="table-2"><label>Table 2</label><caption><title>RMSE of the proposed system for scenario 1 (move 50&#x2005;cm on the x-axis)</title></caption>
<table frame="hsides">
<colgroup>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
</colgroup>
<thead>
<tr>
<th align="left">Scenario</th>
<th align="left">X</th>
<th align="left">Y</th>
<th align="left">Z</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">Stationary</td>
<td align="left">7.756</td>
<td align="left">19.147</td>
<td align="left">21.024</td>
</tr>
<tr>
<td align="left">Moving</td>
<td align="left">20.773</td>
<td align="left">27.648</td>
<td align="left">26.353</td>
</tr>
</tbody>
</table>
</table-wrap>
<p><xref ref-type="fig" rid="fig-8">Fig. 8</xref> illustrates the results of scenario 2, in which the mobile node was moved 50&#x2005;cm in the <italic>y</italic> direction. The target mobile node&#x2019;s relative position was again acquired with respect to the anchor node. It can be seen from the plot that both nodes were kept stationary for the initial few seconds prior to moving. Again, it can be noted that the measured <italic>x</italic>-axis position fluctuated between 490 and 500&#x2005;cm, and the <italic>y</italic>-axis position fluctuated between &#x2212;15 and &#x2212;35&#x2005;cm, even though there was no movement. The movement of the target node by 50&#x2005;cm on the <italic>y</italic>-axis can be seen in the central plot. However, the new position should correspond to 50&#x2005;cm, but the mobile node&#x2019;s calculated relative position varied between 45 and 65&#x2005;cm. Furthermore, the mobile node&#x2019;s relative <italic>x-</italic>axi<italic>s</italic> and <italic>z</italic>-axis positions again continued to vary throughout the experiment. As with scenario 1, <xref ref-type="table" rid="table-3">Table 3</xref> shows the RMSE of the 3D position from the UWB method with scenario 2.</p>
<fig id="fig-8"><label>Figure 8</label><caption><title>Relative position estimation using the UWB module with a 50&#x2005;cm movement on the <italic>y</italic>-axis (from top to bottom, the plots show positions in the <italic>x</italic>, <italic>y</italic>, and <italic>z</italic> directions, respectively)</title></caption><graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_35360-fig-8.tif"/></fig><table-wrap id="table-3"><label>Table 3</label><caption><title>RMSE of the proposed system for scenario 2 (move 50&#x2005;cm on the y-axis)</title></caption>
<table frame="hsides">
<colgroup>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
</colgroup>
<thead>
<tr>
<th align="left">Scenarios</th>
<th align="left">X</th>
<th align="left">Y</th>
<th align="left">Z</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">Stationary</td>
<td align="left">8.325</td>
<td align="left">16.546</td>
<td align="left">19.621</td>
</tr>
<tr>
<td align="left">Moving</td>
<td align="left">11.457</td>
<td align="left">29.324</td>
<td align="left">28.324</td>
</tr>
</tbody>
</table>
</table-wrap>
</sec>
<sec id="s5_3"><label>5.3</label><title>UWB&#x2013;IMU-Based Relative-Position Experiments</title>
<p>This section presents the UWB&#x2013;IMU fusion experimental results and relevant plots to show the potential of fusing UWB and IMU to provide a smooth relative position. As with the experiments in the previous section, two scenarios were considered. Two mobile nodes were again placed 5&#x2005;m apart: in scenario 1, one of the nodes was moved 50&#x2005;cm in the <italic>x</italic> direction; in scenario 2, it was moved 50&#x2005;cm in the <italic>y</italic> direction. Again, the initial position of the anchor mobile node was fixed at (0, 0, 0), and the target mobile node was initially fixed at (5, 0, 0). The UWB position measurement frequency was 1&#x2005;Hz, and the IMU sensor measurement frequency was 50&#x2005;Hz.</p>
<p><xref ref-type="fig" rid="fig-9">Fig. 9</xref> shows the experimental results of scenario 1, in which the mobile node was moved 50&#x2005;cm in the <italic>x</italic> direction. The target mobile node&#x2019;s relative position was acquired with respect to the anchor node. Again, there were fluctuations in the positions even though the nodes were stationary. However, it should be noted that the mobile node&#x2019;s position fluctuated between 5 and 15&#x2005;cm on the <italic>y</italic>-axis and between 0 and &#x2212;5&#x2005;cm on the <italic>z</italic>-axis. The movement of the mobile node on the <italic>x</italic>-axis can be seen in the upper plot. However, it was moved by 50&#x2005;cm, which should correspond to a position of 450&#x2005;cm, but the calculated relative position actually varied from 435 to 450&#x2005;cm. Furthermore, the relative positions on the <italic>y</italic> and <italic>z</italic> axes again continued to vary throughout the experiments. As with the previous experiments, <xref ref-type="table" rid="table-4">Table 4</xref> shows the RMSE values of the UWB&#x2013;IMU fusion method for scenario 1.</p>
<fig id="fig-9"><label>Figure 9</label><caption><title>Relative position estimation using the UWB&#x2013;IMU fusion module with a 50&#x2005;cm movement on the <italic>x</italic>-axis (from top to bottom, the plots show positions in the <italic>x</italic>, <italic>y</italic>, and <italic>z</italic> directions, respectively)</title></caption><graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_35360-fig-9.tif"/></fig><table-wrap id="table-4"><label>Table 4</label><caption><title>RMSE of the UWB-IMU based system for scenario 1 (move 50&#x2005;cm on the x-axis)</title></caption>
<table frame="hsides">
<colgroup>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
</colgroup>
<thead>
<tr>
<th align="left">Scenario</th>
<th align="left">X</th>
<th align="left">Y</th>
<th align="left">Z</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">Stationary</td>
<td align="left">3.023</td>
<td align="left">11.972</td>
<td align="left">11.516</td>
</tr>
<tr>
<td align="left">Moving</td>
<td align="left">15.546</td>
<td align="left">11.321</td>
<td align="left">20.948</td>
</tr>
</tbody>
</table>
</table-wrap>
<p><xref ref-type="fig" rid="fig-10">Fig. 10</xref> illustrates the results of scenario 2, in which the mobile node was moved 50&#x2005;cm in the <italic>y</italic> direction. Once more, the target mobile node&#x2019;s relative position was acquired with respect to the anchor node. This time, it can be seen that the <italic>x</italic>-axis position fluctuated between 490 and 500&#x2005;cm, and the <italic>z</italic>-axis position fluctuated between 0 and &#x2212;15&#x2005;cm even though there was no movement. The movement of the target node by 50&#x2005;cm on the <italic>y</italic>-axis can be seen in the central plot. However, the new position should correspond to 50&#x2005;cm, but the mobile node&#x2019;s calculated relative position varied between 50 and 65&#x2005;cm. Furthermore, the mobile node&#x2019;s relative <italic>x-</italic>axis and <italic>z</italic>-axis positions once more varied throughout the experiment. As with scenario 1, <xref ref-type="table" rid="table-5">Table 5</xref> shows the RMSE of the 3D position from the UWB&#x2013;IMU fusion method with scenario 2.</p>
<fig id="fig-10"><label>Figure 10</label><caption><title>Relative position estimation using the UWB&#x2013;IMU fusion module with a 50&#x2005;cm movement on the <italic>y</italic>-axis (from top to bottom, the plots show positions in the <italic>x</italic>, <italic>y</italic>, and <italic>z</italic> directions, respectively)</title></caption><graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_35360-fig-10.tif"/></fig><table-wrap id="table-5"><label>Table 5</label><caption><title>RMSE of the UWB-IMU based system for scenario 2 (move 50&#x2005;cm on the y-axis)</title></caption>
<table frame="hsides">
<colgroup>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
</colgroup>
<thead>
<tr>
<th align="left">Scenarios</th>
<th align="left">X</th>
<th align="left">Y</th>
<th align="left">Z</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">Stationary</td>
<td align="left">5.684</td>
<td align="left">11.546</td>
<td align="left">14.996</td>
</tr>
<tr>
<td align="left">Moving</td>
<td align="left">5.330</td>
<td align="left">23.616</td>
<td align="left">32.171</td>
</tr>
</tbody>
</table>
</table-wrap>
</sec>
<sec id="s5_4"><label>5.4</label><title>UWB&#x2013;IMU-Based Relative-Position Trajectory Experiment</title>
<p>The structure of the mobile nodes is shown in <xref ref-type="fig" rid="fig-11">Fig. 11</xref>. The experiments were conducted indoors, two mobile nodes were placed at a 5&#x2005;m distance, and the tag node was moved around the anchor node to evaluate the system&#x2019;s performance in establishing a relative position trajectory. The tag was moved toward the anchor node by 4&#x2005;m, translated laterally by 2&#x2005;m, and then returned to its original position by a symmetrical movement. The experimental results are shown in <xref ref-type="fig" rid="fig-12">Fig. 12</xref>.</p>
<fig id="fig-11"><label>Figure 11</label><caption><title>Experimental device and environment used to evaluate the performance of our proposed system. Each mobile node is equipped with four UWB devices and one IMU sensor. The body frame of the mobile node is a plane with four anchor devices placed at a specific distance from the center</title></caption><graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_35360-fig-11.tif"/></fig><fig id="fig-12"><label>Figure 12</label><caption><title>Trajectory from the proposed relative-position system. The blue line represents the position estimated using the proposed system, and the true trajectory is represented by the red line</title></caption><graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_35360-fig-12.tif"/></fig>
<p>It can be seen that the estimated trajectory matches the ground-truth position very well. The RMSE values were also calculated to evaluate the performance of the relative-position trajectory, and the results are shown in <xref ref-type="table" rid="table-6">Table 6</xref>. It can be seen that the proposed system can achieve centimeter-level accuracy with a long-distance trajectory.</p>
<table-wrap id="table-6"><label>Table 6</label><caption><title>RMSE values (in cm) of the UWB&#x2013;IMU-based system for the trajectory experiment</title></caption>
<table frame="hsides">
<colgroup>
<col align="left"/>
<col align="left"/>
<col align="left"/>
</colgroup>
<thead>
<tr>
<th align="left"><italic>X</italic></th>
<th align="left"><italic>y</italic></th>
<th align="left"><italic>z</italic></th>
</tr>
</thead>
<tbody>
<tr>
<td align="left"><bold>9.077</bold></td>
<td align="left">31.548</td>
<td align="left">24.379</td>
</tr>
</tbody>
</table>
</table-wrap>
</sec>
<sec id="s5_5"><label>5.5</label><title>Relative Position Representation in Unity</title>
<p>VR and AR are promising technologies that can be used for different collaborative tasks, such as telemedicine and teleoperation. The main purpose of this study was to enable relative-position sensing for VR/AR scenarios; the proposed system can calculate two or more avatars&#x2019; relative positions. As such, the positionings obtained by the proposed system were represented in Unity to evaluate its real-time usability, and the above-described scenarios were implemented in Unity using the IoT platform. <xref ref-type="fig" rid="fig-13">Fig. 13</xref> shows the resulting relative positions of avatars in Unity. The ground truth (GT) locations represent the positions to which the objects move during the experiment; <xref ref-type="fig" rid="fig-13">Fig. 13a</xref> represents scenario 1, in which the objects move 50&#x2005;cm in the <italic>x</italic> direction; <xref ref-type="fig" rid="fig-13">Fig. 13b</xref> illustrates scenario 2, in which the objects move 50&#x2005;cm in the <italic>y</italic> direction. Lastly, <xref ref-type="fig" rid="fig-13">Fig. 13c</xref> shows a demonstration of the trajectory experiment on the VR platform. It can be seen from <xref ref-type="fig" rid="fig-13">Fig. 13</xref> that the data from the proposed system can be applied to a VR platform to represent the relative positions of two avatars, and it can thus be used in collaborative tasks.</p>
<fig id="fig-13"><label>Figure 13</label><caption><title>Relative position representation in Unity to demonstrate the proposed relative position for VR/AR: (a) scenario 1, a 50&#x2005;cm move in the <italic>x</italic> direction; (b) scenario 2, a 50&#x2005;cm move in the <italic>y</italic> direction, (c) trajectory scenario, illustrating a move around another object</title></caption><graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_35360-fig-13.tif"/></fig>
</sec>
</sec>
<sec id="s6"><label>6</label><title>Discussion</title>
<p>The average RMSE of the proposed system was compared with other relative-position systems in the literature. The purpose of this comparison was to determine the performance of different hardware and algorithm implementations. From <xref ref-type="table" rid="table-7">Table 7</xref>, it can be seen that the performance varies due to the use of different algorithms and devices. As the number of UWB devices is increased, the accuracy is also improved due to the additional range measurements that are provided. The data preprocessing methods also help to improve the accuracy of the relative-position systems by reducing bias and standard deviation. We can see from the comparison that the proposed method has an RMSE of 0.31&#x2005;m, which is lower than the RMSE values of the other solutions found in the literature. Therefore, it can be said that the proposed relative-position system has high accuracy and can thus be implemented for collaborative tasks such as VR/AR applications.</p>
<table-wrap id="table-7"><label>Table 7</label><caption><title>RMSE of the UWB-IMU-based system for the trajectory experiment</title></caption>
<table frame="hsides">
<colgroup>
<col align="left"/>
<col align="left"/>
<col align="left"/>
</colgroup>
<thead>
<tr>
<th align="left">Proposed method</th>
<th align="left">Hardware</th>
<th align="left">Position RMSE(m)</th>
</tr>
</thead>
<tbody>
<tr>
<td align="left">Sliding window [<xref ref-type="bibr" rid="ref-38">38</xref>]</td>
<td align="left">1 UWB&#x2009;&#x002B;&#x2009;1 IMU</td>
<td align="left">0.68</td>
</tr>
<tr>
<td align="left">Particle filter [<xref ref-type="bibr" rid="ref-39">39</xref>]</td>
<td align="left">1 UWB&#x2009;&#x002B;&#x2009;1 IMU</td>
<td align="left">2.11</td>
</tr>
<tr>
<td align="left">Extended kalman filter [<xref ref-type="bibr" rid="ref-40">40</xref>]</td>
<td align="left">1 UWB</td>
<td align="left">0.80</td>
</tr>
<tr>
<td align="left">Extended kalman filter [<xref ref-type="bibr" rid="ref-41">41</xref>]</td>
<td align="left">2 UWB&#x2009;&#x002B;&#x2009;1 IMU</td>
<td align="left">0.42</td>
</tr>
<tr>
<td align="left">Extended kalman filter (proposed)</td>
<td align="left">4 UWB&#x2009;&#x002B;&#x2009;1 IMU</td>
<td align="left"><bold>0.31</bold></td>
</tr>
</tbody>
</table>
</table-wrap>
</sec>
<sec id="s7"><label>7</label><title>Conclusions</title>
<p>This paper presents a UWB&#x2013;IMU fusion-based method for acquiring the relative positions of two mobile nodes. The anchor placement and modified multilateration methods were modified to enable mobile infrastructure-based measurements of relative position. In the present method, measurements from an IMU are fused with UWB measurements to mitigate the error caused by the variance characteristics of the UWB range measurements. The data from the proposed method were also sent to a VR platform to demonstrate its potential for use in VR/AR scenarios. An RMSE relative position accuracy of 0.31&#x2005;m was achieved in three dimensions. The proposed method yields better results than previous methods; thus, it could be deployed in collaborative robotics tasks and VR/AR applications.</p>
<p>The UWB-IMU fusion-based relative position system is also prone to variations of the position due to the standard deviation of the range measurement. Standard deviation can be mitigated using different filters such as complementary filter, moving average filter, and min-max removal, but these filters decrease the measurement rate of the sensor. The measurement rate of the UWB and IMU devices is very important to enable real-time location-aware services for VR/AR applications. Moreover, more complex scenarios need to be proposed to evaluate the performance of the proposed relative position system. Deep learning methods can be explored to mitigate the range error for both LOS and NLOS environments.</p>
</sec>
</body>
<back>
<sec><title>Funding Statement</title>
<p>This work is/was supported by <funding-source>Samsung Advanced Institute of Technology and partly</funding-source> supported by the <funding-source>National Research Foundation of Korea (NRF)</funding-source> grant funded by the <funding-source>Korean government (MSIT)</funding-source> (<award-id>2022R1F1A1063662</award-id>).</p></sec>
<sec sec-type="COI-statement"><title>Conflicts of Interest</title>
<p>The authors declare that they have no conflicts of interest to report regarding the present study.</p></sec>
<ref-list content-type="authoryear"><title>References</title>
<ref id="ref-1"><label>[1]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>G.</given-names> <surname>Bahle</surname></string-name>, <string-name><given-names>V.</given-names> <surname>Fortes Rey</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Bian</surname></string-name>, <string-name><given-names>H.</given-names> <surname>Bello</surname></string-name> and <string-name><given-names>P.</given-names> <surname>Lukowicz</surname></string-name></person-group>, &#x201C;<article-title>Using privacy respecting sound analysis to improve bluetooth based proximity detection for COVID-19 exposure tracing and social distancing</article-title>,&#x201D; <source>Sensors</source>, vol. <volume>21</volume>, no. <issue>16</issue>, pp. <fpage>5604</fpage>, <year>2021</year>.</mixed-citation></ref>
<ref id="ref-2"><label>[2]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>Y.</given-names> <surname>Wang</surname></string-name>, <string-name><given-names>L.</given-names> <surname>Guo</surname></string-name>, <string-name><given-names>Z.</given-names> <surname>Lu</surname></string-name>, <string-name><given-names>X.</given-names> <surname>Wen</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Zhou</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>From point to space: 3D moving human pose estimation using commodity WiFi</article-title>,&#x201D; <source>IEEE Communications Letters</source>, vol. <volume>25</volume>, no. <issue>7</issue>, pp. <fpage>2235</fpage>&#x2013;<lpage>2239</lpage>, <year>2021</year>.</mixed-citation></ref>
<ref id="ref-3"><label>[3]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>A.</given-names> <surname>Mohanty</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Wu</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Bhamidipati</surname></string-name> and <string-name><given-names>G.</given-names> <surname>Gao</surname></string-name></person-group>, &#x201C;<article-title>Precise relative positioning via tight-coupling of GPS carrier phase and multiple UWBs</article-title>,&#x201D; <source>IEEE Robotics and Automation Letters</source>, vol. <volume>7</volume>, no. <issue>2</issue>, pp. <fpage>5757</fpage>&#x2013;<lpage>5762</lpage>, <year>2022</year>.</mixed-citation></ref>
<ref id="ref-4"><label>[4]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>M.</given-names> <surname>Wang</surname></string-name>, <string-name><given-names>X.</given-names> <surname>Chen</surname></string-name>, <string-name><given-names>B.</given-names> <surname>Jin</surname></string-name>, <string-name><given-names>P.</given-names> <surname>Lv</surname></string-name>, <string-name><given-names>W.</given-names> <surname>Wang</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>A novel V2V cooperative collision warning system using UWB/DR for intelligent vehicles</article-title>,&#x201D; <source>Sensors</source>, vol. <volume>21</volume>, no. <issue>10</issue>, pp. <fpage>3485</fpage>, <year>2021</year>.</mixed-citation></ref>
<ref id="ref-5"><label>[5]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>P.</given-names> <surname>Pascacio</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Casteleyn</surname></string-name>, <string-name><given-names>J.</given-names> <surname>Torres-Sospedra</surname></string-name>, <string-name><given-names>E. S.</given-names> <surname>Lohan</surname></string-name> and <string-name><given-names>J.</given-names> <surname>Nurmi</surname></string-name></person-group>, &#x201C;<article-title>Collaborative indoor positioning systems: A systematic review</article-title>,&#x201D; <source>Sensors</source>, vol. <volume>21</volume>, no. <issue>3</issue>, pp. <fpage>1002</fpage>, <year>2021</year>.</mixed-citation></ref>
<ref id="ref-6"><label>[6]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>V.</given-names> <surname>Mai</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Kamel</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Krebs</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Schaffner</surname></string-name>, <string-name><given-names>D.</given-names> <surname>Meier</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>Local positioning system using UWB range measurements for an unmanned blimp</article-title>,&#x201D; <source>IEEE Robotics and Automation Letters</source>, vol. <volume>3</volume>, no. <issue>4</issue>, pp. <fpage>2971</fpage>&#x2013;<lpage>2978</lpage>, <year>2018</year>.</mixed-citation></ref>
<ref id="ref-7"><label>[7]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>J.</given-names> <surname>Cano</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Chidami</surname></string-name> and <string-name><given-names>J. L.</given-names> <surname>Ny</surname></string-name></person-group>, &#x201C;<article-title>A kalman filter-based algorithm for simultaneous time synchronization and localization in UWB networks</article-title>,&#x201D; in <conf-name>Proc. IEEE Intl. Conf. Robot. Automat.</conf-name>, <conf-loc>Montreal, QC, Canada</conf-loc>, pp. <fpage>1431</fpage>&#x2013;<lpage>1437</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-8"><label>[8]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>Y.</given-names> <surname>Cao</surname></string-name>, <string-name><given-names>C.</given-names> <surname>Yang</surname></string-name>, <string-name><given-names>R.</given-names> <surname>Li</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Knoll</surname></string-name> and <string-name><given-names>G.</given-names> <surname>Beltrame</surname></string-name></person-group>, &#x201C;<article-title>Accurate position tracking with a single UWB anchor</article-title>,&#x201D; in <conf-name>Proc. Int. Conf. Robot. Automat.</conf-name>, <conf-loc>Paris, France</conf-loc>, pp. <fpage>2344</fpage>&#x2013;<lpage>2350</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-9"><label>[9]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>H.</given-names> <surname>Obeidat</surname></string-name>, <string-name><given-names>W.</given-names> <surname>Shuaieb</surname></string-name>, <string-name><given-names>O.</given-names> <surname>Obeidat</surname></string-name> and <string-name><given-names>R.</given-names> <surname>Abd-Alhameed</surname></string-name></person-group>, &#x201C;<article-title>A review of indoor localization techniques and wireless technologies</article-title>,&#x201D; <source>Wireless Personal Communications</source>, vol. <volume>119</volume>, no. <issue>1</issue>, pp. <fpage>289</fpage>&#x2013;<lpage>327</lpage>, <year>2021</year>.</mixed-citation></ref>
<ref id="ref-10"><label>[10]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>M. A.</given-names> <surname>Dawood</surname></string-name>, <string-name><given-names>S. S.</given-names> <surname>Saleh</surname></string-name>, <string-name><given-names>E. -S. A.</given-names> <surname>El-Badawy</surname></string-name> and <string-name><given-names>M. H.</given-names> <surname>Aly</surname></string-name></person-group>, &#x201C;<article-title>A comparative analysis of localization algorithms for visible light communication</article-title>,&#x201D; <source>Optical and Quantum Electronics</source>, vol. <volume>53</volume>, no. <issue>2</issue>, pp. 1&#x2013;25, <year>2021</year>.</mixed-citation></ref>
<ref id="ref-11"><label>[11]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>K.</given-names> <surname>Mannay</surname></string-name>, <string-name><given-names>J.</given-names> <surname>Ure&#x00F1;a</surname></string-name>, <string-name><given-names>&#x00C1;.</given-names> <surname>Hern&#x00E1;ndez</surname></string-name>, <string-name><given-names>J. M.</given-names> <surname>Villadangos</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Machhout</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>Evaluation of multi-sensor fusion methods for ultrasonic indoor positioning</article-title>,&#x201D; <source>Applied Sciences</source>, vol. <volume>11</volume>, no. <issue>15</issue>, pp. <fpage>6805</fpage>, <year>2021</year>.</mixed-citation></ref>
<ref id="ref-12"><label>[12]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>T. M.</given-names> <surname>Nguyen</surname></string-name>, <string-name><given-names>Z.</given-names> <surname>Qiu</surname></string-name>, <string-name><given-names>T. H.</given-names> <surname>Nguyen</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Cao</surname></string-name> and <string-name><given-names>L.</given-names> <surname>Xie</surname></string-name></person-group>, &#x201C;<article-title>Distance-based cooperative relative localization for leaderfollowing control of mavs</article-title>,&#x201D; <source>IEEE Robotics and Automation Letters</source>, vol. <volume>4</volume>, no. <issue>4</issue>, pp. <fpage>3641</fpage>&#x2013;<lpage>3648</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-13"><label>[13]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>van der Helm</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Coppola</surname></string-name>, <string-name><given-names>K. N.</given-names> <surname>McGuire</surname></string-name> and <string-name><given-names>G. C.</given-names> <surname>de Croon</surname></string-name></person-group>, &#x201C;<article-title>On-board range-based relative localization for micro air vehicles in indoor leader&#x2013;follower flight</article-title>,&#x201D; <source>Autonomous Robots</source>, vol. <volume>44</volume>, no. <issue>3&#x2013;4</issue>, pp. <fpage>415</fpage>&#x2013;<lpage>441</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-14"><label>[14]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>Zheng</surname></string-name>, <string-name><given-names>Z.</given-names> <surname>Li</surname></string-name>, <string-name><given-names>Y.</given-names> <surname>Yin</surname></string-name>, <string-name><given-names>Y.</given-names> <surname>Liu</surname></string-name>, <string-name><given-names>H.</given-names> <surname>Zhang</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>Multi-robot relative positioning and orientation system based on UWB range and graph optimization</article-title>,&#x201D; <source>Measurement</source>, vol. <volume>195</volume>, pp. <fpage>111068</fpage>, <year>2022</year>.</mixed-citation></ref>
<ref id="ref-15"><label>[15]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>F.</given-names> <surname>Liu</surname></string-name>, <string-name><given-names>J.</given-names> <surname>Liu</surname></string-name>, <string-name><given-names>Y.</given-names> <surname>Yin</surname></string-name>, <string-name><given-names>W.</given-names> <surname>Wang</surname></string-name>, <string-name><given-names>D.</given-names> <surname>Hu</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>Survey on wifi-based indoor positioning techniques</article-title>,&#x201D; <source>IET Communications</source>, vol. <volume>14</volume>, no. <issue>9</issue>, pp. <fpage>1372</fpage>&#x2013;<lpage>1383</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-16"><label>[16]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>Shang</surname></string-name> and <string-name><given-names>L.</given-names> <surname>Wang</surname></string-name></person-group>, &#x201C;<article-title>Overview of WIFI fingerprinting-based indoor positioning</article-title>,&#x201D; <source>IET Communications</source>, vol. <volume>16</volume>, no. <issue>7</issue>, pp. <fpage>725</fpage>&#x2013;<lpage>733</lpage>, <year>2022</year>.</mixed-citation></ref>
<ref id="ref-17"><label>[17]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>W.</given-names> <surname>Shule</surname></string-name>, <string-name><given-names>C. M.</given-names> <surname>Almansa</surname></string-name>, <string-name><given-names>J. P.</given-names> <surname>Queralta</surname></string-name>, <string-name><given-names>Z.</given-names> <surname>Zou</surname></string-name> and <string-name><given-names>T.</given-names> <surname>Westerlund</surname></string-name></person-group>, &#x201C;<article-title>UWB-Based localization for multi-UAV systems and collaborative heterogeneous multi-robot systems</article-title>,&#x201D; <source>Procedia Computer Science</source>, vol. <volume>175</volume>, pp. <fpage>357</fpage>&#x2013;<lpage>364</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-18"><label>[18]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>J.</given-names> <surname>San Mart&#x00ED;n</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Cort&#x00E9;s</surname></string-name>, <string-name><given-names>L.</given-names> <surname>Zamora-Cadenas</surname></string-name> and <string-name><given-names>B. J.</given-names> <surname>Svensson</surname></string-name></person-group>, &#x201C;<article-title>Precise positioning of autonomous vehicles combining UWB ranging estimations with on-board sensors</article-title>,&#x201D; <source>Electronics</source>, vol. <volume>9</volume>, no. <issue>8</issue>, pp. <fpage>1238</fpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-19"><label>[19]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>T. H.</given-names> <surname>Nguyen</surname></string-name>, <string-name><given-names>T. -M.</given-names> <surname>Nguyen</surname></string-name> and <string-name><given-names>L.</given-names> <surname>Xie</surname></string-name></person-group>, &#x201C;<article-title>Range-focused fusion of camera-IMU-UWB for accurate and drift-reduced localization</article-title>,&#x201D; <source>IEEE Robotics and Automation Letters</source>, vol. <volume>6</volume>, no. <issue>2</issue>, pp. <fpage>1678</fpage>&#x2013;<lpage>1685</lpage>, <year>2021</year>.</mixed-citation></ref>
<ref id="ref-20"><label>[20]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>A.</given-names> <surname>Papalia</surname></string-name>, <string-name><given-names>N.</given-names> <surname>Thumma</surname></string-name> and <string-name><given-names>J.</given-names> <surname>Leonard</surname></string-name></person-group>, &#x201C;<article-title>Prioritized planning for cooperative range-only localization in multi-robot networks</article-title>,&#x201D; in <conf-name>Proc. 2022 Int. Conf. on Robotics and Automation (ICRA)</conf-name>, <conf-loc>Philadelphia, PA, USA</conf-loc>, pp. <fpage>10753</fpage>&#x2013;<lpage>10759</lpage>, <year>2022</year>.</mixed-citation></ref>
<ref id="ref-21"><label>[21]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>K.</given-names> <surname>Long</surname></string-name></person-group>, &#x201C;<article-title>Single UWB anchor aided PDR heading and step length correcting indoor localization system</article-title>,&#x201D; <source>IEEE Access</source>, vol. <volume>9</volume>, pp. <fpage>11511</fpage>&#x2013;<lpage>11522</lpage>, <year>2021</year>.</mixed-citation></ref>
<ref id="ref-22"><label>[22]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>M.</given-names> <surname>Wang</surname></string-name>, <string-name><given-names>X.</given-names> <surname>Chen</surname></string-name>, <string-name><given-names>P.</given-names> <surname>Lv</surname></string-name>, <string-name><given-names>B.</given-names> <surname>Jin</surname></string-name>, <string-name><given-names>W.</given-names> <surname>Wang</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>UWB based relative planar localization with enhanced precision for intelligent vehicles</article-title>,&#x201D; <source>Actuators</source>, vol. <volume>10</volume>, no. <issue>7</issue>, pp. <fpage>144</fpage>, <year>2021</year>.</mixed-citation></ref>
<ref id="ref-23"><label>[23]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>Guler</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Abdelkader</surname></string-name> and <string-name><given-names>J. S.</given-names> <surname>Shamma</surname></string-name></person-group>, &#x201C;<article-title>Infrastructure-free multi-robot localization with ultrawideband sensors</article-title>,&#x201D; in <conf-name>Proc. 2019 American Control Conf. (ACC)</conf-name>, <conf-loc>Philadelphia, PA, USA</conf-loc>, pp. <fpage>13</fpage>&#x2013;<lpage>18</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-24"><label>[24]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>D.</given-names> <surname>Shi</surname></string-name>, <string-name><given-names>H.</given-names> <surname>Mi</surname></string-name>, <string-name><given-names>E. G.</given-names> <surname>Collins</surname></string-name> and <string-name><given-names>J.</given-names> <surname>Wu</surname></string-name></person-group>, &#x201C;<article-title>An indoor low-cost and high-accuracy localization approach for agvs</article-title>,&#x201D; <source>IEEE Access</source>, vol. <volume>8</volume>, pp. <fpage>50085</fpage>&#x2013;<lpage>50090</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-25"><label>[25]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>W. W.</given-names> <surname>Xue</surname></string-name> and <string-name><given-names>P.</given-names> <surname>Jiang</surname></string-name></person-group>, &#x201C;<article-title>The research on navigation technology of dead reckoning based on UWB localization</article-title>,&#x201D; in <conf-name>Proc. 2018 Eighth Int. Conf. on Instrumentation &#x0026; Measurement, Computer, Communication and Control (IMCCC)</conf-name>, <conf-loc>Harbin, China</conf-loc>, pp. <fpage>339</fpage>&#x2013;<lpage>343</lpage>, <year>2018</year>.</mixed-citation></ref>
<ref id="ref-26"><label>[26]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>Monica</surname></string-name> and <string-name><given-names>G.</given-names> <surname>Ferrari</surname></string-name></person-group>, &#x201C;<article-title>Low-complexity UWB-based collision avoidance system for automated guided vehicles</article-title>,&#x201D; <source>ICT Express</source>, vol. <volume>2</volume>, no. <issue>2</issue>, pp. <fpage>53</fpage>&#x2013;<lpage>56</lpage>, <year>2016</year>.</mixed-citation></ref>
<ref id="ref-27"><label>[27]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>E. J.</given-names> <surname>Theussl</surname></string-name>, <string-name><given-names>D.</given-names> <surname>Ninevski</surname></string-name> and <string-name><given-names>P.</given-names> <surname>O&#x2019;Leary</surname></string-name></person-group>, &#x201C;<article-title>Measurement of relative position and orientation using UWB</article-title>,&#x201D; in <conf-name>Proc. 2019 IEEE Int. Instrumentation and Measurement Technology Conf. (I2MTC)</conf-name>, <conf-loc>Auckland, New Zealand</conf-loc>, pp. <fpage>1</fpage>&#x2013;<lpage>6</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-28"><label>[28]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>D.</given-names> <surname>Ko</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Kim</surname></string-name>, <string-name><given-names>K.</given-names> <surname>Son</surname></string-name> and <string-name><given-names>D.</given-names> <surname>Han</surname></string-name></person-group>, &#x201C;<article-title>Passive fingerprinting reinforced by active radiomap for WLAN indoor positioning system</article-title>,&#x201D; <source>IEEE Sensors Journal</source>, vol. <volume>22</volume>, no. <issue>6</issue>, pp. <fpage>5238</fpage>&#x2013;<lpage>5247</lpage>, <year>2022</year>.</mixed-citation></ref>
<ref id="ref-29"><label>[29]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>B. H.</given-names> <surname>Pinto</surname></string-name>, <string-name><given-names>H. A.</given-names> <surname>de Oliveira</surname></string-name> and <string-name><given-names>E. J.</given-names> <surname>Souto</surname></string-name></person-group>, &#x201C;<article-title>Factor optimization for the design of indoor positioning systems using a probability-based algorithm</article-title>,&#x201D; <source>Journal of Sensor and Actuator Networks</source>, vol. <volume>10</volume>, no. <issue>1</issue>, pp. <fpage>16</fpage>, <year>2021</year>.</mixed-citation></ref>
<ref id="ref-30"><label>[30]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>W.</given-names> <surname>Liu</surname></string-name>, <string-name><given-names>X.</given-names> <surname>Luo</surname></string-name>, <string-name><given-names>G.</given-names> <surname>Wei</surname></string-name> and <string-name><given-names>H.</given-names> <surname>Liu</surname></string-name></person-group>, &#x201C;<article-title>Node localization algorithm for wireless sensor networks based on static anchor node location selection strategy</article-title>,&#x201D; <source>Computer Communications</source>, vol. <volume>192</volume>, pp. <fpage>289</fpage>&#x2013;<lpage>298</lpage>, <year>2022</year>.</mixed-citation></ref>
<ref id="ref-31"><label>[31]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>W.</given-names> <surname>Li</surname></string-name>, <string-name><given-names>B.</given-names> <surname>Jelfs</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Kealy</surname></string-name>, <string-name><given-names>X.</given-names> <surname>Wang</surname></string-name> and <string-name><given-names>B.</given-names> <surname>Moran</surname></string-name></person-group>, &#x201C;<article-title>Cooperative localization using distance measurements for mobile nodes</article-title>,&#x201D; <source>Sensors</source>, vol. <volume>21</volume>, no. <issue>4</issue>, pp. <fpage>1507</fpage>, <year>2021</year>.</mixed-citation></ref>
<ref id="ref-32"><label>[32]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>G.</given-names> <surname>Shenkai</surname></string-name>, <string-name><given-names>C.</given-names> <surname>Li</surname></string-name>, <string-name><given-names>W.</given-names> <surname>Jing</surname></string-name> and <string-name><given-names>L.</given-names> <surname>Xianglong</surname></string-name></person-group>, &#x201C;<article-title>An improved approach for iterative nodes localization by using artificial bee colony</article-title>,&#x201D; in <conf-name>Proc. 2021 Int. Conf. on Neural Networks, Information and Communication Engineering</conf-name>, <conf-loc>Qingdao, China</conf-loc>, pp. <fpage>442</fpage>&#x2013;<lpage>450</lpage>, <year>2021</year>.</mixed-citation></ref>
<ref id="ref-33"><label>[33]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>W.</given-names> <surname> Xue</surname></string-name> and <string-name><given-names>P.</given-names> <surname>Jiang</surname></string-name></person-group>, &#x201C;<article-title>The research on navigation technology of dead reckoning based on UWB localization</article-title>,&#x201D; in <source> Proc. IMCCC</source>, Harbin, China, pp. <fpage>339</fpage>&#x2013;<lpage>343</lpage>, <year>2018</year>.</mixed-citation></ref>
<ref id="ref-34"><label>[34]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>W.</given-names> <surname>Xue</surname></string-name> and <string-name><given-names>P.</given-names> <surname>Jiang</surname></string-name></person-group>, &#x201C;<article-title>The research on navigation technology of dead reckoning based on UWB localization</article-title>,&#x201D; in <conf-name>Proc. IMCCC</conf-name>, <conf-loc>Harbin, China</conf-loc>, pp. <fpage>339</fpage>&#x2013;<lpage>343</lpage>, <year>2018</year>.</mixed-citation></ref>
<ref id="ref-35"><label>[35]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>M.</given-names> <surname>Pittokopiti</surname></string-name> and <string-name><given-names>R.</given-names> <surname>Grammenos</surname></string-name></person-group>, &#x201C;<article-title>Infrastructureless UWB based collision avoidance system for the safety of construction workers</article-title>,&#x201D; in <conf-name>Proc. ICT</conf-name>, <conf-loc>Hanoi, Vietnam</conf-loc>, pp. <fpage>490</fpage>&#x2013;<lpage>495</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-36"><label>[36]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>R.</given-names> <surname>Zhang</surname></string-name>, <string-name><given-names>L.</given-names> <surname>Song</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Jaiprakash</surname></string-name>, <string-name><given-names>T.</given-names> <surname>Talty</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Alanzai</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>Using ultrawideband technology in vehicles for infrastructure-free localization</article-title>,&#x201D; in <conf-name>Proc. WF-IoT</conf-name>, <conf-loc>Limerick, Ireland</conf-loc>, pp. <fpage>122</fpage>&#x2013;<lpage>127</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-37"><label>[37]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>Y.</given-names> <surname>Cao</surname></string-name> and <string-name><given-names>G.</given-names> <surname>Beltrame</surname></string-name></person-group>, &#x201C;<article-title>Vir-SLAM: Visual, inertial, and ranging slam for single and multi-robot systems</article-title>,&#x201D; <source>Autonomous Robots</source>, vol. <volume>45</volume>, no. <issue>6</issue>, pp. <fpage>905</fpage>&#x2013;<lpage>917</lpage>, <year>2021</year>.</mixed-citation></ref>
<ref id="ref-38"><label>[38]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>K.</given-names> <surname>Guo</surname></string-name>, <string-name><given-names>Z.</given-names> <surname>Qiu</surname></string-name>, <string-name><given-names>W.</given-names> <surname>Meng</surname></string-name>, <string-name><given-names>L.</given-names> <surname>Xie</surname></string-name> and <string-name><given-names>R.</given-names> <surname>Teo</surname></string-name></person-group>, &#x201C;<article-title>Ultra-wideband based cooperative relative localization algorithm and experiments for multiple unmanned aerial vehicles in GPS denied environments</article-title>,&#x201D; <source>International Journal of Micro Air Vehicles</source>, vol. <volume>9</volume>, no. <issue>3</issue>, pp. <fpage>169</fpage>&#x2013;<lpage>186</lpage>, <year>2017</year>.</mixed-citation></ref>
<ref id="ref-39"><label>[39]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>R.</given-names> <surname>Liu</surname></string-name>, <string-name><given-names>C.</given-names> <surname>Yuen</surname></string-name>, <string-name><given-names>T.</given-names> <surname>Do</surname></string-name>, <string-name><given-names>D.</given-names> <surname>Jiao</surname></string-name>, <string-name><given-names>X.</given-names> <surname>Liu</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>Cooperative relative positioning of mobile users by fusing IMU inertial and UWB ranging information</article-title>,&#x201D; in <conf-name>Proc. ICRA</conf-name>, <conf-loc>Singapore</conf-loc>, pp. <fpage>5623</fpage>&#x2013;<lpage>5629</lpage>, <year>2017</year>.</mixed-citation></ref>
<ref id="ref-40"><label>[40]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>K.</given-names> <surname>Guo</surname></string-name>, <string-name><given-names>Z.</given-names> <surname>Qiu</surname></string-name>, <string-name><given-names>W.</given-names> <surname>Meng</surname></string-name>, <string-name><given-names>L.</given-names> <surname>Xie</surname></string-name> and <string-name><given-names>R.</given-names> <surname>Teo</surname></string-name></person-group>, &#x201C;<article-title>Ultra-wideband based cooperative relative localization algorithm and experiments for multiple unmanned aerial vehicles in GPS denied environments</article-title>,&#x201D; <source>International Journal of Micro Air Vehicles</source>, vol. <volume>9</volume>, no. <issue>3</issue>, pp. <fpage>169</fpage>&#x2013;<lpage>186</lpage>, <year>2017</year>.</mixed-citation></ref>
<ref id="ref-41"><label>[41]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>M.</given-names> <surname>Shalaby</surname></string-name>, <string-name><given-names>C. C.</given-names> <surname>Cossette</surname></string-name>, <string-name><given-names>J. R.</given-names> <surname>Forbes</surname></string-name> and <string-name><given-names>J.</given-names> <surname>Le Ny</surname></string-name></person-group>, &#x201C;<article-title>Relative position estimation in multi-agent systems using attitude-coupled range measurements</article-title>,&#x201D; <source>IEEE Robotics and Automation Letters</source>, vol. <volume>6</volume>, no. <issue>3</issue>, pp. <fpage>4955</fpage>&#x2013;<lpage>4961</lpage>, <year>2021</year>.</mixed-citation></ref>
<ref id="ref-42"><label>[42]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>P.</given-names> <surname>Chansamood</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Wisadsud</surname></string-name>, <string-name><given-names>K.</given-names> <surname>Maneerat</surname></string-name>, <string-name><given-names>T.</given-names> <surname>Sanpechuda</surname></string-name>, <string-name><given-names>K.</given-names> <surname>Chinda</surname></string-name> <etal>et al.,</etal></person-group> &#x201C;<article-title>Effects of antenna orientation in ultra wideband indoor positioning system</article-title>,&#x201D; in <conf-name>Proc. 2019 16th Int. Conf. on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology (ECTI-CON)</conf-name>, <conf-loc>Pattaya, Thailand</conf-loc>, pp. <fpage>397</fpage>&#x2013;<lpage>400</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-43"><label>[43]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>A. R.</given-names> <surname>Jim&#x00E9;nez Ruiz</surname></string-name> and <string-name><given-names>F.</given-names> <surname>Seco Granja</surname></string-name></person-group>, &#x201C;<article-title>Comparing ubisense, BeSpoon, and DecaWave UWB location systems: Indoor performance analysis</article-title>,&#x201D; <source>IEEE Transactions on Instrumentation and Measurement</source>, vol. <volume>66</volume>, no. <issue>8</issue>, pp. <fpage>2106</fpage>&#x2013;<lpage>2117</lpage>, <year>2017</year>.</mixed-citation></ref>
</ref-list>
</back>
</article>