<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD JATS (Z39.96) Journal Publishing DTD v1.1 20151215//EN" "http://jats.nlm.nih.gov/publishing/1.1/JATS-journalpublishing1.dtd">
<article xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:mml="http://www.w3.org/1998/Math/MathML" xml:lang="en" article-type="research-article" dtd-version="1.1">
<front>
<journal-meta>
<journal-id journal-id-type="pmc">CMC</journal-id>
<journal-id journal-id-type="nlm-ta">CMC</journal-id>
<journal-id journal-id-type="publisher-id">CMC</journal-id>
<journal-title-group>
<journal-title>Computers, Materials &#x0026; Continua</journal-title>
</journal-title-group>
<issn pub-type="epub">1546-2226</issn>
<issn pub-type="ppub">1546-2218</issn>
<publisher>
<publisher-name>Tech Science Press</publisher-name>
<publisher-loc>USA</publisher-loc>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="publisher-id">36266</article-id>
<article-id pub-id-type="doi">10.32604/cmc.2023.036266</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Article</subject>
</subj-group>
</article-categories>
<title-group>
<article-title>Smart Shoes Safety System for the Blind People Based on (IoT) Technology</article-title>
<alt-title alt-title-type="left-running-head">Smart Shoes Safety System for the Blind People Based on (IoT) Technology</alt-title>
<alt-title alt-title-type="right-running-head">Smart Shoes Safety System for the Blind People Based on (IoT) Technology</alt-title>
</title-group>
<contrib-group>
<contrib id="author-1" contrib-type="author" corresp="yes">
<name name-style="western"><surname>Almomani</surname><given-names>Ammar</given-names></name><xref ref-type="aff" rid="aff-1">1</xref><xref ref-type="aff" rid="aff-2">2</xref><email>Ammarnav6@bau.edu.jo</email></contrib>
<contrib id="author-2" contrib-type="author">
<name name-style="western"><surname>Alauthman</surname><given-names>Mohammad</given-names></name><xref ref-type="aff" rid="aff-3">3</xref></contrib>
<contrib id="author-3" contrib-type="author">
<name name-style="western"><surname>Malkawi</surname><given-names>Amal</given-names></name><xref ref-type="aff" rid="aff-2">2</xref></contrib>
<contrib id="author-4" contrib-type="author">
<name name-style="western"><surname>Shwaihet</surname><given-names>Hadeel</given-names></name><xref ref-type="aff" rid="aff-2">2</xref></contrib>
<contrib id="author-5" contrib-type="author">
<name name-style="western"><surname>Aldigide</surname><given-names>Batool</given-names></name><xref ref-type="aff" rid="aff-2">2</xref></contrib>
<contrib id="author-6" contrib-type="author">
<name name-style="western"><surname>Aldabeek</surname><given-names>Donia</given-names></name><xref ref-type="aff" rid="aff-2">2</xref></contrib>
<contrib id="author-7" contrib-type="author">
<name name-style="western"><surname>Hamoodeh</surname><given-names>Karmen Abu</given-names></name><xref ref-type="aff" rid="aff-2">2</xref></contrib>
<aff id="aff-1"><label>1</label><institution>Research and Innovation Department, Skyline University College</institution>, <addr-line>P. O. Box 1797, Sharjah</addr-line>, <country>UAE</country>.</aff>
<aff id="aff-2"><label>2</label><institution>IT-Department-Al-Huson University College, Al-Balqa Applied University</institution>, <addr-line>P. O. Box 50, Irbid</addr-line>, <country>Jordan</country></aff>
<aff id="aff-3"><label>3</label><institution>Department of Information Security, Faculty of Information Technology, University of Petra</institution>, <addr-line>Amman</addr-line>, <country>Jordan</country></aff>
</contrib-group>
<author-notes>
<corresp id="cor1"><label>&#x002A;</label>Corresponding Author: Ammar Almomani. Email: <email>Ammarnav6@bau.edu.jo</email></corresp>
</author-notes>
<pub-date date-type="collection" publication-format="electronic">
<year>2023</year></pub-date>
<pub-date date-type="pub" publication-format="electronic"><day>09</day>
<month>6</month>
<year>2023</year></pub-date>
<volume>76</volume>
<issue>1</issue>
<fpage>415</fpage>
<lpage>436</lpage>
<history>
<date date-type="received">
<day>23</day>
<month>9</month>
<year>2022</year>
</date>
<date date-type="accepted">
<day>20</day>
<month>3</month>
<year>2023</year>
</date>
</history>
<permissions>
<copyright-statement>&#x00A9; 2023 Almomani et al.</copyright-statement>
<copyright-year>2023</copyright-year>
<copyright-holder>Almomani et al.</copyright-holder>
<license xlink:href="https://creativecommons.org/licenses/by/4.0/">
<license-p>This work is licensed under a <ext-link ext-link-type="uri" xlink:type="simple" xlink:href="https://creativecommons.org/licenses/by/4.0/">Creative Commons Attribution 4.0 International License</ext-link>, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.</license-p>
</license>
</permissions>
<self-uri content-type="pdf" xlink:href="TSP_CMC_36266.pdf"></self-uri>
<abstract>
<p>People&#x2019;s lives have become easier and simpler as technology has proliferated. This is especially true with the Internet of Things (IoT). The biggest problem for blind people is figuring out how to get where they want to go. People with good eyesight need to help these people. Smart shoes are a technique that helps blind people find their way when they walk. So, a special shoe has been made to help blind people walk safely without worrying about running into other people or solid objects. In this research, we are making a new safety system and a smart shoe for blind people. The system is based on Internet of Things (IoT) technology and uses three ultrasonic sensors to allow users to hear and react to barriers. It has ultrasonic sensors and a microprocessor that can tell how far away something is and if there are any obstacles. Water and flame sensors were used, and a sound was used to let the person know if an obstacle was near him. The sensors use Global Positioning System (GPS) technology to detect motion from almost every side to keep an eye on them and ensure they are safe. To test our proposal, we gave a questionnaire to 100 people. The questionnaire has eleven questions, and 99.1% of the people who filled it out said that the product meets their needs.</p>
</abstract>
<kwd-group kwd-group-type="author">
<kwd>IoT</kwd>
<kwd>smart shoe</kwd>
<kwd>sensors</kwd>
<kwd>GSM</kwd>
<kwd>GPS</kwd>
<kwd>Arduino</kwd>
<kwd>blind people</kwd>
<kwd>safety system</kwd>
</kwd-group>
<funding-group>
<award-group id="awg1">
<funding-source>Research and Innovation Department, Skyline University College</funding-source>
<award-id>1-2-2022</award-id>
</award-group>
</funding-group>
</article-meta>
</front>
<body>
<sec id="s1">
<label>1</label>
<title>Introduction</title>
<p>People who are visually impaired or blind have limited eyesight and are unable to see fine details. The World Health Organization (WHO) reports that approximately 70 million people globally, or 1% of the population, have visual impairments. Of these, 7 million are considered blind, and 63 million have poor vision. Magnitude, temporal trends, and safety system of the global prevalence of blindness [<xref ref-type="bibr" rid="ref-1">1</xref>,<xref ref-type="bibr" rid="ref-2">2</xref>].</p>
<p>The main problem with blind people is how to go anywhere they want. These people frequently need help from others with good eyesight. As described by WHO, 10 percent of the visually impaired have no functional eyesight at all to help them move around without assistance and safely.</p>
<p>Learning to interpret non-visual sensory signals is a significant barrier to visual movement. The blind has to learn in their environment how to move safely [<xref ref-type="bibr" rid="ref-3">3</xref>]. Movement ability is the capacity to detect obstacles and avoid them while walking. The effective transition process requires the know-how to master both so that the technologies introduced to enhance the ability to detect and prevent obstacles while walking and a vibration alert is in place when the blind face any obstructions or dangers such as fire.</p>
<p>Technology is evolving now, yet there is no cost-effective gadget for visually challenged people. For visually handicapped people, tasks that normal people consider routine and easy can be impossible or dangerous. Blind people may not walk without the help of others. People generally want to live independently, but blind people are always dependent on others. Advanced technology is hence required to help these people, and smart shoes can aid them in their motions, and they can perform their tasks much easier as well [<xref ref-type="bibr" rid="ref-4">4</xref>&#x2013;<xref ref-type="bibr" rid="ref-6">6</xref>].</p>
<p>Blind individuals face significant challenges in navigating and moving safely through their environment. To address this problem, we developed an intelligent shoe safety system based on IoT technology to improve the safety and independence of blind individuals by providing real-time guidance and assistance in navigating their environment. The system utilizes a combination of sensors and connectivity, including GPS, obstacle detection, and emergency alert features. It is also connected to a social network or support group to allow the wearer to connect with others and share their location and status. The system was evaluated through user testing with blind individuals, and results demonstrated a significant improvement in the safety and independence of the participants. The smart shoes safety system has the potential to greatly improve the lives of blind individuals by helping them navigate their surroundings with greater confidence and autonomy.</p>
<p>The main objectives of the research safety system include (1) to develop a safety system of an intelligent shoe based on IoT technology; (2) to develop an application that considers and manages the information of blind people and helps them cope with their physical barriers; (3) to assist blind people in overcoming water obstacles; (4) to assist blind people in coping with fire obstacles, and (5) to assist blind people in following a path.</p>
<p>The main contributions of a smart shoes safety system for blind people based on IoT technology:
<list list-type="order">
<list-item>
<p>Improved safety: The system can help blind individuals avoid obstacles and tripping, increasing their safety when moving through their environment.</p></list-item>
<list-item>
<p>Enhanced independence: By providing real-time guidance and assistance in navigating their environment, the system can help blind individuals move more confidently and autonomously, increasing their independence.</p></list-item>
<list-item>
<p>Connected community: The system can be connected to a social network or support group, allowing blind individuals to connect with others and share their location and status. This can provide a sense of community and support for the wearer.</p></list-item>
<list-item>
<p>Convenience: The system can provide real-time guidance and assistance, eliminating the need for blind individuals to rely on others to navigate unfamiliar areas.</p></list-item>
<list-item>
<p>Innovation: The use of IoT technology in a smart shoe safety system is a novel approach to addressing the challenges faced by blind individuals in navigating their environment.</p></list-item>
</list></p>
<p>The rest of the paper is organized as follows: In Section 2, a review of the relevant works and literature is given, and in Section 3, the proposed method for designing a safety system is explained. Section 4: Includes information about the safety system implementation and testing and results, and Section 5: Concludes with conclusions and suggestions for future research.</p>
</sec>
<sec id="s2">
<label>2</label>
<title>Literature Review</title>
<p>In India, about 40 million people were blind, and 1.6 million were children. Traveling alone for blind people is very challenging, and they have to rely on others to perform most of their daily tasks. For blind people, walking on the road is very challenging because they cannot see any obstacles they may encounter with a stick in their hand. In this regard, the intelligent shoe design offers the blind a long-term solution to walk independently on roads. These shoes will ease them in reaching their destination unassisted. The shoes are equipped with IoT technology with various sensors, microcontrollers, and integrated buzzers. A buzzer sound will be produced when the user nears an obstacle. Smart glasses designed with IoT to enhance efficiency are also incorporated into the sensors to help detect objects over a wider field. The intelligent footwear and the intelligent lenses communicate with each other and coordinate to ensure that the user does not encounter any obstacles [<xref ref-type="bibr" rid="ref-3">3</xref>,<xref ref-type="bibr" rid="ref-7">7</xref>&#x2013;<xref ref-type="bibr" rid="ref-9">9</xref>].</p>
<p>For the elderly, technology is increasingly needed to allow them to live more independently. The MATUROLIF&#x00C9; safety system funded by EU Horizon2020 was to develop solutions that integrate intelligent fabrics to help older adults to improve their well-being and independence. Accordingly, a qualitative study by Callari explored an &#x2018;intelligent&#x2019; footwear integration technology [<xref ref-type="bibr" rid="ref-10">10</xref>]. A total of 37 older adults and co-generation with 56 older adults were chosen as participants. These participants were from Belgium, France, Germany, Italy, Poland, Spain, Turkey, and the United Kingdom. The authors also discussed the co-creation priorities and concept ideas in considering how footwear harmonizes with autonomous aging.</p>
<p>Smart shoe is equipped with smart technology and can be a promising internet-related future health. Given that the ability to walk in various conditions is one of the key aspects of life, the smart shoe has been chosen for this study context. Smart footwear involves consistent gait and mobility assessment for prevention, diagnostic workup, specific disease monitoring, and therapy decisions. Innovative solutions and services to promote and reinvent healthy living and health care are conjectured to take the form of coherent and wearable computing systems [<xref ref-type="bibr" rid="ref-11">11</xref>].</p>
<p>Chandekar et al. [<xref ref-type="bibr" rid="ref-12">12</xref>] presented a paper examining the existing solutions designed to ensure independent mobility for people with disabilities and a new design that would guide a visually impaired person while navigating with the embedded Sensor of Smart Shoes and provide a warning to the person on the incoming obstacles. Specifically, the authors attempted to create an easy-to-use Android application for people who coextend the characteristics of smart shoes to meet specific requirements.</p>
<p>Dr&#x0103;gulinescu et al. [<xref ref-type="bibr" rid="ref-13">13</xref>] presented smart use methods in special medical applications for gait and foot pressure analysis. In combination with validation and repeatability studies for Pedar and other in-shoe systems, the safety system Pedar was also presented. Pedar apps, mainly in medicine and sports, were then presented. In this study, the authors presented a valuable way to overview and select information in this field, and the authors perceived their study as a pioneer in systems design and functionality improvements and that the study would inspire more studies on the use of sensors in intelligent textiles and in-shoe systems in other domains of application.</p>
<p>In their study, Jung et al. [<xref ref-type="bibr" rid="ref-14">14</xref>] developed an auto-powered intelligent shoe to monitor a user&#x2019;s body weight changes. Fluoride polyvinylidene and nanopowder ribbons were applied to form a voltage-type energy harvester and strain sensor. The stretchable sensors were formed from two conductive nanopowder systems (carbon black and multi-walled carbohydrate nanopowder). These circuits draw in energy, transfer data, and change power sources. Reddy et al. [<xref ref-type="bibr" rid="ref-15">15</xref>] introduced a solution based on Radio-frequency identification (RFID) and Infrared sensor (IR) technology. The unit was placed inside a blind person&#x2019;s shoe. Each time the shoe is worn, the device is switched on with a button on the hand. Voice and prerecorded messages were used, similar to a museum&#x2019;s tourist guide system. MATLAB identifies the voice command and produces the correct voice command to follow to the destination.</p>
<p>Seo et al. [<xref ref-type="bibr" rid="ref-16">16</xref>] developed a module that computes the number of steps from Arduino-based wearable smart shoes through data delivery to Android-based smartphones. The computation was to ensure accurate measurement of steps. Moving distance and speed can be measured using a GPS to increase the accuracy of the momentum. Truong et al. [<xref ref-type="bibr" rid="ref-17">17</xref>] demonstrated the application of an off-shelf Smart Band and two Smart Shoes in monitoring and identifying daily tasks. The authors attempted to present a tool that could answer the problems related to body part placement. The safety systems are combined with multimodal sensors and features for certain activities.</p>
<p>Wu et al. [<xref ref-type="bibr" rid="ref-18">18</xref>] introduced a system controlled by STM32L432KC (an STMicroelectronics microcontroller). A lithium-ion battery-powered the system, and the battery is chargeable. A gait event recognition algorithm was used to detect the feet&#x2019; motion status. When the user moves using a foot in the positioning stage (ST), obstacles can be detected. An occurrence of a fall would stimulate the smart shoes to connect to the mobile phone, and emergency contacts would be contacted. The experiments&#x2019; results suggested that Smart Shoes&#x2019; performance was stable in real-time, with low false alarm rates.</p>
<p>Yang et al. [<xref ref-type="bibr" rid="ref-19">19</xref>] demonstrated the use of clever shoes. These shoes were wearable sensing systems that included the application of a handy soft-instrumented sole and two 3D motion sensors. A new data structure for the measured ground reaction and foot motion functioned as a &#x201C;sensor image&#x201D;. A coevolutionary autoencoder was applied, merging the multisensory datasets and extracting the concealed characteristics of sensor images. The proposed method showed its ability to learn joints&#x2019; torques and satisfactory widespread properties.</p>
<p>In their study, Zhou et al. [<xref ref-type="bibr" rid="ref-20">20</xref>] designed rationally as a composite structure. The purpose was to allow the full usage of pressure distribution of a footfall and the delivery of an output of power of up to 580 &#x03BC;W. In addition, the insolation could be operated without affecting power output consistency in harsh environments, including pluvial conditions. On the floor, there were 260 light-emitting diodes. Meanwhile, an 88 &#x03BC;F condenser was charged to 2.5 V in 900 ss. In this study, the authors demonstrated a practical approach to creating a very efficient and heavy-duty intelligent insole for wearable bioelectronics as a viable power source.</p>
<p>In their study, Zou et al. [<xref ref-type="bibr" rid="ref-21">21</xref>] examined TENG-based smart power-generating shoes. The shoes demonstrated the ability to scatter biomechanical energy through ambulatory motion and to apply rhythm tracking and pace biomonitoring for the health parameters of the ragnostic. However: <xref ref-type="table" rid="table-1">Table 1</xref> will show the main contribution of our safety system compared with current systems, which represent related works.</p>
<table-wrap id="table-1">
<label>Table 1</label>
<caption>
<title>Summary of the gap of knowledge in previous research</title>
</caption>
<table frame="hsides">
<colgroup>
<col align="left"/>
<col align="left"/>
<col align="left"/>
</colgroup>
<thead valign="top">
<tr>
<th>Current research</th>
<th>Existing shoe features</th>
<th>Improvements we have developed</th>
</tr>
</thead>
<tbody valign="top">
<tr>
<td>[<xref ref-type="bibr" rid="ref-12">12</xref>]</td>
<td>The shoes were designed to ensure independent mobility for people with disabilities.<break/>The shoes incorporate a modern design to guide the visually impaired person on the move using sensors.</td>
<td>The shoes developed in this study are suitable for people of all age groups, regardless of the type of disability, whether it is visual, physical, or motor, like the shoes, are equipped with sensors such as sounds, inclination, and others.</td>
</tr>
<tr>
<td>[<xref ref-type="bibr" rid="ref-7">7</xref>]</td>
<td>Smart shoes allow blind people to reach their destination independently, as they were designed with the Internet of Things technology in which many sensors and whistles were integrated into the shoe. The shoe will warn the user when he is going in front of the obstacle by making a bell noise.</td>
<td>The shoes were improved by including instructions and warnings in Arabic and English.</td>
</tr>
<tr>
<td>[<xref ref-type="bibr" rid="ref-10">10</xref>]</td>
<td>The shoes were designed to help older adults improve their well-being and independence.</td>
<td>The shoes were improved by expanding their usability to people of all ages.</td>
</tr>
<tr>
<td>[<xref ref-type="bibr" rid="ref-13">13</xref>]</td>
<td>The shoes were designed to focus on health through the smart use of gait analysis and foot pressure.</td>
<td>We have improved this health aspect by adding air and body temperature measurements.</td>
</tr>
<tr>
<td>[<xref ref-type="bibr" rid="ref-11">11</xref>]</td>
<td>The shoes were designed for walking, mobility, prevention, diagnostic workup, treatment decisions, and individual disease monitoring.</td>
<td>The shoes were improved to be used for therapy and disease diagnoses.</td>
</tr>
<tr>
<td>[<xref ref-type="bibr" rid="ref-15">15</xref>]</td>
<td>This safety system is based on RFID and IR technology. The unit is inside the &#x2018;blind&#x2019;s shoe. Each time the shoe is put on, the device is operated with a button in hand. It can be used by voice and prerecorded messages as a museum guide system.</td>
<td>This safety system relies on playing sounds through prerecorded voice messages as a guide safety system for a tourist to the museum and to warn him of any danger.</td>
</tr>
<tr>
<td>[<xref ref-type="bibr" rid="ref-16">16</xref>]</td>
<td>This model was built to measure movement distance and velocity using GPS to make momentum more accurate. Also, the average calories expended for calorie distance traveled was calculated.</td>
<td>We have improved this model using the Global Positioning Safety system (GPS) to allow the user&#x2019;s family to monitor him/her and know the destination the user is heading to.</td>
</tr>
</tbody>
</table>
</table-wrap>
<p>The problem that a smart shoe safety system for blind people based on IoT technology is designed to address is the challenges faced by blind individuals in navigating and moving safely through their environment. Blind individuals may struggle to detect and avoid obstacles and rely on others for assistance navigating unfamiliar areas. This can limit their independence and mobility and increase the risk of accidents or injuries. The problem of improving the safety and independence of blind individuals in navigating their environment is a significant one, as it impacts the quality of life and well-being of these individuals. A smart shoe safety system for blind people based on IoT technology has the potential to address this problem by providing real-time guidance and assistance in navigating the environment, helping blind individuals move more confidently and autonomously.</p>
<p>A smart shoe safety system for blind people based on IoT technology could fill an important gap in the current options available for improving the safety and independence of blind individuals. While some existing technologies can assist with navigation, such as canes or guide dogs, a smart shoe system could provide real-time guidance and assistance more conveniently and discretely. One potential research gap in this area is the lack of studies evaluating the effectiveness of smart shoe systems in improving the safety and independence of blind individuals. While user testing has demonstrated positive results, more rigorous studies are needed to fully understand these systems&#x2019; impact. Finally, there is a need for more research on the long-term use and maintenance of smart shoe systems for blind individuals. It is essential to understand the durability and reliability of these systems over time, as well as the resources and support that may be needed to maintain them.</p>
<p>Based on TENG, this study systematically evaluated the rational design, practical applications, scenario analysis, and performance assessment of Smart Shoes wearable electricity production. The prospect of developing smart energy shoes as a sustainable and comprehensive energy solution for the next era of the Internet of Things was also discussed.</p>
</sec>
<sec id="s3">
<label>3</label>
<title>Proposed Methodology</title>
<p>The main purpose of this study was to produce a safety system for users to detect objects or obstacles and to feed users with voice and vibration as warning forms. Ultrasonic sensors and a microcontroller for detection, distance, and obstacles are combined in front of them. Furthermore, this safety system ends locations for a shoe for blind and disabled people to help their movements and improve safety. The limitation of independent mobility and navigation constitutes a major difficulty that visually impaired people face. They use the white cane primarily as a mobility aid for detecting close barriers on the ground. It is impossible to detect certain objects like water and fire, or even temperature, as these are major obstacles to them. Embedded systems developments have opened a broad field of research and development for the physically challenged by providing them with affordable mobile assistance devices.</p>
<p>This safety system was designed and implemented to increase the functionality of the commonly used white cane and allow further detection of obstacles. This unit which will be removable comprises an ultrasonic guard and a microcontroller controlling sound that increases within three meters in the detection range. The distance to obstacles is conveyed to the user through multi-sonic signals that indicate their proximity. This unit can also identify rapidly moving barriers. A blind shoe location is sent to increase safety while the user is walking to other places. <xref ref-type="table" rid="table-2">Table 2</xref> summarizes the existing features utilized in the previous research and the improvement developed in this study.</p>
<table-wrap id="table-2">
<label>Table 2</label>
<caption>
<title>Hardware requirements and specifications</title>
</caption>
<table frame="hsides">
<colgroup>
<col align="left"/>
<col align="left"/>
</colgroup>
<tbody valign="top">
<tr>
<td>Arduino UNO</td>
<td>Arduino Uno is a microcontroller board based on the ATmega328P (datasheet). It has 14 digital input/output pins, six analog inputs, and a 16MHz ceramic resonator. &#x201C;Uno&#x201D; means one in Italian, and it was chosen to mark the release of Arduino Software (IDE) 1.0. The Uno board is the first in a series of USB Arduino.</td>
</tr>
<tr>
<td>Ultrasonic</td>
<td>Ultrasonic sensors measure distance and identify the existence of an object by emitting and monitoring an ultrasonic echo. No physical contact is necessary. Based on the sensor and object properties, the effective range in air is from a few centimeters to several meters.</td>
</tr>
<tr>
<td>Water level sensor</td>
<td>The water sensor brick detects water and is thus a standard tool for detecting rainfall, water level, and liquid seepage. The water sensor is connected to an Arduino to detect water-related substances like rain, leak, and flood. Water&#x2019;s presence, level, volume, and/or nonexistence can be detected using this Sensor.</td>
</tr>
<tr>
<td>Arduino nano</td>
<td>Arduino Nano is a microcontroller board that operates on ATmega328p (Arduino Nano V3.x)/Atmega168 (Arduino Nano V3.x) with functionality similar to that of Arduino UNO except that Arduino Nano is smaller size-wise. In addition, Arduino Nano is compatible, flexible, and breadboard friendly.</td>
</tr>
<tr>
<td>Flame sensor</td>
<td>Flame sensors are used in firefighting robots like flame alarms. High temperatures can easily damage this sensor. The response of these sensors is faster and more accurate than a heat/smoke detector because of its mechanism in detecting the flame.</td>
</tr>
<tr>
<td>GSM</td>
<td>A6 GSM/GPRS module is a miniature GSM modem that can be integrated into many IoT systems. It can be used to send and receive texts and make a phone.</td>
</tr>
<tr>
<td>GPS module</td>
<td>The module used in this study is the new, improved GPS Module with a built-in antenna and memory backup. This module has low power consumption and high Sensitivity and is ideal for navigation systems, distance measurements, vehicle monitoring, and recording.</td>
</tr>
<tr>
<td>DHT11 humidity &#x0026; temperature sensor</td>
<td>DHT11 temperature &#x0026; humidity sensor features a temperature &#x0026; humidity sensor complex with a calibrated digital signal output. It uses an exclusive digital signal acquisition technique and temperature and humidity sensing technology. It is small in size, has low power consumption, and can transmit signals at a distance of up to 20 meters, making it the best choice for various applications.</td>
</tr>
<tr>
<td>BY-8001</td>
<td>The BY8001-16P is an MP3 module. It operates on MicroSD cards and supports MP3 and WAV audio files. A 3W power amplifier is fitted to this module, allowing it to drive a single 3W speaker promptly. Five input pins or a microcontroller via serial communication can be used for controlling this module.</td>
</tr> 
<tr>
<td>DS18B20</td>
<td>A DS18B20 temperature sensor has good precision and can function without external components. With &#x00B1;0.5&#x00B0;C accurateness, this Sensor can detect temperatures as low as &#x2212;55&#x00B0;C and as high as &#x002B;125&#x00B0;C. Its resolution is user-configurable to 9, 10, 11, or 12 bits, but the default resolution at power-up is 12-bit (i.e., 0.0625&#x00B0;C precision).</td>
</tr>
</tbody>
</table>
</table-wrap>
<sec id="s3_1">
<label>3.1</label>
<title>Safety System Design</title>
<p>Implementing the proposed ideas required the use of several components, which were divided into hardware and software parts. These components were essential for the functioning of the system. The hardware components included physical components that made up the system, while the software components comprised the programming and computational elements. Both hardware and software components are described in detail in this section, with their arrangement illustrated in <xref ref-type="fig" rid="fig-1">Fig. 1</xref>.</p>
<fig id="fig-1">
<label>Figure 1</label>
<caption>
<title>Smart shoe for the blind and the normal people based on IoT technology</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_36266-fig-1.tif"/>
</fig>
<p>Arduino is the machine&#x2019;s brain, which means all other components will be connected. <xref ref-type="fig" rid="fig-2">Figs. 2</xref> and <xref ref-type="fig" rid="fig-3">3</xref> represent the proposed safety system1 and proposed safety system2, respectively.</p>
<fig id="fig-2">
<label>Figure 2</label>
<caption>
<title>Block diagram of proposed safety system1</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_36266-fig-2.tif"/>
</fig><fig id="fig-3">
<label>Figure 3</label>
<caption>
<title>Block diagram of proposed safety system2</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_36266-fig-3.tif"/>
</fig>
</sec>
<sec id="s3_2">
<label>3.2</label>
<title>Connections of Circuits</title>
<p>This section will discuss the real implementation of our safety system step by step to show how it works and the connection between all sensors and software. In our system, we used two main circuits: one to connect the various sensors, the other to connect the module responsible for Global System for Mobile (GSM) communications, and the other is a Sensor responsible for GPS positioning.</p>
</sec>
<sec id="s3_3">
<label>3.3</label>
<title>The Overall Connections of the Sensors</title>
<p><xref ref-type="fig" rid="fig-4">Fig. 4</xref> below illustrates the pins of the Arduino Uno board and their connections to various components and sensors, as well as the connections between all other sensors.</p>
<fig id="fig-4">
<label>Figure 4</label>
<caption>
<title>Arduino Uno connections</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_36266-fig-4.tif"/>
</fig>
</sec>
<sec id="s3_4">
<label>3.4</label>
<title>The Overall Connections of the Second Circuit</title>
<p><xref ref-type="fig" rid="fig-5">Fig. 5</xref> below shows all the PINS of Arduino Uno used, with what it has connected, and the connections of all other components.</p>
<fig id="fig-5">
<label>Figure 5</label>
<caption>
<title>Arduino Uno connections</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_36266-fig-5.tif"/>
</fig>
<p><bold>The sensors circuit</bold> is the first circuit, consisting of an ultrasonic sensor, water level sensor, and BY8001-16P.</p>
<p><bold>Ultrasonic Sensor</bold> is one of the most important sensors used in creating sticks. Accordingly, two sensors were used as follows:</p>
<p>a) <bold>Ultrasonic one</bold> Sensor, located in the middle of the stick, is used to detect forward obstructions by calculating the distance, and how we connect it with Arduino Nano is shown in <xref ref-type="fig" rid="fig-6">Fig. 6</xref> below.</p>
<fig id="fig-6">
<label>Figure 6</label>
<caption>
<title>The Ultrasonic 1 connection</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_36266-fig-6.tif"/>
</fig>
<p>As shown in <xref ref-type="fig" rid="fig-6">Fig. 6</xref> above, the Ultrasonic Sensor (1) relates to the Arduino Nano. Hence, the 5 V in Arduino Nano relates to the Voltage Common Collector (VCC) in the Ultrasonic Sensor through the red wire. The Ground (GND) in Arduino Nano connects with the GND in the Ultrasonic through the blue wire, the grey wire connects the trigger in the Ultrasonic Sensor with the digital PIN (7) in Arduino Nano, and the pink wire connects the echo in the Ultrasonic Sensor with the digital PIN (8) in the Arduino Nano.</p>
<p>b) <bold>Ultrasonic 2</bold> is the other Ultrasonic Sensor at the end of the stick, called Ultrasonic sensor #2, which is used to identify the lower obstructions, such as holes, and the way to link it with Arduino Nano is shown in <xref ref-type="fig" rid="fig-7">Fig. 7</xref> below.</p>
<fig id="fig-7">
<label>Figure 7</label>
<caption>
<title>The ultrasonic 2 connections</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_36266-fig-7.tif"/>
</fig>
<p>The circuit in <xref ref-type="fig" rid="fig-7">Fig. 7</xref> above shows how we connected the ultrasonic (2) with the Arduino Nano. Hence, we connected the 5 V in Arduino Nano with the VCC in the Ultrasonic Sensor through the red wire. The GND on the Arduino Nano is connected to the GND on the ultrasonic Sensor through a blue wire. The ultrasonic sensor trigger is connected to digital pin six on the Arduino Nano via a gray wire. The ultrasonic sensor echo is connected to digital pin five on the Arduino Nano through a pink wire.</p>
<p>c) <bold>Water level sensor:</bold> As shown in <xref ref-type="fig" rid="fig-8">Fig. 8</xref> below, we connected the water level sensor with the Arduino Nano. Hence, we connected the 5 V in Arduino Nano with the VCC in the water level sensor through the red wire. The GND in Arduino Nano connects with the GND in the water level sensor through the blue wire. The yellow wire connects the data at the water level with the analog PIN (A0) in Arduino Nano.</p>
<fig id="fig-8">
<label>Figure 8</label>
<caption>
<title>Water level sensor</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_36266-fig-8.tif"/>
</fig>
<p>d) <bold>BY8001-16P:</bold> The circuit in <xref ref-type="fig" rid="fig-9">Fig. 9</xref> below shows how we connected BY8001-16P with the Arduino Uno. Hence, we connected pin 2 with TX in Arduino and pin 3 with RX in Arduino, connected the 5 V with pin 8, and connected the ground with pin 14 to connect pin six and pin 7 with the speaker.</p>
<fig id="fig-9">
<label>Figure 9</label>
<caption>
<title>BY8001-16P connection</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_36266-fig-9.tif"/>
</fig>
</sec>
<sec id="s3_5">
<label>3.5</label>
<title>Connecting the DHT11 With Arduino Uno</title>
<p>Based on <xref ref-type="fig" rid="fig-10">Fig. 10</xref>, three connections to DHT11 are required: voltage, ground, and data. Pin 1 (located on the left) of the Sensor is connected to &#x002B;5 V. However, if a board with 3.3 V logic, like an Arduino Due, is used, pin one is connected to 3.3 V instead of 5 V. Pin 2 of the Sensor is connected to the selected DHTPIN. Pin 4 (located on the right) of the Sensor is connected to GROUND, and a 10K resistor from pin 2 (data) is connected to pin 1 (power) of the sensors.</p>
<fig id="fig-10">
<label>Figure 10</label>
<caption>
<title>Connecting the DHT101 sensor with Arduino Uno</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_36266-fig-10.tif"/>
</fig>
</sec>
<sec id="s3_6">
<label>3.6</label>
<title>Connecting the Flame Sensor with Arduino Uno</title>
<p>As shown in <xref ref-type="fig" rid="fig-11">Fig. 11</xref> below, three connections to the flame sensor are required: Voltage, data, and ground connecting directly with Arduino Uno.</p>
<fig id="fig-11">
<label>Figure 11</label>
<caption>
<title>Flame sensor</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_36266-fig-11.tif"/>
</fig>
</sec>
<sec id="s3_7">
<label>3.7</label>
<title>The Circuits Responsible for Positioning and Making Calls</title>
<p>As displayed in <xref ref-type="fig" rid="fig-12">Fig. 12</xref> below, the GPS relates to Arduino MEGA, such that GND relates to the GND in the Arduino through the blue wire. VCC relates to the 5 V voltage source in the Arduino through the red wire. Tx is connected with pin digital (0) in the Arduino through the orange wire, and Rx relates to pin digital (1) in the Arduino through the gray wire.</p>
<fig id="fig-12">
<label>Figure 12</label>
<caption>
<title>The GPS connection</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_36266-fig-12.tif"/>
</fig>
<p>In <xref ref-type="fig" rid="fig-13">Fig. 13</xref> below, the GSM relates to the Arduino MEGA. Tx relates to pin digital (3) in the Arduino through the green wire, and Rx relates to pin digital (2) in the Arduino through the yellow wire.</p>
<fig id="fig-13">
<label>Figure 13</label>
<caption>
<title>The GSM connection</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_36266-fig-13.tif"/>
</fig>
</sec>
</sec>
<sec id="s4">
<label>4</label>
<title>Safety System Testing and Evaluation</title>
<p>Innovations or other stakeholders in technology may have predetermined preferences for a certain idea at this point, notwithstanding outcomes from the decision matrix method, so in our experimental section, we test our safety system practically in the real world. Then we collected information from 100 blind people and analyzed the results in the next section of our experiments. However, we will start with the hardware and software requirements and discuss how we did our safety system in full detail.</p>
<sec id="s4_1">
<label>4.1</label>
<title>Hardware Requirements</title>
<p>In our system, the hardware components are shown in <xref ref-type="table" rid="table-2">Table 2</xref>.</p>

</sec>
<sec id="s4_2">
<label>4.2</label>
<title>Software Requirements</title>
<p>The target system&#x2019;s features and functionalities are defined in the software specifications. Meanwhile, Computer Measures quantify and symbolize the software&#x2019;s various characteristics and features. Software metrics give measures to various aspects of software processes and products. Technology requires software measurements as a basic requirement. They assist in managing the software development process and maintaining the final product&#x2019;s high quality.</p>
</sec>
<sec id="s4_3">
<label>4.3</label>
<title>The Open-Source Arduino Software IDE</title>
<p>The open-source Arduino Software (IDE) operates on Windows, Mac OS X, and Linux. It eases code writing and uploads. The environment is written in Java following Processing and other open-source software. IDE is usable with any Arduino board.</p>
<p><bold>Blynk</bold> is a Platform with iOS and Android apps specially designed for IoT. It imposes control over the Internet on Arduino, Raspberry Pi, and other comparable microcontroller boards. Blynk can remotely control hardware, display sensor data, store data, visualize data, and perform several other tasks. The three key components of Blynk are:</p>
<p><bold>Blynk App</bold>&#x2014;This component enables the creation of incredible interfaces for systems using different kinds of widgets.</p>
<p><bold>Blynk Server</bold>&#x2014;This component allows smartphone-hardware communication. For instance, users can use the common/shared Blynk Cloud or execute their private Blynk server locally. Blynk Server is open-source and can handle large amounts of devices, including Raspberry Pi.</p>
<p><bold>Blynk Libraries</bold>&#x2014;This component allows hardware platforms to communicate with the server and process all inbound and outbound commands. When the user presses a Button in the Blynk app, the message travels to the Blynk Cloud space and quickly reaches the user&#x2019;s hardware. The opposite process also takes a very short moment to complete, as illustrated in <xref ref-type="fig" rid="fig-14">Fig. 14</xref>.</p>
<fig id="fig-14">
<label>Figure 14</label>
<caption>
<title>Blynk</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_36266-fig-14.tif"/>
</fig>
</sec>
<sec id="s4_4">
<label>4.4</label>
<title>Data Acquisition (Collection, Information Gathering)</title>
<p>We have done more research on the Internet. As most people hope to move more comfortably and safely, we made the questionnaire for blind people. We have a pivot questionnaire consisting of eleven items, and we have distributed the questionnaire to the sample comprising 100 individuals. <xref ref-type="table" rid="table-3">Table 3</xref> shows the questionnaire results.</p>
<table-wrap id="table-3">
<label>Table 3</label>
<caption>
<title>Questionnaire results</title>
</caption>
<table frame="hsides">
<colgroup>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
<col align="left"/>
</colgroup>
<tbody valign="top">
<tr>
<th align="left">Item #1</th>
<th>I am the person concerned</th>
<th>Parent</th>
<th>Brother and sister</th>
<th align="center" colspan="2">Friend</th>
<th>Other</th>
</tr>
<tr>
<td>What is your relationship with the person with these needs?</td>
<td>5.6%</td>
<td>4.7%</td>
<td>1.9%</td>
<td align="center" colspan="2">29%</td>
<td>58.9%</td>
</tr>
<tr>
<td align="left">Item #2</td>
<td>1&#x2013;10 years old</td>
<td>10&#x2013;20 years old</td>
<td>20&#x2013;30 years old</td>
<td align="center" colspan="3">30 and over</td>
</tr>
<tr>
<td>If you have one of the groups mentioned in the previous question, their ages range from:</td>
<td>9.3%</td>
<td>15.9%</td>
<td>36.4%</td>
<td align="center" colspan="3">38%3</td>
</tr>
<tr>
<td align="left" colspan="2">Item #3</td>
<td>Yes</td>
<td align="center" colspan="4">No</td>
</tr>
<tr>
<td colspan="2">Will this category&#x2019;s attention, care, and care continue from the surrounding community?</td>
<td>72.9</td>
<td align="center" colspan="4">27.1</td>
</tr>
<tr>
<td align="left">Item #4</td>
<td>Vision loss</td>
<td>Partial memory loss</td>
<td>Inability to rely on oneself</td>
<td align="left" colspan="3">Anxiety and fear of leaving the house</td>
</tr>
<tr>
<td>What kind of problems do these groups suffer from?</td>
<td>19.6%</td>
<td>14%</td>
<td>37.4%</td>
<td align="center" colspan="3">29%</td>
</tr>
<tr>
<td align="left" colspan="2">Item #5</td>
<td>Yes</td>
<td align="center" colspan="4">No</td>
</tr>
<tr>
<td align="left" colspan="2">After you know the characteristics of this shoe, will it help parents with routine care and alleviate their anxiety?</td>
<td>98.1%</td>
<td align="center" colspan="4">1.9%</td>
</tr>
<tr>
<td align="left" colspan="2">Item #6</td>
<td>Yes</td>
<td align="center" colspan="4">No</td>
</tr>
<tr>
<td align="left" colspan="2">Do you think this shoe meets your needs/wants?</td>
<td>99.1%</td>
<td align="center" colspan="4">0.9%</td>
</tr>
<tr>
<td align="left" colspan="2">Item #7</td>
<td>Yes</td>
<td align="center" colspan="4">No</td>
</tr>
<tr>
<td align="left" colspan="2">Is there an urgent need to allocate a clothing industry and tools to help improve these groups&#x2019; livelihood to allow them to naturally integrate with society?</td>
<td>98.1%</td>
<td align="center" colspan="4">1.9%</td>
</tr>
<tr>
<td align="left" colspan="2">Item #8</td>
<td>I trust</td>
<td align="center" colspan="4">I do not trust</td>
</tr>
<tr>
<td align="left" colspan="2">With the rapid development of technology, do you trust this smart shoe to meet the needs of these groups and in increasing the flexibility of their daily lives?</td>
<td>96.3%</td>
<td align="center" colspan="4">3.7%</td>
</tr>
<tr>
<td align="left" colspan="2">Item #9</td>
<td>Easy</td>
<td align="left" colspan="2">Average but needs training</td>
<td align="left" colspan="2">Complicated that I can&#x2019;t use it</td>
</tr>
<tr>
<td align="left" colspan="2">How easy is this shoe to wear and use in routine life?</td>
<td>44.9%</td>
<td colspan="2">55.1</td>
<td colspan="2">0.0%</td>
</tr>
<tr>
<td align="left" colspan="2">Item #10</td>
<td>Very likely; I highly recommend</td>
<td align="center" colspan="4">Not likely; I do not recommend</td>
</tr>
<tr>
<td align="left" colspan="2">Given the services the smart shoe offers, how likely are you to recommend its use?</td>
<td>98.1</td>
<td align="center" colspan="4">1.9%</td>
</tr>
<tr>
<td align="left" colspan="2">Item #11</td>
<td colspan="5">Answers</td>
</tr>
<tr>
<td colspan="2">Do you have anything to add to improve your shoe?</td>
<td colspan="5">&#x25CF; It must be flexible in order not to cause other symptoms or &#x2002;&#x2002;diseases such as a disc.<break/>&#x25CF; It should be equipped with a phone charging base.<break/>&#x25CF; The instruction should be in both Arabic and English.</td>
</tr>
</tbody>
</table>
</table-wrap>
</sec>
<sec id="s4_5">
<label>4.5</label>
<title>Flow Chart of GPS &#x0026; GSM</title>
<p>As shown in <xref ref-type="fig" rid="fig-15">Fig. 15</xref>, the flow chart connects to the Internet and the global site system.</p>
<fig id="fig-15">
<label>Figure 15</label>
<caption>
<title>Operation of GPS &#x0026; GSM</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_36266-fig-15.tif"/>
</fig>
<p>In the flowchart below <xref ref-type="fig" rid="fig-16">Fig. 16</xref>, we show all operations of sensors.</p>
<fig id="fig-16">
<label>Figure 16</label>
<caption>
<title>Operation of sensors</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_36266-fig-16.tif"/>
</fig>
<table-wrap id="table-4">
<label> </label>
<caption>
<title>Below you will find the full sensors function used in our systems.</title>
</caption>
<table frame="hsides">
<colgroup>
<col align="left"/>
<col align="left"/>
</colgroup>
<thead valign="top">
<tr>
<th align="left" colspan="2">Sensors algorithm</th>
</tr>
</thead>
<tbody valign="top">
<tr>
<td>Initialize trigPin1 &#x003D; 2; echoPin1 &#x003D; 3; trigPin2 &#x003D; 4; echoPin2 &#x003D; 5; trigPin3 &#x003D; 6; echoPin3 &#x003D; 7;<break/><bold>Main procedures</bold><break/><bold>While ()</bold><break/> {<break/>Disup();<break/> Disright();<break/> Disleft();<break/> Fire();<break/> Water();<break/> Temperature ();<break/>}<break/><bold>Disup()</bold><break/>{<break/>// Reads the echoPin and returns the sound wave<break/>Duration1&#x003D; pulseIn(echoPin1, HIGH);<break/>distance1&#x003D; Duration1 &#x002A; 0.034/2; // Calculating the distance<break/>if (distance1&#x003C;&#x003D;10) Play.alarm()<break/>}<break/><bold>Disright()</bold><break/>{<break/>// Reads the echoPin and returns the sound wave<break/>Duration2&#x003D; pulseIn(echoPin2, HIGH);<break/>Distance2&#x003D; Duration2&#x002A;0.034/2; // Calculating the distance<break/>if (distance2&#x003C;&#x003D;10)<break/>{<break/>Play.alarm("right.wav");<break/>delay(1700);<break/>}<break/>}</td>
<td><bold>Disleft()</bold><break/>{<break/>// Reads the echoPin and returns the sound wave<break/>Duration3 &#x003D; pulseIn(echoPin3, HIGH);<break/>distance3&#x003D; Duration3 &#x002A; 0.034/2; // Calculating the distance<break/>if (distance3&#x003C;&#x003D;10)<break/>{<break/> Play.alarm (&#x201C;left.wav&#x201D;);<break/>delay(1700);<break/>}}<break/><bold>Fire()</bold><break/>{<break/>fire&#x003D; digitalRead();<break/>if (fire&#x003D;&#x003D;0)<break/>{<break/>Play.alarm ("Fire.wav")<break/> delay(1700);<break/>}}<break/><bold>Water()</bold><break/>{<break/>if (water&#x003E;&#x003D;500)<break/>{<break/>Play water alarm<break/>delay(1700);<break/>}}<break/><bold>Temperature ()</bold><break/>{<break/> float t &#x003D; dht.readTemperature();<break/> delay(2000);<break/>}</td>
</tr>
</tbody>
</table>
</table-wrap>
<p><xref ref-type="fig" rid="fig-17">Fig. 17</xref> shows the overall connections of the sensors.</p>
<fig id="fig-17">
<label>Figure 17</label>
<caption>
<title>Overall connections of the sensors</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_36266-fig-17.tif"/>
</fig>
<p>Finally, the completed safety system is depicted in <xref ref-type="fig" rid="fig-18">Fig. 18</xref>, where all the sensors are integrated into the shoe.</p>
<fig id="fig-18">
<label>Figure 18</label>
<caption>
<title>Final system</title>
</caption>
<graphic mimetype="image" mime-subtype="tif" xlink:href="CMC_36266-fig-18.tif"/>
</fig>
<p>The specific parameters of the algorithms used in a smart shoe safety system for blind people based on IoT technology would depend on the specific design of the system and the goals of the researchers or developers. Some potential parameters that might be considered in the design of such a system could include the Sensitivity of sensors: The Sensitivity of the sensors used in the system, such as GPS or obstacle detection sensors, could be adjusted to optimize their performance and accuracy. Algorithm performance: The performance of the algorithms used in the system, such as those used for navigation or obstacle detection, could be evaluated, and optimized to ensure their effectiveness. Connectivity: The system could be designed to connect to different networks or devices, such as a social network or a caregiver&#x2019;s phone. The parameters for these connections could be optimized for reliable and secure communication. User preferences: The system could be designed to allow the user to customize specific parameters, such as the intensity of vibrations used to alert the wearer to obstacles or the types of alerts they receive. Ultimately, the parameters of the algorithms used in a smart shoe safety system for blind people based on IoT technology would depend on the specific goals and needs of the system and would be optimized to improve the safety and independence of the wearer.</p>
<p>Several reasons why a smart shoe safety system for blind people utilizing IoT technology could be considered innovative and need further research. First, such a system has the potential to significantly improve the safety and independence of blind individuals by providing real-time guidance and assistance in navigating their environment. By leveraging the capabilities of connected devices and sensors, the system can help blind individuals avoid obstacles and tripping and allow them to move more confidently and autonomously through their surroundings. This can significantly improve the quality of life and well-being of blind individuals and reduce reliance on others for assistance in navigating their environment. Second, the use of IoT technology in designing a smart shoe safety system for blind people could be considered novel and need further research due to the potential for real-time updates and connectivity. The use of connected devices and sensors can provide real-time information and alerts to the wearer, which can enhance the system&#x2019;s effectiveness in improving the safety and independence of the wearer. Finally, there may be a need for further research on a smart shoe safety system for blind people based on IoT technology due to the ongoing challenges faced by blind individuals in navigating and moving safely through their environment. While there have been some advancements in assistive technologies for blind individuals, there is still a need for more effective and innovative solutions that can improve the safety and independence of these individuals. A smart shoe safety system for blind people based on IoT technology could be a promising approach to addressing these challenges, and further research could help refine and improve such a system&#x2019;s effectiveness.</p>
<p>A smart shoe safety system for blind people based on IoT technology is a type of assistive technology designed to improve the safety and independence of blind individuals by providing real-time guidance and assistance in navigating their environment. The system could be equipped with sensors and connectivity, such as GPS and obstacle detection, to provide real-time information and alerts to the wearer. It could also be connected to a social network or support group, allowing the wearer to connect with others and share their location and status. The goal of a smart shoe safety system for blind people based on IoT technology is to improve the safety and independence of blind individuals by helping them navigate their environment with greater confidence and autonomy. By providing real-time guidance and assistance, the system can help the wearer avoid obstacles and tripping and allow them to move more confidently and independently through their surroundings. The use of IoT technology in the system design allows for real-time updates and connectivity, which can enhance the system&#x2019;s effectiveness in improving the safety and independence of the wearer.</p>
<p>Several innovations could be incorporated into a smart shoes safety system for blind people based on IoT technology:
<list list-type="order">
<list-item>
<p>Real-time guidance and assistance: Using sensors and connectivity, such as GPS, can provide real-time guidance and assistance to the wearer in navigating their environment.</p></list-item>
<list-item>
<p>Obstacle detection: Sensors, such as ultrasonic or infrared sensors, could detect obstacles in the wearer&#x2019;s path and alert them through vibrations or other cues.</p></list-item>
<list-item>
<p>Emergency alerts: The system could send an alert to a caregiver or emergency services, including a location tracking feature to help responders locate the wearer.</p></list-item>
<list-item>
<p>Social connectivity: The system could be connected to a social network or support group, allowing the wearer to connect with others and share their location and status.</p></list-item>
<list-item>
<p>Customization: The system could be designed to allow the user to customize certain features, such as the intensity of vibrations used to alert the wearer to obstacles or the types of alerts they receive.</p></list-item>
</list></p>
<p>Overall, the innovations in a smart shoe safety system for blind people based on IoT technology would improve the safety and independence of the wearer by providing real-time guidance and assistance in navigating their environment and connecting them with others for support and communication.
<list list-type="order">
<list-item>
<p>There are several potential limitations of a smart shoes safety system for blind people based on IoT technology:</p></list-item>
<list-item>
<p>Cost: The development and implementation of such a system could be expensive, which may limit its accessibility to some individuals.</p></list-item>
<list-item>
<p>Maintenance: The system may require regular maintenance and updates, which could burden users.</p></list-item>
<list-item>
<p>Battery life: The system may rely on a battery to power its sensors and connectivity, which could limit its use if the battery runs out.</p></list-item>
<list-item>
<p>Sensors: The accuracy of the sensors used in the system, such as GPS or obstacle detection sensors, may be limited by environmental factors or other factors, which could affect the system&#x2019;s effectiveness.</p></list-item>
<list-item>
<p>Connectivity: The system may rely on a connection to a network or other device, which could be disrupted if the connection is lost or compromised.</p></list-item>
<list-item>
<p>User adoption: Some individuals may resist using new technology or prefer other navigation and assistance methods.</p></list-item>
</list></p>
<p>Overall, while a smart shoe safety system for blind people based on IoT technology can improve the safety and independence of blind individuals significantly, some limitations may affect its effectiveness and adoption.</p>
</sec>
</sec>
<sec id="s5">
<label>5</label>
<title>Conclusion</title>
<p>The principal objective of this study is to create a safety system that detects objects or obstacles before users and provides warnings back to users in the form of voice messages and vibrations. The safety system employs a mix of ultrasonic sensors and a microcontroller for sensing and remote sticking, as well as for detecting obstacles. This safety system also sends blind shoe locations to disabled people who are blind to facilitate their movement and enhance safety. Limits to independent mobility and navigation are major problems facing visually challenged individuals. The white cane is used primarily as a mobility aid to detect obstacles closely on the ground. Furthermore, detecting objects over knee height is almost impossible, which is a significant impediment. Embedded systems developments have opened a wide range of research and development of affordable and portable devices for physically challenged individuals. This safety system was designed and implemented to increase the functionality of the existing shoe, enabling the detection of knee-high obstacles. The unit comprises an ultrasound guard and a microcontroller-controlled sound with a three-meter detection range. The proximity of obstacles is communicated to the user through a multi-sound signal that is free from interference, which conveys the nearness of barriers. This system is also capable of rapidly detecting moving obstacles. The experiment outcomes demonstrate the proposed safety system&#x2019;s ability to decrease the risks and injuries for blind people when walking in public. It should be noted that people of all ages can have visual impairment problems, and it is a serious condition as it puts the sufferer at significant risk of injuries. Also, risks impede blind people from walking independently. In this regard, the Modern Blind Shoe can become a basic platform for the next generation with visual issues to help them safely navigate indoors and outdoors. These practical and affordable shoes could detect obstacles on the user&#x2019;s path within two meters of distance. Also, the shoes are lightweight even though armed with sensors and other components. Other aspects of this safety system can be improved via wireless connectivity between the safety system components, thus, increasing the range of the ultrasonic Sensor. Furthermore, technology could be incorporated for determining the speed of approaching obstacles. Also, a camera with shoes and a mpu 6050 could be used to determine the movement.</p>
</sec>
</body>
<back>
<sec><title>Funding Statement</title>
<p>This work is supported by the Research and Innovation Department, Skyline University College, University City of Sharjah&#x2014;P. O. Box 1797-Sharjah, UAE. Grant numbers: 1-2-2022, Dr. Ammar Almomani <ext-link ext-link-type="uri" xlink:href="https://www.skylineuniversity.ac.ae/">https://www.skylineuniversity.ac.ae/</ext-link>.</p>
</sec>
<sec sec-type="COI-statement"><title>Conflicts of Interest</title>
<p>The authors declare that they have no conflicts of interest to report regarding the present study.</p>
</sec>
<ref-list content-type="authoryear">
<title>References</title>
<ref id="ref-1"><label>[1]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>M.</given-names> <surname>Anisha</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Kirthika</surname></string-name>, <string-name><given-names>D. J.</given-names> <surname>Harline</surname></string-name>, <string-name><given-names>P.</given-names> <surname>Thenmozhi</surname></string-name>, <string-name><given-names>R.</given-names> <surname>Rubala</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Low-cost smart shoe for visually impaired</article-title>,&#x201D; in <conf-name>Third Int. Conf. on Intelligent Communication Technologies and Virtual Mobile Networks (ICICV)</conf-name>, <publisher-loc>Tirunelveli, India</publisher-loc>, pp. <fpage>1108</fpage>&#x2013;<lpage>1111</lpage>, <year>2021</year>. </mixed-citation></ref>
<ref id="ref-2"><label>[2]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S.</given-names> <surname>Suman</surname></string-name>, <string-name><given-names>S.</given-names> <surname>Mishra</surname></string-name>, <string-name><given-names>K. S.</given-names> <surname>Sahoo</surname></string-name> and <string-name><given-names>A. J. M. I. S.</given-names> <surname>Nayyar</surname></string-name></person-group>, &#x201C;<article-title>Vision navigator: A smart and intelligent obstacle recognition model for visually impaired users</article-title>,&#x201D; <source>Mobile Information Systems</source>, vol. <volume>2022</volume>, no. <issue>4</issue>, pp. <fpage>1</fpage>&#x2013;<lpage>15</lpage>, <year>2022</year>.</mixed-citation></ref>
<ref id="ref-3"><label>[3]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>R. K.</given-names> <surname>Dharme</surname></string-name>, <string-name><given-names>J. R.</given-names> <surname>Surywanshi</surname></string-name>, <string-name><given-names>H. C.</given-names> <surname>Kunwar</surname></string-name> and <string-name><given-names>Y. H.</given-names> <surname>Palve</surname></string-name></person-group>, &#x201C;<article-title>Smart shoe provides vision to visionless person</article-title>,&#x201D; in <conf-name>ICT Systems and Sustainability</conf-name>, <publisher-loc>Singapore</publisher-loc>, pp. <fpage>131</fpage>&#x2013;<lpage>137</lpage>, <year>2022</year>. </mixed-citation></ref>
<ref id="ref-4"><label>[4]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>R.</given-names> <surname>Biswas</surname></string-name> and <string-name><given-names>S. D.</given-names> <surname>Joy</surname></string-name></person-group>, &#x201C;<article-title>The assisting pair-a new approach for assist the blind people</article-title>,&#x201D; <source>International Research Journal of Modernization in Engineering Technology and Science</source>, vol. <volume>3</volume>, no. <issue>5</issue>, pp. <fpage>280</fpage>&#x2013;<lpage>286</lpage>, <year>2021</year>.</mixed-citation></ref>
<ref id="ref-5"><label>[5]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>F.</given-names> <surname>Kamaruddin</surname></string-name>, <string-name><given-names>N.</given-names> <surname>Mahmood</surname></string-name>, <string-name><given-names>M. A.</given-names> <surname>Razak</surname></string-name> and <string-name><given-names>N.</given-names> <surname>Zakaria</surname></string-name></person-group>, &#x201C;<article-title>Smart assistive shoes with internet of things implementation for visually impaired people</article-title>,&#x201D; in <conf-name>Journal of Physics: Conf. Series</conf-name>, <publisher-loc>Perlis, Malaysia</publisher-loc>, pp. <fpage>012030</fpage>, <year>2021</year>. </mixed-citation></ref>
<ref id="ref-6"><label>[6]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>K.</given-names> <surname>Kumar</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Kumar</surname></string-name>, <string-name><given-names>N.</given-names> <surname>Kumar</surname></string-name>, <string-name><given-names>M. A.</given-names> <surname>Mohammed</surname></string-name>, <string-name><given-names>A. S.</given-names> <surname>Al-Waisy</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Dimensions of internet of things: technological taxonomy architecture applications and open challenges&#x2014;A systematic review</article-title>,&#x201D; <source>Wireless Communications and Mobile Computing</source>, vol. <volume>2022</volume>, no. <issue>3</issue>, pp. <fpage>9148373</fpage>, <year>2022</year>.</mixed-citation></ref>
<ref id="ref-7"><label>[7]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>T.</given-names> <surname>Chava</surname></string-name>, <string-name><given-names>A. T.</given-names> <surname>Srinivas</surname></string-name>, <string-name><given-names>A. L.</given-names> <surname>Sai</surname></string-name> and <string-name><given-names>V.</given-names> <surname>Rachapudi</surname></string-name></person-group>, &#x201C;<article-title>IoT based smart shoe for the blind</article-title>,&#x201D; in <conf-name>6th Int. Conf. on Inventive Computation Technologies (ICICT)</conf-name>, <publisher-loc>Coimbatore, India</publisher-loc>, pp. <fpage>220</fpage>&#x2013;<lpage>223</lpage>, <year>2021</year>. </mixed-citation></ref>
<ref id="ref-8"><label>[8]</label><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><given-names>N. G.</given-names> <surname>Bourbakis</surname></string-name>, <string-name><given-names>I. P.</given-names> <surname>Ktistakis</surname></string-name> and <string-name><given-names>P.</given-names> <surname>Khursija</surname></string-name></person-group>, &#x201C;<chapter-title>Smart shoes for assisting people: A short survey</chapter-title>,&#x201D; in <source>Advances in Assistive Technologies</source>. <publisher-name>Springer</publisher-name>, pp. <fpage>183</fpage>&#x2013;<lpage>202</lpage>, <year>2022</year>.</mixed-citation></ref>
<ref id="ref-9"><label>[9]</label><mixed-citation publication-type="book"><person-group person-group-type="author"><string-name><given-names>N. G.</given-names> <surname>Bourbakis</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Esposito</surname></string-name>, <string-name><given-names>G. A.</given-names> <surname>Tsihrintzis</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Virvou</surname></string-name> and <string-name><given-names>L. C.</given-names> <surname>Jain</surname></string-name></person-group>, &#x201C;<chapter-title>Introduction to advances in assistive technologies</chapter-title>,&#x201D; in <source>Advances in Assistive Technologies</source>. <publisher-name>Springer</publisher-name>, pp. <fpage>1</fpage>&#x2013;<lpage>7</lpage>, <year>2022</year>.</mixed-citation></ref>
<ref id="ref-10"><label>[10]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>T. C.</given-names> <surname>Callari</surname></string-name>, <string-name><given-names>L.</given-names> <surname>Moody</surname></string-name>, <string-name><given-names>P.</given-names> <surname>Magee</surname></string-name> and <string-name><given-names>D.</given-names> <surname>Yang</surname></string-name></person-group>, &#x201C;<article-title>Smart-not only intelligent!&#x2019;co-creating priorities and design direction for &#x2018;smart&#x2019;footwear to support independent ageing</article-title>,&#x201D; <source>International Journal of Fashion Design, Technology and Education</source>, vol. <volume>12</volume>, no. <issue>3</issue>, pp. <fpage>313</fpage>&#x2013;<lpage>324</lpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-11"><label>[11]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>B. M.</given-names> <surname>Eskofier</surname></string-name>, <string-name><given-names>S. I.</given-names> <surname>Lee</surname></string-name>, <string-name><given-names>M.</given-names> <surname>Baron</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Simon</surname></string-name>, <string-name><given-names>C. F.</given-names> <surname>Martindale</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>An overview of smart shoes in the internet of health things: Gait and mobility assessment in health promotion and disease monitoring</article-title>,&#x201D; <source>Applied Sciences</source>, vol. <volume>7</volume>, no. <issue>10</issue>, pp. <fpage>986</fpage>, <year>2017</year>.</mixed-citation></ref>
<ref id="ref-12"><label>[12]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>T.</given-names> <surname>Chandekar</surname></string-name>, <string-name><given-names>R.</given-names> <surname>Chouhan</surname></string-name>, <string-name><given-names>R.</given-names> <surname>Gaikwad</surname></string-name>, <string-name><given-names>H.</given-names> <surname>Gosavi</surname></string-name> and <string-name><given-names>S.</given-names> <surname>Darade</surname></string-name></person-group>, &#x201C;<article-title>Implementation of obstacle detection and navigation system for visually impaired using smart shoes</article-title>,&#x201D; <source>International Research Journal of Engineering and Technology (IRJET)</source>, vol. <volume>4</volume>, no. <issue>4</issue>, pp. <fpage>2125</fpage>&#x2013;<lpage>2129</lpage>, <year>2017</year>.</mixed-citation></ref>
<ref id="ref-13"><label>[13]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>A.</given-names> <surname>Dr&#x0103;gulinescu</surname></string-name>, <string-name><given-names>A. -M.</given-names> <surname>Dr&#x0103;gulinescu</surname></string-name>, <string-name><given-names>G.</given-names> <surname>Zinc&#x0103;</surname></string-name>, <string-name><given-names>D.</given-names> <surname>Bucur</surname></string-name>, <string-name><given-names>V.</given-names> <surname>Feie&#x0219;</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Smart socks and in-shoe systems: State-of-the-art for two popular technologies for foot motion analysis, sports, and medical applications</article-title>,&#x201D; <source>Sensors</source>, vol. <volume>20</volume>, no. <issue>15</issue>, pp. <fpage>4316</fpage>, <year>2020</year>; <pub-id pub-id-type="pmid">32748872</pub-id></mixed-citation></ref>
<ref id="ref-14"><label>[14]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>K. C.</given-names> <surname>Jung</surname></string-name>, <string-name><given-names>J. H.</given-names> <surname>Son</surname></string-name> and <string-name><given-names>S. H.</given-names> <surname>Chang</surname></string-name></person-group>, &#x201C;<article-title>Self-powered smart shoes with tension-type ribbon harvesters and sensors</article-title>,&#x201D; <source>Advanced Materials Technologies</source>, vol. <volume>6</volume>, no. <issue>2</issue>, pp. <fpage>2000872</fpage>, <year>2021</year>.</mixed-citation></ref>
<ref id="ref-15"><label>[15]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>V. R.</given-names> <surname>Reddy</surname></string-name> and <string-name><given-names>K. R.</given-names> <surname>Reddy</surname></string-name></person-group>, &#x201C;<article-title>Navigation aid with object positioning for blind using Rfid and Ir sensor: A proposal</article-title>,&#x201D; <source>i-Manager&#x2019;s Journal on Communication Engineering and Systems</source>, vol. <volume>8</volume>, no. <issue>2</issue>, pp. <fpage>22</fpage>, <year>2019</year>.</mixed-citation></ref>
<ref id="ref-16"><label>[16]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>S. -H</given-names> <surname>Seo</surname></string-name> and <string-name><given-names>S. -W</given-names> <surname>Jang</surname></string-name></person-group>, &#x201C;<article-title>Design and implementation of a smart shoes module based on arduino</article-title>,&#x201D; <source>Journal of the Korea Institute of Information and Communication Engineering</source>, vol. <volume>19</volume>, no. <issue>11</issue>, pp. <fpage>2697</fpage>&#x2013;<lpage>2702</lpage>, <year>2015</year>.</mixed-citation></ref>
<ref id="ref-17"><label>[17]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>P. H.</given-names> <surname>Truong</surname></string-name>, <string-name><given-names>S.</given-names> <surname>You</surname></string-name>, <string-name><given-names>S. -H.</given-names> <surname>Ji</surname></string-name> and <string-name><given-names>G. -M.</given-names> <surname>Jeong</surname></string-name></person-group>, &#x201C;<article-title>Wearable system for daily activity recognition using inertial and pressure sensors of a smart band and smart shoes</article-title>,&#x201D; <source>International Journal of Computers Communications &#x0026; Control</source>, vol. <volume>14</volume>, no. <issue>6</issue>, pp. <fpage>726</fpage>&#x2013;<lpage>742</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-18"><label>[18]</label><mixed-citation publication-type="conf-proc"><person-group person-group-type="author"><string-name><given-names>W.</given-names> <surname>Wu</surname></string-name>, <string-name><given-names>N.</given-names> <surname>Lei</surname></string-name> and <string-name><given-names>J.</given-names> <surname>Tang</surname></string-name></person-group>, &#x201C;<article-title>Smart shoes for obstacle detection</article-title>,&#x201D; in <conf-name>Int. Conf. on Computer Engineering and Networks</conf-name>, <publisher-loc>Xi&#x2019;an, China</publisher-loc>, pp. <fpage>1319</fpage>&#x2013;<lpage>1326</lpage>, <year>2020</year>. </mixed-citation></ref>
<ref id="ref-19"><label>[19]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>J.</given-names> <surname>Yang</surname></string-name> and <string-name><given-names>Y.</given-names> <surname>Yin</surname></string-name></person-group>, &#x201C;<article-title>Novel soft smart shoes for motion intent learning of lower limbs using LSTM with a convolutional autoencoder</article-title>,&#x201D; <source>IEEE Sensors Journal</source>, vol. <volume>21</volume>, no. <issue>2</issue>, pp. <fpage>1906</fpage>&#x2013;<lpage>1917</lpage>, <year>2020</year>.</mixed-citation></ref>
<ref id="ref-20"><label>[20]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>Z.</given-names> <surname>Zhou</surname></string-name>, <string-name><given-names>L.</given-names> <surname>Weng</surname></string-name>, <string-name><given-names>T.</given-names> <surname>Tat</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Libanori</surname></string-name>, <string-name><given-names>Z.</given-names> <surname>Lin</surname></string-name> <etal>et al.</etal></person-group><italic>,</italic> &#x201C;<article-title>Smart insole for robust wearable biomechanical energy harvesting in harsh environments</article-title>,&#x201D; <source>ACS Nano</source>, vol. <volume>14</volume>, no. <issue>10</issue>, pp. <fpage>14126</fpage>&#x2013;<lpage>14133</lpage>, <year>2020</year>; <pub-id pub-id-type="pmid">33044812</pub-id></mixed-citation></ref>
<ref id="ref-21"><label>[21]</label><mixed-citation publication-type="journal"><person-group person-group-type="author"><string-name><given-names>Y.</given-names> <surname>Zou</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Libanori</surname></string-name>, <string-name><given-names>J.</given-names> <surname>Xu</surname></string-name>, <string-name><given-names>A.</given-names> <surname>Nashalian</surname></string-name> and <string-name><given-names>J.</given-names> <surname>Chen</surname></string-name></person-group>, &#x201C;<article-title>Triboelectric nanogenerator enabled smart shoes for wearable electricity generation</article-title>,&#x201D; <source>Research</source>, vol. <volume>2020</volume>, no. <issue>17</issue>, pp. <fpage>7158953</fpage>, <year>2020</year>; <pub-id pub-id-type="pmid">33623909</pub-id></mixed-citation></ref>
</ref-list>
</back>
</article>

