Smart Mind-based Approach to Control Wheelchair Wirelessly

The number of elderly and disabled people worldwide has increased, and their day-to-day activities depend on others’ help. Improving the quality of life of these people has become the most important responsibility of society, and it is the role of technology specialists to make their life as normal and easy as possible so that they can perform their day-to-day activities at the relevant time without others’ help. Many researchers have proposed several solutions, but these have limitations such as poor performance and usability. Hence, there is a need to design a wheelchair that is smart and easy to use for the elderly and people with partial or full quadriplegia. In this paper, we propose a smart mind-based wireless controlled wheelchair (SMWCW) approach that completely controls the motion of a wheelchair wirelessly through brain signals and eye blinks to help partially quadriplegic individuals to perform their daily activities easily. In SMWCW, NeuroSky’s electroencephalography (EEG) device is used by users sitting on the wheelchair to capture brain activity and the frequency of eye blinks, and it translates these data into movement commands using an Arduino microcontroller unit that is directly connected to the wheelchair to move the wheelchair. The proposed approach has been simulated, and results show its effectiveness and applicability for use by most physically disabled persons.


Introduction
Users of wheelchairs are the most visible among physically disabled people. The elderly and partially quadriplegic individuals are the groups with the highest rates of both manual and electric wheelchair use. (1) Wheelchair users face difficulties in their day-to-day activities, particularly the elderly and partially quadriplegic individuals, such as the maneuvering of their electrical or mechanical wheelchair.
Nowadays, a wide range of technologies that help the disabled and physically challenged are available, and many things around us depend heavily on technologies that make life easier and more flexible. (1) These technologies give solutions to problems that are difficult for humans to solve. Control systems are the most recent of these technologies designed to help the physically challenged. (5) According to the results of a recently completed clinical trial, assistive technology enables individuals to maneuver a powered wheelchair via a variety of guided systems, such as a mouse cursor or joystick and a tactile screen, and systems based on voice recognition can be operated by individuals with a certain amount of upper body mobility. (2) Moreover, technologies provide solutions in medical fields, which are our concern in this paper. Therefore, many attempts have been made to develop various solutions that help those with disabilities to move and to minimize their dependence on others.
The elderly and partially quadriplegic individuals suffering from severe paralysis may not be able to use present technologies since they require accurate control. To improve further the lifestyle of the physically challenged, we aim to develop a wheelchair system that moves in accordance with the signals obtained from hand movements through an accumulator sensor. Since the hand movements are limited, we aim to explore the signals collected from an accumulator to help maneuver the wheelchair.
An accumulator sensor acquires specific activities and converts them to analog signals by using encoder and decoder protocols and sends them to other connected devices through an RF transmitter and an RF receiver. A microcontroller (AT89C51) is used to analyze the encoded signals and translate them into useful commands. An Arduino microcontroller (ATMEGA328P-PU) is directly connected to the devices to fully control them wirelessly or wired. A brain-computer interface (BCI) provides a direct interface between the human brain and the computer. BCI enables communication between the human brain and physical devices by translating different brain signals into commands in real time. (2) BCI techniques are either invasive or noninvasive. Noninvasive techniques have become a hot research area and increasingly popular. Various techniques are available, such as single-photon emission computer tomography (SPECT), functional magnetic resonance imaging (fMRI), positron emission tomography (PET), magnetoencephalography (MEG), optical brain imaging, singleneuron recording using microelectrodes, and electroencephalography (EEG). (3,13) In EEG, an electro-oculograph is the most common tool for a noninvasive BCI. NeuroSky's EEG device can be placed on the user's scalp to acquire and measure brain signals related to the scalp potential differences for various complex actions. EEG signal acquisition techniques and devices are made more compact, handy, and wireless.
In this paper, using the above-mentioned techniques, we devise a smart mind-based wireless controlled wheelchair (SMWCW) approach to solving the challenges encountered by the physically disabled and assisting the elderly and fully or partially quadriplegic people to perform their daily life activities on their own. The SMWCW consists of a model that performs in collaboration between NeuroSky's EEG device and the Arduino microcontroller unit, which is directly connected to the wheelchair and moves it. Brain signals and eye blinks of people are used to fully control the wheelchair wirelessly.
The major objective of the SMWCW is to overcome physical disability challenges and to improve the quality life of the elderly and partially quadriplegic people without the need for help from others.
The main contributions of the SMWCW are as follows.
(1) We introduce a new SMWCW approach to improving the day-to-day life quality of the elderly or partially quadriplegic people. (2) We provide a cost-effective and easy solution to users with full wireless control of a wheelchair using mind signals and eye blinks. (3) We use a microcontroller to identify and analyze mind signals and eye blinks and translate them to the control of the wheelchair without the need of a whole computer.
The SMWCW is an efficient and hybrid approach. The basic approaches available today make use of traditional wiring techniques that are difficult for elderly or partially quadriplegic people to use. However, our approach is wireless and a hybrid of the two above-mentioned mechanisms and addresses the problems in the use and control of a wheelchair by the elderly and disabled users. The rest of the paper is organized as follows. In Sect. 2, we explain the existing works. In Sect. 3, we present the proposed approach (SMWCW). In Sect. 4, we describe the simulation, implementation and discussion. The results and discussion are provided in Sect. 5, and we conclude the article in Sect. 6.

Literature Review
Over the years, in several research studies, approaches such as the joystick-based, gesturebased, and chain-based control mechanisms have been proposed to help people with physical disabilities move and reduce their dependence on others. The mind-controlled wheelchair solution (3) has been proposed to control a wheelchair. A headset that measures brain activity and eSense meter values has been used, while a BCI is used to translate a user's wishes or commands into device commands to accomplish the user's intent. The core drawback of this solution is the wired transmission of signals and control commands, which makes it difficult for the elderly and disabled people to handle and operate the wheelchair.
In Ref. 4, the authors proposed a solution that provides direct communication between patients and doctors by translating different patterns of brain activity into commands in real time. Doctors can easily obtain information about a patient's brain waves using this solution. The main drawback of this solution is that the range of area for information on brainwaves and movement that is sent is very small because the patient is sitting in the wheelchair. In addition, they are normally low for disabled people.
Brain-based control of the wheelchair using a NeuroSky brainwave sensor in a headset device has been proposed in Ref. 5. The proposed sensor integrates brain activity and the frequency of eye blinks to control the wheelchair using wired devices. The directions of the wheelchair are limited in this solution, and full wired control is necessary.
Previously, the authors attempted to integrate spoken words with brainwaves to develop a model to control a wheelchair. (6) EEG was employed in the proposed model to examine the neuroelectrical correlates of speech quality perception. The model is cost-effective but has low reliability and it is inapplicable for fully quadriplegic people.
A highly accurate device developed by EMOTIV was used to measure brain signals, enabling wheelchair control through thoughts after some training of the patient or user. (7) Unlike previous solutions, this approach also relies on wireless connection between the device and the computer. However, a computer is still required as an intermediary between the device measuring the signals of the brain and the microcontroller. In addition, placing the device on the user and putting liquids on the sensors of the device were inconvenient for the users. Moreover, this device is also more expensive than the others.
In the device presented in Ref. 8, multiple brain signals were used to control the movement of the wheelchair. According to the results of this study, this device has high accuracy. However, it requires a computer to read brain signals and wired connection. Therefore, it is difficult for people with disabilities to use, in addition to being high in cost and requiring periodic maintenance.
Attempts have also been made to develop a wheelchair connected to a computer and controlled by brain signals. (9,10) The proposed system is limited to reading only two signals from the brain, which restricts the number of moves required to control the chair, and therefore, complete control of the wheelchair is not achieved. The use of eye movements instead of brain signals in the system to increase the number of control commands was also attempted, but the accuracy was low and sufficient training of the user was required.
Robot of wheelchair has been designed in Ref. 11. The proposed model uses a beta wave of mind wave sensor and Arduino to control the wheelchair. Controlling wheelchairs with eye movement was proposed in Ref. 12. The system requires a camera and intermediate software that translates eye movement into commands and then sends these commands to a computer or a microcontroller for execution.
A smart wheelchair prototype based on hand gestures was proposed in Ref. 14 to assist disabled people in controlling the wheelchair. In this prototype, many wired hardware devices are required, making it inconvenient for users.
In Ref. 15, a hand-gesture-based approach using a touch sensor has been proposed for users to control a wheelchair at a low cost. This approach uses wired hand gesture hardware, which makes its use difficult for the elderly and physically disabled people.
In Ref. 16, the authors proposed a hand-gesture-based approach to control a wheelchair. The idea of this approach relies on using GPS and GSM to identify locations, which makes it unreliable in noisy environments.
A hand-gesture-based approach using Raspberry pi and image processing techniques to control a wheelchair was proposed in Ref. 17. This approach relies on a USB web camera to recognize gestures. The main drawback of this approach is that it is not easy to handle and operate for users because it requires a wired web camera to observe the operations. The same approach has been proposed in Ref. 18. The proposed model tried to reduce voltage supply and operational cost to use and control the wheelchair using AVR micro-controller.
In Ref. 19, an accelerometer-based gesture approach was proposed to control a wheelchair using GPS and GSM navigation. In this approach, wheelchair information is transmitted to a control room and the location of the user is determined using a navigation application.
A framework to help those who are unable to walk owing to a physiological or physical illness is presented in Ref. 20. This framework relies on a computer to control a wheelchair, so it will be difficult to use for the elderly and physically disabled people.
In Ref. 21, a model for a hand gesture-wired control user interface with hand movements detected with an accelerometer sensor was proposed to control the direction of a wheelchair. The main drawback of this model is that the transmission control is wired, which makes it difficult for users to handle and operate. Table 1 shows the core limitations and drawbacks of the current solutions proposed in the literature.

Proposed Approach
In this paper, we propose the SMWCW approach to solve the challenges faced by the elderly and fully or partially quadriplegic people and to assist them to perform their daily life activities on their own.
We take the following set of assumptions in our approach: (1) The NeuroSky EEG system to capture the brain activity and frequency of eye blinks is operated by the user sitting in a wheelchair. (2) Brain signals and the frequency of eye blinks are translated into movement commands by the Arduino microcontroller unit, which is directly connected to the wheelchair, to move the wheelchair. (3) There are five commands to provide adequate and proper control, namely, the forward, backward, turn right, turn left, and stop commands. (4) The number of brain signals that we were able to measure and analyze using the microcontroller is three, namely, focus, indication of meditation, and eye blink signals.

Core components of SMWCW
In the SMWCW approach, the wheelchair is controlled using EEG signals acquired from the brain. Then the signals are translated, along with the blinking eye signals, into commands using the (ATMEGA328P-PU) AVR Microcontroller with Arduino programmed to move the servo motors and control their speed. A microcontroller is connected to a sensor to determine the distance at which the wheelchair should stop in cases of edges and other limitations. Data of brain signals and data processing are shown in the LCD connected to the microcontroller. Figure 1 shows the core components and work mechanism of the SMWCW.
The following subsections explain the process of reading brain signals using the NeuroSky unit and translating them into commands to control the wheelchair.

Synchronization between NeuroSky unit and ATMEGA328P-PU microcontroller
Firstly, the protocol with the eSense technique is used to measure the intensities of concentration and meditation. This protocol is also used to send packets of bytes and to analyze them by reading the data of these packets using a high-speed 8-bit microcontroller.
Packets of bytes consist of three parts.
• Packet Header: the synchronization part that consists of 3 bits, two of them for synchronization and the other one for determining the length of the main part (Plength). • Packet Payload: the main part of concentration and meditation data. The length of this part should be between 0 and 169 bytes. • Packet Checksum: it contains one byte used to validate the sent data using the Checksum algorithm. Table 2 shows the main parts of packets that are sent in addition to the definitions of bytes and the values of the configured bytes for each part.
The bytes sent are used as representations of a code that performs a specific function. These bytes can be represented by a decimal or a hexadecimal value. Table 3 shows the important byte values and their functions.  Start the connection 208 0xD0 The NeuroSky signal measurement device is connected 209 0xD1 The device was not found in the surrounding area 210 0xD2 The device for measuring brain signals was not connected 211 0xD3 The connection request and data acquisition were denied

Synchronization and data reading algorithm
An accurate algorithm was designed to work well with the protocol used by the brain signal measurement device. Data on the intensities of concentration, meditation, and weak sent signals are read. Figure 2 shows the flowchart of the proposed algorithm to achieve synchronization and communication between the (ATMega328P-PU) microcontroller and the NeuroSky headset, as well as the reading of the bits of brain signals that will be used to control the wheelchair.

Identification of eye blinks and their conversion into commands
The main contribution of this paper is eye blink identification and their analysis by using a microcontroller without the need of a computer to control the wheelchair. We employ the eSense technique in the NeuroSky device to measure the intensities of concentration, meditation, and signals, that is, to read the signal intensity and analyze it using the microcontroller to determine the occurrence of an eye blink and to determine the closure of the eye, which will be used in choosing the movement direction of the wheelchair to the right or left, as illustrated in the following steps.
(1) A USB dongle device was connected to a computer before modifying it to connect to a microcontroller directly to monitor and record brain signals during the blinking of the eyes. (2) Recorded signals were used to determine the degree of signal intensity reduction, so that the appropriate eye blink intensity was determined. The low-intensity signal was then compared with the byte value of the low-intensity signal in the eSense protocol. (3) A USB dongle device was modified to connect it to the ATMega328P-PU microcontroller, and the code instructions were saved for analyzing the brain signals in a microcontroller with the possibility of displaying the data that were read and analyzed in the computer, using a serial monitor to confirm the possibility of knowing when to close the eyes and to calculate the intensity of blinking using the intensity of the weak signal. This data should be sent from the microcontroller through the RX port to the USB port of the computer using USB2.0 to RS232/UART. (4) The byte value that represents the intensity of weak signals to match the required eye closure intensity is determined by comparing the change in the intensity of the concentration and the intensity of weak signals. The comparison shows that the appropriate value for this byte is greater than 23 and less than 53. Figure 3 illustrates the EEG signals for eye blinks and the EEG frequency bands.

Conversion of brain signals into commands
Brain signals are converted into commands using brain electrodes through which the intensities of concentration, meditation, and eye closure were measured. To convert signals into commands, we must first know the number of commands required to control the wheelchair. There are five commands to provide adequate and proper control: • forward command, • backward command, • turn right command, • turn left command, and • stop command.
On the other hand, the number of brain signals that we were able to measure and analyze using the microcontroller is three, as follows: • focus signal, • indication of meditation signal, and • eye blink signal. Therefore, we modified and merged some commands to control the wheelchair using the three brain signals analyzed using the microcontroller. After modification, these commands become as follows: • forward command, • backward command, • direction commands (left, right, front, back), and • stop command.
Each command will be discussed separately, and how the brain signal is translated to match the command to be executed will be explained. The appropriate brain signal for this command is the focus signal, because it is commensurate with the thought of moving forward. As mentioned earlier, the concentration intensities range from 1 to 100. To facilitate their calculation, we can convert intensities into decimal numbers by dividing them by 100. Thus, the initial intensity of concentration is determined, and when the measured concentration intensity is greater than the given initial intensity, a forward command will be sent to two motors. The initial intensity of the concentration was initially determined to be 0.45, but when the device was tested by more than one user, the intensity of the concentration was found to vary from person to person. Therefore, variable resistance was used to control the initial intensity. One side of the variable resistance was connected to analog ports of the (ATMega328P-PU) microcontroller and the other side was connected to the ground.
Equation (1) was used to determine the initial intensity and limit it between the low and high intensities; it should not be equal to zero.

−
(1) Figure 4 shows the flowchart for translating the focus signal into a command to move the motors forward.

(b) Backward Command
The appropriate brain signal for this command is a meditation signal, which has the same properties of the focus signal. In the backward command, a fixed initial intensity is used because all control ports are analogue. Therefore, a constant initial intensity is set for postexperiment reflection for a number of persons, and it can modified by the user depending on the measured concentration intensity to facilitate the control of the wheelchair. This intensity is equal to 0.85, which represents less meditation being needed to move the motors backward.
Because of the overlap of the focus and reflection signals, we have designed the following strategy to separate the signals and determine accurately the signal intended by the user.
• Store the intensity of concentration in one variable, and the intensity of reflection in another variable. • Compare the two values: if the intensity of concentration is greater than that of meditation, it means that the user is focusing, indicating the forward command, whereas if the intensity of meditation is higher than that of focus, it means that the user is meditating, indicating the backward command. • For the backward command, a comparison is made between the intensity of the meditation and the initial intensity of the meditation. If the intensity of the meditation is greater than the initial value (0.85), the motors are moved backwards, otherwise, the motors should be stopped. Figure 5 shows the flowchart used to implement the strategy of comparing the focus and reflection signals and moving the motors backward and forward.

(c) Direction Commands
Because of the lack of a brain signal that can be analyzed using the microcontroller, the two commands for moving the wheelchair to the right or left were removed. To realize these two movements, the commands were replaced by a command that determines the direction first, and then the motors are moved by the commands of moving the wheelchair forward, front, right, and left. The blink of the eyes is used, and since the blink of each eye cannot be determined separately, the direction is defined as follows. (1) When there is no blinking of the eyes, the direction is forward, so when using the motormove command, the two motors move forward. (2) When a single blink occurs, the direction is changed to the right. If the engine-move command is used, the left motor rotates while the right motor stops rotating to act as the axis that leads to the right. (3) When two consecutive eye blinks occur, the direction is changed to the left. In contrast to the right turn, the right motor rotates forward whereas the left motor is in a stop position, turning the SMWCW to the left when the motors move forward. (4) To change the direction to backward, the blinking of the eyes is not used, because of the use of the reflection signal. Thus, the command to change the direction to backward is independent of the rest of the commands.
After enabling movement in the three directions (front, right, left) using the number of blinks of the eyes, we designed a number of algorithms and experimented with a number of cases to achieve a simple and effective direction selection, as illustrated in Fig. 6. Figure 6 shows the flowchart for determining the directions using the blinking eyes signals. The following method was used to determine the number of eye blinks and the time between the first and second blinks: (1) When a blink occurs, the intensity of the blink is checked to confirm that this blink was intended by the user and not the natural blinking of the eye, by examining the intensity of weak signals. An intensity of less than 23 means the occurrence of a natural eye blink, not a deliberate blink. However, when the intensity is greater than 53, it means a separation of the sensitivity to the forehead or error reading of the signal. (2) After making sure that the blink is intended by the user, the time when it occurred is recorded using some special code instructions. (3) The time recorded when the first blink occurred is used to provide a suitable time period for the user to make the second blink to select the left direction, and to make sure that there is sufficient time before ensuring that there is only one blink indicating the right direction.
(5) To return to the forward direction, one or more blinks should occur when the movement direction was to the left or right.

(d) Stop Command
Since the motors are motionless when the measured intensity of reflection is higher than the initial intensity of each of them, the motor stop motion does not need a special signal. If the user wants to stop the motion, the measured focus intensity should be lower than the initial concentration for the forward rotation. However, for the backward rotation of the wheelchair, the measured intensity of meditation is reduced to a level below the initial intensity of meditation. To control the wheel chair correctly, the user must undergo training in the use of the device, especially to achieve a stop command that requires the ability to control the intensity of focus or meditation. Therefore, a screen (4X20 LCD) must be used to facilitate the process and make the commands clear to the user by showing the following information.
• att: the intensity of concentration • Med: the intensity of meditation • Dir: the direction of movement • Blink: the number of blinks • Threshold: change in the initial concentration Figure 7 shows the information displayed on the LCD screen.

Simulation, Implementation, and Discussion
To evaluate the performance and efficiency of the SMWCW, simulation and implementation were carried out using C programming language, Arduino C IDE, and Protues simulation software. The IDE has several built-in functions and falls under the AVR-embedded C design based on processing.

Connecting and simulating the USB Bluetooth dongle with microcontroller
Many modifications are made on the Bluetooth device attached to the EGG used to connect the computer to the EEG. The main aim of our modifications is to connect the Bluetooth device directly to the microcontroller so that the data are transmitted directly from the EGG to the microcontroller without the need for a computer.
When connecting, the voltage of the wireless communication device, whether feeding voltage or transmission voltage, must be considered as follows. • The dongle needs a 3.3 V input voltage, so a regulator is used to convert the 7.4 V battery voltage to the required feeding voltage. The feeder port is connected to the voltage regulator output, for the connection to the ground outlet of the voltage divider. • The TX port of the wireless communication device is connected to the RX receiving port of the ATMega328PU microcontroller. In this case, we do not need to change the voltage because when there is a dongle transmission, the voltage (3.3 V) is directed to the control receiving port, and there is no need to transfer data using transmit port voltage. Therefore, there is no effect on the wireless communication device. • When connecting the receiver (RX) of the dongle to the (TX) port, the output voltage of the transmitting port must be controlled using the microcontroller because the voltage is 5 V, (the voltage that the receiver can withstand is 3.3 V). If an increase in voltage occurs, the unit may be damaged. To solve this problem, we use a voltage divider to provide the appropriate voltage to the receiving port. Figure 8 shows the configuration for connecting the transceiver (RX and TX) ports using the voltage divider. Figure 9 shows the simulation of the USB Bluetooth dongle connection with the ATMega328 microcontroller.

Connecting and simulating servo motors with microcontroller
The control port of the right motor is connected to the second digital input (D2), whereas the control port of the left motor is connected to the third digital input (D3) of the microcontroller. The two inputs have a PWM (pulse control) feature to control the directions and speed of the motor. Figure 10 shows simulations for the servo motor connection to the ATmega328P-PU microcontroller.

Connecting and simulating distance sensor with microcontroller
For the analogue output of the sensor, pin3 of the sensor output is connected to one of the analogue input ports of the (ATMega328P-PU) microcontroller. Analog Input1 is used to connect with the sensor. Figure 11 shows the simulation of the Sharp 2Y0A21 sensor connection to the microcontroller.

Connecting and simulating LCD screen with microcontroller
Only four ports of screen data will be connected to the microcontroller to send the data to the screen in four-bit format (Nipple) instead of eight-bit format (Word) owing to the limited number of ports available on the microcontroller. Figure 12 shows the simulation of the LCD 4X20 connection to the ATMega328P-PU microcontroller.

General simulation of component integration and installation
The core components of the SMWCW have been integrated and installed as shown in Fig.  13. The simulation results of the full schematic of the proposed method are represented as five commands resulting from the analysis of data and figures coming from the device detecting brain signals. These commands are shown in Table 4.

Results and Discussion
The effectiveness of the SMWCW approach was validated through extensive simulations. In this section, we present the performance metrics and simulation results and analysis.

Performance metrics
The performance of the SMWCW is evaluated using the following metrics: (1) thought-controlled change in wheelchair direction, (2) response accuracy of control commands, (3) response time of control commands, (4) issuance of distress alert if needed, (5) edge detection and avoidance, and (6) obstacle detection.

Analysis of results
All components of the SMWCW after integration, simulation, implementation, and experiments show that the wheelchair model works perfectly with good performance and efficiency in accordance with the hand gestures. Table 5 shows the reaction time resulting from several experiments and trial runs of a wheelchair. The results were calculated using SR = (ST * 100) / ToT.
Here, • SR refers to success rate, • ST refers to successful trials of wheelchair runs, and • ToT refers to the total number of trials of wheelchair runs. Table 5 show the high response accuracy and short response time, in which both features results in high reliability and confidence of users and prove the efficiency of the wireless control of the SMWCW. The results indicate the usability of the SMWCW by the elderly with a wide range of disabilities and for partially or fully quadriplegic people. The results also show the capability of the SMWCW to detect edges and obstacles and to avoid them by as much as 20 cm from the wheelchair. Intensity of concentration or meditation is less than the initial intensity

Baseline approach
The performance of the SMWCW is compared with that of wireless smart hand gesture wheelchair control (WSHGWC). The WSHGWC approach is hand-movement-based control of a wheelchair. A comparison of all performance metrics was carried out and discussed in Sect. 5.1. The baseline approach and its objectives are shown in Table 6. Figure 14 shows a comparison of the SMWCW and WSHGWC under all scenarios of performance metrics. As shown in Fig. 14, the SMWCW outperforms WSHGWC in terms of movement change and response accuracy. However, WSHGWC outperforms the SMWCW in terms of response time. This means that the SMWCW approach is highly reliable and recommended and applicable for all cases of disabilities.

Conclusions
A NeuroSky's EEG device headset with an Arduino microcontroller, which gives a measure of brain activity and is integrated with eye blinks in terms of blink detection and eSense meter values is used. The NeuroSky's EEG device translates the user's wishes expressed by eyes blinks into device commands to realize the user's intent. The wheelchair can be controlled merely by brain signals and eye blinks with almost 95% accuracy. In the SMWCW, the wireless technologies used will enhance the confidence, usability, and willpower of the elderly and physically challenged people as it will help them to be self-reliant without the need to use any extra devices. The SMWCW approach was simulated and implemented using C programming language, Arduino C IDE, and Protues simulation software. Results of the SMWCW approach show its cost-effectiveness, competitive performance, accuracy, and efficiency. We compared the SMWCW and WSHGWC and found that SMWCW outperforms WSHGWC in terms of movement change and response accuracy. Although the SMWCW is an efficient approach, its response time needs to be improved. In future work, we will consider response time and response accuracy under all scenarios of disability cases.

Data Availability Statement
The simulation data used to support the findings of this study will be available from the corresponding author upon request, 6-12 months after the publication of this article.

Conflicts of Interest
The authors declare that there are no conflicts of interest.
Mohammad Alamgeer was born on January 1, 1980 in Madhepur, a small village in the Madhubani district in Bihar (India). He received his first religious education in Madarsa Darul-Olum, Madhepur. After some time, owing to his keen interest, he was admitted to a private convent school (National Childrenʼs Academy) and after his seventh grade, he entered the