Page 903
www.rsisinternational.org
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XV, Issue III, March 2026
Hand Movement Audio Message-Based Accelerometer for Paralytic
and Disabled Persons
Joel N. Cañada, Geoffrey V. Alamag, Jhon Cris E. Enriquez, Zyren Enriquez, Liezl P. Garcia
Batangas State University - Balayan campus
DOI:
https://doi.org/10.51583/IJLTEMAS.2026.150300077
Received: 12 March 2026; Accepted: 17 March 2026; Published: 17 April 2026
ABSTRACT
Individuals with paralysis or severe physical disabilities often face significant barriers in communicating their
needs, which can limit their independence and quality of care. This study presents a novel system that translates
hand movements into corresponding audio messages and displays them on an LCD, enabling effective
communication between patients and caregivers. The device incorporates a 3-axis accelerometer and a
microcontroller that processes directional hand gestures to generate preprogrammed messages. Testing
demonstrates that the system reliably interprets four primary gestures corresponding to the messages: I am
hungry,” “I want to go to the bathroom,” “I want to watch television,” and “I want to go out.” The device operates
effectively for users with partial mobility and accommodates those with speech impairments. Maximum speaker
volume is 60 dB, with an operational range of approximately 2 meters. The system provides a practical and low-
cost assistive technology solution that may improve patient autonomy, caregiver responsiveness, and quality of
life.
Keywords: accelerometer, hand gesture recognition, paralytic communication system, audio message device,
Batangas City, Philippines
INTRODUCTION
Technological advancements have significantly improved healthcare services by enabling smarter and more
efficient patient care solutions. Despite these developments, communication remains a major challenge for
individuals with disabilities, particularly paralytic patients who experience limitations in speech and motor
functions. While monitoring systems allow healthcare providers to track vital signs effectively, practical
solutions that support real-time verbal communication for impaired patients remain limited.
Disability, defined as physical or neurological impairment affecting movement, sensation, or daily activities,
often leads to dependence on caregivers for basic communication. Paralysis, which involves partial or complete
loss of muscle function, affects millions of individuals worldwide and can result from conditions such as stroke,
spinal cord injury, multiple sclerosis, and cerebral palsy. These conditions not only restrict mobility but also
create significant barriers in expressing needs, leading to delays in assistance and reduced quality of care.
Recent technological innovations, including braincomputer interface devices and assistive communication
systems, demonstrate the potential of sensor-based and embedded technologies in improving patient interaction.
However, many of these solutions remain costly, complex, or inaccessible in resource-limited healthcare
settings. This highlights the need for affordable and reliable assistive devices that can support effective
communication between paralytic patients and caregivers.
In response to this gap, the present study proposes a hand-movement audio messaging system using an
accelerometer to translate simple gestures into audio commands. The device aims to provide a low-cost, user-
friendly communication tool that enables patients to convey their needs independently. By integrating embedded
sensors, microcontroller processing, and audio output, the system seeks to enhance responsiveness, reduce
caregiver workload, and improve overall patient care in healthcare environments.
Page 904
www.rsisinternational.org
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XV, Issue III, March 2026
Communication is a fundamental human need, but it becomes challenging for individuals with paralysis or severe
motor impairments. Existing communication tools for disabled persons often require physical input devices or
complex training, which can be impractical for daily use. This study focuses on developing a Hand Movement
Audio Message-Based Accelerometer, a system that allows users to convey messages using simple hand
gestures.
The primary objective of this study is to develop a hand-movement audio messaging system using an
accelerometer designed to assist paralytic and disabled individuals in communicating their needs effectively.
1. To design a gesture-based system using a 3-axis accelerometer for accurate motion detection.
2. To convert detected gestures into corresponding audio and visual messages for immediate
communication.
3. To evaluate the system’s performance in terms of accuracy, response time, and reliability.
LITERATURE REVIEW
Based on the literature review, similar research that has been done could be explained as follows:
H. Liu et al. (2018) demonstrated a hybrid-integrated, high-precision vacuum accelerometer with a
sensitivity of 3.081 V/g and low non-linearity for precise acceleration measurements. Similarly, X. Hu
et al. and X. Zhao et al. developed silicon-on-insulator (SOI) piezoresistive sensors capable of measuring
high-g accelerations with minimal cross-axis interference, while J. Kim et al. explored thermal
convection-based accelerometers, investigating cavity volume and gas density effects on Z-axis response.
These studies highlight innovations in multi-axis MEMS accelerometer design, structural optimization,
and sensor performance improvement.
Further improvements, including sandwiched proof-mass structures (S. Tez and T. Akin) and compact
single-proof-mass designs (C.M. Sun et al., M.H. Tsai et al., S.-C. Lo et al.), achieved high sensitivity
and low cross-axis effects, demonstrating ongoing progress in miniaturized accelerometer technology.
Kristoffersson and Lindén (2020) reviewed wearable sensor systems for real-life gait monitoring and
emphasized the importance of demographic considerations, fall risk, and chronic conditions. Atallah et
al. (2012) validated ear-worn accelerometers on a treadmill, showing the accuracy of gait parameters
against force-plate measurements. Ghazal et al. (2015) proposed fall-detection systems using artificial
Page 905
www.rsisinternational.org
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XV, Issue III, March 2026
neural networks with wrist-worn accelerometers and gyroscopes, achieving high accuracy in activity
classification.
Pande et al. (2014), Kirthika Sivakumara et al. (2020), and Sanish Manadhar et al. (2019) developed
gesture-based assistive devices, including wheelchairs and message conveyers for physically disabled
persons. These systems integrated accelerometers, flex sensors, and microcontrollers to enable user
control and communication through hand or body gestures, providing low-cost and accessible solutions.
Similarly, Jagdale et al. (2020) and Hoque Chowdhury (2019) designed smart wheelchairs for home use,
integrating real-time monitoring of vital signs and emergency messaging, demonstrating practical
applications for independent living.
Rajkhanna et al. (2014) and Ganapathyraju (2013) focused on gesture-based control of mobile robots
and industrial devices, using accelerometers and vision-based systems for accurate hand gesture
recognition. Mohammed Abdul Kader et al. (2019) developed a head-motion-controlled semi-
autonomous wheelchair for quadriplegic patients, integrating a 3-axis accelerometer with DC motors and
obstacle detection, demonstrating the potential of accelerometer-based assistive systems for indoor and
outdoor navigation.
Atallah et al. (2012) validated ear-worn accelerometers for gait monitoring using treadmill data,
demonstrating the feasibility of mapping accelerometer data to physical activity features.
Ghazal et al. (2015) designed a smartwatch-based fall detection system using ANN, showing that sensor
data combined with classification algorithms can accurately detect falls.
Baraka et al. (2019) investigated accelerometer and sEMG data for discriminating between healthy
participants and simulated tremors, achieving high classification accuracy using ensemble machine
learning techniques.
Vishal V. Pande et al. (2014) and Kirthika Sivakumara et al. (2020) designed gesture-controlled
wheelchairs and motion-based message systems for disabled users, showing that accelerometer-based
hand gestures can provide reliable control for navigation and communication.
Sanish Manadhar et al. (2019) and Jagdale et al. (2020) developed wearable glove controllers and
microcontroller-based systems to translate hand gestures into audio or digital messages, highlighting the
effectiveness of accelerometers in enabling communication for immobile or paralytic individuals.
Mohammed Abdul Kader et al. (2019) implemented a head motion-controlled semi-autonomous
wheelchair using a 3-axis accelerometer, sonar sensors, and GSM notifications, demonstrating that
combining accelerometer input with real-time alerts increases safety and autonomy for disabled users.
From the review of related literature, it is evident that while significant advancements have been made in the
design, modeling, and application of MEMS accelerometers, there remain critical areas that require further
exploration. Specifically, most studies focused on laboratory-based testing with controlled conditions, and only
a limited number have addressed long-term, real-world deployment for wearable or assistive devices.
Additionally, variations in sensor placement, multi-sensor fusion techniques, and the integration of user-friendly
interfaces for disabled or elderly populations remain underexplored. These gaps highlight the need for
developing a system that not only ensures high sensitivity and low cross-axis interference but also provides
practical usability, real-time monitoring, and adaptive feedback in everyday environments. The present study
seeks to address these gaps by designing an integrated accelerometer-based assistive system capable of
enhancing mobility, communication, and safety for physically challenged individuals.
Compared to existing assistive communication systems that rely on vision-based processing or multiple sensors,
the proposed system offers a simpler and more cost-effective solution by utilizing a single accelerometer and RF
communication. This approach reduces system complexity while maintaining reliable gesture detection, making
it more suitable for real-world applications, particularly in resource-limited environments.
Page 906
www.rsisinternational.org
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XV, Issue III, March 2026
Based on the literature review, similar research conducted emphasizes the importance of accelerometer-based
systems for hand and head movements in assisting paralytic and disabled persons. Key points include:
1. Accelerometer Design & Sensitivity: Several studies focused on creating high-sensitivity, multi-axis
accelerometers with low noise and cross-axis interference.
2. Wearable Applications: Wearable sensors, such as wrist-worn or ring-based accelerometers, are
effective for real-time monitoring of fine movements, including hand gestures.
3. Assistive Technology: Wheelchair control, gesture-based communication, and motion-triggered
messaging systems demonstrate the integration of accelerometers in improving autonomy and safety for
disabled individuals.
4. Integration with IoT & Feedback Systems: Many systems incorporated wireless communication, GSM
alerts, or mobile apps to provide real-time feedback to caregivers, enhancing emergency response and
health monitoring.
5. User-Centered Design: Several studies emphasized designing for usability, comfort, and real-world
conditions, ensuring adherence and accurate data capture for disabled or paralytic patients.
METHODOLOGY
This study employed a developmental research design to create a hand-movement audio messaging system using
an accelerometer intended for paralytic and disabled individuals. The research process involved several stages,
including data gathering, system design, development, fabrication, assembly, simulation, and testing and
evaluation.
During the data-gathering phase, relevant literature, journals, articles, and related studies were reviewed to
establish the conceptual foundation of the device. In the design stage, system requirements were identified and
appropriate electronic components were selected based on availability, cost, functionality, and reliability. The
development phase involved integrating sensors, microcontroller units, and audio output modules to enable
gesture detection and message transmission.
Fabrication and assembly were conducted to construct the physical prototype, followed by simulation and
troubleshooting to ensure proper system operation. Finally, testing and evaluation were performed to assess the
device’s performance in terms of functionality, efficiency, durability, and safety. These procedures ensured that
the developed system met the intended objectives and provided reliable communication support for users.
SYSTEM COMPONENTS AND MATERIALS
The developed system consists of the following major hardware components:
Accelerometer Sensor The accelerometer is used to detect hand movements performed by the user. It
measures directional tilt (left, right, up, and down) and converts motion into electrical signals for processing.
Microcontroller Unit (ATmega328 / Arduino Pro Micro) The microcontroller serves as the main processing
unit of the system. It interprets the accelerometer data and generates corresponding control signals to trigger
predefined audio messages and display outputs.
Transmitter and Receiver Module The radio frequency (RF) transmitter and receiver modules are used to
wirelessly transmit encoded commands from the input device to the output unit, enabling remote communication
between the patient and caregiver.
SD Card and SD Card Module The SD card stores prerecorded audio messages. The SD card module
facilitates communication between the microcontroller and the storage device for audio playback.
Page 907
www.rsisinternational.org
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XV, Issue III, March 2026
Amplifier and Speaker The amplifier increases the audio signal strength, while the speaker converts electrical
signals into audible sound for caregiver notification.
Liquid Crystal Display (LCD) The LCD provides visual feedback by displaying the message corresponding
to the detected hand gesture.
Power Supply Components The system utilizes a 9V battery for the input device and an AC/DC adaptor with
a DC power jack socket for the output device, ensuring stable and continuous operation.
Switch A control switch is used to power the system on and off.
System Operation
This stage describes the operational framework of the developed hand-movement audio messaging system. The
project integrates multiple electronic components to enable gesture-based communication for paralytic and
disabled individuals. The system operates by detecting hand movements through an accelerometer sensor, which
measures directional tilt and motion. The captured motion data are transmitted to the microcontroller, where the
signals are processed and matched with predefined commands. Once interpreted, the corresponding message is
encoded and sent through the wireless transmitter to the receiver unit.
At the output stage, the received signal activates the stored audio message from the SD card module. The
message is then amplified and played through a speaker, allowing caregivers to hear the patient’s request.
Simultaneously, the liquid crystal display (LCD) presents the same message visually to provide confirmation
and additional clarity. The device is powered by a regulated power supply, including a 9V battery for the input
unit and an external AC/DC adaptor for the output unit, ensuring stable system performance during operation.
To classify hand gestures, threshold values were defined based on the tilt angles measured along the X and Y
axes of the accelerometer. Each gesture corresponds to a specific range of acceleration values. For example, a
forward tilt is detected when the X-axis value exceeds a positive threshold, while a backward tilt is identified
when it falls below a negative threshold. Similarly, left and right tilts are determined using Y-axis thresholds.
These threshold values were obtained through repeated experimental trials to ensure reliable detection.
Calibration may be required for different users to account for variations in hand movement strength and
orientation, thereby improving the overall accuracy of the system.
Experimental Procedure
Each of the four gestures was tested for 20 trials. Data collected included:
Correct detection count
False triggers
Missed triggers
Response time
Accuracy percentage was calculated using:
Accuracy = (Correct Detections / Total Trials) x 100
Accuracy = (76 / 80) 𝑥 100 = 95%
System Architecture
The proposed system consists of the following major components:
1. Motion Sensing Unit An accelerometer sensor, ADXL335, was used to detect hand movements along
the X, Y, and Z axes.
Page 908
www.rsisinternational.org
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XV, Issue III, March 2026
2. Microcontroller Unit An Arduino Pro Micro was utilized to process the analog signals from the
accelerometer and classify predefined gestures.
3. Wireless Communication Module A 433 MHz RF transmitter and Receiver module was used to
transmit the processed signal from the glove unit to the receiving unit.
4. Audio Output Module A speaker connected to the receiver unit produced the corresponding pre-
recorded voice message.
5. Power Supply Unit - Provides stable voltage to ensure proper operation of all system components.
The system operates in two parts:
1. Transmitter unit embedded in a wearable glove
2. Receiver unit connected to the speaker
RESULTS AND DISCUSSION
System Functionality Testing
The developed Hand Movement Audio-Message System was tested to evaluate its overall performance in terms
of gesture recognition accuracy, response time, and wireless transmission reliability. The system successfully
detected four predefined hand gestures using the ADXL335 sensor integrated with the Arduino Pro Micro. Each
gesture triggered a corresponding audio message transmitted through the 433 MHz RF Transmitter and Receiver
Module. Testing confirmed that the accelerometer reading varies distinctly along the X and Y axes depending
on hand orientation, allowing reliable classification using programmed threshold values.
Gesture Recognition Accuracy
Each of the four programmed gestures was tested for 20 trials, resulting in 80 total trials.
Table 1.
Gesture Type
Correct Detection
False Trigger
Missed Trigger
Accuracy (%)
Forward Tilt
19
1
0
95%
Backward Tilt
18
1
1
90%
Left Tilt
19
0
1
95%
Right Tilt
20
0
0
100%
The overall system accuracy was computed as:
Accuracy = (76 / 80) x 100 = 95%
Minor errors occurred due to:
Slight unintended hand tremors
Rapid movement transitions
Variations in wrist angle positioning
Response Time Analysis
The average time measured from gesture execution to audio output activation was recorded.
Minimum Response Time: 0.85 seconds
Maximum Response Time: 1.20 seconds
Page 909
www.rsisinternational.org
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XV, Issue III, March 2026
Mean Response Time: 0.98 seconds
The response delay includes:
1. Sensor data acquisition
2. Microcontroller processing time
3. RF transmission delay
4. Receiver decoding
5. Audio output activation
The system maintains response times below 1.2 seconds, which is acceptable for assistive communication
devices.
Wireless Transmission Performance
The RF communication module was tested at varying distances:
Table 2.
Distance
Signal Status
1-5 meters
Stable
6-10 meters
Stable
11-15 meters
Minor delay
Beyond 15 meters, occasional signal loss was observed. The effective operating range was determined to be
approximately 10 meters in indoor conditions without significant obstruction.
Power Stability Evaluation
The 9V regulated power supply maintained a stable voltage during continuous operation.
Observations:
No unexpected system resets
No sensor fluctuation due to power drop
Stable output audio level
Battery life lasted approximately 4-5 hours.
DISCUSSION
The findings indicate that the developed system is capable of providing an alternative communication method
for paralytic and speech-impaired individuals.
The use of analog accelerometer-based gesture detection proved effective for simple directional movements.
Compared to camera-based systems, this approach offers:
Lower power consumption
Reduced computational complexity
Portable and wearable configuration
Lower system cost
Page 910
www.rsisinternational.org
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XV, Issue III, March 2026
However, the system is limited to predefined gestures and may require recalibration for different users due to
variation in hand strength and movement capability.
Future improvement may include:
Implementation of digital filtering for noise reduction
Use of a machine learning algorithm for adaptive gesture recognition
Integration of rechargeable lithium battery
Addition of LCD status display for visual confirmation
SUMMARY, FINDINGS, CONCLUSIONS, AND RECOMMENDATIONS
Summary
This study developed a Hand Movement Audio Message-Based Accelerometer to assist paralytic and disabled
persons in communication. The system integrates a 3-axis accelerometer, a microcontroller (Arduino), an LCD,
and an audio output. Testing was conducted to evaluate gesture recognition accuracy, system response time, and
message output reliability.
Key findings from the testing include:
Gesture Recognition: 95% of gestures were correctly recognized.
Message output: 100% recognized gestures resulted in the correct audio and visual message.
Response Time: The average system response was 0.98 seconds, ensuring real-time interaction.
The system proved effective in converting hand movements into immediate audio messages, offering an
accessible communication solution for individuals with severe motor impairments.
Findings
1. Multi-axis accelerometers are sensitive enough to detect subtle hand movements and gestures.
2. Wearable systems (gloves, rings, wristbands) enable continuous monitoring without hindering daily
activity.
3. Gesture-based control systems can be integrated with wheelchairs, messaging systems, or IoT platforms.
4. Real-time feedback mechanisms (audio, SMS, or mobile notifications) improve caregiver response.
5. User-centered designs enhance usability, adherence, and accessibility for paralytic patients.
Conclusion
Based on the results and discussion, the following conclusions were drawn:
1. The developed system can accurately detect and interpret predefined hand gestures using a 3-axis
accelerometer.
2. Audio and visual feedback ensure that users and caregivers receive immediate confirmation of the
intended message.
3. The prototype operates in real-time with minimal delay, making it suitable for emergency or daily use.
4. The system provides a reliable, low-cost solution that enables communication capabilities for paralytic
and disabled persons.
Recommendations
For further improvement and practical implementation, the following recommendations are suggested:
1. Expand Gesture Library: Introduce more hand gestures to cover a wider range of commands and
communication needs.
Page 911
www.rsisinternational.org
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XV, Issue III, March 2026
2. Wireless Connectivity: Integrate Bluetooth or Wi-Fi modules to allow remote notification to caregivers
or hospital staff.
3. Portable Design: Reduce the size of the device and make it wearable, such as in gloves or wristbands, for
greater mobility.
4. Real-World Testing: Conduct trials with actual patients in home and hospital settings to validate usability
and effectiveness.
5. Machine Learning Integration: Apply machine learning algorithms to improve gesture recognition
accuracy and adapt to user-specific movement patterns.
6. Adaptive gesture recognition
7. User calibration feature
8. Mobile or IoT integration
REFERENCES
1. Atallah, L., Lo, B., King, R., & Yang, G. Z. (2012). Validation of an ear-worn accelerometer for gait
monitoring. Journal of Biomechanics, 45(10), 17641770.
2. Baraka, A., Loconsole, C., & Rissanen, A. (2019). Human locomotion monitoring using forearm sEMG
and accelerometers. IEEE Sensors Journal, 19(18), 78627873.
3. Ghazal, M., Al-Maadeed, S., & Faraj, A. (2015). Fall detection using an ANN with a smartwatch
accelerometer and gyroscope. Procedia Computer Science, 60, 215222.
4. H. Liu, Z., Wang, Q., & Chen, D. (2018). High-precision vacuum accelerometer for harsh environments.
Sensors and Actuators A: Physical, 278, 1222.
5. Kristoffersson, A., & Lindén, B. (2020). Wearable sensor systems for gait monitoring in real-life
conditions: A systematic review. Journal of Biomedical Informatics, 104, 103390.
6. Manadhar, S., et al. (2019). Hand gesture vocalizer for the dumb and deaf people using accelerometers.
Procedia Computer Science, 152, 6875.
7. Mohammed, Z., et al. (2019). Semi-autonomous head motion wheelchair for disabled persons. Biomedical
Engineering Letters, 9(4), 387396.
8. Nowshin, N., et al. (2018). Infrared sensor-controlled wheelchair for physically disabled people.
International Journal of Engineering Science and Technology, 10(5), 6574.
9. Pande, V. V., et al. (2014). Hand gesture-based wheelchair movement control for disabled persons using
MEMS. Procedia Computer Science, 132, 127134.
10. Rajkhanna, U., Mathankumar, M., & Gunasekaran, K. (2014). Hand gesture-based mobile robot control
using PIC microcontroller. International Journal of Advanced Research in Electrical, Electronics and
Instrumentation Engineering, 3(2), 12231230.
11. Tez, S., & Akin, T. (2013). Sandwich three-axis bulk-micromachined accelerometer with multiplexed
readout. IEEE Sensors Journal, 13(6), 21022111.
12. Tsai, M. H., et al. (2015). Compact three-axis accelerometer using gap-change comb fingers. Microsystem
Technologies, 21, 14571468.
13. Srivastava, P., Chatterjee, S., & Thakur, R. (2018). Gesture-controlled wheelchair using accelerometer IC
MMA7361L. Procedia Computer Science, 132, 140147.
Page 912
www.rsisinternational.org
INTERNATIONAL JOURNAL OF LATEST TECHNOLOGY IN ENGINEERING,
MANAGEMENT & APPLIED SCIENCE (IJLTEMAS)
ISSN 2278-2540 | DOI: 10.51583/IJLTEMAS | Volume XV, Issue III, March 2026
APPENDICES
Figure 2. Block diagram of the prototype
The system operation is implemented through embedded programming in the microcontroller, which processes
sensor data, classifies gestures, and triggers corresponding audio outputs.