Connect with us

Artificial Intelligence

‘Signing’ the Future: AI-Powered ASL Recognition System Revolutionizes Deaf Communication

American Sign Language (ASL) recognition systems often struggle with accuracy due to similar gestures, poor image quality and inconsistent lighting. To address this, researchers developed a system that translates gestures into text with 98.2% accuracy, operating in real time under varying conditions. Using a standard webcam and advanced tracking, it offers a scalable solution for real-world use, with MediaPipe tracking 21 keypoints on each hand and YOLOv11 classifying ASL letters precisely.

Avatar photo

Published

on

The world is on the cusp of a revolution in communication for the deaf and hard-of-hearing community. Traditional sign language interpreters are often scarce, expensive, and dependent on human availability. In an increasingly digital world, the demand for smart, assistive technologies that offer real-time, accurate, and accessible communication solutions is growing.

Enter American Sign Language (ASL), one of the most widely used sign languages worldwide. Comprising distinct hand gestures representing letters, words, and phrases, ASL recognition systems have long struggled with real-time performance, accuracy, and robustness across diverse environments.

Researchers from Florida Atlantic University’s College of Engineering and Computer Science have developed an innovative AI-powered ASL interpretation system that tackles these challenges head-on. Combining the object detection power of YOLOv11 with MediaPipe’s precise hand tracking, the system accurately recognizes ASL alphabet letters in real-time.

Using advanced deep learning and key hand point tracking, it translates ASL gestures into text, enabling users to interactively spell names, locations, and more with remarkable accuracy. The built-in webcam serves as a contact-free sensor, capturing live visual data that is converted into digital frames for gesture analysis.

The system’s effectiveness has been confirmed through results published in the journal Sensors, achieving a 98.2% accuracy (mean Average Precision, mAP@0.5) with minimal latency. This finding highlights the system’s ability to deliver high precision in real-time, making it an ideal solution for applications that require fast and reliable performance.

The ASL Alphabet Hand Gesture Dataset includes a wide variety of hand gestures captured under different conditions to help models generalize better. These conditions cover diverse lighting environments (bright, dim, and shadowed), backgrounds (both outdoor and indoor scenes), and various hand angles and orientations.

Each image is carefully annotated with 21 keypoints, highlighting essential hand structures such as fingertips, knuckles, and the wrist. These annotations provide a skeletal map of the hand, allowing models to distinguish between similar gestures with exceptional accuracy.

This project demonstrates how cutting-edge AI can be applied to serve humanity. By fusing deep learning with hand landmark detection, the team created a system that not only achieves high accuracy but also remains accessible and practical for everyday use.

The significance of this research lies in its potential to transform communication for the deaf community by providing an AI-driven tool that translates ASL gestures into text, enabling smoother interactions across education, workplaces, healthcare, and social settings.

Artificial Intelligence

“Paws-itive Progress: Amphibious Robotic Dog Breaks Ground in Mobility and Efficiency”

A team of researchers has unveiled a cutting-edge Amphibious Robotic Dog capable of roving across both land and water with remarkable efficiency.

Avatar photo

Published

on

The field of robotics has taken a significant leap forward with the development of an amphibious robotic dog, capable of efficiently navigating both land and water. This innovative creation was inspired by the remarkable mobility of mammals in aquatic environments.

Unlike existing amphibious robots that often draw inspiration from reptiles or insects, this robotic canine is based on the swimming style of dogs. This design choice has allowed it to overcome several limitations faced by insect-inspired designs, such as reduced agility and load capacity.

The key to the amphibious robot’s water mobility lies in its unique paddling mechanism, modeled after the natural swimming motion of dogs. By carefully balancing weight and buoyancy, the engineers have ensured stable and effective aquatic performance.

To test its capabilities, the researchers developed and experimented with three distinct paddling gaits:

* A doggy paddle method that prioritizes speed
* A trot-like style that focuses on stability
* A third gait that combines elements of both

Through extensive experimentation, it was found that the doggy paddle method proved superior for speed, achieving a maximum water speed of 0.576 kilometers per hour (kph). On land, the amphibious robotic dog reaches speeds of 1.26 kph, offering versatile mobility in amphibious environments.

“This innovation marks a big step forward in designing nature-inspired robots,” says Yunquan Li, corresponding author of the study. “Our robot dog’s ability to efficiently move through water and on land is due to its bioinspired trajectory planning, which mimics the natural paddling gait of real dogs.”

The implications of this technology are vast and exciting, with potential applications in environmental research, military vehicles, rescue missions, and more. As we continue to push the boundaries of what’s possible with robotics, it’s clear that the future holds much promise for innovation and discovery.

Continue Reading

Artificial Intelligence

“Revolutionizing Hospital Disinfection: Autonomous Robots for Efficient Sanitation”

A research team develops disinfection robot combining physical wiping and UV-C sterilization.

Avatar photo

Published

on

The COVID-19 pandemic has brought to the forefront the critical importance of thorough disinfection, particularly within hospital environments. However, traditional manual disinfection methods have inherent limitations, including labor shortages due to physical fatigue and risk of exposure to pathogens, inconsistent human performance, and difficulty in reaching obscured or hard-to-reach areas.

To address these challenges, a team of researchers from Pohang University of Science and Technology (POSTECH) has developed an “Intelligent Autonomous Wiping and UV-C Disinfection Robot” that can automate hospital disinfection processes. This innovative robot is capable of navigating through hospital environments and performing disinfection tasks with precision and consistency.

The key feature of this robot is its dual disinfection system, which combines physical wiping and UV-C irradiation to effectively remove contaminants from surfaces. The robotic manipulator uses a wiping mechanism to physically clean high-touch areas, while the UV-C light ensures thorough disinfection of hard-to-reach corners and narrow spaces.

Real-world testing at Pohang St. Mary’s Hospital validated the robot’s performance, with bacterial culture experiments confirming its effectiveness in disinfecting surfaces. Repeated autonomous operations were carried out to verify its long-term usability in clinical settings.

The significance of this technology lies in its ability to automate time-consuming and repetitive disinfection tasks, allowing healthcare professionals to devote more attention to patient care. Additionally, the robot’s precision control algorithms minimize operational failures, while its integration with a self-sanitizing station and wireless charging system ensures sustained disinfection operations.

Professor Keehoon Kim emphasized that despite COVID-19 transitioning into an endemic phase, it remains essential to prepare for future pandemics by advancing this disinfection robot technology beyond hospitals to public facilities, social infrastructures, and everyday environments to further reduce infection risks. This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT).

Continue Reading

Artificial Intelligence

The Power of Robot Design: How Service Robots’ Gender Characteristics Influence Customer Decisions

While service robots with male characteristics can be more persuasive when interacting with some women who have a low sense of decision-making power, ‘cute’ design features — such as big eyes and raised cheeks — affect both men and women similarly, according to new research.

Avatar photo

Published

on

By

The hospitality industry is taking a cue from new research in the Penn State School of Hospitality Management, which suggests that service robots can be designed to influence customers’ decisions based on their gender characteristics. The study found that service robots with characteristics typically associated with males may be more persuasive when interacting with women who have a low sense of power.

Led by researchers Lavi Peng, Anna Mattila, and Amit Sharma, the team conducted two studies to explore how the gender portrayed in service robots can affect customers’ decisions. In the first study, participants were asked to imagine visiting a new restaurant and receiving a menu recommendation from a service robot. The results showed that women with a low sense of power were more likely to accept recommendations from male robots.

“For men with a low sense of power, we found the difference was less obvious,” said Peng. “Based on our findings, consumers with high power tend to make their own judgment without relying on societal expectations.”

The researchers suggested that businesses could leverage these findings by using male robots to recommend new menu items or persuade customers to upgrade their rooms.

To mitigate gender stereotypes in robot design, the team conducted a second study and found that “cute” features, such as big eyes and raised cheeks, can reduce the effect of portrayed robot gender on persuasiveness. Both male and female customers responded similarly to robots with these features, suggesting that businesses could consider using cute designs to mitigate gender stereotypes.

The Marriott Foundation supported this research, highlighting the importance of understanding how service robots can influence customer decisions in the hospitality industry.

Continue Reading

Trending