Connect with us

Artificial Intelligence

“Step Closer to Reality: Scientists Develop Advanced Brain-Computer Interface for Restoring Sense of Touch”

While exploring a digitally represented object through artificially created sense of touch, brain-computer interface users described the warm fur of a purring cat, the smooth rigid surface of a door key and cool roundness of an apple.

Avatar photo

Published

on

Scientists at the University of Pittsburgh School of Medicine are on the cusp of a groundbreaking innovation – a brain-computer interface (BCI) that enables individuals with tetraplegia to regain their lost sense of touch. This revolutionary technology has the potential to transform the lives of millions, allowing people to interact with the world in a more natural and meaningful way.

In a recent study published in Nature Communications, researchers demonstrated that BCI users can design distinct tactile experiences for different objects displayed on a computer screen. Participants were able to recreate sensations such as petting a cat, touching an apple, or holding a door key – a far cry from the indistinct buzzing or tingling often associated with earlier experiments.

The key innovation behind this breakthrough lies in giving BCI users control over the details of electrical stimulation that creates tactile sensations. This personalized approach allows participants to make interactions with objects feel more realistic and meaningful, which gets us closer to creating a neuroprosthetic that feels pleasant and intuitive to use.

The study involved three participants who lost sensation in their hands due to spinal cord injuries. They were tasked with finding combinations of stimulation parameters that felt like petting a cat or touching an apple, key, towel, or toast – all while exploring objects presented digitally. The results were nothing short of remarkable: participants described objects in rich and vivid terms that made logical sense but were also unique and subjective.

When the image was taken away and participants had to rely on stimulation alone, they were able to correctly identify one of five objects 35% of the time – better than chance but far from perfect. While this may seem like a small step forward, it represents an important milestone towards invoking accurate sensation of touch on a person’s paralyzed hand.

The study is an exciting step towards creating an artificial limb that seamlessly integrates into a person’s unique sensory world. As researchers continue to push the boundaries of BCI technology, we can expect to see even more innovative applications in the future – transforming the lives of individuals with tetraplegia and beyond.

Artificial Intelligence

Shedding Light on Shadow Branches: Revolutionizing Computing Efficiency in Modern Data Centers

Researchers have developed a new technique called ‘Skia’ to help computer processors better predict future instructions and improve computing performance.

Avatar photo

Published

on

By

The collaboration between trailblazing engineers and industry professionals has led to a groundbreaking technique called Skia, which may transform the future of computing efficiency for modern data centers.

In data centers, large computers process massive amounts of data, but often struggle to keep up due to taxing workloads. This results in slower performance, causing search engines to generate answers more slowly or not at all. To address this issue, researchers at Texas A&M University have developed Skia in collaboration with Intel, AheadComputing, and Princeton.

The team includes Dr. Paul V. Gratz, a professor in the Department of Electrical and Computer Engineering, Dr. Daniel A. Jiménez, a professor in the Department of Computer Science and Engineering, and Chrysanthos Pepi, a graduate student in the Department of Electrical and Computer Engineering.

Processing instructions has become a major bottleneck in modern processor design,” Gratz said. “We developed Skia to better predict what’s coming next and alleviate that bottleneck.” Skia can not only help better predict future instructions but also improve the throughput of instructions on the system, leading to quicker performance and less power consumption for the data center.

Think of throughput in terms of being a server in a restaurant,” Gratz said. “You have lots and lots of jobs to do. How many tasks can you complete or how many instructions can you execute per unit time? You want high throughput, especially for computing.”

Improving throughput can lead to quicker performance and less power consumption for the data center. In fact, making it up to 10% more efficient means a company previously needing to make 100 data centers around the country now only needs to make 90, which is 10 fewer data centers. That’s pretty significant. These data centers cost millions of dollars, and they consume roughly the equivalent of the entire output of a power plant.

Skia identifies and decodes these shadow branches in unused bytes, storing them in a memory area called the Shadow Branch Buffer, which can be accessed alongside the BTB. What makes this technique interesting is that most of the future instructions were already available, and we demonstrate that Skia, with a minimal hardware budget, can make data centers more efficient, nearly twice the performance improvement versus adding the same amount of storage to the existing hardware as we observe,” Pepi said.

Their findings, “Skia: Exposing Shadow Branches,” were published in one of the leading computer architecture conferences, the ACM International Conference on Architectural Support for Programming Languages and Operating Systems. The team also traveled to the Netherlands to present their work to colleagues from around the globe.

Funding for this research is administered by the Texas A&M Engineering Experiment Station (TEES), the official research agency for Texas A&M Engineering.

Continue Reading

Artificial Intelligence

Estimating Biological Age with AI: A New Frontier in Cancer Care and Beyond

Researchers developed FaceAge, an AI tool that calculate’s a patient biological age from a photo of their face. In a new study, the researchers tied FaceAge results to health outcomes in people with cancer: When FaceAge estimated a younger age than a cancer patient’s chronological age, the patient did significantly better after cancer treatment, whereas patients with older FaceAge estimates had worse survival outcomes.

Avatar photo

Published

on

Estimating Biological Age with AI: A New Frontier in Cancer Care and Beyond

Imagine having a tool that can accurately predict a person’s biological age based on their facial appearance. This might seem like science fiction, but a team of investigators at Mass General Brigham has made this concept a reality using artificial intelligence (AI) technology called FaceAge.

FaceAge uses deep learning algorithms to analyze photographs of individuals and estimate their biological age. The tool was trained on 58,851 photos of presumed healthy individuals from public datasets and tested in a cohort of 6,196 cancer patients from two centers, who had photographs taken at the start of radiotherapy treatment.

The results were striking: patients with cancer appeared significantly older than those without cancer, with an average FaceAge that was about five years older than their chronological age. Moreover, older FaceAge predictions were associated with worse overall survival outcomes across multiple cancer types.

This technology has significant implications for cancer care. By using FaceAge to inform clinical decision-making, healthcare providers may be able to tailor treatment plans more effectively to individual patients based on their estimated biological age and health status.

The researchers also found that FaceAge outperformed clinicians in predicting short-term life expectancies of patients receiving palliative radiotherapy. This highlights the potential for AI-driven tools like FaceAge to provide valuable insights in high-pressure situations, where accurate predictions are crucial.

While further research is needed before this technology can be used in real-world clinical settings, the possibilities are vast and exciting. By expanding its capabilities to predict diseases, general health status, and lifespan, FaceAge has the potential to revolutionize our approach to healthcare.

The co-senior authors of the study, Hugo Aerts, PhD, and Ray Mak, MD, emphasize that this technology opens up new avenues for biomarker discovery from photographs, with implications far beyond cancer care or predicting age. As we increasingly consider chronic diseases as diseases of aging, accurately predicting an individual’s aging trajectory becomes crucial for developing effective treatment plans.

The door to a whole new realm of biomedical innovation has been opened, and it is essential that this technology be developed within a strong regulatory and ethical framework to ensure its safe use in various applications, ultimately helping to save lives.

Continue Reading

Artificial Intelligence

Ping Pong Robot Aces High-Speed Precision Shots

Engineers developed a ping-pong-playing robot that quickly estimates the speed and trajectory of an incoming ball and precisely hits it to a desired location on the table.

Avatar photo

Published

on

By

The MIT engineers’ latest creation is a powerful and lightweight ping pong bot that returns shots with high-speed precision. This table tennis tech has come a long way since the 1980s, when researchers first started building robots to play ping pong. The problem requires a unique combination of technologies, including high-speed machine vision, fast and nimble motors and actuators, precise manipulator control, and accurate real-time prediction.

The team’s new design comprises a multijointed robotic arm that is fixed to one end of a standard ping pong table and wields a standard ping pong paddle. Aided by several high-speed cameras and a high-bandwidth predictive control system, the robot quickly estimates the speed and trajectory of an incoming ball and executes one of several swing types – loop, drive, or chop – to precisely hit the ball to a desired location on the table with various types of spin.

In tests, the engineers threw 150 balls at the robot, one after the other, from across the ping pong table. The bot successfully returned the balls with a hit rate of about 88 percent across all three swing types. The robot’s strike speed approaches the top return speeds of human players and is faster than that of other robotic table tennis designs.

The researchers have since tuned the robot’s reaction time and found the arm hits balls faster than existing systems, at velocities of 20 meters per second. Advanced human players have been known to return balls at speeds of between 21 to 25 meters per second.

“Some of the goal of this project is to say we can reach the same level of athleticism that people have,” Nguyen says. “And in terms of strike speed, we’re getting really, really close.”

The team’s design has several implications for robotics and AI research. It could be adapted to improve the speed and responsiveness of humanoid robots, particularly for search-and-rescue scenarios, or situations where a robot would need to quickly react or anticipate.

This technology also has potential applications in smart robotic training systems. A robot like this could mimic the maneuvers that an opponent would do in a game environment, in a way that helps humans play and improve.

The researchers plan to further develop their system, enabling it to cover more of the table and return a wider variety of shots. This research is supported, in part, by the Robotics and AI Institute.

Continue Reading

Trending