Connect with us

Artificial Intelligence

The Hidden Barrier to Advanced Robotic Touch

Researchers argue that the problem that has been lurking in the margins of many papers about touch sensors lies in the robotic skin itself.

Avatar photo

Published

on

The development of advanced robotic touch has been hindered by a seemingly innocuous yet critical issue – the insulating layer in robotic skin. Researchers at Northwestern University and Tel Aviv University have successfully overcome this barrier, paving the way for low-cost solutions that enable robots to mimic human touch.

In their study, the researchers observed that inexpensive silicon rubber composites used to make robotic skin host an insulating layer on both top and bottom surfaces. This prevents direct electrical contact between the sensing polymer and the monitoring surface electrodes, making accurate and repeatable measurements impossible. By eliminating this error, cheap robotic skins can now allow robots to sense an object’s curves and edges, essential for proper grasping.

The research team, consisting of electrical engineers and polymer materials scientists, shed light on this problem in a paper published in Advanced Electronic Materials. The study highlights the importance of validating electrical contacts, which might unknowingly obscure device performance.

“A lot of scientists misunderstand their sensor response because they lump together the behavior of the contacts with the behavior of the sensor material, resulting in inconsistent data,” said Matthew Grayson, professor of electrical and computer engineering at Northwestern’s McCormick School of Engineering. “Our work identifies the exact problem, quantifies its extent both microscopically and electrically, and gives a clear step-by-step trouble-shooting manual to fix the problem.”

The researchers detected that adding electrically conducting fillers like carbon nanotubes to rubber composites creates an ideal candidate for touch sensors. However, this material needs electrical signals, which are blocked by the insulating layer. By sanding down the ultrathin insulation layer, the team achieved a stronger electrical contact and calibrated the thickness of the insulating layer.

The collaboration between Northwestern University and Tel Aviv University is essential in addressing the “contact preparation” challenge. The researchers relied on each other’s expertise to prepare materials and study their properties, leading to consistent results across various variables.

As awareness spreads among researchers about the issue of reproducibility in touch sensing literature, new publications can be more rigorously relied upon to advance the field with new capabilities. The research was supported by various organizations, including the U.S. National Science Foundation, Northwestern University, and Tel Aviv University through the Center for Nanoscience & Nanotechnology.

The breakthrough has significant implications for robotics development, enabling robots to sense and interact with their environment more effectively. By overcoming this critical barrier, researchers have opened up new possibilities for advanced robotic touch, paving the way for future innovations in robotics and beyond.

Artificial Intelligence

Shedding Light on Shadow Branches: Revolutionizing Computing Efficiency in Modern Data Centers

Researchers have developed a new technique called ‘Skia’ to help computer processors better predict future instructions and improve computing performance.

Avatar photo

Published

on

By

The collaboration between trailblazing engineers and industry professionals has led to a groundbreaking technique called Skia, which may transform the future of computing efficiency for modern data centers.

In data centers, large computers process massive amounts of data, but often struggle to keep up due to taxing workloads. This results in slower performance, causing search engines to generate answers more slowly or not at all. To address this issue, researchers at Texas A&M University have developed Skia in collaboration with Intel, AheadComputing, and Princeton.

The team includes Dr. Paul V. Gratz, a professor in the Department of Electrical and Computer Engineering, Dr. Daniel A. Jiménez, a professor in the Department of Computer Science and Engineering, and Chrysanthos Pepi, a graduate student in the Department of Electrical and Computer Engineering.

Processing instructions has become a major bottleneck in modern processor design,” Gratz said. “We developed Skia to better predict what’s coming next and alleviate that bottleneck.” Skia can not only help better predict future instructions but also improve the throughput of instructions on the system, leading to quicker performance and less power consumption for the data center.

Think of throughput in terms of being a server in a restaurant,” Gratz said. “You have lots and lots of jobs to do. How many tasks can you complete or how many instructions can you execute per unit time? You want high throughput, especially for computing.”

Improving throughput can lead to quicker performance and less power consumption for the data center. In fact, making it up to 10% more efficient means a company previously needing to make 100 data centers around the country now only needs to make 90, which is 10 fewer data centers. That’s pretty significant. These data centers cost millions of dollars, and they consume roughly the equivalent of the entire output of a power plant.

Skia identifies and decodes these shadow branches in unused bytes, storing them in a memory area called the Shadow Branch Buffer, which can be accessed alongside the BTB. What makes this technique interesting is that most of the future instructions were already available, and we demonstrate that Skia, with a minimal hardware budget, can make data centers more efficient, nearly twice the performance improvement versus adding the same amount of storage to the existing hardware as we observe,” Pepi said.

Their findings, “Skia: Exposing Shadow Branches,” were published in one of the leading computer architecture conferences, the ACM International Conference on Architectural Support for Programming Languages and Operating Systems. The team also traveled to the Netherlands to present their work to colleagues from around the globe.

Funding for this research is administered by the Texas A&M Engineering Experiment Station (TEES), the official research agency for Texas A&M Engineering.

Continue Reading

Artificial Intelligence

Estimating Biological Age with AI: A New Frontier in Cancer Care and Beyond

Researchers developed FaceAge, an AI tool that calculate’s a patient biological age from a photo of their face. In a new study, the researchers tied FaceAge results to health outcomes in people with cancer: When FaceAge estimated a younger age than a cancer patient’s chronological age, the patient did significantly better after cancer treatment, whereas patients with older FaceAge estimates had worse survival outcomes.

Avatar photo

Published

on

Estimating Biological Age with AI: A New Frontier in Cancer Care and Beyond

Imagine having a tool that can accurately predict a person’s biological age based on their facial appearance. This might seem like science fiction, but a team of investigators at Mass General Brigham has made this concept a reality using artificial intelligence (AI) technology called FaceAge.

FaceAge uses deep learning algorithms to analyze photographs of individuals and estimate their biological age. The tool was trained on 58,851 photos of presumed healthy individuals from public datasets and tested in a cohort of 6,196 cancer patients from two centers, who had photographs taken at the start of radiotherapy treatment.

The results were striking: patients with cancer appeared significantly older than those without cancer, with an average FaceAge that was about five years older than their chronological age. Moreover, older FaceAge predictions were associated with worse overall survival outcomes across multiple cancer types.

This technology has significant implications for cancer care. By using FaceAge to inform clinical decision-making, healthcare providers may be able to tailor treatment plans more effectively to individual patients based on their estimated biological age and health status.

The researchers also found that FaceAge outperformed clinicians in predicting short-term life expectancies of patients receiving palliative radiotherapy. This highlights the potential for AI-driven tools like FaceAge to provide valuable insights in high-pressure situations, where accurate predictions are crucial.

While further research is needed before this technology can be used in real-world clinical settings, the possibilities are vast and exciting. By expanding its capabilities to predict diseases, general health status, and lifespan, FaceAge has the potential to revolutionize our approach to healthcare.

The co-senior authors of the study, Hugo Aerts, PhD, and Ray Mak, MD, emphasize that this technology opens up new avenues for biomarker discovery from photographs, with implications far beyond cancer care or predicting age. As we increasingly consider chronic diseases as diseases of aging, accurately predicting an individual’s aging trajectory becomes crucial for developing effective treatment plans.

The door to a whole new realm of biomedical innovation has been opened, and it is essential that this technology be developed within a strong regulatory and ethical framework to ensure its safe use in various applications, ultimately helping to save lives.

Continue Reading

Artificial Intelligence

Ping Pong Robot Aces High-Speed Precision Shots

Engineers developed a ping-pong-playing robot that quickly estimates the speed and trajectory of an incoming ball and precisely hits it to a desired location on the table.

Avatar photo

Published

on

By

The MIT engineers’ latest creation is a powerful and lightweight ping pong bot that returns shots with high-speed precision. This table tennis tech has come a long way since the 1980s, when researchers first started building robots to play ping pong. The problem requires a unique combination of technologies, including high-speed machine vision, fast and nimble motors and actuators, precise manipulator control, and accurate real-time prediction.

The team’s new design comprises a multijointed robotic arm that is fixed to one end of a standard ping pong table and wields a standard ping pong paddle. Aided by several high-speed cameras and a high-bandwidth predictive control system, the robot quickly estimates the speed and trajectory of an incoming ball and executes one of several swing types – loop, drive, or chop – to precisely hit the ball to a desired location on the table with various types of spin.

In tests, the engineers threw 150 balls at the robot, one after the other, from across the ping pong table. The bot successfully returned the balls with a hit rate of about 88 percent across all three swing types. The robot’s strike speed approaches the top return speeds of human players and is faster than that of other robotic table tennis designs.

The researchers have since tuned the robot’s reaction time and found the arm hits balls faster than existing systems, at velocities of 20 meters per second. Advanced human players have been known to return balls at speeds of between 21 to 25 meters per second.

“Some of the goal of this project is to say we can reach the same level of athleticism that people have,” Nguyen says. “And in terms of strike speed, we’re getting really, really close.”

The team’s design has several implications for robotics and AI research. It could be adapted to improve the speed and responsiveness of humanoid robots, particularly for search-and-rescue scenarios, or situations where a robot would need to quickly react or anticipate.

This technology also has potential applications in smart robotic training systems. A robot like this could mimic the maneuvers that an opponent would do in a game environment, in a way that helps humans play and improve.

The researchers plan to further develop their system, enabling it to cover more of the table and return a wider variety of shots. This research is supported, in part, by the Robotics and AI Institute.

Continue Reading

Trending