Connect with us
We’re experimenting with AI-generated content to help deliver information faster and more efficiently.
While we try to keep things accurate, this content is part of an ongoing experiment and may not always be reliable.
Please double-check important details — we’re not responsible for how the information is used.

Communications

Smart Bandage Takes Another Step Forward: Revolutionizing Chronic Wound Care with Real-Time Monitoring and Treatment

The iCares bandage uses innovative microfluidic components, sensors, and machine learning to sample and analyze wounds and provide data to help patients and caregivers make treatment decisions.

Avatar photo

Published

on

Smart Bandages have long been envisioned as a “lab on skin” that could monitor and treat chronic wounds in patients. Caltech Professor of Medical Engineering Wei Gao and his colleagues are now one step closer to achieving this goal. After successfully demonstrating the efficacy of their smart bandage, iCares, in animal models, they have now cleared another hurdle by showing its ability to continually sample fluid from human patients with chronic wounds.

The improved version of the smart bandage, which integrates three microfluidic components, is designed to clear excess moisture from wounds while providing real-time data about biomarkers present. The innovative microfluidics system ensures that only fresh samples are analyzed, allowing for accurate measurements of biomarkers such as nitric oxide and hydrogen peroxide.

Gao’s team has demonstrated the potential of their smart bandage to detect signs of inflammation and infection in patients up to three days before symptoms appear. Furthermore, they have developed a machine-learning algorithm that can accurately classify wounds and predict healing time with a level of accuracy comparable to expert clinicians.

The iCares system consists of a flexible, biocompatible polymer strip that can be 3D printed at low cost. It integrates nanoengineered biomarker sensor arrays for single-use applications and reuses signal processing and wireless data transmission through a user interface like a smartphone. The triad of microfluidic modules includes a membrane that draws wound fluid from the surface, a bioinspired component that shuttles it to the sensor array where analysis takes place, and a micropillar module that carries the sampled fluid away from the bandage.

The implications of this innovation are vast, with potential applications extending beyond chronic wound care. By integrating real-time monitoring and treatment capabilities into wearable devices, we may soon see significant improvements in patient outcomes and quality of life.

Artificial Intelligence

World’s First Petahertz-Speed Phototransistor Achieved in Ambient Conditions

Researchers demonstrated a way to to manipulate electrons using pulses of light that last less than a trillionth of a second to record electrons bypassing a physical barrier almost instantaneously — a feat that redefines the potential limits of computer processing power.

Avatar photo

Published

on

By

Imagine a world where computers can process information at speeds a million times faster than today’s fastest processors. A team of scientists from the University of Arizona, in collaboration with international researchers, have made this vision a reality by creating the world’s first petahertz-speed phototransistor that operates in ambient conditions.

The breakthrough was achieved through a groundbreaking experiment where researchers used pulses of light to manipulate electrons in graphene, a material composed of a single layer of carbon atoms. By leveraging a quantum effect known as tunneling, they recorded electrons bypassing a physical barrier almost instantaneously, redefining the potential limits of computer processing power.

“This achievement represents a huge leap forward in the development of ultrafast computer technologies,” said Mohammed Hassan, an associate professor of physics and optical sciences at the University of Arizona. “By leaning on the discovery of quantum computers, we can develop hardware that matches the current revolution in information technology software. Ultrafast computers will greatly assist discoveries in space research, chemistry, health care, and more.”

The team was originally studying the electrical conductivity of modified samples of graphene when they discovered the unexpected result of electrons slipping through the material almost instantly. This near-instant “tunnelling” effect was captured and tracked in real-time, paving the way for the development of a petahertz-speed transistor.

Using a commercially available graphene phototransistor that was modified to introduce a special silicon layer, the researchers used a laser that switches off and on at a rate of 638 attoseconds to create what Hassan called “the world’s fastest petahertz quantum transistor.”

A transistor is a device that acts as an electronic switch or amplifier that controls the flow of electricity between two points and is fundamental to the development of modern electronics. For reference, a single attosecond is one-quintillionth of a second, making this achievement represent a big leap forward in the development of ultrafast computer technologies.

While some scientific advancements occur under strict conditions, including temperature and pressure, this new transistor performed in ambient conditions – opening the way to commercialization and use in everyday electronics. Hassan is working with Tech Launch Arizona to patent and market innovations related to this invention, aiming to collaborate with industry partners to realize a petahertz-speed transistor on a microchip.

The University of Arizona is already known for developing the world’s fastest electron microscope, and they hope to also be recognized for creating the first petahertz-speed transistor. This achievement has the potential to revolutionize computing as we know it, enabling faster processing speeds, improved efficiency, and breakthroughs in various fields.

Continue Reading

Artificial Intelligence

Empowering Robots with Human-Like Perception to Navigate Complex Terrain

Researchers have developed a novel framework named WildFusion that fuses vision, vibration and touch to enable robots to ‘sense’ and navigate complex outdoor environments much like humans do.

Avatar photo

Published

on

By

The human body is incredibly adept at navigating its surroundings. Our senses work together in harmony to help us avoid obstacles, predict potential dangers, and make our way through even the most challenging environments. We can feel the roughness of tree bark beneath our fingers, smell the sweet scent of blooming flowers, hear the gentle rustle of leaves, and see the intricate details of the world around us.

Robots, on the other hand, have long relied solely on visual information to move through their environment. However, outside of Hollywood, multisensory navigation has remained a significant challenge for machines. The forest, with its dense undergrowth, fallen logs, and ever-changing terrain, is a maze of uncertainty for traditional robots.

Researchers at Duke University have developed a novel framework called WildFusion that fuses vision, vibration, and touch to enable robots to “sense” complex outdoor environments much like humans do. This groundbreaking work has been accepted to the IEEE International Conference on Robotics and Automation (ICRA 2025), which will be held May 19-23, 2025, in Atlanta, Georgia.

WildFusion is built on a quadruped robot that integrates multiple sensing modalities, including an RGB camera, lidar, inertial sensors, contact microphones, and tactile sensors. The camera and lidar capture the environment’s geometry, color, distance, and other visual details. However, what makes WildFusion special is its use of acoustic vibrations and touch.

As the robot walks, contact microphones record the unique vibrations generated by each step, capturing subtle differences such as the crunch of dry leaves versus the soft squish of mud. Meanwhile, the tactile sensors measure how much force is applied to each foot, helping the robot sense stability or slipperiness in real-time. These added senses are complemented by the inertial sensor that collects acceleration data to assess how much the robot is wobbling, pitching, or rolling as it traverses uneven ground.

Each type of sensory data is then processed through specialized encoders and fused into a single, rich representation. At the heart of WildFusion is a deep learning model based on the idea of implicit neural representations. Unlike traditional methods that treat the environment as a collection of discrete points, this approach models complex surfaces and features continuously, allowing the robot to make smarter, more intuitive decisions about where to step, even when its vision is blocked or ambiguous.

WildFusion was tested at the Eno River State Park in North Carolina near Duke’s campus, successfully helping a robot navigate dense forests, grasslands, and gravel paths. The team plans to expand the system by incorporating additional sensors, such as thermal or humidity detectors, to further enhance a robot’s ability to understand and adapt to complex environments.

One of the key challenges for robotics today is developing systems that not only perform well in the lab but reliably function in real-world settings. WildFusion provides vast potential applications beyond forest trails, including disaster response across unpredictable terrains, inspection of remote infrastructure, and autonomous exploration.

WildFusion’s multimodal approach lets the robot “fill in the blanks” when sensor data is sparse or noisy, much like what humans do. The team plans to expand the system by incorporating additional sensors, such as thermal or humidity detectors, to further enhance a robot’s ability to understand and adapt to complex environments.

The success of WildFusion has been recognized by the IEEE International Conference on Robotics and Automation (ICRA 2025), which will be held May 19-23, 2025, in Atlanta, Georgia. The team plans to continue expanding the system and exploring new applications for this groundbreaking technology.

Continue Reading

Communications

Hexagons for Data Protection: A New Method for Location Proofing without Personal Data Disclosure

Location data is considered particularly sensitive — its misuse can have serious consequences. Researchers have now developed a method that allows individuals to cryptographically prove their location — without revealing it. The foundation of this method is the so-called zero-knowledge proof with standardized floating-point numbers.

Avatar photo

Published

on

By

Hexagons for data protection is a novel method that protects individuals’ privacy while still providing verifiable location data. This innovative approach uses a hierarchical hexagonal grid system to divide the Earth’s surface into cells that can be represented at various resolutions, from broad regional levels down to individual street segments. The key feature of this method is its ability to combine precision and privacy in a practically usable way.

The researchers behind this breakthrough used zero-knowledge proofs, a mathematical concept that verifies the truth of a statement without revealing the underlying data. This was combined with standardized floating-point numbers, which ensured computational accuracy and avoided unintended deviations during complex operations. The proof can be computed in less than a second, making it efficient for practical use cases.

One example of an application is Peer-to-Peer Proximity Testing, where two people can determine whether they are in close physical proximity without revealing their exact position. In a prototype, a user can prove in just 0.26 seconds that they are near a specific region, with the desired level of precision adjustable to demonstrate being in a particular neighborhood or park.

This research contributes not only to location proofing but also to the broader field of cryptography. The developed floating-point zero-knowledge circuits are reusable and could be applied in other areas, such as verifying physical measurement data or secure machine learning systems. This opens up new possibilities for trusted systems, including digital healthcare, mobility applications, or identity protection.

Overall, hexagons for data protection offers a promising solution for preserving individuals’ privacy while still providing verifiable location data, making it an essential tool for various industries and applications.

Continue Reading

Trending