Connect with us

Computers & Math

“Revolutionizing Superconductivity: 3D Nanostructures Pave the Way for Reconfigurable Devices”

An international team has pioneered a nano-3D printing method to create superconducting nanostructures, leading to groundbreaking technological advancements.

Avatar photo

Published

on

A groundbreaking study has successfully created three-dimensional superconducting nanostructures, akin to a nanoscale “3D printer.” This achievement opens doors to unprecedented control over the superconducting state, enabling researchers to switch it on and off in different parts of the structure by rotating it in a magnetic field.

The research team, led by scientists at the Max Planck Institute for Chemical Physics of Solids, has demonstrated the creation of complex 3D geometries at the nanoscale, a feat that was previously considered impossible. This breakthrough has significant implications for the development of new superconducting technologies and devices.

Superconductors are materials that can exhibit zero electrical resistance and expel magnetic fields. The formation of Cooper pairs – bound pairs of electrons that move coherently through the material without scattering – is responsible for this striking behavior. However, controlling this state at the nanoscale has proven to be a significant challenge, hindering the exploration of novel effects and future technological developments.

The researchers involved in this study have successfully localized control over the superconducting state by patterning superconductors in 3D nanogeometries. This achievement has enabled them to create reconfigurable superconducting devices that can switch on and off in different parts of the structure, simply by rotating it in a magnetic field.

The implications of this breakthrough are far-reaching, offering a new platform for building adaptive or multi-purpose superconducting components. The ability to propagate defects of the superconducting state also opens the door to complex superconducting logic and neuromorphic architectures, setting the stage for a new generation of reconfigurable superconducting technologies.

This study has been published in the journal Advanced Functional Materials and represents a significant step forward in the field of nanotechnology. The researchers involved have demonstrated their ability to push the boundaries of what was previously thought possible, paving the way for further innovations and discoveries.

Artificial Intelligence

Shedding Light on Shadow Branches: Revolutionizing Computing Efficiency in Modern Data Centers

Researchers have developed a new technique called ‘Skia’ to help computer processors better predict future instructions and improve computing performance.

Avatar photo

Published

on

By

The collaboration between trailblazing engineers and industry professionals has led to a groundbreaking technique called Skia, which may transform the future of computing efficiency for modern data centers.

In data centers, large computers process massive amounts of data, but often struggle to keep up due to taxing workloads. This results in slower performance, causing search engines to generate answers more slowly or not at all. To address this issue, researchers at Texas A&M University have developed Skia in collaboration with Intel, AheadComputing, and Princeton.

The team includes Dr. Paul V. Gratz, a professor in the Department of Electrical and Computer Engineering, Dr. Daniel A. Jiménez, a professor in the Department of Computer Science and Engineering, and Chrysanthos Pepi, a graduate student in the Department of Electrical and Computer Engineering.

Processing instructions has become a major bottleneck in modern processor design,” Gratz said. “We developed Skia to better predict what’s coming next and alleviate that bottleneck.” Skia can not only help better predict future instructions but also improve the throughput of instructions on the system, leading to quicker performance and less power consumption for the data center.

Think of throughput in terms of being a server in a restaurant,” Gratz said. “You have lots and lots of jobs to do. How many tasks can you complete or how many instructions can you execute per unit time? You want high throughput, especially for computing.”

Improving throughput can lead to quicker performance and less power consumption for the data center. In fact, making it up to 10% more efficient means a company previously needing to make 100 data centers around the country now only needs to make 90, which is 10 fewer data centers. That’s pretty significant. These data centers cost millions of dollars, and they consume roughly the equivalent of the entire output of a power plant.

Skia identifies and decodes these shadow branches in unused bytes, storing them in a memory area called the Shadow Branch Buffer, which can be accessed alongside the BTB. What makes this technique interesting is that most of the future instructions were already available, and we demonstrate that Skia, with a minimal hardware budget, can make data centers more efficient, nearly twice the performance improvement versus adding the same amount of storage to the existing hardware as we observe,” Pepi said.

Their findings, “Skia: Exposing Shadow Branches,” were published in one of the leading computer architecture conferences, the ACM International Conference on Architectural Support for Programming Languages and Operating Systems. The team also traveled to the Netherlands to present their work to colleagues from around the globe.

Funding for this research is administered by the Texas A&M Engineering Experiment Station (TEES), the official research agency for Texas A&M Engineering.

Continue Reading

Artificial Intelligence

Estimating Biological Age with AI: A New Frontier in Cancer Care and Beyond

Researchers developed FaceAge, an AI tool that calculate’s a patient biological age from a photo of their face. In a new study, the researchers tied FaceAge results to health outcomes in people with cancer: When FaceAge estimated a younger age than a cancer patient’s chronological age, the patient did significantly better after cancer treatment, whereas patients with older FaceAge estimates had worse survival outcomes.

Avatar photo

Published

on

Estimating Biological Age with AI: A New Frontier in Cancer Care and Beyond

Imagine having a tool that can accurately predict a person’s biological age based on their facial appearance. This might seem like science fiction, but a team of investigators at Mass General Brigham has made this concept a reality using artificial intelligence (AI) technology called FaceAge.

FaceAge uses deep learning algorithms to analyze photographs of individuals and estimate their biological age. The tool was trained on 58,851 photos of presumed healthy individuals from public datasets and tested in a cohort of 6,196 cancer patients from two centers, who had photographs taken at the start of radiotherapy treatment.

The results were striking: patients with cancer appeared significantly older than those without cancer, with an average FaceAge that was about five years older than their chronological age. Moreover, older FaceAge predictions were associated with worse overall survival outcomes across multiple cancer types.

This technology has significant implications for cancer care. By using FaceAge to inform clinical decision-making, healthcare providers may be able to tailor treatment plans more effectively to individual patients based on their estimated biological age and health status.

The researchers also found that FaceAge outperformed clinicians in predicting short-term life expectancies of patients receiving palliative radiotherapy. This highlights the potential for AI-driven tools like FaceAge to provide valuable insights in high-pressure situations, where accurate predictions are crucial.

While further research is needed before this technology can be used in real-world clinical settings, the possibilities are vast and exciting. By expanding its capabilities to predict diseases, general health status, and lifespan, FaceAge has the potential to revolutionize our approach to healthcare.

The co-senior authors of the study, Hugo Aerts, PhD, and Ray Mak, MD, emphasize that this technology opens up new avenues for biomarker discovery from photographs, with implications far beyond cancer care or predicting age. As we increasingly consider chronic diseases as diseases of aging, accurately predicting an individual’s aging trajectory becomes crucial for developing effective treatment plans.

The door to a whole new realm of biomedical innovation has been opened, and it is essential that this technology be developed within a strong regulatory and ethical framework to ensure its safe use in various applications, ultimately helping to save lives.

Continue Reading

Artificial Intelligence

Ping Pong Robot Aces High-Speed Precision Shots

Engineers developed a ping-pong-playing robot that quickly estimates the speed and trajectory of an incoming ball and precisely hits it to a desired location on the table.

Avatar photo

Published

on

By

The MIT engineers’ latest creation is a powerful and lightweight ping pong bot that returns shots with high-speed precision. This table tennis tech has come a long way since the 1980s, when researchers first started building robots to play ping pong. The problem requires a unique combination of technologies, including high-speed machine vision, fast and nimble motors and actuators, precise manipulator control, and accurate real-time prediction.

The team’s new design comprises a multijointed robotic arm that is fixed to one end of a standard ping pong table and wields a standard ping pong paddle. Aided by several high-speed cameras and a high-bandwidth predictive control system, the robot quickly estimates the speed and trajectory of an incoming ball and executes one of several swing types – loop, drive, or chop – to precisely hit the ball to a desired location on the table with various types of spin.

In tests, the engineers threw 150 balls at the robot, one after the other, from across the ping pong table. The bot successfully returned the balls with a hit rate of about 88 percent across all three swing types. The robot’s strike speed approaches the top return speeds of human players and is faster than that of other robotic table tennis designs.

The researchers have since tuned the robot’s reaction time and found the arm hits balls faster than existing systems, at velocities of 20 meters per second. Advanced human players have been known to return balls at speeds of between 21 to 25 meters per second.

“Some of the goal of this project is to say we can reach the same level of athleticism that people have,” Nguyen says. “And in terms of strike speed, we’re getting really, really close.”

The team’s design has several implications for robotics and AI research. It could be adapted to improve the speed and responsiveness of humanoid robots, particularly for search-and-rescue scenarios, or situations where a robot would need to quickly react or anticipate.

This technology also has potential applications in smart robotic training systems. A robot like this could mimic the maneuvers that an opponent would do in a game environment, in a way that helps humans play and improve.

The researchers plan to further develop their system, enabling it to cover more of the table and return a wider variety of shots. This research is supported, in part, by the Robotics and AI Institute.

Continue Reading

Trending