Connect with us
We’re experimenting with AI-generated content to help deliver information faster and more efficiently.
While we try to keep things accurate, this content is part of an ongoing experiment and may not always be reliable.
Please double-check important details — we’re not responsible for how the information is used.

Computers & Math

A Breakthrough in AR Glasses: One Glass, Full Color

Augmented-reality (AR) technology is rapidly finding its way into everyday life, from education and healthcare to gaming and entertainment. However, the core AR device remains bulky and heavy, making prolonged wear uncomfortable. A breakthrough now promises to change that. A research team has slashed both thickness and weight using a single-layer waveguide.

Avatar photo

Published

on

A breakthrough in augmented-reality (AR) technology has been made by POSTECH researchers, which could revolutionize the way we interact with everyday life. The core AR device, typically bulky and heavy, can now be designed to be thin and light, making prolonged wear comfortable.

One of the main hurdles to commercializing AR glasses was the waveguide, a crucial component that guides virtual images directly to the user’s eye. Conventional designs required separate layers for red, green, and blue light, leading to increased weight and thickness. POSTECH researchers have eliminated this need by developing an achromatic metagrating that handles all colors in a single glass layer.

The key to this innovation lies in an array of nanoscale silicon-nitride pillars whose geometry was finely tuned using a stochastic topology-optimization algorithm to steer light with maximum efficiency. In experiments, the researchers produced vivid full-color images using a 500-µm-thick single-layer waveguide – about one-hundredth the diameter of a human hair.

The new design erases color blur while outperforming multilayer optics in brightness and color uniformity. This breakthrough has significant implications for the commercialization of AR glasses, which could become as thin and light as ordinary eyewear. The era of truly everyday AR is now within reach.

“This work marks a key milestone for next-generation AR displays,” said Prof. Junsuk Rho. “Coupled with scalable, large-area fabrication, it brings commercialization within reach.”

The study was carried out by POSTECH’s Departments of Mechanical, Chemical and Electrical Engineering and the Graduate School of Interdisciplinary Bioscience & Bioengineering, in collaboration with the Visual Team at Samsung Research. It was published online on April 30, 2025, in Nature Nanotechnology.

This research was supported by POSCO Holdings N.EX.T Impact, Samsung Research, the Ministry of Trade, Industry and Energy’s Alchemist Project, the Ministry of Science and ICT’s Global Convergence Research Support Program, and the Mid-Career Researcher Program.

Computers & Math

Quantum Computers Just Beat Classical Ones – Exponentially and Unconditionally

A research team has achieved the holy grail of quantum computing: an exponential speedup that’s unconditional. By using clever error correction and IBM’s powerful 127-qubit processors, they tackled a variation of Simon’s problem, showing quantum machines are now breaking free from classical limitations, for real.

Avatar photo

Published

on

By

Quantum computers have been touted as potential game-changers for computation, medicine, coding, and material discovery – but only when they truly function. One major obstacle has been noise or errors produced during computations on a quantum machine, making them less powerful than classical computers – until recently.

Daniel Lidar, holder of the Viterbi Professorship in Engineering and Professor of Electrical & Computing Engineering at USC Viterbi School of Engineering, has made significant strides in quantum error correction. In a recent study with collaborators at USC and Johns Hopkins, he demonstrated a quantum exponential scaling advantage using two 127-qubit IBM Quantum Eagle processor-powered quantum computers over the cloud.

The key milestone for quantum computing, Lidar says, is to demonstrate that we can execute entire algorithms with a scaling speedup relative to ordinary “classical” computers. An exponential speedup means that as you increase a problem’s size by including more variables, the gap between the quantum and classical performance keeps growing – roughly doubling for every additional variable.

Lidar clarifies that this type of speedup is unconditional, meaning it doesn’t rely on unproven assumptions. Prior speedup claims required assuming there was no better classical algorithm against which to benchmark the quantum algorithm. This study used an algorithm modified for the quantum computer to solve a variation of “Simon’s problem,” an early example of quantum algorithms that can solve tasks exponentially faster than any classical counterpart, unconditionally.

Simon’s problem involves finding a hidden repeating pattern in a mathematical function and is considered the precursor to Shor’s factoring algorithm, which can be used to break codes. Quantum players can win this game exponentially faster than classical players.

The team achieved their exponential speedup by squeezing every ounce of performance from the hardware: shorter circuits, smarter pulse sequences, and statistical error mitigation. They limited data input, compressed quantum logic operations using transpilation, applied dynamical decoupling to detach qubits from noise, and used measurement error mitigation to correct errors left over after dynamical decoupling.

Lidar says that this result shows today’s quantum computers firmly lie on the side of a scaling quantum advantage. The performance separation cannot be reversed because the exponential speedup is unconditional – making it increasingly difficult to dispute. Next steps include demonstrating practical real-world applications, reducing noise and decoherence in ever larger quantum computers, and addressing the lack of oracle-based speedups.

Continue Reading

Computational Biology

A Quantum Leap Forward – New Amplifier Boosts Efficiency of Quantum Computers 10x

Chalmers engineers built a pulse-driven qubit amplifier that’s ten times more efficient, stays cool, and safeguards quantum states—key for bigger, better quantum machines.

Avatar photo

Published

on

By

Quantum computers have long been touted as revolutionary machines capable of solving complex problems that stymie conventional supercomputers. However, their full potential has been hindered by the limitations of qubit amplifiers – essential components required to read and interpret quantum information. Researchers at Chalmers University of Technology in Sweden have taken a significant step forward with the development of an ultra-efficient amplifier that reduces power consumption by 90%, paving the way for more powerful quantum computers with enhanced performance.

The new amplifier is pulse-operated, meaning it’s activated only when needed to amplify qubit signals, minimizing heat generation and decoherence. This innovation has far-reaching implications for scaling up quantum computers, as larger systems require more amplifiers, leading to increased power consumption and decreased accuracy. The Chalmers team’s breakthrough offers a solution to this challenge, enabling the development of more accurate readout systems for future generations of quantum computers.

One of the key challenges in developing pulse-operated amplifiers is ensuring they respond quickly enough to keep pace with qubit readout. To address this, the researchers employed genetic programming to develop a smart control system that enables rapid response times – just 35 nanoseconds. This achievement has significant implications for the future of quantum computing, as it paves the way for more accurate and powerful calculations.

The new amplifier was developed in collaboration with industry partners Low Noise Factory AB and utilizes the expertise of researchers at Chalmers’ Terahertz and Millimeter Wave Technology Laboratory. The study, published in IEEE Transactions on Microwave Theory and Techniques, demonstrates a novel approach to developing ultra-efficient amplifiers for qubit readout and offers promising prospects for future research.

In conclusion, the development of this highly efficient amplifier represents a significant leap forward for quantum computing. By reducing power consumption by 90%, researchers have opened doors to more powerful and accurate calculations, unlocking new possibilities in fields such as drug development, encryption, AI, and logistics. As the field continues to evolve, it will be exciting to see how this innovation shapes the future of quantum computing.

Continue Reading

Artificial Intelligence

AI Uncovers Hidden Heart Risks in CT Scans: A Game-Changer for Cardiovascular Care

What if your old chest scans—taken years ago for something unrelated—held a secret warning about your heart? A new AI tool called AI-CAC, developed by Mass General Brigham and the VA, can now comb through routine CT scans to detect hidden signs of heart disease before symptoms strike.

Avatar photo

Published

on

The Massachusetts General Brigham researchers have developed an innovative artificial intelligence (AI) tool called AI-CAC to analyze previously collected CT scans and identify individuals with high coronary artery calcium (CAC) levels, indicating a greater risk for cardiovascular events. Their research, published in NEJM AI, demonstrated the high accuracy and predictive value of AI-CAC for future heart attacks and 10-year mortality.

Millions of chest CT scans are taken each year, often in healthy people, to screen for lung cancer or other conditions. However, this study reveals that these scans can also provide valuable information about cardiovascular risk, which has been going unnoticed. The researchers found that AI-CAC had a high accuracy rate (89.4%) at determining whether a scan contained CAC or not.

The gold standard for quantifying CAC uses “gated” CT scans, synchronized to the heartbeat to reduce motion during the scan. However, most chest CT scans obtained for routine clinical purposes are “nongated.” The researchers developed AI-CAC, a deep learning algorithm, to probe through these nongated scans and quantify CAC.

The AI-CAC model was 87.3% accurate at determining whether the score was higher or lower than 100, indicating a moderate cardiovascular risk. Importantly, AI-CAC was also predictive of 10-year all-cause mortality, with those having a CAC score over 400 having a 3.49 times higher risk of death over a 10-year period.

The researchers hope to conduct future studies in the general population and test whether the tool can assess the impact of lipid-lowering medications on CAC scores. This could lead to the implementation of AI-CAC in clinical practice, enabling physicians to engage with patients earlier, before their heart disease advances to a cardiac event.

As Dr. Raffi Hagopian, first author and cardiologist at the VA Long Beach Healthcare System, emphasized, “Using AI for tasks like CAC detection can help shift medicine from a reactive approach to the proactive prevention of disease, reducing long-term morbidity, mortality, and healthcare costs.”

Continue Reading

Trending