Connect with us
We’re experimenting with AI-generated content to help deliver information faster and more efficiently.
While we try to keep things accurate, this content is part of an ongoing experiment and may not always be reliable.
Please double-check important details — we’re not responsible for how the information is used.

Computers & Math

Breaking the Rules: Scientists Create ‘Hot’ Quantum States in Less Perfect Conditions

Quantum states can only be prepared and observed under highly controlled conditions. A research team has now succeeded in creating so-called hot Schrodinger cat states in a superconducting microwave resonator. The study shows that quantum phenomena can also be observed and used in less perfect, warmer conditions.

Avatar photo

Published

on

The article you provided is well-written, but it could benefit from some improvements to clarity, structure, and style to make it more accessible to a general audience. Here’s a rewritten version:

Quantum phenomena have long been associated with extremely cold conditions. However, a team of researchers from Innsbruck, Austria, has now succeeded in creating so-called “hot” Schrödinger cat states in a superconducting microwave resonator. This breakthrough, published in Science Advances, shows that quantum effects can persist even in less perfect, warmer environments.

Schrödinger’s thought experiment, featuring a cat that is alive and dead at the same time, has fascinated physicists for decades. In real experiments, this simultaneity has been observed in the locations of atoms and molecules and in the oscillations of electromagnetic resonators. Previous studies have created these analogues to Schrödinger’s thought experiment by cooling the quantum object to its ground state, the state with the lowest possible energy.

However, researchers led by Gerhard Kirchmair and Oriol Romero-Isart have demonstrated for the first time that it is indeed possible to create quantum superpositions from thermally excited states. This achievement has significant implications for the development of quantum technologies.

The team used a transmon qubit in a microwave resonator to generate the cat states, achieving temperatures of up to 1.8 Kelvin – sixty times hotter than the ambient temperature in the cavity. “Our results show that it is possible to generate highly mixed quantum states with distinct quantum properties,” explains Ian Yang, who performed the experiments reported in the study.

The researchers employed two special protocols to create the hot Schrödinger cat states, adapting techniques previously used to produce cat states starting from the ground state of the system. These protocols generated distinct quantum interferences, opening up new opportunities for the creation and use of quantum superpositions, particularly in nanomechanical oscillators where achieving the ground state can be technically challenging.

“Our measurements confirm that quantum interference can persist even at high temperature,” says Thomas Agrenius, who helped develop the theoretical understanding of the experiment. This research finding could benefit the development of quantum technologies, as it reveals that it is possible to observe and use quantum phenomena even in less ideal, warmer environments.
“We wanted to know whether these quantum effects can also be generated if we don’t start from the ‘cold’ ground state,” remarks Gerhard Kirchmair. “Our work reveals that it is indeed possible to create the necessary interactions in a system, regardless of temperature.”

The study was funded by the Austrian Research Fund FWF and the European Union, among others.

Note: I made some minor changes to the text to improve readability and clarity, while maintaining the original meaning and content. I also added some transitional phrases to help connect the ideas between paragraphs. The prompt for image generation is based on the title of the article and the description of the experiment.

Computer Programming

The Limits of Precision: How AI Can Help Us Reach the Edge of What Physics Allows

Scientists have uncovered how close we can get to perfect optical precision using AI, despite the physical limitations imposed by light itself. By combining physics theory with neural networks trained on distorted light patterns, they showed it’s possible to estimate object positions with nearly the highest accuracy allowed by nature. This breakthrough opens exciting new doors for applications in medical imaging, quantum tech, and materials science.

Avatar photo

Published

on

By

The concept of precision has been a cornerstone in physics for centuries. For 150 years, it has been known that no matter how advanced our technology becomes, there are fundamental limits to the precision with which we can measure physical phenomena. The position of a particle, for instance, can never be measured with infinite precision; a certain amount of blurring is unavoidable.

Recently, an international team of researchers from TU Wien (Vienna), the University of Glasgow, and the University of Grenoble posed a question: where is the absolute limit of precision that is possible with optical methods? And how can this limit be approached as closely as possible? The team’s findings have significant implications for a wide range of fields, including medicine.

To address this question, the researchers employed a theoretical measure known as Fisher information. This measure describes how much information an optical signal contains about an unknown parameter – such as the object position. By using Fisher information, the team was able to calculate an upper limit for the theoretically achievable precision in different experimental scenarios.

However, the calculation of this limit does not necessarily mean that it is impossible to achieve. In fact, a corresponding experiment designed by Dorian Bouchet from the University of Grenoble, together with Ilya Starshynov and Daniele Faccio from the University of Glasgow, showed that using artificial intelligence (AI) algorithms for neural networks can come very close to this limit.

In the experiment, a laser beam was directed at a small, reflective object located behind a turbid liquid. The measurement conditions varied depending on the turbidity – and therefore also the difficulty of obtaining precise position information from the signal. The recorded images only showed highly distorted light patterns that looked like random patterns to the human eye.

But when fed into a neural network, which was trained with many such images each with a known object position, the network could learn which patterns are associated with which positions. After sufficient training, the network was able to determine the object position very precisely, even with new, unknown patterns.

The precision of the prediction achieved by the AI-supported algorithm was only minimally worse than the theoretically achievable maximum calculated using Fisher information. This means that our AI-supported algorithm is not only effective but almost optimal, achieving almost exactly the precision permitted by the laws of physics.

This realisation has far-reaching consequences: with the help of intelligent algorithms, optical measurement methods could be significantly improved in a wide range of areas – from medical diagnostics to materials research and quantum technology. In future projects, the research team wants to work with partners from applied physics and medicine to investigate how these AI-supported methods can be used in specific systems.

Continue Reading

Computer Science

Sharper than Lightning: Oxford’s Groundbreaking Quantum Breakthrough

Physicists at the University of Oxford have set a new global benchmark for the accuracy of controlling a single quantum bit, achieving the lowest-ever error rate for a quantum logic operation–just 0.000015%, or one error in 6.7 million operations. This record-breaking result represents nearly an order of magnitude improvement over the previous benchmark, set by the same research group a decade ago.

Avatar photo

Published

on

By

The University of Oxford has achieved a major milestone in the field of quantum computing. Physicists at the institution have successfully set a new global benchmark for controlling a single quantum bit (qubit), reducing the error rate to an astonishing 0.000015% – or one error in 6.7 million operations. This achievement represents nearly an order of magnitude improvement over the previous record, which was also held by the same research group.

To put this remarkable result into perspective, it’s more likely for a person to be struck by lightning in a given year (1 in 1.2 million) than for one of Oxford’s quantum logic gates to make a mistake. This breakthrough has significant implications for the development of practical and robust quantum computers that can tackle real-world problems.

The researchers utilized a trapped calcium ion as the qubit, which is a natural choice for storing quantum information due to its long lifetime and robustness. Unlike conventional methods, which rely on lasers, the Oxford team employed electronic (microwave) signals to control the quantum state of the ions. This approach offers greater stability and other benefits for building practical quantum computers.

The experiment was conducted at room temperature without magnetic shielding, simplifying the technical requirements for a working quantum computer. The previous best single-qubit error rate achieved by the Oxford team in 2014 was 1 in 1 million. The group’s expertise led to the launch of the spinout company Oxford Ionics in 2019, which has become an established leader in high-performance trapped-ion qubit platforms.

While this record-breaking result marks a significant milestone, the researchers caution that it is part of a larger challenge. Quantum computing requires both single- and two-qubit gates to function together. Currently, two-qubit gates still have significantly higher error rates – around 1 in 2000 in the best demonstrations to date – so reducing these will be crucial to building fully fault-tolerant quantum machines.

The experiments were carried out by a team of researchers from the University of Oxford’s Department of Physics, including Molly Smith, Aaron Leu, Dr Mario Gely, and Professor David Lucas, together with a visiting researcher, Dr Koichiro Miyanishi, from the University of Osaka’s Centre for Quantum Information and Quantum Biology. The Oxford scientists are part of the UK Quantum Computing and Simulation (QCS) Hub, which is a part of the ongoing UK National Quantum Technologies Programme.

Continue Reading

Computer Programming

Boosting AI with Green Quantum Chips: A Breakthrough in Photonic Quantum Computing

A team of researchers has shown that even small-scale quantum computers can enhance machine learning performance, using a novel photonic quantum circuit. Their findings suggest that today s quantum technology isn t just experimental it can already outperform classical systems in specific tasks. Notably, this photonic approach could also drastically reduce energy consumption, offering a sustainable path forward as machine learning s power needs soar.

Avatar photo

Published

on

By

The integration of artificial intelligence (AI) and quantum computing has been a topic of intense research in recent years. A team of international researchers from the University of Vienna has made a significant breakthrough in this field by demonstrating that small-scale quantum computers can enhance the performance of machine learning algorithms. Their study, published in Nature Photonics, showcases promising applications for optical quantum computers.

Machine learning and AI have revolutionized our lives with their ability to perform complex tasks and drive scientific research. Quantum computing, on the other hand, has emerged as a new paradigm for computation. The combination of these two fields has given rise to the field of Quantum Machine Learning, which aims to find enhancements in speed, efficiency, or accuracy when running algorithms on quantum platforms.

However, achieving such advantages with current technology is still an open challenge. The University of Vienna team took this next step by designing a novel experiment featuring a photonic quantum circuit and a machine learning algorithm. Their goal was to classify data points using a photonic quantum computer and understand the contribution of quantum effects in comparison to classical computers.

The results were promising, as they found that already small-sized quantum processors can perform better than conventional algorithms. “We found that for specific tasks our algorithm commits fewer errors than its classical counterpart,” explained Philip Walther from the University of Vienna, lead of the project. This implies that existing quantum computers can show good performances without necessarily going beyond state-of-the-art technology.

Another significant aspect of this research is that photonic platforms can consume less energy compared to standard computers. Given the high energy demands of machine learning algorithms, this could prove crucial in the future. Co-author Iris Agresti emphasized that new algorithms inspired by quantum architectures could be designed, reaching better performances and reducing energy consumption.

This breakthrough has a significant impact on both quantum computation and standard computing. It identifies tasks that benefit from quantum effects and opens up possibilities for designing more efficient and eco-friendly algorithms. The integration of AI and quantum computing is an exciting area of research, and this study takes us one step closer to making AI smarter and greener.

Continue Reading

Trending