Connect with us
We’re experimenting with AI-generated content to help deliver information faster and more efficiently.
While we try to keep things accurate, this content is part of an ongoing experiment and may not always be reliable.
Please double-check important details — we’re not responsible for how the information is used.

Computers & Math

Breaking Ground in 6G Technology: A Leap Forward in Semiconductor Innovation

Self-driving cars which eliminate traffic jams, getting a healthcare diagnosis instantly without leaving your home, or feeling the touch of loved ones based across the continent may sound like the stuff of science fiction. But new research could make all this and more a step closer to reality thanks to a radical breakthrough in semiconductor technology.

Avatar photo

Published

on

The world is on the cusp of a technological revolution that could transform the way we live, work, and interact with one another. A team of researchers from the University of Bristol has made a groundbreaking breakthrough in semiconductor technology that could supercharge 6G delivery, paving the way for previously unimaginable advancements in fields such as healthcare, transportation, and communication.

Led by Professor Martin Kuball, the research team has developed an innovative way to accelerate data transfer between scores of users, potentially across the globe. This achievement is set to revolutionize industries that rely on fast and efficient data transmission, including remote diagnostics and surgery, virtual classrooms, and advanced driver assistance systems.
Co-lead author Dr Akhil Shaji explained: “We have piloted a device technology called superlattice castellated field effect transistors (SLCFETs), in which more than 1000 fins with sub-100 nm width help drive the current. Although SLCFETs have demonstrated the highest performance in the W-band frequency range, equating to 75 gigahertz -110 GHz, the physics behind it was unknown.”
The researchers then needed to pinpoint exactly where this effect occurred, by simultaneously using ultra precision electrical measurements and optical microscopy, so it could be further studied and understood. After analysing more than 1,000 fins findings located this effect to the widest fin.
Prof Kuball added: “We also developed a 3D model using a simulator to further verify our observations. The next challenge was to study the reliability aspects of latch effect for practical applications. The rigorous testing of the device over a long duration of time showed it has no detrimental effect on device reliability or performance.”
Next steps for the work include further increasing the power density the devices can deliver, so they can offer even higher performance and serve wider audiences. Industry partners will also be bringing such next generation devices to a commercial market.

The potential benefits of this research are far-reaching, with applications in healthcare, transportation, communication, and more. The researchers at the University of Bristol are at the forefront of improving electrical performance and efficiency in a wide range of different applications and settings.
Professor Kuball leads the Centre for Device Thermography and Reliability (CDTR), which is developing next generation semiconductor electronic devices for net zero, and for communications and radar technology. It also works on improving device thermal management, electrical performance and reliability, using wide and ultra-wide bandgap semiconductors.

In conclusion, the breakthrough in semiconductor technology made by the University of Bristol’s research team has the potential to revolutionize various industries and transform human experiences in many different ways. The future is exciting, and it’s not hard to imagine a world where flying cars, virtual reality contact lenses, and other futuristic technologies become a reality.

Computer Programming

The Limits of Precision: How AI Can Help Us Reach the Edge of What Physics Allows

Scientists have uncovered how close we can get to perfect optical precision using AI, despite the physical limitations imposed by light itself. By combining physics theory with neural networks trained on distorted light patterns, they showed it’s possible to estimate object positions with nearly the highest accuracy allowed by nature. This breakthrough opens exciting new doors for applications in medical imaging, quantum tech, and materials science.

Avatar photo

Published

on

By

The concept of precision has been a cornerstone in physics for centuries. For 150 years, it has been known that no matter how advanced our technology becomes, there are fundamental limits to the precision with which we can measure physical phenomena. The position of a particle, for instance, can never be measured with infinite precision; a certain amount of blurring is unavoidable.

Recently, an international team of researchers from TU Wien (Vienna), the University of Glasgow, and the University of Grenoble posed a question: where is the absolute limit of precision that is possible with optical methods? And how can this limit be approached as closely as possible? The team’s findings have significant implications for a wide range of fields, including medicine.

To address this question, the researchers employed a theoretical measure known as Fisher information. This measure describes how much information an optical signal contains about an unknown parameter – such as the object position. By using Fisher information, the team was able to calculate an upper limit for the theoretically achievable precision in different experimental scenarios.

However, the calculation of this limit does not necessarily mean that it is impossible to achieve. In fact, a corresponding experiment designed by Dorian Bouchet from the University of Grenoble, together with Ilya Starshynov and Daniele Faccio from the University of Glasgow, showed that using artificial intelligence (AI) algorithms for neural networks can come very close to this limit.

In the experiment, a laser beam was directed at a small, reflective object located behind a turbid liquid. The measurement conditions varied depending on the turbidity – and therefore also the difficulty of obtaining precise position information from the signal. The recorded images only showed highly distorted light patterns that looked like random patterns to the human eye.

But when fed into a neural network, which was trained with many such images each with a known object position, the network could learn which patterns are associated with which positions. After sufficient training, the network was able to determine the object position very precisely, even with new, unknown patterns.

The precision of the prediction achieved by the AI-supported algorithm was only minimally worse than the theoretically achievable maximum calculated using Fisher information. This means that our AI-supported algorithm is not only effective but almost optimal, achieving almost exactly the precision permitted by the laws of physics.

This realisation has far-reaching consequences: with the help of intelligent algorithms, optical measurement methods could be significantly improved in a wide range of areas – from medical diagnostics to materials research and quantum technology. In future projects, the research team wants to work with partners from applied physics and medicine to investigate how these AI-supported methods can be used in specific systems.

Continue Reading

Computer Science

Sharper than Lightning: Oxford’s Groundbreaking Quantum Breakthrough

Physicists at the University of Oxford have set a new global benchmark for the accuracy of controlling a single quantum bit, achieving the lowest-ever error rate for a quantum logic operation–just 0.000015%, or one error in 6.7 million operations. This record-breaking result represents nearly an order of magnitude improvement over the previous benchmark, set by the same research group a decade ago.

Avatar photo

Published

on

By

The University of Oxford has achieved a major milestone in the field of quantum computing. Physicists at the institution have successfully set a new global benchmark for controlling a single quantum bit (qubit), reducing the error rate to an astonishing 0.000015% – or one error in 6.7 million operations. This achievement represents nearly an order of magnitude improvement over the previous record, which was also held by the same research group.

To put this remarkable result into perspective, it’s more likely for a person to be struck by lightning in a given year (1 in 1.2 million) than for one of Oxford’s quantum logic gates to make a mistake. This breakthrough has significant implications for the development of practical and robust quantum computers that can tackle real-world problems.

The researchers utilized a trapped calcium ion as the qubit, which is a natural choice for storing quantum information due to its long lifetime and robustness. Unlike conventional methods, which rely on lasers, the Oxford team employed electronic (microwave) signals to control the quantum state of the ions. This approach offers greater stability and other benefits for building practical quantum computers.

The experiment was conducted at room temperature without magnetic shielding, simplifying the technical requirements for a working quantum computer. The previous best single-qubit error rate achieved by the Oxford team in 2014 was 1 in 1 million. The group’s expertise led to the launch of the spinout company Oxford Ionics in 2019, which has become an established leader in high-performance trapped-ion qubit platforms.

While this record-breaking result marks a significant milestone, the researchers caution that it is part of a larger challenge. Quantum computing requires both single- and two-qubit gates to function together. Currently, two-qubit gates still have significantly higher error rates – around 1 in 2000 in the best demonstrations to date – so reducing these will be crucial to building fully fault-tolerant quantum machines.

The experiments were carried out by a team of researchers from the University of Oxford’s Department of Physics, including Molly Smith, Aaron Leu, Dr Mario Gely, and Professor David Lucas, together with a visiting researcher, Dr Koichiro Miyanishi, from the University of Osaka’s Centre for Quantum Information and Quantum Biology. The Oxford scientists are part of the UK Quantum Computing and Simulation (QCS) Hub, which is a part of the ongoing UK National Quantum Technologies Programme.

Continue Reading

Computer Programming

Boosting AI with Green Quantum Chips: A Breakthrough in Photonic Quantum Computing

A team of researchers has shown that even small-scale quantum computers can enhance machine learning performance, using a novel photonic quantum circuit. Their findings suggest that today s quantum technology isn t just experimental it can already outperform classical systems in specific tasks. Notably, this photonic approach could also drastically reduce energy consumption, offering a sustainable path forward as machine learning s power needs soar.

Avatar photo

Published

on

By

The integration of artificial intelligence (AI) and quantum computing has been a topic of intense research in recent years. A team of international researchers from the University of Vienna has made a significant breakthrough in this field by demonstrating that small-scale quantum computers can enhance the performance of machine learning algorithms. Their study, published in Nature Photonics, showcases promising applications for optical quantum computers.

Machine learning and AI have revolutionized our lives with their ability to perform complex tasks and drive scientific research. Quantum computing, on the other hand, has emerged as a new paradigm for computation. The combination of these two fields has given rise to the field of Quantum Machine Learning, which aims to find enhancements in speed, efficiency, or accuracy when running algorithms on quantum platforms.

However, achieving such advantages with current technology is still an open challenge. The University of Vienna team took this next step by designing a novel experiment featuring a photonic quantum circuit and a machine learning algorithm. Their goal was to classify data points using a photonic quantum computer and understand the contribution of quantum effects in comparison to classical computers.

The results were promising, as they found that already small-sized quantum processors can perform better than conventional algorithms. “We found that for specific tasks our algorithm commits fewer errors than its classical counterpart,” explained Philip Walther from the University of Vienna, lead of the project. This implies that existing quantum computers can show good performances without necessarily going beyond state-of-the-art technology.

Another significant aspect of this research is that photonic platforms can consume less energy compared to standard computers. Given the high energy demands of machine learning algorithms, this could prove crucial in the future. Co-author Iris Agresti emphasized that new algorithms inspired by quantum architectures could be designed, reaching better performances and reducing energy consumption.

This breakthrough has a significant impact on both quantum computation and standard computing. It identifies tasks that benefit from quantum effects and opens up possibilities for designing more efficient and eco-friendly algorithms. The integration of AI and quantum computing is an exciting area of research, and this study takes us one step closer to making AI smarter and greener.

Continue Reading

Trending