Connect with us
We’re experimenting with AI-generated content to help deliver information faster and more efficiently.
While we try to keep things accurate, this content is part of an ongoing experiment and may not always be reliable.
Please double-check important details — we’re not responsible for how the information is used.

Computer Science

Revolutionary Cooling Technology Could Slash Data Center Energy Use and Water Consumption

UC San Diego engineers have created a passive evaporative cooling membrane that could dramatically slash energy use in data centers. As demand for AI and cloud computing soars, traditional cooling systems struggle to keep up efficiently. This innovative fiber membrane uses capillary action to evaporate liquid and draw heat away without fans or pumps. It performs with record-breaking heat flux and is stable under high-stress operation.

Avatar photo

Published

on

The University of California San Diego has made a groundbreaking discovery in the field of data center cooling. Researchers have developed a new evaporative cooling technology that could significantly reduce energy consumption and water usage associated with traditional cooling systems. This breakthrough is expected to revolutionize the way data centers cool their high-powered electronics, making them more sustainable and efficient.

As artificial intelligence (AI) and cloud computing continue to grow in popularity, the demand for data processing and storage has skyrocketed. However, this growth comes with a significant drawback – heat generation. Current cooling systems account for up to 40% of a data center’s total energy use, which could more than double by 2030 if trends continue.

The new evaporative cooling technology uses a specially engineered fiber membrane that passively removes heat through evaporation. This innovative design features interconnected pores that draw cooling liquid across its surface using capillary action, efficiently dissipating heat from the electronics underneath – all without requiring extra energy.

“We were able to achieve record-breaking performance with our fiber membrane, managing heat fluxes exceeding 800 watts of heat per square centimeter,” said Renkun Chen, professor in the Department of Mechanical and Aerospace Engineering at UC San Diego. “This success showcases the potential of reimagining materials for entirely new applications.”

The team’s results are promising, but they say the technology is still operating well below its theoretical limit. Next steps include refining the membrane and optimizing performance to integrate it into prototypes of cold plates, which attach to chips like CPUs and GPUs to dissipate heat.

This research has significant implications for the data center industry, as evaporative cooling could become a widely adopted solution for cooling high-power electronics. The development of this technology is expected to reduce energy consumption and water usage, making data centers more sustainable and efficient.

The team’s startup company plans to commercialize the technology, and researchers are now working on refining the membrane to achieve even better performance. This breakthrough has the potential to revolutionize the way data centers cool their high-powered electronics, making them more sustainable and efficient.

Computational Biology

A Quantum Leap Forward – New Amplifier Boosts Efficiency of Quantum Computers 10x

Chalmers engineers built a pulse-driven qubit amplifier that’s ten times more efficient, stays cool, and safeguards quantum states—key for bigger, better quantum machines.

Avatar photo

Published

on

By

Quantum computers have long been touted as revolutionary machines capable of solving complex problems that stymie conventional supercomputers. However, their full potential has been hindered by the limitations of qubit amplifiers – essential components required to read and interpret quantum information. Researchers at Chalmers University of Technology in Sweden have taken a significant step forward with the development of an ultra-efficient amplifier that reduces power consumption by 90%, paving the way for more powerful quantum computers with enhanced performance.

The new amplifier is pulse-operated, meaning it’s activated only when needed to amplify qubit signals, minimizing heat generation and decoherence. This innovation has far-reaching implications for scaling up quantum computers, as larger systems require more amplifiers, leading to increased power consumption and decreased accuracy. The Chalmers team’s breakthrough offers a solution to this challenge, enabling the development of more accurate readout systems for future generations of quantum computers.

One of the key challenges in developing pulse-operated amplifiers is ensuring they respond quickly enough to keep pace with qubit readout. To address this, the researchers employed genetic programming to develop a smart control system that enables rapid response times – just 35 nanoseconds. This achievement has significant implications for the future of quantum computing, as it paves the way for more accurate and powerful calculations.

The new amplifier was developed in collaboration with industry partners Low Noise Factory AB and utilizes the expertise of researchers at Chalmers’ Terahertz and Millimeter Wave Technology Laboratory. The study, published in IEEE Transactions on Microwave Theory and Techniques, demonstrates a novel approach to developing ultra-efficient amplifiers for qubit readout and offers promising prospects for future research.

In conclusion, the development of this highly efficient amplifier represents a significant leap forward for quantum computing. By reducing power consumption by 90%, researchers have opened doors to more powerful and accurate calculations, unlocking new possibilities in fields such as drug development, encryption, AI, and logistics. As the field continues to evolve, it will be exciting to see how this innovation shapes the future of quantum computing.

Continue Reading

Computer Modeling

Harnessing True Randomness from Entangled Photons: The Colorado University Randomness Beacon (CURBy)

Scientists at NIST and the University of Colorado Boulder have created CURBy, a cutting-edge quantum randomness beacon that draws on the intrinsic unpredictability of quantum entanglement to produce true random numbers. Unlike traditional methods, CURBy is traceable, transparent, and verifiable thanks to quantum physics and blockchain-like protocols. This breakthrough has real-world applications ranging from cybersecurity to public lotteries—and it’s open source, inviting the world to use and build upon it.

Avatar photo

Published

on

By

The Colorado University Randomness Beacon (CURBy) is a pioneering service that harnesses the true randomness of entangled photons to produce unguessable strings of numbers. This breakthrough was made possible by the work of scientists at the National Institute of Standards and Technology (NIST) and their colleagues at the University of Colorado Boulder.

“True randomness is something that nothing in the universe can predict in advance,” said Krister Shalm, a physicist at NIST. “If God does play dice with the universe, then you can turn that into the best random number generator that the universe allows.”

The CURBy system uses a Bell test to measure pairs of entangled photons whose properties are correlated even when separated by vast distances. When researchers measure an individual particle, the outcome is random, but the properties of the pair are more correlated than classical physics allows, enabling researchers to verify the randomness.

This is the first random number generator service to use quantum nonlocality as a source of its numbers, and the most transparent source of random numbers to date. The results are certifiable and traceable to a greater extent than ever before.

The CURBy system consists of a nonlinear crystal that generates entangled photons, which travel via optical fiber to separate labs at opposite ends of the hall. Once the photons reach the labs, their polarizations are measured. The outcomes of these measurements are truly random.

NIST passes millions of these quantum coin flips to a computer program at the University of Colorado Boulder, where special processing steps and strict protocols are used to turn the outcomes into 512 random bits of binary code (0s and 1s). The result is a set of random bits that no one, not even Einstein, could have predicted.

The CURBy system has been operational for several months now, with an impressive success rate of over 99.7%. The ability to verify the data behind each random number was made possible by the Twine protocol, a novel set of quantum-compatible blockchain technologies developed by NIST and its collaborators.

“The Twine protocol lets us weave together all these other beacons into a tapestry of trust,” said Jasper Palfree, a research assistant on the project at the University of Colorado Boulder. This allows any user to verify the data behind each random number, providing security and traceability.

The CURBy system can be used anywhere an independent, public source of random numbers would be useful, such as selecting jury candidates, making a random selection for an audit, or assigning resources through a public lottery.

“I wanted to build something that is useful. It’s this cool thing that is the cutting edge of fundamental science,” said Gautam Kavuri, a graduate student on the project. The whole process is open source and available to the public, allowing anyone to not only check their work but even build on the beacon to create their own random number generator.

The CURBy system has the potential to revolutionize fields such as cryptography, gaming, and finance, where true randomness is essential. By harnessing the power of entangled photons, scientists have created a truly independent source of random numbers that can be trusted.

Continue Reading

Artificial Intelligence

A Quantum Leap Forward: “Magic States” Get Easier, Faster, and Less Noisy

Quantum computing just got a significant boost thanks to researchers at the University of Osaka, who developed a much more efficient way to create “magic states”—a key component for fault-tolerant quantum computers. By pioneering a low-level, or “level-zero,” distillation method, they dramatically reduced the number of qubits and computational resources needed, overcoming one of the biggest obstacles: quantum noise. This innovation could accelerate the arrival of powerful quantum machines capable of revolutionizing industries from finance to biotech.

Avatar photo

Published

on

By

A team of researchers from the Graduate School of Engineering Science at Osaka University has made a groundbreaking discovery that could bring quantum computers one step closer to reality. In an article published in PRX Quantum, they have developed a method for preparing high-fidelity “magic states” with unprecedented accuracy and significantly less overhead.

Quantum computers are machines that harness the unique properties of quantum mechanics to perform calculations at speeds millions of times faster than classical computers. These machines could revolutionize fields like engineering, finance, and biotechnology. However, there’s been a significant obstacle holding them back: noise.

Noise is an enemy of quantum computers because even the slightest disturbance can ruin a setup, making it useless. To overcome this challenge, scientists have been exploring ways to build fault-tolerant quantum computers that can continue computing accurately despite noise.

One popular method for creating such systems is called magic state distillation. This process involves preparing a single high-fidelity quantum state from many noisy ones. However, traditional magic state distillation is computationally expensive and requires many qubits (the basic units of quantum information).

The research team was inspired to create a new version of magic state distillation, which they call “level-zero.” In this approach, a fault-tolerant circuit is developed at the physical level of qubits, rather than higher, more abstract levels. This innovation has led to a significant decrease in spatial and temporal overhead compared to traditional methods.

According to lead researcher Tomohiro Itogawa, this breakthrough could bring quantum computers closer to reality: “Noise is absolutely the number one enemy of quantum computers. We’re optimistic that our technique will help make large-scale quantum computers more feasible.”

Keisuke Fujii, senior author of the study, added: “We wanted to explore if there was any way of expediting the preparation of high-fidelity states necessary for quantum computation. Our results show that this is indeed possible, and we’re excited about the potential implications.”

Continue Reading

Trending