Connect with us
We’re experimenting with AI-generated content to help deliver information faster and more efficiently.
While we try to keep things accurate, this content is part of an ongoing experiment and may not always be reliable.
Please double-check important details — we’re not responsible for how the information is used.

Computers & Math

“Dropcountr: A Smart Water-Use App That Nudges Households Towards Conservation”

A new study has found that a smartphone app that tracks household water use and alerts users to leaks or excessive consumption offers a promising tool for helping California water agencies meet state-mandated conservation goals. The study found that use of the app — called Dropcountr — reduced average household water use by 6%, with even greater savings among the highest water users.

Avatar photo

Published

on

Dropcountr, a smartphone app that tracks household water use and alerts users to leaks or excessive consumption, has been found to be an effective tool in helping California water agencies meet state-mandated conservation goals. Led by Mehdi Nemati, an assistant professor of public policy at the University of California, Riverside (UCR), the study found that use of the app reduced average household water use by 6%, with even greater savings among high-volume users.

The app works by interpreting data from smart water meters and providing real-time feedback to consumers. This type of digital feedback gives users a “nudge” – a timely prompt to take water-saving actions, such as taking shorter showers or fixing leaks. Utilities can also use the app to send customers tips for cutting use and notify them of rebate programs.

The research focused on the City of Folsom in Northern California, where Dropcountr was offered to residential customers beginning in late 2014. About 3,600 households volunteered for the program, which collected smart meter data from 2013 to 2019. The findings showed that participating households reduced their daily consumption by an average of 6.2% compared to a control group. The reduction was greater among high-volume users.

One major advantage of Dropcountr is its ability to detect leaks quickly and notify customers before damage or costly bills occur. The app also uses behavioral science concepts, especially the power of social norms, to encourage conservation. Users receive personalized water-use summaries that show how their consumption stacks up against more efficient nearby households.

The study found that these behavioral changes lasted, with sustained reductions in water use even six days after a leak alert was sent. “We looked at water use 50 months out and still found sustained reductions,” Nemati said. “People weren’t just reacting once and forgetting. They stayed engaged.”

With California preparing to enforce stricter drought and efficiency standards, Nemati said more utilities should consider deploying digital tools like Dropcountr. “We have the data,” he said. “Now we just need to use it in smarter ways. This study shows how a relatively inexpensive solution can help homeowners conserve and ease pressure on our water systems.”

Computer Programming

The Limits of Precision: How AI Can Help Us Reach the Edge of What Physics Allows

Scientists have uncovered how close we can get to perfect optical precision using AI, despite the physical limitations imposed by light itself. By combining physics theory with neural networks trained on distorted light patterns, they showed it’s possible to estimate object positions with nearly the highest accuracy allowed by nature. This breakthrough opens exciting new doors for applications in medical imaging, quantum tech, and materials science.

Avatar photo

Published

on

By

The concept of precision has been a cornerstone in physics for centuries. For 150 years, it has been known that no matter how advanced our technology becomes, there are fundamental limits to the precision with which we can measure physical phenomena. The position of a particle, for instance, can never be measured with infinite precision; a certain amount of blurring is unavoidable.

Recently, an international team of researchers from TU Wien (Vienna), the University of Glasgow, and the University of Grenoble posed a question: where is the absolute limit of precision that is possible with optical methods? And how can this limit be approached as closely as possible? The team’s findings have significant implications for a wide range of fields, including medicine.

To address this question, the researchers employed a theoretical measure known as Fisher information. This measure describes how much information an optical signal contains about an unknown parameter – such as the object position. By using Fisher information, the team was able to calculate an upper limit for the theoretically achievable precision in different experimental scenarios.

However, the calculation of this limit does not necessarily mean that it is impossible to achieve. In fact, a corresponding experiment designed by Dorian Bouchet from the University of Grenoble, together with Ilya Starshynov and Daniele Faccio from the University of Glasgow, showed that using artificial intelligence (AI) algorithms for neural networks can come very close to this limit.

In the experiment, a laser beam was directed at a small, reflective object located behind a turbid liquid. The measurement conditions varied depending on the turbidity – and therefore also the difficulty of obtaining precise position information from the signal. The recorded images only showed highly distorted light patterns that looked like random patterns to the human eye.

But when fed into a neural network, which was trained with many such images each with a known object position, the network could learn which patterns are associated with which positions. After sufficient training, the network was able to determine the object position very precisely, even with new, unknown patterns.

The precision of the prediction achieved by the AI-supported algorithm was only minimally worse than the theoretically achievable maximum calculated using Fisher information. This means that our AI-supported algorithm is not only effective but almost optimal, achieving almost exactly the precision permitted by the laws of physics.

This realisation has far-reaching consequences: with the help of intelligent algorithms, optical measurement methods could be significantly improved in a wide range of areas – from medical diagnostics to materials research and quantum technology. In future projects, the research team wants to work with partners from applied physics and medicine to investigate how these AI-supported methods can be used in specific systems.

Continue Reading

Computer Science

Sharper than Lightning: Oxford’s Groundbreaking Quantum Breakthrough

Physicists at the University of Oxford have set a new global benchmark for the accuracy of controlling a single quantum bit, achieving the lowest-ever error rate for a quantum logic operation–just 0.000015%, or one error in 6.7 million operations. This record-breaking result represents nearly an order of magnitude improvement over the previous benchmark, set by the same research group a decade ago.

Avatar photo

Published

on

By

The University of Oxford has achieved a major milestone in the field of quantum computing. Physicists at the institution have successfully set a new global benchmark for controlling a single quantum bit (qubit), reducing the error rate to an astonishing 0.000015% – or one error in 6.7 million operations. This achievement represents nearly an order of magnitude improvement over the previous record, which was also held by the same research group.

To put this remarkable result into perspective, it’s more likely for a person to be struck by lightning in a given year (1 in 1.2 million) than for one of Oxford’s quantum logic gates to make a mistake. This breakthrough has significant implications for the development of practical and robust quantum computers that can tackle real-world problems.

The researchers utilized a trapped calcium ion as the qubit, which is a natural choice for storing quantum information due to its long lifetime and robustness. Unlike conventional methods, which rely on lasers, the Oxford team employed electronic (microwave) signals to control the quantum state of the ions. This approach offers greater stability and other benefits for building practical quantum computers.

The experiment was conducted at room temperature without magnetic shielding, simplifying the technical requirements for a working quantum computer. The previous best single-qubit error rate achieved by the Oxford team in 2014 was 1 in 1 million. The group’s expertise led to the launch of the spinout company Oxford Ionics in 2019, which has become an established leader in high-performance trapped-ion qubit platforms.

While this record-breaking result marks a significant milestone, the researchers caution that it is part of a larger challenge. Quantum computing requires both single- and two-qubit gates to function together. Currently, two-qubit gates still have significantly higher error rates – around 1 in 2000 in the best demonstrations to date – so reducing these will be crucial to building fully fault-tolerant quantum machines.

The experiments were carried out by a team of researchers from the University of Oxford’s Department of Physics, including Molly Smith, Aaron Leu, Dr Mario Gely, and Professor David Lucas, together with a visiting researcher, Dr Koichiro Miyanishi, from the University of Osaka’s Centre for Quantum Information and Quantum Biology. The Oxford scientists are part of the UK Quantum Computing and Simulation (QCS) Hub, which is a part of the ongoing UK National Quantum Technologies Programme.

Continue Reading

Computer Programming

Boosting AI with Green Quantum Chips: A Breakthrough in Photonic Quantum Computing

A team of researchers has shown that even small-scale quantum computers can enhance machine learning performance, using a novel photonic quantum circuit. Their findings suggest that today s quantum technology isn t just experimental it can already outperform classical systems in specific tasks. Notably, this photonic approach could also drastically reduce energy consumption, offering a sustainable path forward as machine learning s power needs soar.

Avatar photo

Published

on

By

The integration of artificial intelligence (AI) and quantum computing has been a topic of intense research in recent years. A team of international researchers from the University of Vienna has made a significant breakthrough in this field by demonstrating that small-scale quantum computers can enhance the performance of machine learning algorithms. Their study, published in Nature Photonics, showcases promising applications for optical quantum computers.

Machine learning and AI have revolutionized our lives with their ability to perform complex tasks and drive scientific research. Quantum computing, on the other hand, has emerged as a new paradigm for computation. The combination of these two fields has given rise to the field of Quantum Machine Learning, which aims to find enhancements in speed, efficiency, or accuracy when running algorithms on quantum platforms.

However, achieving such advantages with current technology is still an open challenge. The University of Vienna team took this next step by designing a novel experiment featuring a photonic quantum circuit and a machine learning algorithm. Their goal was to classify data points using a photonic quantum computer and understand the contribution of quantum effects in comparison to classical computers.

The results were promising, as they found that already small-sized quantum processors can perform better than conventional algorithms. “We found that for specific tasks our algorithm commits fewer errors than its classical counterpart,” explained Philip Walther from the University of Vienna, lead of the project. This implies that existing quantum computers can show good performances without necessarily going beyond state-of-the-art technology.

Another significant aspect of this research is that photonic platforms can consume less energy compared to standard computers. Given the high energy demands of machine learning algorithms, this could prove crucial in the future. Co-author Iris Agresti emphasized that new algorithms inspired by quantum architectures could be designed, reaching better performances and reducing energy consumption.

This breakthrough has a significant impact on both quantum computation and standard computing. It identifies tasks that benefit from quantum effects and opens up possibilities for designing more efficient and eco-friendly algorithms. The integration of AI and quantum computing is an exciting area of research, and this study takes us one step closer to making AI smarter and greener.

Continue Reading

Trending