Connect with us
We’re experimenting with AI-generated content to help deliver information faster and more efficiently.
While we try to keep things accurate, this content is part of an ongoing experiment and may not always be reliable.
Please double-check important details — we’re not responsible for how the information is used.

Computers & Math

Wearable Tech Helps People with Type 2 Diabetes Stay Active and Manage Condition

Wearable mobile health technology could help people with Type 2 Diabetes (T2D) to stick to exercise regimes that help them to keep the condition under control, a new study reveals. An international team studied the behavior of recently-diagnosed T2D patients in Canada and the UK as they followed a home-based physical activity program, with some participants wearing a smartwatch paired with a health app on their smartphone.

Avatar photo

Published

on

Rewritten Article:

A groundbreaking study has revealed that wearable mobile health technology can significantly help people with Type 2 Diabetes (T2D) stick to exercise regimes that keep the condition under control. Researchers from Lancaster University and other international partners conducted a feasibility trial in Canada and the UK, involving recently-diagnosed T2D patients.

The “Mobile Health Biometrics to Enhance Exercise and Physical Activity Adherence in Type 2 Diabetes” (MOTIVATE-T2D) trial recruited participants aged 40-75 years who had been diagnosed with T2D within the previous 5-24 months. The study aimed to investigate whether wearable technology could encourage people with T2D to engage in purposeful exercise, a crucial aspect of managing the condition.

The trial’s results are impressive: participants who used wearable technology were more likely to start and maintain purposeful exercise, leading to improvements in blood sugar levels, systolic blood pressure, and quality of life. The study successfully recruited 125 participants with an 82% retention rate after 12 months, demonstrating the potential for this type of intervention.

Professor Céu Mateus, Professor of Health Economics at Lancaster University, highlighted the significance of these findings: “The results of this study can contribute to changing the lives of many people around the world. There are millions of people suffering from Diabetes type 2 without access to non-pharmacological interventions with sustained results in the long term.”

The MOTIVATE-T2D programme used biofeedback and data sharing to support the development of personalized physical activity programmes, incorporating wearable technologies like smartwatches and online coaching platforms. Participants gradually increased purposeful exercise of moderate-to-vigorous intensity, aiming for a target of 150 minutes per week by the end of 6 months.

Co-author Dr Katie Hesketh from the University of Birmingham emphasized: “We found that using biometrics from wearable technologies offered great promise for encouraging people with newly diagnosed T2D to maintain a home-delivered, personalized exercise programme with all the associated health benefits.”

The MOTIVATE-T2D trial has demonstrated the potential for non-pharmacological interventions to improve equity in access to healthcare services, particularly in managing Type 2 Diabetes. As Professor Mateus noted: “In a time where savings to health services budgets are of paramount importance, non-pharmacological interventions contributing to improve equity in access by patients are very valuable for society.”

Computers & Math

Quantum Computers Just Beat Classical Ones – Exponentially and Unconditionally

A research team has achieved the holy grail of quantum computing: an exponential speedup that’s unconditional. By using clever error correction and IBM’s powerful 127-qubit processors, they tackled a variation of Simon’s problem, showing quantum machines are now breaking free from classical limitations, for real.

Avatar photo

Published

on

By

Quantum computers have been touted as potential game-changers for computation, medicine, coding, and material discovery – but only when they truly function. One major obstacle has been noise or errors produced during computations on a quantum machine, making them less powerful than classical computers – until recently.

Daniel Lidar, holder of the Viterbi Professorship in Engineering and Professor of Electrical & Computing Engineering at USC Viterbi School of Engineering, has made significant strides in quantum error correction. In a recent study with collaborators at USC and Johns Hopkins, he demonstrated a quantum exponential scaling advantage using two 127-qubit IBM Quantum Eagle processor-powered quantum computers over the cloud.

The key milestone for quantum computing, Lidar says, is to demonstrate that we can execute entire algorithms with a scaling speedup relative to ordinary “classical” computers. An exponential speedup means that as you increase a problem’s size by including more variables, the gap between the quantum and classical performance keeps growing – roughly doubling for every additional variable.

Lidar clarifies that this type of speedup is unconditional, meaning it doesn’t rely on unproven assumptions. Prior speedup claims required assuming there was no better classical algorithm against which to benchmark the quantum algorithm. This study used an algorithm modified for the quantum computer to solve a variation of “Simon’s problem,” an early example of quantum algorithms that can solve tasks exponentially faster than any classical counterpart, unconditionally.

Simon’s problem involves finding a hidden repeating pattern in a mathematical function and is considered the precursor to Shor’s factoring algorithm, which can be used to break codes. Quantum players can win this game exponentially faster than classical players.

The team achieved their exponential speedup by squeezing every ounce of performance from the hardware: shorter circuits, smarter pulse sequences, and statistical error mitigation. They limited data input, compressed quantum logic operations using transpilation, applied dynamical decoupling to detach qubits from noise, and used measurement error mitigation to correct errors left over after dynamical decoupling.

Lidar says that this result shows today’s quantum computers firmly lie on the side of a scaling quantum advantage. The performance separation cannot be reversed because the exponential speedup is unconditional – making it increasingly difficult to dispute. Next steps include demonstrating practical real-world applications, reducing noise and decoherence in ever larger quantum computers, and addressing the lack of oracle-based speedups.

Continue Reading

Computational Biology

A Quantum Leap Forward – New Amplifier Boosts Efficiency of Quantum Computers 10x

Chalmers engineers built a pulse-driven qubit amplifier that’s ten times more efficient, stays cool, and safeguards quantum states—key for bigger, better quantum machines.

Avatar photo

Published

on

By

Quantum computers have long been touted as revolutionary machines capable of solving complex problems that stymie conventional supercomputers. However, their full potential has been hindered by the limitations of qubit amplifiers – essential components required to read and interpret quantum information. Researchers at Chalmers University of Technology in Sweden have taken a significant step forward with the development of an ultra-efficient amplifier that reduces power consumption by 90%, paving the way for more powerful quantum computers with enhanced performance.

The new amplifier is pulse-operated, meaning it’s activated only when needed to amplify qubit signals, minimizing heat generation and decoherence. This innovation has far-reaching implications for scaling up quantum computers, as larger systems require more amplifiers, leading to increased power consumption and decreased accuracy. The Chalmers team’s breakthrough offers a solution to this challenge, enabling the development of more accurate readout systems for future generations of quantum computers.

One of the key challenges in developing pulse-operated amplifiers is ensuring they respond quickly enough to keep pace with qubit readout. To address this, the researchers employed genetic programming to develop a smart control system that enables rapid response times – just 35 nanoseconds. This achievement has significant implications for the future of quantum computing, as it paves the way for more accurate and powerful calculations.

The new amplifier was developed in collaboration with industry partners Low Noise Factory AB and utilizes the expertise of researchers at Chalmers’ Terahertz and Millimeter Wave Technology Laboratory. The study, published in IEEE Transactions on Microwave Theory and Techniques, demonstrates a novel approach to developing ultra-efficient amplifiers for qubit readout and offers promising prospects for future research.

In conclusion, the development of this highly efficient amplifier represents a significant leap forward for quantum computing. By reducing power consumption by 90%, researchers have opened doors to more powerful and accurate calculations, unlocking new possibilities in fields such as drug development, encryption, AI, and logistics. As the field continues to evolve, it will be exciting to see how this innovation shapes the future of quantum computing.

Continue Reading

Artificial Intelligence

AI Uncovers Hidden Heart Risks in CT Scans: A Game-Changer for Cardiovascular Care

What if your old chest scans—taken years ago for something unrelated—held a secret warning about your heart? A new AI tool called AI-CAC, developed by Mass General Brigham and the VA, can now comb through routine CT scans to detect hidden signs of heart disease before symptoms strike.

Avatar photo

Published

on

The Massachusetts General Brigham researchers have developed an innovative artificial intelligence (AI) tool called AI-CAC to analyze previously collected CT scans and identify individuals with high coronary artery calcium (CAC) levels, indicating a greater risk for cardiovascular events. Their research, published in NEJM AI, demonstrated the high accuracy and predictive value of AI-CAC for future heart attacks and 10-year mortality.

Millions of chest CT scans are taken each year, often in healthy people, to screen for lung cancer or other conditions. However, this study reveals that these scans can also provide valuable information about cardiovascular risk, which has been going unnoticed. The researchers found that AI-CAC had a high accuracy rate (89.4%) at determining whether a scan contained CAC or not.

The gold standard for quantifying CAC uses “gated” CT scans, synchronized to the heartbeat to reduce motion during the scan. However, most chest CT scans obtained for routine clinical purposes are “nongated.” The researchers developed AI-CAC, a deep learning algorithm, to probe through these nongated scans and quantify CAC.

The AI-CAC model was 87.3% accurate at determining whether the score was higher or lower than 100, indicating a moderate cardiovascular risk. Importantly, AI-CAC was also predictive of 10-year all-cause mortality, with those having a CAC score over 400 having a 3.49 times higher risk of death over a 10-year period.

The researchers hope to conduct future studies in the general population and test whether the tool can assess the impact of lipid-lowering medications on CAC scores. This could lead to the implementation of AI-CAC in clinical practice, enabling physicians to engage with patients earlier, before their heart disease advances to a cardiac event.

As Dr. Raffi Hagopian, first author and cardiologist at the VA Long Beach Healthcare System, emphasized, “Using AI for tasks like CAC detection can help shift medicine from a reactive approach to the proactive prevention of disease, reducing long-term morbidity, mortality, and healthcare costs.”

Continue Reading

Trending