Connect with us
We’re experimenting with AI-generated content to help deliver information faster and more efficiently.
While we try to keep things accurate, this content is part of an ongoing experiment and may not always be reliable.
Please double-check important details — we’re not responsible for how the information is used.

Computers & Math

“DNA Under Threat: Scientists Warn of Growing Cyber-Risks in Next-Generation Sequencing”

According to new research next-generation DNA sequencing (NGS) — the same technology which is powering the development of tailor-made medicines, cancer diagnostics, infectious disease tracking, and gene research — could become a prime target for hackers.

Avatar photo

Published

on

The article highlights a pressing concern for scientists, who warn that next-generation DNA sequencing (NGS) technology could become a prime target for hackers. The NGS workflow involves complex steps from sample preparation to data analysis, each vulnerable to potential security breaches. As many DNA datasets are openly accessible online, cybercriminals can misuse the information for surveillance, manipulation, or malicious experimentation.

The research study, led by Dr Nasreen Anjum from the University of Portsmouth’s School of Computing, is the first comprehensive investigation into cyber-biosecurity threats across the entire NGS workflow. The study warns that protecting genomic data isn’t just about encryption; it requires anticipating attacks that don’t yet exist. Dr Anjum emphasizes the need for a paradigm shift in securing the future of precision medicine.

The research team identified new and emerging methods that hackers can use to exploit or attack systems, such as synthetic DNA-encoded malware, AI-driven manipulation of genome data, and identity tracing through re-identification techniques. These threats pose risks to individual privacy, scientific integrity, and national security.

Dr Mahreen-Ul-Hassan, a microbiologist and co-author from the Shaheed Benazir Bhutto Women University, notes that genomic data is one of the most personal forms of data we have. If compromised, the consequences go far beyond a typical data breach.

The research team recommends practical solutions, including secure sequencing protocols, encrypted storage, and AI-powered anomaly detection. They urge governments, regulatory bodies, funding agencies, and academic institutions to prioritize this field and invest in dedicated research, education, and policy development before it’s too late.

Without coordinated action, genomic data could be exploited for surveillance, discrimination, or even bioterrorism. The study highlights the need for interdisciplinary cooperation between computer scientists, bioinformaticians, biotechnologists, and security professionals to prevent these threats.

The study was funded by the British Council’s UK-Saudi Challenge Fund and a Quality Related Research Grant from the University of Portsmouth.

Computers & Math

Quantum Computers Just Beat Classical Ones – Exponentially and Unconditionally

A research team has achieved the holy grail of quantum computing: an exponential speedup that’s unconditional. By using clever error correction and IBM’s powerful 127-qubit processors, they tackled a variation of Simon’s problem, showing quantum machines are now breaking free from classical limitations, for real.

Avatar photo

Published

on

By

Quantum computers have been touted as potential game-changers for computation, medicine, coding, and material discovery – but only when they truly function. One major obstacle has been noise or errors produced during computations on a quantum machine, making them less powerful than classical computers – until recently.

Daniel Lidar, holder of the Viterbi Professorship in Engineering and Professor of Electrical & Computing Engineering at USC Viterbi School of Engineering, has made significant strides in quantum error correction. In a recent study with collaborators at USC and Johns Hopkins, he demonstrated a quantum exponential scaling advantage using two 127-qubit IBM Quantum Eagle processor-powered quantum computers over the cloud.

The key milestone for quantum computing, Lidar says, is to demonstrate that we can execute entire algorithms with a scaling speedup relative to ordinary “classical” computers. An exponential speedup means that as you increase a problem’s size by including more variables, the gap between the quantum and classical performance keeps growing – roughly doubling for every additional variable.

Lidar clarifies that this type of speedup is unconditional, meaning it doesn’t rely on unproven assumptions. Prior speedup claims required assuming there was no better classical algorithm against which to benchmark the quantum algorithm. This study used an algorithm modified for the quantum computer to solve a variation of “Simon’s problem,” an early example of quantum algorithms that can solve tasks exponentially faster than any classical counterpart, unconditionally.

Simon’s problem involves finding a hidden repeating pattern in a mathematical function and is considered the precursor to Shor’s factoring algorithm, which can be used to break codes. Quantum players can win this game exponentially faster than classical players.

The team achieved their exponential speedup by squeezing every ounce of performance from the hardware: shorter circuits, smarter pulse sequences, and statistical error mitigation. They limited data input, compressed quantum logic operations using transpilation, applied dynamical decoupling to detach qubits from noise, and used measurement error mitigation to correct errors left over after dynamical decoupling.

Lidar says that this result shows today’s quantum computers firmly lie on the side of a scaling quantum advantage. The performance separation cannot be reversed because the exponential speedup is unconditional – making it increasingly difficult to dispute. Next steps include demonstrating practical real-world applications, reducing noise and decoherence in ever larger quantum computers, and addressing the lack of oracle-based speedups.

Continue Reading

Computational Biology

A Quantum Leap Forward – New Amplifier Boosts Efficiency of Quantum Computers 10x

Chalmers engineers built a pulse-driven qubit amplifier that’s ten times more efficient, stays cool, and safeguards quantum states—key for bigger, better quantum machines.

Avatar photo

Published

on

By

Quantum computers have long been touted as revolutionary machines capable of solving complex problems that stymie conventional supercomputers. However, their full potential has been hindered by the limitations of qubit amplifiers – essential components required to read and interpret quantum information. Researchers at Chalmers University of Technology in Sweden have taken a significant step forward with the development of an ultra-efficient amplifier that reduces power consumption by 90%, paving the way for more powerful quantum computers with enhanced performance.

The new amplifier is pulse-operated, meaning it’s activated only when needed to amplify qubit signals, minimizing heat generation and decoherence. This innovation has far-reaching implications for scaling up quantum computers, as larger systems require more amplifiers, leading to increased power consumption and decreased accuracy. The Chalmers team’s breakthrough offers a solution to this challenge, enabling the development of more accurate readout systems for future generations of quantum computers.

One of the key challenges in developing pulse-operated amplifiers is ensuring they respond quickly enough to keep pace with qubit readout. To address this, the researchers employed genetic programming to develop a smart control system that enables rapid response times – just 35 nanoseconds. This achievement has significant implications for the future of quantum computing, as it paves the way for more accurate and powerful calculations.

The new amplifier was developed in collaboration with industry partners Low Noise Factory AB and utilizes the expertise of researchers at Chalmers’ Terahertz and Millimeter Wave Technology Laboratory. The study, published in IEEE Transactions on Microwave Theory and Techniques, demonstrates a novel approach to developing ultra-efficient amplifiers for qubit readout and offers promising prospects for future research.

In conclusion, the development of this highly efficient amplifier represents a significant leap forward for quantum computing. By reducing power consumption by 90%, researchers have opened doors to more powerful and accurate calculations, unlocking new possibilities in fields such as drug development, encryption, AI, and logistics. As the field continues to evolve, it will be exciting to see how this innovation shapes the future of quantum computing.

Continue Reading

Artificial Intelligence

AI Uncovers Hidden Heart Risks in CT Scans: A Game-Changer for Cardiovascular Care

What if your old chest scans—taken years ago for something unrelated—held a secret warning about your heart? A new AI tool called AI-CAC, developed by Mass General Brigham and the VA, can now comb through routine CT scans to detect hidden signs of heart disease before symptoms strike.

Avatar photo

Published

on

The Massachusetts General Brigham researchers have developed an innovative artificial intelligence (AI) tool called AI-CAC to analyze previously collected CT scans and identify individuals with high coronary artery calcium (CAC) levels, indicating a greater risk for cardiovascular events. Their research, published in NEJM AI, demonstrated the high accuracy and predictive value of AI-CAC for future heart attacks and 10-year mortality.

Millions of chest CT scans are taken each year, often in healthy people, to screen for lung cancer or other conditions. However, this study reveals that these scans can also provide valuable information about cardiovascular risk, which has been going unnoticed. The researchers found that AI-CAC had a high accuracy rate (89.4%) at determining whether a scan contained CAC or not.

The gold standard for quantifying CAC uses “gated” CT scans, synchronized to the heartbeat to reduce motion during the scan. However, most chest CT scans obtained for routine clinical purposes are “nongated.” The researchers developed AI-CAC, a deep learning algorithm, to probe through these nongated scans and quantify CAC.

The AI-CAC model was 87.3% accurate at determining whether the score was higher or lower than 100, indicating a moderate cardiovascular risk. Importantly, AI-CAC was also predictive of 10-year all-cause mortality, with those having a CAC score over 400 having a 3.49 times higher risk of death over a 10-year period.

The researchers hope to conduct future studies in the general population and test whether the tool can assess the impact of lipid-lowering medications on CAC scores. This could lead to the implementation of AI-CAC in clinical practice, enabling physicians to engage with patients earlier, before their heart disease advances to a cardiac event.

As Dr. Raffi Hagopian, first author and cardiologist at the VA Long Beach Healthcare System, emphasized, “Using AI for tasks like CAC detection can help shift medicine from a reactive approach to the proactive prevention of disease, reducing long-term morbidity, mortality, and healthcare costs.”

Continue Reading

Trending