Intelligent CIO North America Issue 62 | Page 36

FEATURE: QUANTUM COMPUTING
The end of classical scaling and the opening for quantum
For decades, HPC performance gains were driven by transistor scaling, energy density improvements and architectural innovation. Yet in the past 10 years, the physical limits of transistor size and chip energy capacity have slowed progress. Each new generation of supercomputer is harder, more expensive and less efficient to build.
What eFTQC means
The term‘ early fault-tolerant Quantum Computing’ refers to machines with hundreds to a thousand logical qubits, operating with logical error rates between 10-6 and 10-10. Unlike today’ s noisy intermediate-scale quantum( NISQ) devices, which can only run approximate algorithms of uncertain value, eFTQC machines will deliver deterministic results through error correction.
But the report says at the same time, the quantum horizon has been coming closer. Algorithms once thought to require astronomical quantum resources have become feasible thanks to advances in error correction, algorithm design and qubit encodings. For example, the estimated number of qubits required to run Shor’ s algorithm for breaking RSA-2048 encryption has fallen by three orders of magnitude in the past decade.
The report projects these devices are projected to arrive between 2027 and 2032, depending on hardware progress and vendor roadmaps. While limited in scale, they will be powerful enough to run rigorous algorithms such as quantum phase estimation( QPE) and quantum time evolution – methods that unlock new frontiers in chemistry, materials science and physics, the report says.
This dual shift – the slowing of classical HPC scaling and the acceleration of quantum feasibility – has created a critical inflection point. For many workloads, supercomputers are already approaching practical limits while quantum processors are nearing utility. eFTQC is the bridge between the present and the future: a stage where Quantum Computers are not yet universal machines but are already powerful enough to accelerate meaningful subsets of HPC workloads, the report says.
“ HPC users will see benefits in accuracy, time-tosolution and computational cost as hybrid workflows shift subproblems to quantum processors,” explained Théau Peronnin, CEO, Alice & Bob.“ HPC centers that want to lead have to co-design these hybrid workflows with users and vendors, shape efficient infrastructure and deploy prototypes early.”
The promise for scientific domains Materials Science
Materials Science is identified in the report as the first major domain expected to benefit from eFTQC. With just a few hundred logical qubits, Quantum Computers will be able to solve the Hubbard model in regimes that are classically intractable. This opens the door to breakthroughs in superconductivity, magnetic materials and spin models – fields with enormous industrial implications from lossless power transmission to ultra-efficient computing, the report says.
At leading facilities like the National Energy Research Scientific Computing Center( NERSC) and Los Alamos National Laboratory( LANL), materials simulations already account for up to 20 % of workloads. The report says that with eFTQC, those simulations could be accelerated dramatically, freeing resources and unlocking new research questions.
Quantum Chemistry
The next frontier identified in the report is Quantum Chemistry. Complex molecules with strong electronic correlations, such as FeMoco – the active site of nitrogenase enzymes critical to biological nitrogen fixation – are beyond the reach of classical full
36 INTELLIGENTCIO NORTH AMERICA www. intelligentcio. com