Donate

Quantum computing: beyond the hype

Developing this revolutionary technology is challenging. But there are grounds for optimism.

Jonathan Jones

Topics Science & Tech

Want to read spiked ad-free? Become a spiked supporter.

Quantum computers are the most famous example of the new quantum technologies predicted to transform the 21st century.

Their development would mark a significant advance over classical computers. The latter use ‘bits’, which can either be 0 or 1, as their basic unit of data, which they create and store in very long data strings. Quantum computers, however, use ‘quantum bits’ or ‘qubits’, which can be both 0 and 1 at the same time. This allows them to create and store much more data, and to perform some calculations far more rapidly than any classical computer can.

The speed with which a quantum computer works, and its ability to perform so many more functions, could have remarkable applications, for good or ill. But there’s a problem. Building quantum computers is proving challenging, leading to growing claims that they have been overhyped.

So is the bubble about to burst? Sceptics regularly point out that quantum computing – like nuclear fusion – always seems to be 10 years away. Optimism has been replaced by pessimism, and some now even talk of a ‘quantum winter’.

But while this new-found realism is welcome, it may have gone too far. The public face of quantum computing has certainly been one of unparalleled hype. But the more private face of academic research has shown extraordinary progress towards building practical working devices.

Quantum information processing has been rediscovered many times for various reasons. Richard Feynman wanted to simulate physics with computers, after concluding that ‘nature isn’t classical, dammit’. David Deutsch wanted to use quantum computing to prove the existence of parallel universes. And Stephen Wiesner invented what later became quantum cryptography, but was so disappointed by its reception that he retreated to growing vegetables.

These early proposals for quantum computing were met with considerable scepticism, with most experts believing that actually building a quantum computer would be almost impossible. The main reason given was that like all devices, computers are prone to errors. For classical computers, such errors are kept in check by error-correcting codes. Unfortunately these classical codes could not be used with quantum devices. Quantum-computing sceptics believed it would therefore likely be necessary to build quantum devices that operate without errors.

This problem was solved by Peter Shor and Andrew Steane, who independently discovered quantum error correction in the mid-1990s. The resultant Error Correction Zoo currently exhibits more than 300 quantum codes, optimised for different conditions. This means that quantum computers don’t have to be perfect, but they still have to be very good, and building them remains a challenge.

Quantum computations with a handful of quantum bits have been demonstrated for decades, and simple devices can be bought as desktop toys, but these are very much toys. One early triumph was using Peter Shor’s quantum-factoring algorithm to find the prime factors of 15, which is hardly a challenging problem.

A practical quantum computer requires large numbers (thousands to millions) of high-quality quantum bits (error rates under one part in a thousand, preferably around one part in a million). Both targets are challenging, and achieving them simultaneously is even more difficult. The obvious strategy is to tackle the problem in two stages, but which one should be tackled first? Here, the approaches of industry and academia have diverged.

Major industrial efforts, undertaken by the likes of Google and IBM, have mostly concentrated on superconducting technologies. These are well suited to building large numbers of quantum bits, using approaches familiar from other nanotechnologies. However these quantum bits are normally of low quality, and the devices are frequently called Noisy Intermediate Scale Quantum (NISQ) computers, to distinguish them from true quantum computers.

Whether these NISQ devices are genuinely useful remains controversial, and most of the accusations of quantum hype relate to this question. Claims are regularly made that NISQ devices have achieved quantum supremacy – that is, that they have performed calculations supposedly beyond the capacity of any conventional computer. But these claims are regularly debunked by people performing conventional simulations using classical computing.

Academics, and academic spinout companies such as IonQ, have taken a different approach. They have largely pursued quality over quantity, assuming that it is easier to scale up a good, small implementation than to improve the quality of an unreliable heap. Most of this effort has involved trapped ion technologies, building on the experience gained in the development of atomic clocks. Several years ago, these experiments reached the ‘fault-tolerant threshold’, which means that they are good enough to start on the next stage. Scaling them up remains a formidable problem, but there are no fundamental reasons why this could not be done.

This progress with trapped ions has led to renewed academic enthusiasm, with some believing that quantum computers might really be only a decade away. But a third approach, based on trapped atoms rather than trapped ions, has recently leapt ahead. This approach shows remarkable promise.

Trapping and manipulating weakly interacting uncharged atoms is much more complicated than doing the same for strongly interacting charged ions. But these weak interactions bring their own advantages. In March 2024, researchers at Caltech reported trapping and controlling over 6,000 atoms, using technologies adapted from optical tweezers, and performing simple manipulations at the precision required for quantum computing. Although several key requirements are still missing, this rapid progress means that there is now a clear third player in the quantum game.

It is still too early to predict the future for computing. But just maybe, quantum spring is in the air.

Jonathan Jones is a professor of Physics, working within the Department of Atomic and Laser Physics on Nuclear Magnetic Resonance (NMR) Quantum Computation at the University of Oxford.

Picture by: Getty.

To enquire about republishing spiked’s content, a right to reply or to request a correction, please contact the managing editor, Viv Regan.

Topics Science & Tech

Comments

Want to join the conversation?

Only spiked supporters and patrons, who donate regularly to us, can comment on our articles.

Join today