Digital Platforms and Services

Quantum computing’s daunting challenges

By Martyn Warwick

Jul 24, 2023

  • Utility quantum computing is edging closer to commercial reality, but serious problems are still to be solved
  • Error correction tops the list
  • Compatibility and interoperability are also stumbling blocks 
  • Agreed standards and protocols for hardware and software for apps and comms interfaces are required 
  • There’s a need to optimise data transfer between quantum and classical computers, which are likely to co-exist forever

Over the past five years, quantum computing has come on in leaps and bounds. However, we are still a long way from the era of scalable “utility quantum computing” when the amount of qubits in a device will be numbered in their many thousands rather than in a few hundred, and it will be sufficiently robust to be able to routinely, and very quickly, provide solutions to pressing problems that, whilst not actually beyond the capabilities of classical binary computers in terms of their ability to solve, would take them centuries to actually do so.

Quantum computers are hugely complex and are incredibly susceptible to ‘noise’ (such as heat, electronics, magnetic fields, cosmic radiation and even stray light) impinging on the immensely delicate environment in which they operate. They are error-prone and fault intolerant and as the processor works – even if that is for no longer than a few milliseconds – errors are introduced and accumulate to the extent that the quantum state itself de-coheres. That’s why so much of a quantum computer is devoted to ensuring that its qubits are as protected as possible so that errors are minimised and the quantum state continues for as long as possible.

Many forms of day-to-day technology already require error-protection – telecoms and datacentre operations, for example – but with quantum computing, error correction is such an enormous problem that it has been likened to juggling with loose soot whilst trying to herd cats. The remedy seems to be the “logical qubit” – a set of physical qubits operating together – but they too are very hard to construct and manage.

Depending on the error-correction regime used in a particular set of circumstances and the error rates of each physical qubit, a single logical qubit might easily comprise 1,000 or more physical qubits, of which the great majority will be dedicated to simultaneously identifying and correcting errors in real time, while only a few qubits actually do the computational processing. In early tests of logical qubits, groups were chained in sets of nine where, for one data qubit as a processor, eight ancillary qubits identified and corrected errors: The overhead is enormous, as is the energy required.

Nonetheless, the search to find answers to the problem of quantum error correction is a long-term proposition and it continues apace. So too does the race to scale-up quantum computers to thousands of qubits while ensuring coherence remains as high as possible and de-coherence and error rates are minimised.

Furthermore, given that quantum computers and classical computers are going to co-exist, presumably forever, the race is on to develop ways and means to optimise the transfer of data between the two very different technologies: Such methodologies will be vital to maximising the value of practical, complementary and compatible applications. 

That will require the design and development of standards and protocols for hardware, for software and for applications and communications interfaces that will facilitate interoperability between different quantum computing platforms, of which there are a surprisingly large number. There will also be a need for benchmarking standards to measure and compare performance between quantum computers.

Unified R&D, skills shortages and immense costs are also major hurdles

As if such challenges were not enough, quantum computing expert Lawrence Gasman (pictured below) highlighted a number of others to TelecomTV during a recent interview. Gasman is a former senior fellow in telecommunications at the Washington DC-headquartered thinktank, the Cato Institute, and the founder and president of research and consultancy house Inside Quantum Technology, and has plenty to say about the challenges and opportunities associated with the sector – we already know from our previous article on this topic that he isn’t keen on major tech companies claiming bragging rights and using terms such as “quantum supremacy” – see Google reignites the ‘quantum supremacy’ debate – again.

Lawrence Gasman, founder and president of research and consultancy house Inside Quantum Technology.

Talking to TelecomTV, Gasman stressed that current approaches to the development of specialist hardware and software are holding back advances in utility quantum computing, as there is currently no common or unified approach to the challenges of developing scalable, fault-tolerant qubit control technology. 

It is generally agreed that there are seven primary qubit technologies for quantum computing. These are: superconducting qubits; semiconductor quantum dots; trapped ion qubits; photonic qubits; defect-based qubits; topological nanowire qubits; and nuclear magnetic resonance qubits. Different companies and institutions researching and making quantum computers use different qubit technologies, each of which has its own strengths and weaknesses.

Meanwhile, the software side of the equation is equally problematic: New programming languages and compilers have to be developed and quantum algorithms are in their infancy. 

Add to that heady brew the global lack of trained and/or experienced quantum scientists and engineers, and the sheer overall expense of the entire quantum computing enterprise, and the challenges seem daunting. But then so was (and still is) going to the moon and beyond. The problems of quantum computing are big, but the science and technology is advancing very quickly. 

While he’s practical about the many challenges, Gasman is optimistic about the increasing number of applications that are the direct result of quantum computing. He told TelecomTV, “Five or six years ago, quantum computers were just R&D devices and now they are moving to drug discovery and materials design. Just about every big drug company now works with quantum computers.”  

He added that quantum devices “are moving from the hundreds into the thousands [of qubits]” and are capable of doing highly advanced work. For example, “Quantum chemistry using quantum computers is a real thing. It’s still R&D-ish, but with the emphasis on the D.” 

Elsewhere, vehicle and aircraft manufacturers are using them in developing new coatings for cars and aircraft, noted Gasman.

Moving on to other sectors, he added, “We are also close to, but not yet quite at the point of, using quantum computers in financial services in areas such as arbitration between currencies by simulation, something that is simply not possible with classical computers. What quantum computers do, and do very well, is run all the possibilities for a given set of circumstances and parameters, via billions of system scenarios, to come up with an optimal answer. Most big banks now have quantum teams – today it’s still mainly R&D, but tomorrow it’ll be for operational purposes.”

Asked about the possibility that many organisations and enterprises will have their own quantum computers at some time in the future, Gasman said, “Look at healthcare – Cleveland Clinic [the international hospital group] has just bought its own quantum computer from IBM. This is like what happened with classical computing. At the start of the era of the mainframes, organisations used them under time-sharing agreements. And what happened later? If you timeshare, generally, you’ll want your own machine.” 

Gasman believes that when the cost of quantum computing comes down (as it will) the devices will become more easily available and smaller organisations will get better access to the technology. Then the age of the end-user mini-quantum computer will dawn, and the world will change again – likely much sooner, he says, than we might expect.

- Martyn Warwick, Editor in Chief, TelecomTV

Cookies

TelecomTV uses cookies and third-party tools to provide functionality, personalise your visit, monitor and improve our content, and show relevant adverts.