Quantum computing has its limits

Quantum computing has its limits

In a lecture at the Massachusetts Institute of Technology in 1981, Richard Feynman talked about “simulating physics with computers”. It was already done at the time, but Feynman said he wanted to talk about “the possibility that there was a exact simulation, which the computer will do exactly the same as nature.’ But since nature is quantum mechanical, he pointed out, what you need for that is a quantum computer.

The rest is history, but history is still in the making. When I recently asked David Deutsch, the visionary physicist who in 1985 explained what quantum computing might look like, if he was surprised at how quickly the idea became a practical technology, he said replied with characteristic laconicism: “It is not the case”. You can see his point. Of course, in October, President Joe Biden visited IBM’s new quantum data center in Poughkeepsie, New York, to see an entire room full of the company’s quantum computers. And on November 9, IBM announced its 433 quantum bit (qubit) Osprey processor, although it seems only yesterday we were excited about Google’s 53-qubit Sycamore chip – with which the Google team claimed in 2016 demonstrate “quantum supremacy”, meaning it could perform a calculation in days that would take the best classical computer several millennia.1 This claim has since been disputed.

Deutsch’s reluctance to accept that practical quantum computing has arrived likely stems from the question of whether it can still do anything truly useful. Of course, one can construct a very difficult problem for a classical device but perfectly suited to a quantum computer, and then demonstrate that just a few dozen qubits can be enough to achieve “supremacy”. But how useful is that in the proverbial real world? When Feynman described the idea of ​​quantum computing, he had in mind that such a setup would be used to simulate systems governed by quantum laws, such as molecules and materials. Instead of using cumbersome classical approximations such as the norm From the beginning In quantum chemical methods, one would represent the quantum states of atoms and molecules in their own terms to calculate properties such as energy spectra, electronic band structures and stabilities.

small steps

Quantum computers have been doing this for several years now – sort of. Alán Aspuru-Guzik, then at the University of California, Berkeley, and his colleagues showed in 2005 that it should be possible to simulate simple molecules such as water and lithium hydride with just a few qubits.2 And in 2017, an IBM team used a simple six-qubit circuit to simulate LiH and BeH2 – the latter being the first triatomic species to be simulated in this way.3 The results were average, but they established the feasibility of doing quantum chemistry in a quantum way.

Given the growth in available resources over the past few years, one would expect that we could do much more now. But a new study by Garnet Chan of the California Institute of Technology and colleagues puts that — and Deutsch’s comment — into perspective.4 They used a 53-qubit chip tied to Google’s Sycamore to simulate a real molecule and material of interest. They chose their test cases without seeking to identify problems well suited to a quantum approach. One was the cluster of eight iron and sulfur atoms in the catalytic core of the enzyme nitrogenase, which fixes atmospheric nitrogen into biologically usable forms. Understanding this process could be useful for developing artificial nitrogen fixation catalysts. The other was the crystalline material alpha ruthenium trichloride, a compound of great interest in the field of quantum materials because it is believed to display the exotic low-temperature phase called spin liquid.5

How did the chip work? Frankly, rather indifferently. Chan admits he initially thought that with 53 qubits at their disposal, they would be able to simulate these systems with aplomb. But getting to grips with the problem disabused him of this idea. By mapping them onto the quantum circuit, researchers could reasonably attempt to calculate, for example, the energy spectra of the FeS cluster and the heat capacity of α-RuCl3 – but nothing that conventional methods can’t do at least as well. One of the main problems is noise: current qubits are error-prone and there are not yet ways to correct for these quantum errors. So if the calculation has too many logical steps, noise overwhelms the result, producing gibberish. For these reasons, the team was only able to exploit a fraction – around a fifth – of the resources offered by the processor.

It’s a sobering reminder of where we are now. Sure, things will keep getting better, but you’re not going to throw away your old quantum chemical algorithms just yet.

References

1 F Arrived et al, Nature2019, 574505 (DOI: 10.1038/s41586-019-1666-5)

2 A Aspuru-Guzik et al, Science2005, 3091704 (DOI: 10.1126/science.1113479)

3 A Kandala et al, Nature2017, 549242 (DOI: 10.1038/nature23879)

4 RN Tajigulov et al, Phys. Rev. X Quantum, 2022, 3040318 (DOI: 10.1103/PRXQuantum.3.040318)

5 H Li et al, Nat. Common.2021, 123513 (DOI: 10.1038/s41467-021-23826-1)

#Quantum #computing #limits

Leave a Comment

Your email address will not be published. Required fields are marked *