I was fortunate to get tickets to the Economist's "Commercialising Quantum" conference on 13/14th May 2025. Over two days there were presentations and panels from people from manufacturers, researchers and other industry figures. Like all conferences, it was a bit of a mixed bag. But here are the main points that struck me.
When Will it be Useful?
Realistically, "quantum superiority" for a limited class of applications is probably 3-5 years away. IBM said 3, they're probably being optimistic though. "Q-Moore's Law" seems to hold - the number of qubits roughly doubles each year.
Strong consensus that the likely first application is chemistry and materials science (which is just a special case of chemistry anyway). For this you don't need anything spectacular, as soon as you have enough reliable ("logical") qubits to do more than conventional computers can (say 30-40), you have something you can use.
Error Correction
Raw physical qubits are extremely unreliable - error rates of 10-2 or worse. To make it useful, you need very powerful error correction. Masses of effort in this right now. The problem is that first, it takes a lot of physical qubits to create a single usefully-reliable logical qubit, and second, error correction is (relatively) slow. That matters, because for some technologies you only have microseconds before the whole thing turns to mush due to decoherence.
People talked about "logical qubits" without saying what they meant in terms of physical per logical. You can build a logical qubit (lqubit) with nine physical qubits, but it won't get you anywhere close to the reliability you need. Current estimates are that it will take 100-1000 physical qubits to make one useful logical qubit. So if your problem needs 100 logical qubits, you need at least 10,000 physical qubits. It's worse than that. To achieve an overall say 10-2 reliability, the reliability you need from each logical qubit increases with the number of them. So the number of physical qubits you need increases with the square of the complexity of the problem.
One company (Riverlane) has built an off-line error correction box, a 1U package using lots of FPGAs for speed. It sounds improbable, but they claim it works. You still have to program all the guard-bits in your quantum program, which is non-trivial, but they take care of the decoding.
How - Quantum Technologies
There are several competing raw physics approaches to building a quantum computer: superconducting, neutral atom, photonic, topological, trapped ion, quantum dot. There is a good survey article here . They all have been made to work at "toy" scale, and they all have $Bs of investment and serious companies (MS, Google, IBM, ...) behind them. They all have serious drawbacks, and none have been made to work at useful scale.
Clearly there will eventually be a winner, maybe two. I think it's fair to say that at this point nobody (who doesn't have a dog in the race) has the faintest idea which. It's not even certain that ANY of them can be made to work at scale, though given the $$$ and sheer number of quantum physics PhDs being thrown at them, something will probably emerge.
Crypto
THE talked-about QC application is cryptography, or rather breaking it. The nirvana is (relatively) rapid factorization of 2048-bit RSA public keys. It's still a long way off. Right now the estimate is that it will take about 10 million physical qubits. Assuming Q-Moore's Law, that is 15 years away. Supposing improvements in algorithms and error correction reduce that by a factor 10, call it 10 years. Still a long way off. (And "rapid" means in several hours, not seconds or microseconds).
Even so, there has been a lot of work on "post quantum cryptography" (PQC), i.e. cryptography which is resilient to attack using quantum techniques. NIST in the US has blessed several techniques. My guess is that the world will move fairly rapidly to using these, say within the next 3 years. That matters because for really critical stuff (like defence designs), state actors are already storing encrypted data hoping they'll be able to decode it "one day".
QC and AI
There were several sessions claiming to talk about this, but all the ones I attended were content free. Everyone agrees that QC will be part of the AI toolbox, but beyond that nobody had much to say.
Deployment
There were several companies proposing different kinds of "make QC usable for you" services and techniques. I'm 100% sure that this is the only way any "normal" software engineers will ever be able to use it, so this will be a big market. One day.
Non-Computing Stuff - Quantum Navigation
Several sessions were about quantum-based navigation, i.e. getting GPS-like benefits without needing external radio signals. This is a huge deal because there are large parts of the world where GPS is unusable due to jamming and spoofing. There are two techniques, both dependent on fairly miraculous quantum technology. One is quantum INS - building accelerometers and gyroscopes that are orders of magnitude more stable than you get with classical techniques. The other is quantum gravimetry - measuring the local strength of the earth's magnetic field. Using that, you can relate your position to a map (which are available) and have a position accurate to within a few hundred metres.
The technology of quantum gravimetry is really mind-boggling, but a bit long to fit in the margin here. Basically it involves measuring the difference in position between two superposed quantum states of the same atom. If that seems reasonable to you, you have your head around quantum stuff a lot better than I do.
2 comments:
I am reminded of the gestation of semiconductor memory. Philips Redhill was one of the first, if not the first to build it. However they were very concerned about building dies that had 100% correct and working bits. So they spent a long time on error detection and correction. However they massively missed the market to a US pioneering semiconductor firm that focused on getting the process right and thus producing perfect chips in which all bits worked as advertised.
I wonder which approach will take the market with Qbits.
The nav idea is interesting and I can see the concept but did they explain how they detected the positions?
The interesting bit about QC is that it is a threat only to the "session setup / key exchange" e.g. RSA/EC. It seems to not be a threat to the subsequent data encryption (in X509) e.g. AES256. So by avoiding the whole PK crypto stuff you sidestep the risk - at a cost of using a shared key system which has other problems.
Post a Comment