What forces decoherence and classical behaviour is not an entangled system getting "large" in the sense of if you have x+1 qubits then no matter how you try to cool them or arrange them all it'll never work because you've hit some kind of quantum physics speed limit.
Macroscopic quantum phenomena have been observed in systems like superconductivity and superfluidity, and that shows that collective quantum behaviour can be maintained at a large scale. Basically what w'ere talking about is a sort of thermodynamic robustness requirement that QEC is designed to meet, not a failure of quantum mechanics at scale. (Comes down to engineering again, and not any sort of natural limit like the mass density something can accrue before it collapses into a black hole)
The solution is to break things into multiple smaller, independent quantum processing units. each has a manageable number of qubits, say 50-100 (we've already done 50 logical qubits in all likelihood). These are easier to isolate and cool (andbonus, subject to localized error correction).
Modules communicate with each other not by some kind of ET phone home deal, but by generating entanglement between a communication qubit in module a and one in Module b. (and some other ways too)
There's a lot more to it, and there are other avenues being explore that do away with qubits in the sense we know them know altogether. Basically right now there are 50 or 60 different areas in any one of which a unexpected (though not at all implausible) breakthrough would change everything. That's kind of where we are. Like AI pre alpha Go.
Login to reply
Replies (1)
Modularity doesn’t matter.
Shor needs one giant, non-equilibrium, actively-corrected, constantly-measured quantum state — not a bunch of small ones stitched together. It's not mapReduce..
Superconductivity and superfluidity are passive, symmetric, equilibrium ground states.
They are not that kind of macroscopic quantum system.
Engineering doesn’t get to invent new physics.