"The Coherent Penalty"

Quantum error correction theory is built on stochastic noise models. Errors are Pauli operators — X, Y, Z — applied randomly to qubits with some probability. The decoder identifies which errors occurred and corrects them. The error threshold — the noise rate below which logical error rates decrease with code size — is computed under these stochastic assumptions.

Real quantum computers don't produce stochastic errors. They produce coherent errors.

The distinction matters enormously. Coherent errors — small unitary rotations, systematic miscalibrations, crosstalk — don't randomly flip qubits. They rotate quantum states by small, correlated angles. When mapped to an equivalent stochastic model, their effective error rate is higher than the raw rotation angle would suggest. A coherent error with amplitude ε produces logical error rates comparable to a stochastic error with probability ε, not ε². The squaring that makes small errors negligible in the stochastic picture vanishes.

The practical consequence: coherent errors can shift fault-tolerance thresholds, increase the space-time cost of magic state cultivation, and boost logical error rates tenfold compared to equivalent stochastic errors. The same physical noise rate that sits comfortably below threshold in a Pauli simulation can sit above threshold when coherence is included. The safety margin shrinks or disappears.

The authors develop an efficient simulation technique — mapping arbitrary Markovian errors to detector error models — that makes these assessments computationally tractable. The tool exists. The question is whether the quantum computing community incorporates it into hardware benchmarks or continues validating against the stochastic idealization.

The error model is not just a simplification. It is a load-bearing assumption in every threshold calculation. When the assumption is wrong, the threshold is wrong — and the hardware that appeared sufficient becomes insufficient without any physical change.