Regular CPUs don't have error correction. The logic gates simply don't make mistakes, even after trillions of operations. The reason is that the gates are digital. Manufacturing imperfections and noise don't matter, as long as the signal levels stay within their bounds.
On the other hand, current quantum computers are analog. Each quantum gate will get the coefficients of its output states a little bit wrong, and after a few tens of operations only noise is left in the qubits. In principle, quantum error correction could be used to measure and compensate for these errors. But none of the technologies demonstrated so far are anywhere near good enough for this.
> The logic gates simply don't make mistakes, even after trillions of operations.
The error rates are not zero. Low, but not zero. Trillions of operations is what, a split-second of runtime on a five-year-old GPU?
"High-reliability" systems invariably use some form of error correction or error detection. You can do this at different levels of abstraction. At a low level, you can build redundant gates. At a company I used to work for, this was a product we sold--it would synthesize ICs from an HDL and incorporate error correction. (This particular feature forced the company to get ITAR export licenses.) At a different company I worked for, we did our error correction at a high level using software. We encountered hardware errors on a regular basis. I'm not even talking about ECC--I'm talking about CPU errors.
The only reason why you can think that imperfections and noise don't matter is because there isn't much noise in your environment and you aren't dealing with enough data that you'll notice any errors.
Deal with a large enough amount of data, enough CPUs, enough RAM, and error becomes a certainty.
The "quantum computers are analog" line is at best profoundly misleading. If your definition of "analog" extends to quantum computers, then I'd say that digital computers are also analog. Which is not an incorrect thing to say.
Sure, if your CPU is controlling an airplane, it should better be redundant. Still, the CPUs inside your laptop, smartphone, or even compute server do their jobs pretty well without any error correction.
The transistors inside a digital CPU are of course analog devices. However, the digital interpretation of the logic levels means that analog errors do not propagate between gates. If the voltage fluctuations are small enough, the probability of triggering a logic gate incorrectly becomes astronomically small. Analog computers (such as all current quantum computers) do not have this feature. Each operation adds a little bit of error to the quantity that is being processed. This could be the voltage in an electronic analog computer, or the state coefficients in a quantum computer.
> Each operation adds a little bit of error to the quantity that is being processed. This could be the voltage in an electronic analog computer, or the state coefficients in a quantum computer.
This brings us back to the Quantum threshold theorem, which says that it's possible to design a quantum computer so that you can perform as many operations as you like without accumulating additional error, as long as individual gates in the quantum computer have a bounded amount of error.
This is one of the reasons why thinking of a quantum computer as "analog" will lead you astray. Quantum computers are more like classical digital computers than like analog computers.
They do. They have physical error correction in the form of majority-vote. Whichever way the majority of electrons go, is what the digital signal will be. Then the total number of electrons is above a threshold such that you basically get less than a handful of errors in years of operations if that.
SEUs are more of a problem in space. CPUs used for space missions mitigate the errors in a few different ways. In the past, the feature size on the CPU substrate was large enough to absorb most cosmic radiation without incident. As the feature sizes have shrunk and the old foundries are no longer producing the old chips, many missions have adopted "multiple voting" architectures. Satellites often use "triple voted" processors. The Space Shuttle had a system with five votes.
I'm not a quantum computing expert, but I wonder if a SEU would even matter in some parts of a quantum system. The bits are in an undefined state during operation anyway.
Not my area but you hear about but flips due to cosmic rays, and I've heard of mainframes that run everything on two cpus as a form or error correction. I think your point stands, just pointing out that there are occasional errors in cpus
On the other hand, current quantum computers are analog. Each quantum gate will get the coefficients of its output states a little bit wrong, and after a few tens of operations only noise is left in the qubits. In principle, quantum error correction could be used to measure and compensate for these errors. But none of the technologies demonstrated so far are anywhere near good enough for this.