Okay
@TENET , I was trying to keep it really simple but you're probably gonna push me past my own understanding if I follow you as far as you took it lol.
Yes its an abstraction but it also reflects a fundamental physical limitation. A low state can be 0.2V because you're dealing with large #'s of electrons in each bit of the system. Transistors are ever-decreasing in size (Moore's Law and all), but at some point you reach fundamental limits where your transistors will be at atomic levels. At some point you run into an absolute limit because there are a discrete # of free (semi-free?) electrons in the semiconductor and they can only be measured in a fixed # of states. That is a hard limit on how much information can be held. (In fact, the hard limit may come earlier than individual electron counting due to fundamental limitations of semiconductor material and quantum interactions between the electrons.)
But there's another level to the limitation too. Even if you have your current high # of electrons in your transistor that can communicate a spectrum of states, you still only have one bit of information. That voltage is a discrete number. It's one piece of information and can't be more than that. If you combine it with another piece of information, you've still only combined two pieces of information.
Yes I agree that classical computation architectures / models are limited to having singular deterministic values at their core and quantum computers are not.
A couple of side points. There is a belief out there that stacking ("3D") of silicon circuits is a way to get around the quantum-effects interference limit limitation (causing tunnelling for example). AIso I think the Apple-M1 is 5nm which in itself was thought unfeasible not too long ago.
"It is the first
personal computer chip built using a
5 nm process"
Apple M1 - Wikipedia
The number of states per "bit" is more of an architecture and computing model area than an engineering one from a computational point of view. I graduated a while ago so I don't know how they split those formally. So "classical" computing (as it your typical PC/server) is Von-Neumann architecture with Turing Machine/Lambda Calc model of computation. The Quantum stuff must have a new arch and I suppose the model of comp stays the same .. somewhat..
Binary was a good choice with old circuits as it gives the maximum possible room for voltage-overrun. If you wanted to run old computers on decimal it would not have worked because the stored values were not accurate enough. So now in 2020 computing is stuck on base 2. It just happens to be the fact that the most efficient number encoding system is base "e" but I imagine that would be too complex for most humans to work in, so base 2 it is.
That's completely fair - I was trying to explain it too simplistically. Another way of saying it is that a quantum qubit is not a number, a better visual metaphor is a three-dimensional map of unlimited complexity. With qubits you're not combining two numbers to get a 3rd number, you're combining two waveforms that each carry unlimited information and interact in a quantum state that does not just become a "1" or "0" but rather its own incredibly complex result.
I view it like this .. if you have N qubits then you have 2^N possible simultaneous values.
see: "For a system of
n components, a complete description of its state in classical physics requires only
n bits, whereas in quantum physics it requires 2^
n complex numbers.
[3]"
Qubit - Wikipedia
So (in principle) but this is NOT how it works, in classical computing if you want to find a 64 bit key you would have to generate and try them one after the other (which is the same as concurrent in this sense as they all take computing resources (incl. time)) whereas with a quantum computer you have all possible combinations in your 64qubits at the same time and can try them all at the same time.
Now.....to go full circle, it's important to note that none of this relates directly to the OP at all. The breakthrough in the OP is not a computing breakthrough, its a demonstration of the ability maintain the fidelity of quantum states over distance, which as I pointed out before is generally pursued with the goal of creating unbreakable encryption. For quantum computing, I would guess at some point it may allow you to run quantum computations and transfer intermediate results over a network rather than simply running the operation within a single computer...but I'm not certain how feasible that is.
That it how I think they will be used initially, in a similar fashion to how jobs used to be run on mainframes. You package up your job and send it to the QC for processing. That's common in finance with pricing/simulation services, in research with supercomputers and even google/apple used that model with their translation/AI assistant services.
-
Back in the early days of modern computer programming you had to know more about the physical machine. Early quantum programming, in a similar way, is going to be far more complex in the early days. It's cross-field between physicists, engineers, mathematicians, computer scientists and for at least three of those it lies at the advanced boundary of that discipline.