By today’s standards, the super computer is one of our greatest technological feats, with processing power that can be compared to the complexity of the human brain. But even these computers won’t stand up to what quantum computing will bring to the table.
Normal computers use transistors that operate on binary code, a series of ones and zeroes to make sense of data. A theoretical quantum computer would operate on qubits (quantum bits) that make direct use of mechanical phenomena such as engagement and superposition made from atoms at the quantum level (hence the name). The current model of a quantum computer is the quantum Turing machine that uses probability on the state of being in more than one state simultaneously. A pair of quibits that have two states each can exist in a possible 4 states and 3 quibits can exist in 8, 4 in 16 and so on.
Yuri Manin developed the theory for quantum computing in 1980. Paul Benioff is credited with first applying quantum theory to computers in 1981. Now, quantum computing is still in its infancy.
One of the major questions in the field is if quantum particles behave according to the current rules of quantum mechanics. This phenomenon is often tested through a Boston sampler, which observes light particles or photons to try and carry out an objective based on observation. It can get rather confusing and scientists often refer to it as “spooky physics”. However, it is crucial to our understanding of how entanglement works and how we apply it the logistics of a computer.
Currently, the number of transistors we can put in a microprocessor are growing exponentially, doubling almost every year and getting smaller and smaller. The next step will be the use of quantum mechanics in computers that will decode data so fast it will solve algorithms that would take a current super computer months or years to do in just seconds. What the future processing power of quantum computers can bring to the table may be a bit much for some to wrap their heads around but it will be the next step in solving some of physics’ most intense algorithms and change our understanding of how we use technology.