The word “quantum” gained currency in the late 20th century as a descriptor signifying something so significant, it defied the use of common adjectives. For example, a “quantum leap” is a dramatic advancement (also an early ’90’s television series starring Scott Bakula).
At best, that is an imprecise (though entertaining) definition. When “quantum” is applied to “computing,” however, we are indeed entering an era of dramatic advancement.
Quantum computing is technology based on the principles of quantum theory, which explains the nature of energy and matter on the atomic and subatomic level. It relies on the existence of mind-bending quantum-mechanical phenomena, such as superposition and entanglement.
Erwin Schrödinger’s famous 1930’s thought experiment involving a cat that was both dead and alive at the same time was intended to highlight the apparent absurdity of superposition, the principle that quantum systems can exist in multiple states simultaneously until observed or measured. Today quantum computers contain dozens of qubits (quantum bits), which take advantage of that very principle. Each qubit exists in a superposition of zero and one (i.e., has non-zero probabilities to be a zero or a one) until measured. The development of qubits has implications for dealing with massive amounts of data and achieving previously unattainable level of computing efficiency that are the tantalizing potential of quantum computing.
To read more, click here.