Definition Series: Quantum Computing

Following on the Definition series that was started this month, today it is a quite interesting, long and pretty close to my background post. It is on the subject of Quantum Computing .

Quantum Physics is one part of Physics that witnessed enormous success as a scientific theory in the 20th century. It is a scientific achievement somewhat weird, as it isn’t a conventional scientific theory. In normal Science, theorists formulate a hypothesis and then test its validity by empirical enquiry and peer review scrutiny; on top of this the theories are supposed to be falsifiable by further hypothesis and empirical scrutiny. But with Quantum Physics the process is somewhat turned around, as it is an empirical achievement and bafflement by the weird outcome of experiments that triggered a process of hypothesis testing and sophisticated mathematical formulation. The end result of this process was nevertheless a theoretical edifice that have stood the test of time and all sorts of scrutiny to become the most successful scientific theory thus far.

The empirical bafflement and amazement by the very first scientists working on quantum physics led to, not surprisingly, widespread controversy as to what is the right interpretation of the findings. Furthermore in order to build the proper framework for a scientific theory, as mentioned above, there has to be somehow a way to the theory to be falsifiable, but quantum physics operates with particles at subatomic scale which renders empirical scrutiny difficult and onerous. The best way to check empirically this theory is to observe the behavior of particles at very low temperatures, in the order of -273°C, and this makes many experiments impractical. The effects of this theory that started to be theorized to be useful to Computer Science and IT were found at these temperature levels.

The long post on (quantum computing) is quite useful from the point of view of the application of Quantum Physics to modern computational issues, and the emergent field of research Quantum Computing is one of the most exciting and interesting in Computer Science and Information Technology. It is a long post with many subsections, but I will highlight here the paragraphs that I find to be the most relevant:

Quantum computing is the area of study focused on developing computer technology based on the principles of quantum theory, which explains the nature and behavior of energy and matter on the quantum (atomic and subatomic) level. Development of a quantum computer, if practical, would mark a leap forward in computing capability far greater than that from the abacus to a modern day supercomputer, with performance gains in the billion-fold realm and beyond. The quantum computer, following the laws of quantum physics, would gain enormous processing power through the ability to be in multiple states, and to perform tasks using all possible permutations simultaneously. Current centers of research in quantum computing include MIT, IBM, Oxford University, and the Los Alamos National Laboratory.

The essential elements of quantum computing originated with Paul Benioff, working at Argonne National Labs, in 1981. He theorized a classical computer operating with some quantum mechanical principles. But it is generally accepted that David Deutsch of Oxford University provided the critical impetus for quantum computing research. In 1984, he was at a computation theory conference and began to wonder about the possibility of designing a computer that was based exclusively on quantum rules, then published his breakthrough paper a few months later. With this, the race began to exploit his ideas. However, before we delve into what he started, it is beneficial to have a look at the background of the quantum world.

Quantum Theory

Quantum theory’s development began in 1900 with a presentation by Max Planck to the German Physical Society, in which he introduced the idea that energy exists in individual units (which he called “quanta”), as does matter. Further developments by a number of scientists over the following thirty years led to the modern understanding of quantum theory.

The Essential Elements of Quantum Theory:

  • Energy, like matter, consists of discrete units, rather than solely as a continuous wave.
  • Elementary particles of both energy and matter, depending on the conditions, may behave like either particles or waves.
  • The movement of elementary particles is inherently random, and, thus, unpredictable.
  • The simultaneous measurement of two complementary values, such as the position and momentum of an elementary particle, is inescapably flawed; the more precisely one value is measured, the more flawed will be the measurement of the other value.

Further Developments of Quantum Theory

Niels Bohr proposed the Copenhagen interpretation of quantum theory, which asserts that a particle is whatever it is measured to be (for example, a wave or a particle) but that it cannot be assumed to have specific properties, or even to exist, until it is measured. In short, Bohr was saying that objective reality does not exist. This translates to a principle called superposition that claims that while we do not know what the state of any object is, it is actually in all possible states simultaneously, as long as we don’t look to check.

The second interpretation of quantum theory is the multiverse or many-worlds theory. It holds that as soon as a potential exists for any object to be in any state, the universe of that object transmutes into a series of parallel universes equal to the number of possible states in which that the object can exist, with each universe containing a unique single possible state of that object. Furthermore, there is a mechanism for interaction between these universes that somehow permits all states to be accessible in some way and for all possible states to be affected in some manner. Stephen Hawking and the late Richard Feynman are among the scientists who have expressed a preference for the many-worlds theory. Which ever argument one chooses, the principle that, in some way, one particle can exist in numerous states opens up profound implications for computing.

A Comparison of Classical and Quantum Computing

Classical computing relies, at its ultimate level, on principles expressed by Boolean algebra, operating with a (usually) 7-mode logic gate principle, though it is possible to exist with only three modes (which are AND, NOT, and COPY). Data must be processed in an exclusive binary state at any point in time – that is, either 0 (off / false) or 1 (on / true). These values are binary digits, or bits. The millions of transistors and capacitors at the heart of computers can only be in one state at any point. While the time that the each transistor or capacitor need be either in 0 or 1 before switching states is now measurable in billionths of a second, there is still a limit as to how quickly these devices can be made to switch state. As we progress to smaller and faster circuits, we begin to reach the physical limits of materials and the threshold for classical laws of physics to apply. Beyond this, the quantum world takes over, which opens a potential as great as the challenges that are presented.

The Quantum computer, by contrast, can work with a two-mode logic gate: XOR and a mode we’ll call QO1 (the ability to change 0 into a superposition of 0 and 1, a logic gate which cannot exist in classical computing). In a quantum computer, a number of elemental particles such as electrons or photons can be used (in practice, success has also been achieved with ions), with either their charge or polarization acting as a representation of 0 and/or 1. Each of these particles is known as a quantum bit, or qubit, the nature and behavior of these particles form the basis of quantum computing. The two most relevant aspects of quantum physics are the principles of superposition and entanglement .


Think of a qubit as an electron in a magnetic field. The electron’s spin may be either in alignment with the field, which is known as a spin-up state, or opposite to the field, which is known as a spin-down state. Changing the electron’s spin from one state to another is achieved by using a pulse of energy, such as from a laser – let’s say that we use 1 unit of laser energy. But what if we only use half a unit of laser energy and completely isolate the particle from all external influences? According to quantum law, the particle then enters a superposition of states, in which it behaves as if it were in both states simultaneously. Each qubit utilized could take a superposition of both 0 and 1. Thus, the number of computations that a quantum computer could undertake is 2^n, where n is the number of qubits used. A quantum computer comprised of 500 qubits would have a potential to do 2^500 calculations in a single step. This is an awesome number – 2^500 is infinitely more atoms than there are in the known universe (this is true parallel processing – classical computers today, even so called parallel processors, still only truly do one thing at a time: there are just two or more of them doing it). But how will these particles interact with each other? They would do so via quantum entanglement.

Entanglement Particles (such as photons, electrons, or qubits) that have interacted at some point retain a type of connection and can be entangled with each other in pairs, in a process known as correlation . Knowing the spin state of one entangled particle – up or down – allows one to know that the spin of its mate is in the opposite direction. Even more amazing is the knowledge that, due to the phenomenon of superpostition, the measured particle has no single spin direction before being measured, but is simultaneously in both a spin-up and spin-down state. The spin state of the particle being measured is decided at the time of measurement and communicated to the correlated particle, which simultaneously assumes the opposite spin direction to that of the measured particle. This is a real phenomenon (Einstein called it “spooky action at a distance”), the mechanism of which cannot, as yet, be explained by any theory – it simply must be taken as given. Quantum entanglement allows qubits that are separated by incredible distances to interact with each other instantaneously (not limited to the speed of light). No matter how great the distance between the correlated particles, they will remain entangled as long as they are isolated.

Taken together, quantum superposition and entanglement create an enormously enhanced computing power. Where a 2-bit register in an ordinary computer can store only one of four binary configurations (00, 01, 10, or 11) at any given time, a 2-qubit register in a quantum computer can store all four numbers simultaneously, because each qubit represents two values. If more qubits are added, the increased capacity is expanded exponentially.


The Problems – And Some Solutions


The above sounds promising, but there are tremendous obstacles still to be overcome. Some of the problems with quantum computing are as follows:

  • Interference – During the computation phase of a quantum calculation, the slightest disturbance in a quantum system (say a stray photon or wave of EM radiation) causes the quantum computation to collapse, a process known as de-coherence. A quantum computer must be totally isolated from all external interference during the computation phase. Some success has been achieved with the use of qubits in intense magnetic fields, with the use of ions.
  • Error correction – Because truly isolating a quantum system has proven so difficult, error correction systems for quantum computations have been developed. Qubits are not digital bits of data, thus they cannot use conventional (and very effective) error correction, such as the triple redundant method. Given the nature of quantum computing, error correction is ultra critical – even a single error in a calculation can cause the validity of the entire computation to collapse. There has been considerable progress in this area, with an error correction algorithm developed that utilizes 9 qubits (1 computational and 8 correctional). More recently, there was a breakthrough by IBM that makes do with a total of 5 qubits (1 computational and 4 correctional).
  • Output observance – Closely related to the above two, retrieving output data after a quantum calculation is complete risks corrupting the data. In an example of a quantum computer with 500 qubits, we have a 1 in 2^500 chance of observing the right output if we quantify the output. Thus, what is needed is a method to ensure that, as soon as all calculations are made and the act of observation takes place, the observed value will correspond to the correct answer. How can this be done? It has been achieved by Grover with his database search algorithm, that relies on the special “wave” shape of the probability curve inherent in quantum computers, that ensures, once all calculations are done, the act of measurement will see the quantum state decohere into the correct answer.

Even though the practical problems and challenges for the full implementation of quantum computing, recent developments of the last 3 years or so have given some hope that this is closer to a reality than we might think. The broad range of applications, namely: the promise of improvement in DNA modeling, complex materials science analysis and decryption of cryptographic keys via brute force searches. Which makes quantum computing implementation a worthy effort to take. Whether this will be fully exploited and seized is still a lingering question, with a further layer of uncertainty beyond the technological feasibility, entering also in the realm of ethical, political balances to take into account. Just for no other reason that massive technological improvement normally comes also with massive societal or sociological issues attached.

 Featured Image:  Quantum Computing Promises Fast Solutions to Huge Problems

One thought on “Definition Series: Quantum Computing

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s