QUATUM COMPUTING: Points for discussion / Test Material

In Canada, for the last 10 years , the main centre for quantum computing research has been the University of Waterloo.

The Institute for Quantum Computing is in the process of moving into the Mike & Ophelia Lazaridis Quantum-Nano Centre.

“Classical computing” is a highly developed narrative of zeros or ones.

“ Moore’s law” is the principle that technological innovation causes computing power to double on average every two years.

By 2020 - 2030, processing circuits will have become so small — as small as atoms — that they will be regulated by the same physical imperative that governs atoms: the inability to get any smaller.

A laptop in 2025 will be vastly more powerful than any laptop today, but that will be the end of it. Classical computing will have reached a natural limit and will cease to evolve.

A quantum bit is known as a “qubit”, and can represent a 0 and a 1, both of them at the same time. It can be simultaneously off and on.

The uncertainty principle is critical to the nature of atomic and subatomic particles (including electrons, photons, protons, neutrinos and other minuscule dots of nature) as they exist in a haze of possibility rather than in any fixed state.

A microscopic particle can be both here and there at the same instant of time, which means that it can also be in every intermediate location between here and there, also at the same time.

Even though an atom-sized classical bit would be extremely small, it would still be restricted to just two possible states — off and on. By contrast, an atom-sized qubit would possess a near infinity of states, simultaneously.

By a common calculation a quantum computer consisting of just 300 atoms would possess more sheer calculating power than a conventional machine harnessing every last atom in the universe

Such a machine could instantly perform difficult calculations — for example, the factoring of very large numbers — that the largest conventional computer would need billions of years to figure out, if it could figure them out at all.

Classical computers are not good at factoring large numbers, a weakness long been exploited by cryptographers to safeguard data. It is easy to multiply two prime numbers in order to produce a much larger number, but it is very difficult to engineer the same process in reverse, to find the two prime divisors of a large number, a process called factoring.

The only way classical computers can address the challenge is by systematic trial and error — trying out two numbers to see if they work, discarding them, trying out two different numbers, and so on. This defect in conventional computers is used to secure your banking information on the Internet. Even with powerful computers, would-be hackers still cannot find a way to expose the key — the two original prime numbers used to secure the code that protects your data. The only method available is trial and error, examining every possible combination of divisors, one pair at a time, which could take forever.

A quantum computer could crack such privacy barriers in an instant, by testing every possible combination of divisors, not one by one, but all at once, something no conventional computer could do. The right answer would reveal itself almost immediately. This is recognized as being the potential “Hydrogen Bomb” of Cyberwarfare.

Governments around the world are pouring huge amounts of money into quantum computing. Whoever is first to build a quantum machine would be able to crack just about every secret code in cyberspace. (Next James Bond or Mission Impossible theme...)

Quantum computers will ignite a revolution in a variety of fields other than cryptography. These will include the development of new drugs, or the creation of new superconducting materials, or a range of other innovations currently impossible to predict. (Think the singularity...)

Quantum computers do now exist, but they are extremely limited in complexity and capability, restricted to no more than a dozen qubits and able to perform only the simplest mathematics.

The highest number so far factored by a quantum device, using an algorithm developed in the 1990s by U.S. scientist Peter Shor, is 21

Lab facilities at IQC’s existing premises fabricate qubit chips from aluminum and niobium. The slivers of metal are about the size of a baby’s fingernail and are extremely difficult to produce. (It take 1,000 tries to create a chip that works)

Current 12 qubit chips can add two plus two, but cannot add large numbers. The boundary is about 50 qubits where classical computing will be left behind.

The big obstacle quantum computing involves the inherent instability of the quantum universe, a territory where superposition — the ability of particles to occupy multiple locations at once — collapses the very instant a human observer interferes with the process in any way, even by looking at a particle through an electron microscope.

D-Wave Systems, located in Burnaby, British Columbia claims to have created "the world's first commercially available quantum computer," using adiabatic quantum technology based on quantum annealing to solve optimization problems operating on an 128 qubit chip-set.