Is Quantum Computing ready ?
By   |  August 20, 2015

As Moore’s Law demonstrates, the IT sector is undoubtedly the most dynamic since its inception 70 years ago. If computing architectures have followed from the start the Von Neumann model, researchers and R & D departments have always pursued the dream of next generation computers. The quantum computer is such a dream, slowly becoming reality as the technical barriers are slowly but enduringly overcome.

The search for a successor to current architectures has a technical origin. According to the Moore’s law, the size of transistors will approach that of the atom as soon as 2020. At this scale, quantum effects disrupt the operation of the components. The current photolithographic engraving technology may well prevent any architectural evolution by conventional means. Just as the multiplication of execution cores within the processors succeeded the race for clockrate during the past decade, architectural developments are at the heart of the quantum approach and aims to multiply the capacity of the future computers.

HOW A QUANTUM COMPUTER WORKS
Quantum computing is based on quantum bits or qubits. Unlike traditional computers, wherein these bits can only have a value of zero or one, a qubit can be a zero, one, or both values simultaneously. This representation of information allows qubits to process information in a manner that no classical computer does, enjoying phenomena such as tunneling and quantum entanglement. As such, quantum computers can theoretically be able to solve some problems in days where traditional architectures could take up to millions of years to solve the same problem. The promise is phenomenal! But the pitfalls many.

WHAT IS A QUANTUM COMPUTER USEFUL FOR?
Unlike current computers based on the Von Neumann model and which are based on the exchange of the four elements that distinguish it, namely the arithmetic unit processing, the control unit, the memory and the input output devices, a quantum computer requires few inputs and low outputs. It therefore lends itself for calculations whose complexity lies in the combinatorial algorithms. We find these problems in scheduling, operational search calculations, bioinformatics, and cryptography. This low volume of input-outputs to process and solve problems further predisposes them to remote use through the Internet.

A DIZZYING POTENTIAL
If large quantum computers (more than 300 qubits) could be built – which is unlikely at this moment – they would be able, according to David Deutsch, a British physicist and professor of physics at Oxford University, to simulate the behavior of the universe itself. They could also solve cryptanalysis problems in a much shorter time than a conventional computer, because increasing linearly (N) with the size N of the key, and not exponentially (To 64n, for example) as with sequential brute force methods. Quantum computers require different calculation techniques to be programmed, while still using classical linear algebra methods. But this technology suffers of reliability issues.

The reason being that the qubit, the quantum information storage unit, is fragile and sensitive to changes of the temperature and the magnetic field, thereby generating errors. And this occurs even if the work is carried out at a temperature close to absolute zero.

D-WAVE: THE HAMPERED PIONEER
D-Wave announced officially on February 13, 2007 to have created the world’s first 16-qubit quantum computer. This computer would however be limited to certain quantum optimized calculations. The combinatorial problems (like sudoku) are however solved slower than with a regular computer. At the heart of the D-Wave system 2 (adopted by NASA and google) lies the SQUID, a proprietary quantum transistor that can manipulate superconductive qubits, which are the basic bricks of a quantum computer. SQUID is an acronym meaning Superconducting Quantum Interference Device. The term “interference” refers to electrons, acting as waves within quantum waves, which in turn create interference patterns which give rise to quantum effects. The reason why quantum effects such as electron waves are held in place in the structure – allowing them to act as a qubit – Is due to the properties of the material of which they are made. Squid consists of metallic niobium (As opposed to conventional transistors made from silicon). When the metal is cooled, it becomes superconductive, and begins to exhibit quantum effects. The superconducting structure encodes their states like tiny magnetic fields, pointing up or down. We call these states +1 and -1, and correspond to the two states that the qubit can adopt. The use of quantum mechanics can control one, several or all of the qubits to adopt the superposition of these two states. Although D Wave 2 contains 512 qubits this quantum computer suffers from two flaws: their lifespan is limited and the computational scope is limited to specialized tasks like Machine Learning, pattern recognition and detection of abnormalities, Financial Analysis and verification and validation process. Furthermore it is not independent and must operate besides a traditional computer, which makes D-Wave2 a quantum coprocessor still far from universal quantum computer pursued by Google, IBM and Microsoft.

GOOGLE: SOLVING THE CORRECTION ERROR AND CORRUPTION PHASE
The University of California, combined with Google, has developed an error correction mechanism in qubits, the quantum equivalent bits. But these errors are today one of the main obstacles to the design of quantum computers. The teams of Physics professor John Martinis of the University of California, associated with Google since last September, just crossed a significant step: the reliability of quantum computers.

The teams of the University of California and Google have managed to program a chip containing 9 qubits able to monitor each other to detect bit inversion errors. The method imagined does not correct these errors, but prevents them contaminating the steps of a calculation. The team of John Martinis explains that the mechanism they imagined reduced the error rate by a factor of 2.7 when 5 qubits are used, and by a factor of 8.5 when the 9 elements are active. “Other studies have yet be conducted before we can say error-free quantum calculations are possible, “says Daniel Gottesman, who works on error corrections at the Perimeter Institute in Canada. If the inversion of bits that treats John Martinis, can be supported by classic algorithms, another type of error, the alteration of a qubit property called the qubit phase, requires much more complex calculations. In the MIT Technology Review, Austin Fowler, an engineer at Google, ensures that the teams of Mountain View and the University of California are working specifically on the subject of phase alteration as well as an error detection mechanism over 9 qubits.

IBM: SUPERCONDUCTIVITY AND SQUARE DESIGN
As we see, a quantum computer will only work when quantum decoherence is eliminated, namely the emerging errors in the calculations because of the ambient temperature, or even electromagnetic radiation. The qubits are extremely sensitive, their simple measurement can change their condition. It is very possible to have flip type bit errors, which equates to obtain the opposite state (1 instead of 0, for example). Or still fall on a turnaround phase error, which can flip the state sign (+ instead of -). Quantum error correction is necessary in any large-scale reliable quantum computer design. So far, there was only possible to detect either of these two phenomenons simultaneously. The IBM solution is a quantum bit circuit which is based on a square lattice of four supercooled superconductive qubits placed on a chip with an area of ​​about 60mm². The square shape allows the circuit to work for the resolution of the quantum error correction. This form also enables scaling by adding more qubits.

Previous work in this area, using linear arrangements, “only allowed observation of the bit inversion errors hence providing incomplete information on the quantum state of a system” says Jay Gambetta from IBM’s Quantum Computing Group. “Our work allows us to solve this obstacle by detecting the two types of quantum errors. Even better, they are transferable to larger systems as the qubits are arranged in a square configuration, as opposed in a linear array. The next step is to design and manufacture a handful of reliable superconducting qubits with low error rates. Once done, we could very well be on the way to a complete quantum computer. If we could build a quantum computer with only 50 bits quantum bits (qubits) instead of four no TOP500 supercomputers today could not succeed to surpass its performance – Which would be absolutely amazing. “

INRIA: MISSION QUANTIC
Mazyar Mirrahimi, research director at the French research institute INRIA and head of the Quantic Mission estimates it is necessary to obtain a full-blown quantum computer, to couple several quantum systems, which seems very difficult in quantum optics. This problem led researchers in the field of mesoscopic physics to prefer superconductive circuits at low temperatures in order to create the logic gates and memories needed by a quantum computer. We feel the need of a systems engineering that obeys quantum rules, that we could call “quantum engineering laws”. In terms of impact, Mazyar Mirrahimi considers that there is a whole field of possible applications including metrology measurement accuracy improvement, as we did it for the atomic clock. We could consider, for example, improving and stabilizing the amplitude measurements of a magnetic field. Some applications like quantum cryptography and quantum communications are based on these laws and are rather easier to achieve. There are already industrial prototypes that can communicate information by optical fiber in a quantum encrypted way by polarizing the light photons. Solving problems not accessible to current computers involves to be able to manipulate thousands of qubits. However, quantum superposition (The possibility for a qubit to be simultaneously 0 and 1) is very fragile. To go beyond 9 qubits implies an increase in the risk of developing new noise sources, in particular, correlated noise (which can affect many qubits at a time). We speak here of the problem of “scalability”: how to significantly increase the size of the quantum system without deteriorating the properties of its subsystems? This is the heart of the problem in developing a universal quantum computer. As we see, the potential of the quantum computer is such that it alone justifies the investment made by the world’s biggest players and research institutes. If it has not the ability to revolutionize day to day computing, the acceleration potential in research and specific treatments is mind-blowing. The state of research is nevertheless encouraging and allows the academic and industrial sectors to work together to solve problems on the way. The results may still be years away, but for such a young sector the progress in only a decade are very encouraging !

© HPC Today 2024 - All rights reserved.

Thank you for reading HPC Today.

Express poll

Do you use multi-screen
visualization technologies?

Industry news

Brands / Products index