Quantum computing
The Bloch sphere is a representation of a qubit, the fundamental building block of quantum computers.
Quantum computing is computing usingquantum-mechanical phenomena, such assuperposition and entanglement.[1] Aquantum computer is a device that performs quantum computing. They are different frombinary digital electronic computers based ontransistors. Whereas common digital computing requires that the data be encoded into binary digits (bits), each of which is always in one of two definite states (0 or 1), quantum computation uses quantum bits, which can be in superpositions of states. Aquantum Turing machine is a theoretical model of such a computer, and is also known as the universal quantum computer. The field of quantum computing was initiated by the work of Paul Benioff (de)[2] and Yuri Manin in 1980,[3]Richard Feynman in 1982,[4] and David Deutsch in 1985.[5] A quantum computer with spins as quantum bits was also formulated for use as a quantum spacetime in 1968.[6]
As of 2017, the development of actual quantum computers is still in its infancy, but experiments have been carried out in which quantum computational operations were executed on a very small number of quantum bits.[7] Both practical and theoretical research continues, and many national governments and military agencies are funding quantum computing research in additional effort to develop quantum computers for civilian, business, trade, environmental and national security purposes, such as cryptanalysis.[8] A small 16-qubit quantum computer exists and is available for hobbyists to experiment with via the IBM quantum experience project. Along with the IBM computer a company called D-Wave has also been developing their own version of a quantum computer that uses a process called annealing.[9]
One of the more fascinating developments in the Quantum Computing field is the fact that in December of 2017, Microsoft released a preview version of a "Quantum Development Kit".[10] It includes a programming language Q# that they have developed which can take advantage of the unusual and potentially incredible power of the Quantum Computer. The remarkable thing about this release is that there does not exist yet a quantum computer that can run programs beyond the trivial ones that can be run on the very small, experimental IBM and D-Wave quantum computers described in the previous paragraph. So as Lewis D. Eigen has said, "We can write, debug, and perfect quantum programs that not only cannot be run today, but might not be able to be run for a decade or more. The fact that Microsoft has invested substantial resources in a new language, and many programmers will actually write programs in that language that cannot now be run, is an indication of the enormous theoretical potential of quantum computing and the faith that someone soon will find a way to construct a large scale quantum computer. This is the future of computing, and Microsoft has given new meaning to the phrase 'getting ahead of the curve'." [11] To allow the "quantum programmers" to debug and see the results of their quantum programs, Microsoft has programmed a normal modern computer to behave like and simulate a quantum computer only it does not have the incredible speed that the real quantum computer will have when created. So quantum programmers will be creating their programs in "slow motion" until the development for which so many are awaiting.
Large-scale quantum computers would theoretically be able to solve certain problems much more quickly than any classical computers that use even the best currently known algorithms, like integer factorization using Shor's algorithm or thesimulation of quantum many-body systems. There exist quantum algorithms, such asSimon's algorithm, that run faster than any possible probabilistic classical algorithm.[12]A classical computer could in principle (withexponential resources) simulate a quantum algorithm, as quantum computation does not violate the Church–Turing thesis.[13]:202 On the other hand, quantum computers may be able to efficiently solve problems which are not practically feasible on classical computers.
In a world where we are relying increasingly on computing, to share our information and store our most precious data, the idea of living without computers might baffle most people.
But if we continue to follow the trend that has been in place since computers were introduced, by 2040 we will not have the capability to power all of the machines around the globe, according to a recent report by theSemiconductor Industry Association.
To prevent this, the industry is focused on finding ways to make computing more energy efficient, but classical computers are limited by the minimum amount of energy it takes them to perform one operation.
This energy limit is named after IBMResearch Lab's Rolf Landauer, who in 1961 found that in any computer, each single bit operation must use an absolute minimum amount of energy. Landauer's formula calculated the lowest limit of energy required for a computer operation, and in March this year researchers demonstrated it could be possible to make a chip that operates with this lowest energy.
It was called a "breakthrough for energy-efficient computing" and could cut the amount of energy used in computers by a factor of one million. However, it will take a long time before we see the technology used in our laptops; and even when it is, the energy will still be above the Landauer limit.
This is why, in the long term, people are turning to radically different ways of computing, such as quantum computing, to find ways to cut energy use.
What is quantum computing?
Quantum computing takes advantage of the strange ability of subatomic particles to exist in more than one state at any time. Due to the way the tiniest of particles behave, operations can be done much more quickly and use less energy than classical computers.
In classical computing, a bit is a single piece of information that can exist in two states – 1 or 0. Quantum computing uses quantum bits, or 'qubits' instead. These are quantum systems with two states. However, unlike a usual bit, they can store much more information than just 1 or 0, because they can exist in any superposition of these values.
"Traditionally qubits are treated as separated physical objects with two possible distinguishable states, 0 and 1," Alexey Fedorov, physicist at the Moscow Institute of Physics and Technology told WIRED.
"The difference between classical bits and qubits is that we can also prepare qubits in a quantum superposition of 0 and 1 and create nontrivial correlated states of a number of qubits, so-called 'entangled states'."
D-Wave
A qubit can be thought of like an imaginary sphere. Whereas a classical bit can be in two states - at either of the two poles of the sphere - a qubit can be any point on the sphere. This means a computer using these bits can store a huge amount more information using less energy than a classical computer.

Advances in quantum computing

Last year, a team of Google and Nasascientists found a D-wave quantum computer was 100 million times faster than a conventional computer. But moving quantum computing to an industrial scale is difficult.
IBM recently announced its Q division is developing quantum computers that can be sold commercially within the coming years. Commercial quantum computer systems "with ~50 qubits" will be created "in the next few years," IBM claims. While researchers at Google, in Naturecomment piece, say companies could start to make returns on elements of quantum computer technology within the next five years.
Computations occur when qubits interact with each other, therefore for a computer to function it needs to have many qubits. The main reason why quantum computers are so hard to manufacture is that scientists still have not found a simple way to control complex systems of qubits.
Now, scientists from Moscow Institute of Physics and Technology and Russian Quantum Centre are looking into an alternative way of quantum computing. Not content with single qubits, the researchers decided to tackle the problem of quantum computing another way.
"In our approach, we observed that physical nature allows us to employ quantum objects with several distinguishable states for quantum computation," Fedorov, one of the authors of the study, told WIRED.
The team created qubits with various different energy "levels", that they have named qudits. The "d" stands for the number of different energy levels the qudit can take. The term "level" comes from the fact that typically each logic state of a qubit corresponds to the state with a certain value of energy - and these values of possible energies are called levels.
"In some sense, we can say that one qudit, quantum object with d possible states, may consist of several 'virtual' qubits, and operating qudit corresponds to manipulation with the 'virtual' qubits including their interaction," continued Federov.
"From the viewpoint of abstract quantum information theory everything remains the same but in concrete physical implementation many-level system represent potentially useful resource."
Quantum computers are already in use, in the sense that logic gates have been made using two qubits, but getting quantum computers to work on an industrial scale is the problem.
"The progress in that field is rather rapid but no one can promise when we come to wide use of quantum computation," Fedorov told WIRED.
Elsewhere, in a step towards quantum computing, researchers have guided electrons through semiconductors using incredibly short pulses of light.
These extremely short, configurable pulses of light could lead to computers that operate 100,000 times faster than they do today. Researchers, including engineers at the University of Michigan, can now control peaks within laser pulses of just a few femtoseconds (one quadrillionth of a second) long. The result is a step towards "lightwave electronics" which could eventually lead to a breakthrough in quantum computing.

Quantum computing and space

A bizarre discovery recently revealed that cold helium atoms in lab conditions on Earth abide by the same law of entropy that governs the behaviour of black holes.
The law, first developed by ProfessorStephen Hawking and Jacob Bekenstein in the 1970s, describes how the entropy, or the amount of disorder, increases in a black hole when matter falls into it. It now seems this behaviour appears at both the huge scales of outer space and at the tiny scale of atoms, specifically those that make up superfluid helium.
"It's called an entanglement area law,” explained Adrian Del Maestro, physicist at the University of Vermont. "It points to a deeper understanding of reality” and could be a significant step toward a long-sought quantum theory of gravity and new advances in quantum computing