Senin, 12 Mei 2014

Quantum Computation



Quantum computing is the area of study focused on developing computer technology based on the principles of quantum theory, which explains the nature and behavior of energy and matter on the quantum (atomic and subatomic) level. Development of a quantum computer, if practical, would mark a leap forward in computing capability far greater than that from the abacus to a modern day supercomputer, with performance gains in the billion-fold realm and beyond. The quantum computer, following the laws of quantum physics, would gain enormous processing power through the ability to be in multiple states, and to perform tasks using all possible permutations simultaneously. Current centers of research in quantum computing include MIT, IBM, Oxford University, and the Los Alamos National Laboratory.

Quantum entanglement is a quantum mechanical phenomenon in which the quantum states of two or more objects have to be described with reference to each other, even though the individual objects may be spatially separated. This leads to correlations between observable physical properties of the systems. For example, it is possible to prepare two particles in a single quantum state such that when one is observed to be spin-up, the other one will always be observed to be spin-down and vice versa, this despite the fact that it is impossible to predict, according to quantum mechanics, which set of measurements will be observed. As a result, measurements performed on one system seem to be instantaneously influencing other systems entangled with it. But quantum entanglement does not enable the transmission of classical information faster than the speed of light. Quantum entanglement has applications in the emerging technologies of quantum computing and quantum cryptography, and has been used to realize quantum teleportation experimentally.

A qubit is a quantum bit , the counterpart in quantum computing to the binary digit or bit of classical computing. Just as a bit is the basic unit of information in a classical computer, a qubit is the basic unit of information in a quantum computer .In a quantum computer, a number of elemental particles such as electrons or photons can be used (in practice, success has also been achieved with ions), with either their charge or polarization acting as a representation of 0 and/or 1. Each of these particles is known as a qubit; the nature and behavior of these particles (as expressed in quantum theory ) form the basis of quantum computing. The two most relevant aspects of quantum physics are the principles of superposition and entanglement .

A quantum gate or quantum logic gate is a rudimentary quantum circuit operating on a small number of qubits. They are the analogues for quantum computers to classical logic gates for conventional digital computers. Quantum logic gates are reversible, unlike many classical logic gates. Some universal classical logic gates, such as the Toffoli gate, provide reversibility and can be directly mapped onto quantum logic gates. Quantum logic gates are represented by unitary matrices. The most common quantum gates operate on spaces of one or two qubits. This means that as matrices, quantum gates can be described by 2 x 2 or 4 x 4 matrices with orthonormal rows.

Shor's algorithm is a quantum algorithm for factoring a number N in O((log N)3) time and O(log N) space, named after Peter Shor. The algorithm is significant because it implies that RSA, a popular public-key cryptography method, might be easily broken, given a sufficiently large quantum computer. RSA uses a public key N which is the product of two large prime numbers, One way to crack RSA encryption is by factoring N, but with classical algorithms, factoring becomes increasingly time-consuming as N grows large; more specifically. no classical algorithm is known that can factor in polynomial time. But Shor's algorithm can crack RSA in polynomial time.  Like many quantum computer algorithms, Shor's algorithm is probabilistic and It gives the correct answer with high probability, and the probability of failure can be decreased by repeating the algorithm. Shor's algorithm was discovered in 1994 by Peter Shor, but the classical part was known before. it is credited to G. L. Miller. Seven years later, in 2001. it was demonstrated by a group at IBM, which factored 15 into 3 and 5, using a quantum computer with 7 qubits.