29-04-2014, 03:25 PM
Quantum Computers
Quantum Computers.docx (Size: 229.63 KB / Downloads: 15)
Introduction
The history of computer technology has involved a sequence of changes from one type of physical realisation to another --- from gears to relays to valves to transistors to integrated circuits and so on. Today's advanced lithographic techniques can squeeze fraction of micron wide logic gates and wires onto the surface of silicon chips. Soon they will yield even smaller parts and inevitably reach a point where logic gates are so small that they are made out of only a handful of atoms; i.e. the size of the logic gates become comparable to the size of atoms. On the atomic scale matter obeys the rules of quantum mechanics, which are quite different from the classical rules that determine the properties of conventional logic gates. So if computers are to become smaller in the future, new, quantum technology must replace or supplement what we have now. The point is, however, that quantum technology can offer much more than cramming more and more bits to silicon and multiplying the clock-speed of microprocessors. It can support entirely new kind of computation with qualitatively new algorithms based on quantum principles! The story of quantum computation started as early as 1982, when the physicist Richard Feynman considered simulation of quantum-mechanical objects by other quantum systems. However, the unusual power of quantum computation was not really anticipated until the 1985 when David Deutsch of the University of Oxford published a crucial theoretical paper in which he described a universal quantum computer. After the Deutsch paper, the hunt was on for something interesting for quantum computers to do. At the time all that could be found were a few rather contrived mathematical problems and the whole issue of quantum computation seemed little more than an academic curiosity. It all changed rather suddenly in 1994 when Peter Shor from AT&T's Bell Laboratories in New Jersey devised the first quantum algorithm that, in principle, can perform efficient factorisation. This became a `killer application' --- something very useful that only a quantum computer could do.
General Concept of Information
To explain what makes quantum computers so different from their classical counterparts we begin by having a closer look at a basic chunk of information namely one bit. A bit is the basic unit of information in a digital computer. From a physical point of view, a bit is a physical system which can be prepared in one of the two different states representing two logical values --- no or yes, false or true, or simply 0 or 1. For example, in digital computers, the voltage between the plates in a capacitor represents a bit of information: a charged capacitor denotes bit value 1 and an uncharged capacitor bit value 0. One bit of information can be also encoded using two different polarizations of light or two different electronic states of an atom. In any of the systems listed above, a bit can store a value of logical 1 or logical 0 using some method which depends on the system used.
Classical computation theory
All the classical computational theory is sequential, step by step procedure oriented. Classical computer believes that if there are a finite number of inputs and a finite step algorithm is there then it is absolutely possible to give a finite output with 100% accuracy. That means all the classical computation model that we have like Universal Turing machine , DNA computing ,Cellular automata etc are can be treated as physical system like a function f(x) in the mathematics . That means it takes a single input at a time or process a single instruction at time and gives a definite output based upon that input. The best example is Turing machine.
Church-Turing principle
A Turing machine can be thought of as finite control connected to a read/write head. It has one tap which is divided into a number of cells.
Each cell can store only one symbol. The input to and the output from the finite state automaton are affected by R/W head which can examine one cell at a time. In one move the machine examines the present symbol under the tape and the present state under the finite automaton in the finite control and based on that it gives a output which can be directly written on the infinite tape that is associated with the Turing machine.
Concept Of Information In Quantum Computer –
THE QUBIT
In quantum computers also, the basic unit of information is a bit. The concept of quantum computing first arose when the use of an atom as a bit was suggested. If we choose an atom as a physical bit then quantum mechanics tells us that apart from the two distinct electronic states (the excited state and the ground state), the atom can be also prepared in what is known as a coherent superposition of the two states. This means that the atom can be both in state 0 and state 1 simultaneously. It is at this point that the concept of a quantum bit or a qubit arises. This concept is the backbone of the idea of quantum computing. For the same reason, let’s see in detail what actually coherent superposition is.
Classical Bit to Quantum Qubit
Classical physics uses flip-flops (flip-flops are made up of transistors) to store one bit of information. Information is stored in the flip-flop in the form of voltage level means in the case of active low high voltage is treated as ‘0’ and low voltage is treated as ‘1’ and for active high the exact reverse .since for maintaining a constant voltage level a continuous power supply is required so primary memory of classical computer needs continuous power consumption. And the information stored in the classical computer is in string format means a string of zeros and ones (like 1110001110100011….) defining the state of a computer.