Different Between Normal Computer And Quantum Computer
Imagine a computer with a memory considerably larger than its actual size, a calculator capable of simultaneously managing the exponential array of inputs, a computer calculated in the twilight zone. You think of a quantum computer. It takes relatively few simple concepts in quantum mechanics to render quantum quantum devices. The precision of learning was how to deal with these concepts. Is such a computer inevitable or would it be very difficult to build it?
Through the strange laws of quantum mechanics, Volger, editor of Discover, observe that: “There is an electron, a proton or other subatomic particles” at more than one place at a time “because particles Individuals act like waves, these different places are different states.
What is the problem with quantum computing? Imagine you were in a big office building and you had to get a bag from a randomly chosen office in one of the hundreds of bureaucrats. In the same way, you have to cross the building and open the doors simultaneously to find the bag. The normal computer has to make its way through long chains of 1 and 0 until it reaches the answer. But what if, instead of looking for yourself, you can instantly create as many copies of your own as there were dirty ones in the building, and all versions can simultaneously cast an eye at all desks, and the person who finds the sack is your true, the rest disappears. –
David Deutsch, physicist at Oxford University, argued that it would be possible to build a very powerful computer on the basis of this weird fact. In 1994, Peter Schorr, a mathematician at AT & T Bell Laboratories in New Jersey, had at least theoretically proved that the complete quantum computer could work as many times in a matter of seconds, impossible to reach even the fastest of conventional computers. The epidemic of theories and discussions about the possibility of building a quantum computer are now taking over despite the quantum domains of technology and research.
Its roots go back to 1981, when Richard Feynman pointed out that physicists still seemed to face computational problems when trying to simulate a system in which quantum mechanics occurs. Calculations involving the behavior of atoms, electrons or photons take a lot of time on current computers. In 1985 in Oxford, England, the first description of how quantum computers worked with David Deutsch’s theories appeared. The new device will not only be able to quickly bypass current computers, but it will also be able to perform some logical operations that conventional devices can not.
This research began researching the construction of a device with the advance and additional funding of the AT & T Bell Labs in Murray Hill, New Jersey, and a new member was added to the team. Peter Schorr discovered that quantum computing can dramatically speed up the analysis of integers. It’s more than a step in the small computer technology that can provide information about real-world applications such as encryption.
“There is hope at the end of the tunnel that quantum computers will one day become a reality,” said Gilles Prasard of the Université de Montréal. Quantum mechanics provides an unexpected clarity in the description of the behavior of atoms, electrons and photons at microscopic levels. Although this information is not applicable to everyday home use, it certainly applies to all visible interactions, but the real benefits of that knowledge are just beginning to emerge.
In our computers, printed circuit boards are designed to represent 1 or 0 different amounts of electricity, and the result of one possibility does not affect the other. However, when quantitative theories are presented, a problem arises: the results come from a single device and two distinct, overlapping realities, affecting both results at the same time. However, these problems can become one of the major assets of a new computer if the results can be programmed in such a way that the adverse effects cancel out while the positive effects reinforce each other.
The quantum system must be able to program the equation, verify the calculation and draw conclusions. The researchers looked at many potential systems, one of which involves the use of Seth Lloyd of MIT also.