Quantum computing - advertising or hyper?
Just looking at the results from one site, I would say that this is not so impossible:
http://arstechnica.com/journals/science.ars/2008/03/28/encoding-more-than-one-bit-in-a-photon
http://arstechnica.com/journals/science.ars/2008 / 10/28 / scalable-quantum-computing-in-the-next-5-years
http://arstechnica.com/news.ars/post/20080729-finding-lost-qubits.html
http://arstechnica.com /news.ars/post/20080509-new-quantum-dot-logic-gates-a-step-towards-quantum-computers.html
http://arstechnica.com/news.ars/post/20080626-three-dimensional- qubits-on-the-way.html
http://arstechnica.com/news.ars/post/20080527-molecular-magnets-in-soap-bubbles-could-lead-to-quantum-ram.html
For a more technical overview of why it's not as difficult as it used to be, there is a four-part series on self-correcting quantum computers:
http://scienceblogs.com/pontiff/2008/08/selfcorrecting_quantum_compute.php
source to share
Quantum computing has not gone far beyond the "idea" stage. Sure, they can multiply two 2-bit integers, but it takes a dozen gradient students a week to get set up to run, and another week to check the results.
Long-term, it has probably gained a lot of potential, although it may never be stable enough for use outside the tightly controlled laboratory environment of a supercomputer.
At this point, I would classify it more as physics than computer science. In a way, itโs like Charles Babbage took his hands on one of Michael Faradayโs papers and started thinking about perhaps, perhaps someday, being able to use electromagnetism as a basis for calculation.
Much has been written in Scientific American over the past few years about Quantum Computing, mostly with the primary researchers themselves: http://www.sciam.com
source to share
Error correction and loss of consistency are big problems in quantum computing, as I understand it. A lot of smart people find it difficult at work to solve these problems, but finally I read it, it looked like the error correction requirements could be exponential over the number of qubits, which actually distracts from "we will solve NP problems in an instant!" attraction of quantum computing.
source to share
Nice, I am getting reusable answer from another SO question one word at a time. :)
Several answers mention quantum computers as if they are still distant in the future, but I ask you to distinguish.
There were vague references to the possibility of quantum computers in the 1970s and 1980s (see timeline on Wikipedia), but the first "working" 3-curve NMR quantum computer was built in 1998. The field is still in its infancy, and almost all of the advances are still theoretical and academic, but in 2007 a company called D-Wave Systems unveiled a prototype of a working 16-qubit, then over a 28-qubit adiabatic quantum computer. Their efforts are notable because they claim their technology is commercially viable and scalable. As of 2010, they have 7 drilling rigs, the current generation of their chips has 128 qubits. They seem to be partnering with Google to find interesting challenges to test their hardware.
I recommend this short 24 minute video and Wikipedia article on D-Wave for a quick overview, and there are a lot more resources out there on this blog , written by the founder and CFO of D-Wave.
source to share