D-Wave does a nice dog-and-pony show, but I won't be cheering until they actually do one of the long-promised "holy grail" things that quantum computers are supposed to be able to do.
Like what? Like running Shor's Algorithm (for integer factorization) on a big integer, in the expected amount of time.
I don't think you can run that Algorithm on adiabatic quantum computer. From what I recall quantum adiabatic chips use hills and valleys optimization, but they aren't generalized quantum comptuer.
So their computer does behave quantum mechanically. But I can't find any evidence that they have shown entanglement, which is necessary to realize the full promise of quantum computing, see e.g. http://www.scottaaronson.com/blog/?p=954#comment-40364
I think they only do adiabatic quantum computing, which is not what most of the people mean when speaking about quantum computing (it does not involve entanglement, only quantum annealing).
Edit: Wikipedia article: http://en.wikipedia.org/wiki/Quantum_annealing
this is correct. for those with a hazy understanding of quantum mechanics- or to lazy to read link- the adiabatic process basically adds an extraneous (kinetic) part to the Hamiltonian and then the commutation of these properties allows for a superior process than thermal annealing. D-Wave folks would like us to believe that this imitates the vast majority of computation improvements expected in QC, and it may yet create some performance improvements.
However, in 'true' QC, there is something very physically different happening; we expect to see universal superposition of Hamiltonian states (entangled vector bases), which allows for some remarkable parallelism in a gate architecture that mirrors programmable (classic) computers. Specifically, we would be able to perform an operation on 2^N different numbers with just one calculation, while in classical computing such a computation would actually require 2^N separate (sequential) iterations. DWave systems, although it has produced some evidence of engtanglement, could not achieve anything on this magnitude of computational efficiency/parallelism. Footnote: as promising as the big picture of recombining said superpositions and 'hacking the multiverse' (a la Deutsche) is, fragility & decoherence of said states is an extraordinary barrier to overcome in order to achieve a 'true' QC in the near term or even medium term (as some critics as Wolfram would argue)...
Your explanation of 'true' QC seems misleading as well. We can (and have) constructed the quantum gates necessary to build a computer with the classical gate structure. It is true that in doing so, if we make the input a superposition of states, then the output will be a superposition of the states that would result from passing in each individual state into a classical computer. However, this is not sufficient for us to actually take advantage of the massive parralelization.
Lets assume you have a N quibit input, which is a perfect superposition of all 2^N possible states, and as many internal quibits as you want. You have some boolean function, f(n), that is true for exactly 1 number (and that number fits in N bits). If you tried applying programming your QC to apply f(n) to the superposition, then after you do that, you will have a superposistion of 2^N-1 0s, and one 1, all with an equal probability. When you go to measure it, you will read either 0, or 1. Not very usefull.
You can fix this problem using N+1 quibits, and returning the input number along with the answer. Now, you have 1 state where the 'answer' quibit is true. When you go to measure the system, you have a 1/(2^N-1) probability of seeing that state. If you see another state, then you have collapsed the superposition and need to redo the computation. This is equivalent to just randomly picking an input in the first place.
Quantom mechanics does not allow a way of taking a system and making one state more probable based on its value (IE. guaranteeing that the one you measure is the one marked 'true'). However, their are some quantum mechanical mechanisms that can provide provable improvement.
One such algorithm is Shors, which allows efficient factorization of prime numbers.
In my opinion, their is a far more interesting result I have seen, which is a general algorithm that can take any problem with a solution space of 2^N, and solve it in 2^(N/2).
Sadly, it has been a while since I dabbled in quantom computing so I do not recall the details of that, and I very well might have gotten stuff wrong to.
I am however becoming convinced that P/=NP is a physical law of the universe.
EDIT: I just read farther down the thread. I think the algorithm I was referring to is Grovers search algorithm which was mentioned by zaptheimpaler.
The folks at D-Wave have published some interesting & perfectly legitimate papers on techniques for optimizing parallel tempering simulations (including on non-quantum computers, but especially helpful if you have a quantum computer handy for quantum PT, which is how they vaguely describe their actual setup). That suggests to me that at the very least they have some legit people who are good at solving stochastic optimization problems & are not fraudulent.
This is false. I guarantee you, scientists would be very interested if they actually demonstrated entanglement. For example, in the slides of the linked talk, the presenter goes over a very general measure of entanglement which allows you to construct after-the-fact a "witness" of entanglement. It is respected theory, and a credible last-resort approach; no numbers. His latest preprint[1] doesn't have a clear claim either. The Newscientist section refers to a conference 2 weeks ago, without published abstracts, procedings, or video. I'll believe it when I see it (there's no indication Aram's quote is in context, but maybe he has).
The first thing any qc lab does when they get their second qubit is make an entangled state and measure Bell Inequality violation. D-wave's lack of interest in this or in running Shor[2] is very telling. Everybody wants to run Shor, it's where the money is.
I remember many people, especially Scott Aaronson, questioned D-Wave back when I was in school. Has anything changed recently regarding the community's opinion?
Scott's visit to D-Wave shifted his opinion. He obviously can't come out and say he changed his mind as a professional skeptic but it comes about as close to saying as much.
This seems to me a very inaccurate and unfair characterization, especially the insinuation that he's dishonestly suppressing his newfound belief in D-Wave because he's a "professional skeptic".
Aaronson is clearly still unconvinced that D-Wave's machine actually does anything "sufficiently quantum" to get any speedup over what you can do classically. He says:
D-Wave does have something today that’s more computationally-useful than a roast-beef sandwich; the question is “merely” whether it’s ever more useful than your laptop.
and:
It remains true, as I’ve reiterated here for years, that we have no direct evidence that quantum coherence is playing a role in the observed speedup, or indeed that entanglement between qubits is ever present in the system.
On the other hand, he is willing to say that D-Wave have indeed made a machine that does Something Interesting, namely giving approximate solutions to a particular mathematical problem.
So it looks to me as if he (1) has changed his mind about the things he's been given actual evidence for, (2) is perfectly willing to say so despite his role "as a professional skeptic", and (3) hasn't so far been convinced that D-Wave have an actual useful quantum computer doing quantum-computer things that you couldn't do just as well with a classical machine. Which is approximately the exact opposite of what you say.
>especially the insinuation that he's dishonestly suppressing his newfound belief in D-Wave because he's a "professional skeptic".
That wasn't my intent; moreso I intended to portray that he is emphasizing, and greatly warranted in this case, scientific skepticism paraphrasing his own terms. I don't think that is the opposite.
There was experimental confirmation of quantum entaglement[1].
Of course this being science, there will be more tests in future and hopefully it will shed some light if it was a hoax or not.
However people need to understand this isn't quantum gate computer (i.e. a quantum version of modern PCs with gates), but adiabatic quantum computer which are designed for purpose of solving a particular problem (that's what I gather from that article).
Putting the cost of the machine aside, would a computer like this be able to mine coins at high speeds using relatively little electricity. Seems like it could crush the market with inflation.
It's an interesting technical question: analogous to the quantum fourier transform, is there an algorithm only quantum machines can run that computes sha-256 hashes faster? I don't have the answer. (I know sha-256 is O(N) normally.)
As for bitcoin, though, I don't think there's too much to worry about. If a quantum algorithm grants a complexity speed up (or some sort of deal-breaking cryptographic weakness is discovered in sha-256 for that matter), I imagine the network could agree to update to a new protocol using a different hash, if that was the proper solution. If a quantum computer just gave integer-factor speedups such as ASICs are currently doing to GPUs and FPGAs, then I suspect by the time a quantum computer is made that can efficiently mine bitcoins, enough can be made at once that, just as we're seeing with ASICs, quite a lot of miners will grab them (avoiding a 51% issue) and the difficulty will simply rise.
I don't think it will work like that. As I repeated ad nauseam above, this is an adiabatic quantum computer, not gate quantum computer. Here is a quote:
> Computers using D-Wave's devices will always
> differ from gate-based quantum computers in one key way:
> D-Wave's devices do not lead to universal computers that
> can be programmed with any algorithm, as an ordinary computer can.
> Instead, says D-Wave's Colin Williams, their chips focus
> on solving a particular class of optimisation problems.
> "That's such an important class that commercially it is
> a much better strategy than trying to make a universal machine."
Perhaps the problem of bitcoins can be reduced to hill/valley optimization, but I don't think that's how bitcoin works.
> It could be possible, for example, to tell instantly how the millions of lines of software running a network of satellites would react to a solar burst or a pulse from a nuclear explosion
Can anyone explain in understandable terms how a quantum computer will help solve this type of problem? Or is the article's author just dumping random affirmation to hype things up?
Theres some truth to it. Despite how revolutionary quantum computing could be, there still aren't a lot of quantum algorithms that are asymptotically faster than their classical counterparts. The two important ones are Shor's factorization, which can factor an integer in O((log n)^3) time, and Grover's Search, which can search an unsorted collection in O(sqrt(n)) time.
Grover's search is pretty amazing though - a lot of problems can be reduced to some form of search in the solution space, and sqrt(n) search will let us say "fuck it, lets just brute-force it" to much bigger search spaces. I could see that being applied to the example you mentioned - searching through a list of possibilities very quickly.
Like what? Like running Shor's Algorithm (for integer factorization) on a big integer, in the expected amount of time.
https://en.wikipedia.org/wiki/Shors_algorithm
(and I will be cheering, because they're based in Burnaby, where I live.)