Monday, December 21, 2015

More off Topic on Quantum Computing

More off Topic on Quantum Computing

The last post on quantum computers got more hits than all my previous posts combined! That shows pretty starkly that people are not interested in commodities. So I'll do one more post on the subject, than go back to what I actually know best.

I read through Google's paper on the D-Wave benchmarks http://arxiv.org/abs/1512.02206. The results really do seem good. Compared to a classical computer running the same type program, the quantum machine was many orders of magnitude faster. Probably more important, the running time increased more slowly than the classical machine as the problem got larger. So far, so good.

Now, the benchmark problem can be solved at about the same speed with special purpose algorithms aimed at that problem. It could be solved even faster with special purpose classical hardware. This type of attack, building the hard and software for one purpose alone, was pioneered by Alan Turing when he built the (possibly) first computer in order to break Nazi codes in WWII. But that's not really a fair comparison. Longer term, it's almost always cheaper to solve problems with a general purpose machine.

It's also important to know that D-Wave is something of an outlier in the world of quantum computer research. Most of the research is going into quantum gate (or circuit) machines. These are truly general purpose, with their qubits a rough analogy to traditional computer bits. IBM, Google itself and many academic institutions are doing this research. Depending on who you ask, these are between eight and twenty years away. A very few people think they will never be practical. There has been a noticeable increase in research funding in the last year or so, which may be telling you something.

A number of the academic researchers dislike the D-Wave approach (also called the "adiabatic" approach). They have two issues. First, there is theory that limits the speedup that adiabatic machines can achieve. This has to do with the speed which the quantum state can be changed without ruining the whole system. Second, D-Wave made the design decision to emphasize the quantity of qubits over the quality. Some think that they will have to reengineer the qubits from the ground up to make it practical.

It's hard to know what to think about this. However, I note that some of these same critics were pooh-poohing that D-Wave ever had any quantum effects. It is now clear that it does. Also, I seem to remember the same type of objections when computers first became widely useful (Yes, I am that old). They argued that IBM was sacrificing quality of design for larger and cheaper. This was especially true for software. Guess who won?

Math geeks only: This also reminds me of the solving of the Poincare Conjecture a few years back. The mainstream math community had a program in place that was working to solve it along established lines. Along came G. Perelman who was an expert in Alexandrov spaces, something of a mathematical backwater. But it was just what was needed to solve the problem!

OK, can one make a buck on any of this stuff? In the last post I mentioned a back door to buying D-Wave. That's probably still the cleanest way. However, Google is quite active in both quantum gate and adiabatic hardware, so that would be an alternate way. IBM appears to be concentrating on gate machines, so the payoff may be further down the road. Microsoft is working on an even more speculative approach, using the topology of certain two-dimensional particles to provide an ultra-stable qubit system.

There are also a fair number of private companies who are doing research in the area, particularly in software that could run on future machines. These might be interesting for angel investors who understand what they are getting into and can afford to lose the entire investment. The hope is that the patents will eventually be bought by some of the big boys once a useful gate machine is developed.

Ok, that's all on this. Next post: back to commodities.

No comments:

Post a Comment

Comments are welcome, although I can't promise to answer every question.