What started me on this whole complexity theory series was a question in

the comments about the difference between quantum computers and classical

computers. Taking the broadest possible view, in theory, a quantum computer

is a kind of non-deterministic machine – so in the best possible case,

a quantum machine can solve NP-complete problems in polynomial time. The set

of things computable on a quantum machine is not different from the set of

things computable on a classical machine – but the things that are *tractable* (solvable in a reasonable amount of time) on a quantum

computer may be different.

The current proposed designs for quantum computers fall well short of being general non-deterministic machines – so the set of problems that are

tractable on a quantum automaton (QA) may not be as large as the set of problems tractable on a general non-deterministic automaton (NDA).

To try to understand what we can do with a QA, there’s a complexity class

BQP – bounded probability quantum polynomial – to represent the things

computable in polynomial time using a quantum computer. BQP is basically the equivalent of P for conventional computers – it is part of the nature of the quantum machine that its programs are *always* probabilistic.

To really understand what BQP means, we need to take a moment to look at what a quantum computer is, in contrast to a conventional computer. The canonical example of a conventional computer for the purposes of computation theory is the turing machine. To make the contrast between the quantum automaton and the turing machine clearer, we’ll use a restricted definition of the Turing machine called a binary Turing machine. (The binary Turing machine is equivalent in power to the general Turing machine; in fact, some people use this restricted definition as the canonical definition of a TM.)

A binary Turing machine consists of:

- A
*tape*, which is an unbounded sequence of numbered cells,

each of which may contain either a one or a zero. - A
*tape head*, which is positioned over some tape cell at any

point in time, and in any computation step, can read the value of a cell, write a new value into the cell, and move one step in either direction. - A
*state*consisting of a finite set of binary registers r_{1},…,r_{n}. - A
*state transition function*λ : (r_{1}…r_{n}) × (0|1) → (r’_{1}…r’_{n}) × (0|1) × (left|right), which takes the current state and the symbol under the tape head as input, and produces a new state, a symbol to be written to the tape cell, and a direction to move the head.

A computation starts with some input written onto the tape, with the tape head positioned at cell zero. Each step, it invokes the transition function,

and updates the state, tape cell value, and head position based on the result. It continues to do this until it enters the state where all registers contain the value 0, at which point it halts. When the machine halts, the *result* of the computation is the value on the tape.

A basic quantum machine is very similar – it also has a tape, a register

set, and a transition function – but they’re all probabilistic. Instead of a

quantum register/tape cell containing a specific value of one or zero, it contains a probability function describing the quantum state of the register – effectively specifying the probability of seeing a *one* when the value of the tape cell/register is *observed*; and instead of the transition function specifying a transition from a fixed state to a fixed state, it specifies how the *probability function* for the registers/tape cell varies.

The result is something *similar* to BPP, but not quite the same. In BPP, you get to flip a coin to choose a path. But once you’ve chosen a path, you’re committed to that path. If that path turns out to be a dead end, there’s no way to get back the time you spent on the dead end. You’re committed to it, and you’ve already paid for it. In contrast, on a quantum machine, everything is happening in a sort of probability haze – until the end of the computation when you observe the result, it’s not committed to a *single* path. It doesn’t have to set the value of a register to zero; it can effectively just change the probability function associated with that register; if a computation is a dead end, its contribution to the probability function will *fade out* of the quantum state of the register.

But it’s not all good news for the quantum machine. Unfortunately, here’s where the direct correspondence between the quantum machine and the classical turing machine breaks down. In general, a quantum machine is effectively *linear*. You don’t get to do something like a conditional in a quantum computation: at least for the technologies we’re looking at so far, they execute a *linear* set of instructions. So you can’t branch. If you want to explore multiple paths, you need to encode the paths together in the probability functions. So effectively, the while the quantum transition function does get to alter the state and the tape cell, it does *not* have a single fixed transition function: each transition function has a specific successor, and no matter what happens to the tape and the registers,

the transition function has to go through a specific sequence of values for the computation. So instead of a single transition function λ, the quantum machine has a sequence of functions {λ_{1},…,λ_{n}}, which must all be used in sequence during the computation. So, effectively, you can do small local

branches – that is, choices within the duration of a single transition function; but you can’t make choices the way you could with a non-quantum machine.

So what’s the difference between BPP and BQP? BQP allows the *limited* exploration of multiple paths, instead of requiring an

a priori commitment to one of them.

In terms of the complexity hierarchy, P ⊆ BPP ⊆ BQP ⊆ PP. So BQP is *at least* as powerful as BPP, and possibly better – but it’s bounded from above by PP. And in fact, PP is sufficiently different from BQP that being able to get the solution to a BQP problem in *constant* time doesn’t grant *any* additional capability to PP. (In fact, according to wikipedia, being able to solve BQP problems in *zero* time doesn’t give any additional capability to PP; but I haven’t seen the proof of that.) So from what we know, BQP is very close to BPP; if P!=NP (as many of us expect), then BQP may be slighty larger that BPP, but no larger than PP.

Dan Vanderkam“n the best possible case, a quantum machine can solve NP-complete problems in polynomial time.” => At best, it could actually do anything in NP in polynomial time, right? There’s no need for the complete part, that just makes things easier.

Blake StaceyFun reading on the same topic: Computational Complexity and the Anthropic Principle by Scott Aaronson.

Also this:

Nice!

Jonathan Vos PostMy mentor and coauthor Richard P. Feynman was not trying to analyze quantum computing when he invented Feynman integrals. Chronologically, he FIRST invented QED, and then, years later, became the greatgrandfather of quantum computing.

Dr. George Hockney, formerly at FermiLab, and then researching Quantum Computing at JPL, contends that Feynman’s specific approach to quantum computing has proved to be wrong, but that doesn’t matter, as Deutch and other grandfathers of quantum computing began their research because of Feynman’s pioneering.

In the same way, Feynman is the greatgrandfather of Nanotechnology. It doersn’t matter if the approach he describes in “There’s Plenty of Room at the Bottom” works; what matters is that about half of the grandfathers of nanotechnology began their research because of Feynman’s pioneering.

Mark C. Chu-CarrollDan:

The reason I said NP-complete was because I thought it was clearer than something like “problems in NP for which we don’t know polynomial time solutions.” NP-complete is a set of hard problems for which we don’t know P-time solutions, and which most people who read this article are familiar with. So it was just an easy way of being a tad more concise.

nikitaIsn’t

binary Turing machineusually called “Post Machine“? Also, the phrase “Unfortunately, here’s where the direct correspondence between the quantum machine and the classical turing machine.” seems to miss a verb at the end.Blake StaceyNikita:

No, no, you’re supposed to phrase that correction like this: This sentence no verb. (-;

Well, OK, unless you count the “‘s”, which naturally I won’t, so there.

Blake StaceyJonathan Vos Post said, in part:

I think we’re all tolerably familiar with the chronology of QED (or at least those of us who know enough about it to even parse Aaronson’s statement). Hence the joke Aaronson makes: “From my perspective, …”

Blake StaceyBut wait, there’s more!

This sentence needs to be corrected:

Somebody needs to multiply all those &lamba;s through by

d.Cat loverSomebody needs to multiply all those &lamba;s through by d.You’re assuming that

dcommutes witha;. Actually, you need to multiply by(a;) -1 d.(Did I mention I was a math nerd?)

Mark C. Chu-Carrollnikita:

The post machine is slightly different. The binary machine is sometimes called a Post-Turing machine, but I prefer binary turing, because to me, “Post-Turing” sounds like we’re talking about some kind of post-modern model of computation. 🙂

Thanks for the corrections, I’ve fixed the ones you pointed out.

Mark C. Chu-CarrollBlake:

You know what I really hate? When people tell the kind of jokes that make

melaugh, but when I tell them to anyone else, I just get a confused blank stare. 🙂John ArmstrongDoesn’t this assume C-T? I still don’t see

anyproof that C-T really holds beyond that we haven’t yet found anything that goes beyond it, and definitely no proof thatallquantum algorithms are implementable on a classical Turing machine.Mark C. Chu-CarrollJohn:

Yes, this assumes that the Church-Turing thesis is true. But pretty much

allof the work in computational complexity, and most of the work in theoretical computer science is predicated on Church-Turing, so we don’t tend to qualify every statement with “Assuming C-T is true”; instead, we specifically state when wearen’tassuming C-T.Michael RalstonAlso, simply assuming that CT isn’t true in and of itself doesn’t tell us all that much.

We still have to come up with models to describe what we’re doing, and most of them are, well … uninteresting, IMO.

CoinMarkCC, thanks so much! This is incredibly helpful and clears up a bunch of stuff I’ve been trying to work out for a long time.

I do have a couple unrelated random questions/comments:

1. You actually might want to cut&paste some chunk of this article into the wikipedia article on Quantum Turing Machine, as that article is like five lines long and gives no specifics.

2. I’m trying to understand clearly this idea of the quantum machine you describe being “linear”. While it seems clear what this would mean in the context of the real-world qubit-based systems we know how to build, I’m wondering what it means in the context of the quantum turing machine model you outline here. As I understand the model you’ve described, the tape cells each have a probability of being 1 or 0, and the operations on cells change the probability distributions of those cells. But what about the tape head? That is, is there a tape head, and is it allowed to be “in more than one place at once” probabilistically? Say, if one entry in the state transition function says “go left 50% of the time, right 50% of the time”, then when this executes wouldn’t the hypothetical path where the head moves left and the hypothetical path where the head moves right be roughly analogous to different code branches? Or are the symbols addressed in some other way than a tape head, or is the head not allowed to “smear” the way the cells can, or what? Or is the idea that when you say the execution is “linear”, you just mean that branching effectively cases to have any effect, since whenever you branch all the branches have to be taken at once?

What exactly is the type of the quantum turing machine’s analogue of the state transition function?

3. On a totally different note, in relation to the tangent about Feynman: When I was in the bookstore the other day I ran across a book called “The Feynman Lectures on Computation”. Does anyone know about this book? If anybody’s heard of it, is it worth reading, and would it still be informative or useful to someone who already has a working grasp on the theory of computation? Is the information on quantum computing still kind of relevant despite being from 1984?

4. Post Machine would be a great name for a band.

Mark C. Chu-CarrollCoin:

The tape head in the quantum Turing machine has a fixed position. Binary values – like the value in a tape cell, or a register bit – are in quantum flux, and described by probability functions. The tape head is

not– that should probably have been made clearer in the post.John ArmstrongOkay, CT is a default assumption, but it was formulated well before quantum computation ever came on the scene. I don’t like unproven assumptions (the perils of being a mathematicians), and CT has never really sat well with me. I’m actually hoping that CT turns out to be true for classical computation, and that quantum computation will go beyond it.

MarkkRe: Quantum Computing beyond CT Thesis.

The Church-Turing Thesis is about computability, not efficiency. There is no Quantum model that we have that does not fall under the same conditions as classical computing in regard to Church-Turing. That is, if we can compute something on any quantum computer we know about, we could also compute it on a classical computer given enough resources. If you can come up with a NEW quantum computing model that cannot be emulated (even in beyond exponential time) on a regular Turing machine, then you have got it made and people will remember your name for a long time. So far no one knows about it.

What this means is that whether the Church-Turing Thesis is true or not doesn’t effect Quantum Computing any more than classical – since we can simulate quantum computers (as we know them now) on classical computers. If CT is false then there must be a totally new model of computing that we cannot map to either current model quantum computers or current model classical computers.

Blake StaceyThe

Feynman Lectures on Computationare an interesting read, despite hailing from 1984. He lays out finite state machines, introduces Turing machines, builds a few of them and then constructs a Universal Turing Machine before covering the halting problem. If you’re familiar with all of that already, the more valuable part of the book would be the stuff relating information to thermodynamics. He has a very good section on Maxwell’s Demon, for example.His take on quantum computing was, roughly speaking, “Can we still do computations when our switches are the size of atoms?” Feynman develops a theory which indicates that we could do the same sort of computations on the quantum scale that we do on the macro-scale. Being pre-Shor’s Algorithm and all that jazz, he doesn’t take particular advantage of the new capabilities quantum computing has to offer.

All in all, I’d say that whether the book is worthwhile to buy depends on which ones you have in your library already.