The Elegance of Uncertainty

I was recently reading yet another botched explanation of Heisenberg’s uncertainty principle, and it ticked me off. It wasn’t a particularly interesting one, so I’m not going disassemble it in detail. What it did was the usual crackpot quantum dance: Heisenberg said that quantum means observers affect the universe, therefore our thoughts can control the universe. Blah blah blah.

It’s not worth getting into the cranky details. But it inspired me to actually take some time and try to explain what uncertainty really means. Heisenberg’s uncertainty principle is fascinating. It’s an extremely simple concept, and yet when you realize what it means, it’s the most mind-blowingly strange thing that you’ve ever heard.

One of the beautiful things about it is that you can take the math of uncertainty and reduce it to one simple equation. It says that given any object or particle, the following equation is always true:

sigma_x sigma_p ge hbar

Where:

  • sigma_x is a measurement of the amount of uncertainty
    about the position of the particle;
  • sigma_p is the uncertainty about the momentum of the particle; and
  • hbar is a fundamental constant, called the reduced Plank’s constant, which is roughly 1.05457173 times 10^{-34}frac{m^2 kg}{s}.

That last constant deserves a bit of extra explanation. Plank’s constant describes the fundamental granularity of the universe. We perceive the world as being smooth. When we look at the distance between two objects, we can divide it in half, and in half again, and in half again. It seems like we should be able to do that forever. Mathematically we can, but physically we can’t! Eventually, we get to a point where where is no way to subdivide distance anymore. We hit the grain-size of the universe. The same goes for time: we can look at what happens in a second, or a millisecond, or a nanosecond. But eventually, it gets down to a point where you can’t divide time anymore! Planck’s constant essentially defines that smallest unit of time or space.

Back to that beautiful equation: what uncertainty says is that the product of the uncertainty about the position of a particle and the uncertainty about the momentum of a particle must be at least a certain minimum.

Here’s where people go wrong. They take that to mean that our ability to measure the position and momentum of a particle is uncertain – that the problem is in the process of measurement. But no: it’s talking about a fundamental uncertainty. This is what makes it an incredibly crazy idea. It’s not just talking about our inability to measure something: it’s talking about the fundamental true uncertainty of the particle in the universe because of the quantum structure of the universe.

Let’s talk about an example. Look out the window. See the sunlight? It’s produced by fusion in the sun. But fusion should be impossible. Without uncertainty, the sun could not exist. We could not exist.

Why should it be impossible for fusion to happen in the sun? Because it’s nowhere near dense or hot enough.

There are two forces that you need to consider in the process of nuclear fusion. There’s the electromagnetic force, and there’s the strong nuclear force.

The electromagnetic force, we’re all familiar with. Like charges repel, different charges attract. The nucleus of an atom has a positive charge – so nuclei repel each other.

The nuclear force we’re less familiar with. The protons in a nucleus repel each other – they’ve still got like charges! But there’s another force – the strong nuclear force – that holds the nucleus together. The strong nuclear force is incredibly strong at extremely short distances, but it diminishes much, much faster than electromagnetism. So if you can get a proton close enough to the nucleus of an atom for the strong force to outweigh the electromagnetic, then that proton will stick to the nucleus, and you’ve got fusion!

The problem with fusion is that it takes a lot of energy to get two hydrogen nuclei close enough to each other for that strong force to kick in. In fact, it turns out that hydrogen nuclei in the sun are nowhere close to energetic enough to overcome the electromagnetic repulsion – not by multiple orders of magnitude!

But this is where uncertainty comes in to play. The core of the sun is a dense soup of other hydrogen atoms. They can’t move around very much without the other atoms around them moving. That means that their momentum is very constrained – sigma_p is very small, because there’s just not much possible variation in how fast it’s moving. But the product of sigma_p and sigma_x have to be greater than hbar, which means that sigma_x needs to be pretty large to compensate for the certainty about the momentum.

If sigma_x is large, that means that the particle’s position is not very constrained at all. It’s not just that we can’t tell exactly where it is, but it’s position is fundamentally fuzzy. It doesn’t have a precise position!

That uncertainty about the position allows a strange thing to happen. The fuzziness of position of a hydrogen nucleus is large enough that it overlaps with the the nucleus of another atom – and bang, they fuse.

This is an insane idea. A hydrogen nucleus doesn’t get pushed into a collision with another hydrogen nucleus. It randomly appears in a collided state, because it’s position wasn’t really fixed. The two nuclei that fused didn’t move: they simply didn’t have a precise position!

So where does this uncertainty come from? It’s part of the hard-to-comprehend world of quantum physics. Particles aren’t really particles. They’re waves. But they’re not really waves. They’re particles. They’re both, and they’re neither. They’re something in between, or they’re both at the same time. But they’re not the precise things that we think of. They’re inherently fuzzy probabilistic things. That’s the source uncertainty: at macroscopic scales, they behave as if they’re particles. But they aren’t really. So the properties that associate with particles just don’t work. An electron doesn’t have an exact position and velocity. It has a haze of probability space where it could be. The uncertainty equation describes that haze – the inherent uncertainty that’s caused by the real particle/wave duality of the things we call particles.

29 thoughts on “The Elegance of Uncertainty

  1. John Armstrong

    I would say it doesn’t even have to do with wave/particle duality, but just with the wave nature of quantum particles. Once you really think about waves, the UP is all but obvious.

    Look at a sine wave, as a solution of the wave equation. It’s got a single, well-defined frequency, and frequency (basically) is momentum. If we take the Fourier transform, we get a delta function — the FT is supported at a single point, which is the frequency. But the wave extends forever in both directions, so there’s no way of containing “where” the wave is.

    On the other extreme, a wave with a single, well-defined position is a delta function at that point. Its Fourier transform is (all but obviously) a sine wave in frequency (momentum) space, so there’s no control over the frequency.

    Everywhere in the middle there’s some trade-off between how “spread out” the wave is in position space and how spread out its FT is in momentum space. And this trade-off can be measured quantitatively in terms of the second moments (statistically: variances), which is the UP.

    Reply
  2. David Furcy

    Interesting post. Thanks!

    PS: You have a typo in your equation: the inequality sign needs to be reversed.

    Reply
      1. dr24hours

        That is a typo which fundamentally changes the nature of the universe. WELL DONE.

        Now: what would happen if that inequality were reversed? Just as violating the Parallel Postulate leads to awesome geometries, what happens when uncertainty is reversed? How does the universe have to change?

        Let’s get all speculative on Uncertainty’s ass.

        Reply
        1. ob

          It’s probably get pretty boring because it’d mean you are a pretty tight range on which the values are defined. Unless you make the plank constant be much, much, much, bigger 😉

          Reply
        2. John Armstrong

          No. Changing the Parallel Postulate is possible because it’s a *postulate*. That is, it’s taken as true without proof. The UP is a theorem, which follows from huge swaths of real and functional analysis. Changing the direction of the inequality entails upending pretty much all of that, and probably the entire universe along with it.

          Reply
          1. UserGoogol

            dr24hours: Broadly speaking I’d say the answer is just indeterminate. As long as the universe is described by wavefunctions, something like the uncertainty principle has to hold. If the uncertainty principle doesn’t hold, then the universe has to be described by some nebulous “something else.” But there’s infinitely many “something elses” to choose from.

            But one rather boring answer would just be a Newtonian universe. In classical physics uncertainty is zero, so trivially it satisfies the inverted uncertainty principle.

  3. Dan Riley

    Though (unlike special relativity) there is still quite a lot of debate about exactly what the uncertainty principle does mean, since it gets tied into the measurement problem…

    Reply
  4. Harald M.

    By coincidence, the important German weekly newspaper “Die ZEIT” had a small notice two weeks ago, where it challenged its readers to explain the uncertainty principle in no more than 100 words.
    Although I am only a physicist by hobby, and the mathematics of quantum physics is way beyond my capabilities, I risked to take part – and I risk to translate my explanation into English here. I hope it is not laughably wrong …

    Electrons and other parts of atoms do not move on distinct trajectories – this can be shown by double-slit experiments. Therefore, such particles do not habe velocity or position!
    However, it is possible to force them to interact with other, larger particles. Through this, one can measure values on the larger particles that have the same effects as velocity or position (velocity, e.g., creates impulse; and position defines the direction of electro-static forces).
    But experiments that measure the “position value” more and more precisely will measure the “velocity value” always with some imprecision; whereas experiments that can measure the “velocity value” more and more precisely will never measure the “position value” exactly.
    In which way these properties are measured: Some of them will be fundamentally imprecise.

    (This English version has of course more than 100 words …). This text is not really my invention: It is my radical compression of the first chapter of the third volume of Landau-Lifshitz’s Course of Theoretical Physics – which after this non-mathematical introduction, of course, goes on to develop the whole necessary mathematical machinery. However, what I like about it is the blunt statement that these particles (Landau actually only talks about electrons – thus, he avoids the assumption that these particles must be like other particles we experience in macroscopic phyiscs) *do not have* velocity and position: Only through interaction with classical particles, we ascribe them properties which have similar effects in the purely classical world that defines what is meant by measurements as that world’s velocity and position.

    That is the reason why I use, in my text, this terms “velocity value” and “position value:” By this, I try to circumvent formulations like your “An electron doesn’t have an exact position and velocity:” One can argue that the notion of exactness is and always has been *in* the terms “position” and “velocity”: A position of (0,0,0) in some coordinate system has always – up to 1926 – made sense; it was never thought to be some like (e,e,e) with |e| < h or the like. Therefore, instead of assuming that position now (after 1926) includes "inexact positions", which requires the qualification "exact position" for all previous "position", I think – with Landau (hopefully) – that one could proceed the other way round: "position" remains that in principle exact location definer; and for the electron we need new terms – in my text, "position value" and "velocity value" for the reverse-engineered (from interaction with classical particles) values that behave very similiarly to position and velocity in many formulas and effects.

    But it's just an attempt … and, being common-sense words for a non-common-sense area of the world, it must also fail in some sense.

    Reply
  5. Harald M.

    Another comment: You write “…Here’s where people go wrong. They take that to mean that our ability to measure the position and momentum of a particle is uncertain – that the problem is in the process of measurement….”

    In defence of these people, one must say that, as far as I know, Heisenberg, who *found* the “(Heisenberg) uncertainty principle”, himself in his original publication introduced “language” (as lawyers would say) that can be read like that. Only over the following years – and finally, I believe, by people who showed that a few mathematical properties of the operations involved are sufficient to produce uncertainty; like v.Neumann? -, it became clear that (and how) this is a fundamental property of nature.

    Reply
  6. Anon

    I don’t see how you can reconcile this statement: “Planck’s constant essentially defines that smallest unit of time or space” with the fact that it’s the *product* of uncertainty in space and momentum (velocity) that has a lower bound. In other words, sigma_x can be zero as long as sigma_p is infinite (and vice versa). I understand what you’re trying to say (I think) about Planck’s constant and its relation to the granularity of the universe, but putting it in this way, I feel, is confusing.

    Reply
    1. Timotheos

      “We perceive the world as being smooth. When we look at the distance between two objects, we can divide it in half, and in half again, and in half again. It seems like we should be able to do that forever. Mathematically we can, but physically we can’t!”

      What reasons do we have for thinking that this is true? Why can’t we say that nature is continuous and divisible ad infinitum?

      (Note that I’m not suggesting we can actually divide physical things into Euclidean points, but we could approach that situation as a limit)

      Reply
  7. Alex Vincent

    Could this explain possibly why we’ve not yet had great successes in commercial fusion attempts? We don’t have nearly the amount of hydrogen in a tokamak that a star has…

    Reply
    1. MarkCC Post author

      In short, yes.

      The reason that it’s so hard to generate power with a tokomak is because we need to push the hydrogen
      plasma up to a much higher energy level than the heart of the sun. The core of the sun has a temperature around 15 million degrees celsius. The temperature of the plasma in a Tokomak needs to be around 150 million degrees celsius – 10 times higher than the sun.

      That’s the whole problem: with the technology that they’re using, it takes so much energy to make fusion happen in a small chamber like a tokomak, that even the amount of energy produced by fusion isn’t enough to exceed breakeven and actually get usable energy out.

      This is also why real scientists are so skeptical of cold fusion claims. It’s really, really hard to force atoms to fuse. It takes an incredible amount of energy. How are you doing it without being able to explain how you’re overcoming that immense energy barrier?

      Reply
  8. Arthur Dent

    I agree. This must be one of the most mind blowing, bizarre qualities of reality. But then again, it strangely makes sense. Kind of.

    Reply
  9. Peter

    Back to your typo with the sign. Wouldn’t the matter underlying the “Heisenberg certainty” be the ideal candidate for the dark matter? Leaving it nearly unable to move in common sense within our space-time means very limited possibilities to interact with the rest of the world – except gravitationally.

    Reply
    1. MarkCC Post author

      Problems like the nature of dark matter aren’t things that can be solved with a one-sentence plain-english explanation.

      If you had particles that didn’t have particle/wave duality, that wouldn’t necessarily mean that they wouldn’t interact with “bright” matter via electromagnetic and nuclear forces. Dark matter needs to be stuff where there’s a reason it interacts gravitationally with bright matter, but not via the other forces, and not in energetic ways that produce anything that could affect bright-matter.

      Futzing with uncertainty doesn’t even begin to explain that.

      Reply
  10. Chip

    So the cat is both dead and alive – we don’t know? Surely the cat is too large for Plank’s constant to matter. And the cat certainly doesn’t do much spinning, live or dead.

    I’m guessing Schrodinger didn’t have a dog. If he’d had a dog, it would sense whether the cat were live or dead… bark if still alive, and walk off bored if the cat were dead. Dogs get right to the heart of it.

    So string theory is important because cats play with string??

    Reply
    1. MarkCC Post author

      Schrodinger’s cat is an illustrative allegory, and it has absolutely nothing to do with the uncertainty principle.

      Reply
  11. Pingback: ScienceSeeker Editors’ Selections: November 17-23, 2013 | ScienceSeeker Blog

  12. eric

    The HUP also bears on the question of “how can something come from nothing” and quantum foam. For any (finite) patch of space, the energy cannot be exactly zero because then the deltaP of any particle contained in that space would be zero.

    Reply
  13. Chris P. Cogan

    One way to think of mutual uncertainty between two types of measurement is to suppose that the “particle” has a limited amount of information for both types of measurement, so that “measuring” one (momentum, say) reduces the information available for the other. This is a bit like the trick of using one 64-bit number to store, as a single sum, the information as to the value of two measurement-distinct values. Then, measuring one of the values “fixes” it, removing that value from the sum, so that measuring the other value produces a less-precise value. I’m not, of course, saying that this way of viewing the matter is true, but it’s one way to sorta-kinda think about it.

    Another, more realistic way is to suppose that, for each pair of mutually uncertain attributes, there is really an attribute that is neither of those attributes, and that the measurement process in effect, produces the attribute being measured, and that the degree of certainty of that measurement means that the other type of measurement has less of that core attribute (whatever it is) to be measured in the other form.

    Again, let me emphasize that I’m not saying that this is true, but only that it is one way to think about this stuff in a way that is a little more intuitively reasonable than merely talking about the measured attributes as if they were truly distinct. If they are, in some way, measuring some deeper attribute but in a form determined by the type of measurement process we are using, then at least one level of counter-intuitiveness is (slightly) reduced.

    In addition, a similar view can be made of entanglement between widely separated particles. If two entangled particles are viewed as a single system with some common basis, then it makes some sense that measuring an attribute of one of them would affect the other one.

    Think of a single coin resting on a horizontal piece of glass with a video camera looking down on it and another looking up at the coin from beneath the glass. Then suppose that the feed from each camera appears on one of two widely-separated tv monitors. Now, suppose we pick up the coin and flip it so it lands on the glass. Now, one “measurement” (via one camera) will be heads and the other (via the other camera) will be tails. If we only look at the monitors, it appears as if there are two coins that magically always show the opposite of each other, when in fact, it is really only one coin being viewed in in two different ways.

    That is, if we think of two entangled “particles” as really being one thing seen in two ways, then some (and only some) of the weirdness goes away. This view doesn’t really help with nonlocality, since, from our point of view, we are actually interacting with two separate and real particles, not merely viewing one particle on separated tv monitors. For nonlocality, we need to suppose that, in some sense, the two particles are connected via non-spatial means (or via some very peculiar spatial means, at least). That is, we need to think of the two particles as being somehow next to each other in a way that is outside of (or “behind”) normal three-dimensional space, so that interacting with one of an entangled pair affects the other one directly.

    I find the weirdness of nonlocality and entanglement is fascinating, and far more important than many physicists seem to suppose. I suppose this apparent lack of deep interest in these bits of weirdness is partly the result of the silly notion of Feynman’s that we shouldn’t try to understand quantum mechanics but simply to calculate. Such a view cripples research, by, in effect, telling scientists that they should “move along; there is nothing here to see.” Potentially, the very most important aspects of physical reality at the subatomic level are the very things that Feynman and others tell us not to bother trying to understand. For example, actually understanding these things could conceivably lead to the GUT that physicists talk of, or at least to radical advances in quantum computing.

    (To be fair to Feynman, he gave a dictum about understanding and calculating at a time when QM was younger, and when the hope of achieving actual understanding of the facts giving rise to the weirdness could have seemed not fundamentally wrong but merely premature. On the other hand, Bohm and others had made some progress in this area, so maybe I’m being “too fair” to Feynman (who I greatly admire despite the dictum I’m criticizing).)

    Please, don’t anyone think I’m saying that any of the above-described ways of thinking about these things is true. They are merely ways of looking at what we know in such a way as to make QM seem less weird and to encourage the view that maybe understanding the weirdness may be possible after all. Quite likely, the truth is very different, but these metaphors or analogies can give us a starting point (what could somehow be like these things and yet actually the way reality is?).

    And, I hope it goes without saying that none of what I have in mind here is intended to contradict the observational, predictive, and mathematical aspects of current QM. But, I hope that I may be encouraging the search for a deeper theory that will make everything clear (yeah, right) — while very possibly revealing even deeper weirdness, I suppose.

    Reply
  14. David Rutter

    “Look out the window. See the sunlight?”

    Hmm, so your average reader isn’t staying up all night to read math blogs on the internet? Your average reader actually sleeps at /night/?

    Reply
  15. Adam

    You’re wrong in saying that Planck’s constant defines a smallest unit of time or a smallest unit of space. In fact, in non-relativistic quantum mechanics (where position is an operator, and the uncertainty principle for position and momentum makes sense to assert), space is continuous. For that matter, the uncertainty principle by itself does not give a lower bound on how sharply position can be determined—in direct contradiction to the claim that space is granular.

    Reply
  16. Gibbon1

    So this is the more interesting thing I read this year. I’ll take a stab at an explanation.

    First lets ditch measurement. Too confusing. Lets talk ‘fuzzy’, a particle’s position is fuzzy and the _momentum_ is ‘fuzzy’ How fuzzy? Well it turns out that depending on a particles environment either the momentum or the position’s fuzziness can be very small, but the product of the two fuzziness’s is always greater than planks reduced constant. So as the momentum uncertainly gets smaller, the position gets fuzzier.

    Leading to the factoid: In the sun because of the density, a hydrogen atoms momentum is constrained, constrained enough the the position gets fuzzy enough for protons to occasionally tunnel past their respective charges and be captured by the strong nuclear force. And *bam* a photon appears.

    Reply

Leave a Reply