{"id":726,"date":"2008-12-31T08:53:39","date_gmt":"2008-12-31T08:53:39","guid":{"rendered":"http:\/\/scientopia.org\/blogs\/goodmath\/2008\/12\/31\/my-favorite-strange-number-classic-repost\/"},"modified":"2008-12-31T08:53:39","modified_gmt":"2008-12-31T08:53:39","slug":"my-favorite-strange-number-classic-repost","status":"publish","type":"post","link":"http:\/\/www.goodmath.org\/blog\/2008\/12\/31\/my-favorite-strange-number-classic-repost\/","title":{"rendered":"My Favorite Strange Number: &#937; (classic repost)"},"content":{"rendered":"<p><em>I&#8217;m away on vacation this week, taking my kids to Disney World. Since I&#8217;m not likely to<br \/>\nhave time to write while I&#8217;m away, I&#8217;m taking the opportunity to re-run an old classic series<br \/>\nof posts on numbers, which were first posted in the summer of 2006. These posts are mildly<br \/>\nrevised.<\/em><\/p>\n<p> &Omega; is my own personal favorite transcendental number. &Omega; isn&#8217;t really a specific number, but rather a family of related numbers with bizarre properties. It&#8217;s the one real transcendental number that I know of that comes from the theory of computation, that is important, and that expresses meaningful fundamental mathematical properties. It&#8217;s also deeply non-computable; meaning that not only is it non-computable, but even computing meta-information about it is non-computable. And yet, it&#8217;s <em>almost<\/em> computable. It&#8217;s just all around awfully cool. <\/p>\n<p><!--more--><\/p>\n<p>So. What is it &Omega;?<\/p>\n<p> It&#8217;s sometimes called the <em>halting probability<\/em>. The idea of it is that it encodes the <em>probability<\/em> that a given infinitely long random bit string contains a prefix that represents a halting program.<\/p>\n<p> It&#8217;s a strange notion, and I&#8217;m going to take a few paragraphs to try to elaborate on what that means, before I go into detail about how the number is generated, and what sorts of bizarre properties it has.<\/p>\n<p> Remember that in the theory of computation, one of the most fundamental results is the non-computability of <em>the halting problem<\/em>. The halting problem is the question &#8220;Given a program P and input I, if I run P on I, will it ever stop?&#8221; You cannot write a program that reads an arbitrary P and I and gives you the answer to the halting problem. It&#8217;s impossible. And what&#8217;s more, the statement that the halting problem is not computable is actually <em>equivalent<\/em> to the fundamental statement of G&ouml;del&#8217;s incompleteness theorem.<\/p>\n<p>&Omega; is something related to the halting problem, but stranger. The fundamental question of &Omega; is: if you hand me a string of 0s and 1s, and I&#8217;m allowed to look at it one bit at a time, what&#8217;s the probability that <em>eventually<\/em> the part that I&#8217;ve seen will be a program that eventually stops?<\/p>\n<p> When you look at this definition, your reaction <em>should<\/em> be &#8220;Huh? Who cares?&#8221;<\/p>\n<p> The catch is that this number &#8211; this probability &#8211; is a number which is easy to define; it&#8217;s not computable; it&#8217;s completely <em>uncompressible<\/em>; it&#8217;s <em>normal<\/em>.<\/p>\n<p>Let&#8217;s take a moment and look at those properties:<\/p>\n<ol>\n<li> Non-computable: no program can compute &Omega;. In fact, beyond a certain value N (which is non-computable!), you cannot compute the value of <em>any bit<\/em> of &Omega;. <\/li>\n<li> Uncompressible: there is no way to represent &Omega; in a non-infinite way; in fact, there is no way to represent <em>any substring<\/em> of &Omega; in less bits than there are in that substring.<\/li>\n<li> Normal: the digits of &Omega; are completely random and unpatterned; the value of and digit in &Omega; is equally likely to be a zero or a one; any selected  <em>pair<\/em> of digits is equally likely to be any of the 4 possibilities 00, 01, 10, 100; and so on.<\/li>\n<\/ol>\n<p> So, now we know a little bit about why &Omega; is so strange, but we still haven&#8217;t really defined it precisely. What is &Omega;? How does it get these bizarre properties?<\/p>\n<p> Well, as I said at the beginning, &Omega; isn&#8217;t a single number; it&#8217;s a family of numbers. The value of <em>an<\/em> &Omega; is based on two things: an effective (that is, turing equivalent) computing device; and a prefix-free encoding of programs for that computing device as strings of bits.<\/p>\n<p> (The prefix-free bit is important, and it&#8217;s also probably not familiar to most people, so let&#8217;s take a moment to define it. A prefix-free encoding is an encoding where for any given string which is valid in the encoding, <em>no prefix<\/em> of that string is a valid string in the encoding. If you&#8217;re familiar with data compression, Huffman codes are a common example of a prefix-free encoding.)<\/p>\n<p> So let&#8217;s assume we have a computing device, which we&#8217;ll call &phi;. We&#8217;ll write the result of running &phi; on a program encoding as the binary number p as &amp;phi(p).  And let&#8217;s assume that we&#8217;ve set up &phi; so that it only accepts programs in a prefix-free encoding, which we&#8217;ll call &epsilon;; and the set of programs coded in &epsilon;, we&#8217;ll call &Epsilon;; and we&#8217;ll write &phi;(p)&darr; to mean &phi;(p) halts. Then we can define &Omega; as:<\/p>\n<p>&Omega;<sub>&phi;,&epsilon;<\/sub> = <b>&Sigma;<\/b><sub>p &isin; &Epsilon;|p&darr;<\/sub> 2<sup>-len(p)<\/sup><\/p>\n<p> So: for each program in our prefix-free encoding, if it halts, we <em>add<\/em> 2<sup>-length(p)<\/sup> to &Omega;. <\/p>\n<p> &Omega; is an incredibly odd number. As I said before, it&#8217;s random, uncompressible, and has a fully normal distribution of digits. But where it gets fascinatingly strange is when you start considering its computability properties.<\/p>\n<p> &Omega; is <em>definable<\/em>. We can (and have) provided a specific, precise definition of it. We&#8217;ve even described a <em>procedure<\/em> by which you can conceptually generate it. But despite that, it&#8217;s <em>deeply<\/em> uncomputable. There are procedures where we can compute a finite prefix of it. But there&#8217;s a limit: there&#8217;s a point beyond which we cannot compute <em>any<\/em> digits of it. <em>And<\/em> there is no way to compute that point. So, for example, there&#8217;s a very interesting paper where the authors<br \/>\n<a href=\"http:\/\/www.expmath.org\/expmath\/volumes\/11\/11.3\/Calude361_370.pdf\">computed the value of &Omega; for a Minsky machine<\/a>; they were able to compute 84 bits of it; but the last 20 are <em>unreliable<\/em>, because it&#8217;s uncertain whether or not they&#8217;re actually beyond the limit, and so they may be wrong. But we can&#8217;t tell!<\/p>\n<p> What does &Omega; mean? It&#8217;s actually something quite meaningful. It&#8217;s a number that encodes some of the very deepest information about <em>what<\/em> it&#8217;s possible to compute. It gives a way to measure the probability of computability. In a very real sense, it represents the overall nature and structure of computability in terms of a discrete probability.<\/p>\n<p> &Omega; is actually even the basis of a halting oracle &#8211; that is, if you knew the<br \/>\nvalue of &Omega;, then you could easily write a program which solves the halting problem!<\/p>\n<p> &Omega; is also an amazingly dense container of information &#8211; as an infinitely long number and so thoroughly non-compressible, it contains an unmeasurable quantity of information. And we can&#8217;t even figure out what most of it is!<\/p>\n","protected":false},"excerpt":{"rendered":"<p>I&#8217;m away on vacation this week, taking my kids to Disney World. Since I&#8217;m not likely to have time to write while I&#8217;m away, I&#8217;m taking the opportunity to re-run an old classic series of posts on numbers, which were first posted in the summer of 2006. These posts are mildly revised. &Omega; is my [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":false,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[13,79,43],"tags":[],"class_list":["post-726","post","type-post","status-publish","format-standard","hentry","category-classics","category-computation","category-numbers"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_shortlink":"https:\/\/wp.me\/p4lzZS-bI","jetpack_sharing_enabled":true,"jetpack_likes_enabled":true,"_links":{"self":[{"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/posts\/726","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/comments?post=726"}],"version-history":[{"count":0,"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/posts\/726\/revisions"}],"wp:attachment":[{"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/media?parent=726"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/categories?post=726"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/tags?post=726"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}