{"id":589,"date":"2008-01-30T13:26:17","date_gmt":"2008-01-30T13:26:17","guid":{"rendered":"http:\/\/scientopia.org\/blogs\/goodmath\/2008\/01\/30\/what-makes-linear-logic-linear\/"},"modified":"2008-01-30T13:26:17","modified_gmt":"2008-01-30T13:26:17","slug":"what-makes-linear-logic-linear","status":"publish","type":"post","link":"http:\/\/www.goodmath.org\/blog\/2008\/01\/30\/what-makes-linear-logic-linear\/","title":{"rendered":"What makes linear logic linear?"},"content":{"rendered":"<p> Sorry for the lack of posts this week. I&#8217;m traveling for work, and<br \/>\nI&#8217;m seriously jet-lagged, so I haven&#8217;t been able to find enough time<br \/>\nor energy to do the studying that I need to do to put together a solid<br \/>\npost.<\/p>\n<p> Fortunately, someone sent me a question that I can answer<br \/>\nrelatively easily, even in my jet-lagged state. (Feel free to ping me with more questions that can turn into easy but interesting posts. I appreciate it!)<\/p>\n<p> The question was about linear logic: specifically, what makes<br \/>\nlinear logic <em>linear<\/em>?<\/p>\n<p><!--more--><\/p>\n<p> For those who haven&#8217;t ever seen it before, linear logic is based<br \/>\non the idea of resource consumption. Where the normal propositional or<br \/>\npredicate logics that most of us are familiar with are focused around an<br \/>\nidea of <em>truth<\/em>, linear logic is focused on the idea of resource<br \/>\nposession and consumption. In standard propositional logic, if you&#8217;re given<br \/>\nthe proposition &#8220;A&#8221;, that means that A is true, and you can use the truth<br \/>\nof A in as many inferences as you want. In linear logic, if you&#8217;re given<br \/>\nthe proposition &#8220;A&#8221;, that means that you possess one instance of A,<br \/>\nand you can use it, <em>once<\/em>, in an inference.<\/p>\n<p> Think of a conventional logic, like simple propositional<br \/>\nlogic. Imagine that you had the following statements:<\/p>\n<ol>\n<li> A <\/li>\n<li> A &rArr; B<\/li>\n<li> A &rArr; C<\/li>\n<\/ol>\n<p> If you&#8217;re working in normal propositional logic, then you can use<br \/>\nthose statements to infer some more facts. Given &#8220;A&#8221;, and knowing &#8220;A<br \/>\n&rArr; B&#8221;, you can infer B; and given &#8220;A&#8221;, and knowing &#8220;A &rArr; C&#8221;,<br \/>\nyou can infer C. So given those three statements, we can infer that A<br \/>\nand B and C are all true.<\/p>\n<p> In linear logic, if you start with the equivalent of those three<br \/>\nstatements, you can infer <em>either<\/em> B or C, but not both: using<br \/>\n&#8220;A&#8221; in an inferences <em>consumes<\/em> it. So once you use it, it&#8217;s<br \/>\ngone, and you can&#8217;t use it again.<\/p>\n<p> I&#8217;m not going to go into detail about linear logic here; I&#8217;ve written<br \/>\nabout it before, and if you&#8217;re interested, you can <a href=\"http:\/\/scientopia.org\/blogs\/goodmath\/2006\/07\/linear-logic\">go look at that post.<\/a> I&#8217;m just going<br \/>\nto quickly walk through one simple example of how it&#8217;s used, and then go back to<br \/>\nthe original question.<\/p>\n<p> Pretty much every introduction to linear logic that I&#8217;ve ever seen uses the same<br \/>\nfirst example: a vending machine. Suppose you want to describe how a vending machine<br \/>\nworks. For example, if I put 6 quarters into a soda machine, I get a soda. The key thing<br \/>\nthat I want to capture is when I get the soda, the quarters are <em>gone<\/em>. I had<br \/>\nto give them up to get back the soda. In linear logic, I can write that as an<br \/>\nimplication:<\/p>\n<p>Quarter &otimes; Quarter &otimes; Quarter &otimes; Quarter &otimes; Quarter &otimes; Quarter -o Soda<\/p>\n<p> &#8220;&otimes;&#8221; means simultaneous possession: &#8220;A &otimes; B&#8221; means &#8220;I have an A and a B at the same time&#8221;; &#8220;-o&#8221; is linear implication, &#8220;A -o B&#8221; means &#8220;I can give up an A to get a B&#8221;. So the expression above means that if I have six quarters, I can give them up and get a soda.<\/p>\n<p> Try to think about what you&#8217;d need to do in conventional predicate<br \/>\nlogic to say that. You&#8217;d need to introduce a notion of counting, and addition and subtraction,<br \/>\nand possession, and consumption. It would be quite difficult &#8211; and the result would be a lot<br \/>\nharder to analyze, because it would rely on at least some amount of arithmetic. Linear logic makes things like that really easy.<\/p>\n<p> It should also be pretty obvious why computer scientists &#8211; and in<br \/>\nparticular, programming language designers &#8211; are so interested in<br \/>\nlinear logic. In implementing software, managing resources is always a<br \/>\nmajor issue. You can use linear logic in tools that analyze memory<br \/>\nusage, or management of resources like files. You can use linear<br \/>\nlogic-based analysis and inference in a functional language compiler<br \/>\nto figure out when you can convert an allocation into an in-place<br \/>\nupdate. It&#8217;s incredibly useful.<\/p>\n<p> But getting back to the question: why is it linear?<\/p>\n<p> There&#8217;s a bunch of different ways to explain it. I&#8217;m going to use<br \/>\none that&#8217;s informal, but easy to understand. Most logicians would hate<br \/>\nit as an explanation, but it&#8217;s easy.<\/p>\n<p> In linear logic, simultaneous posession of resources is treated as a kind of<br \/>\nmultiplication. If I have A and B at the same time, I can say that I have A*B.<\/p>\n<p> Now, suppose that each distinct proposition that we can use in an<br \/>\ninference has a unique label, so that we can distinguish them. Then<br \/>\nfor any sequence of inferences, we can describe the inference cost &#8211;<br \/>\nthat is, the set of resources consumed by those inferences via a<br \/>\nmultiplicative expression. So for my example above with the vending<br \/>\nmachine, I can say that the six quarters are named<br \/>\nQuarter<sub>a<\/sub>, Quarter<sub>b<\/sub>, Quarter<sub>c<\/sub>,<br \/>\nQuarter<sub>d<\/sub>, Quarter<sub>e<\/sub>, and Quarter<sub>f<\/sub>.<\/p>\n<p> Then, to get a soda, I need to spend the six quarters. The inference cost<br \/>\nwill be Quarter<sub>a<\/sub>*Quarter<sub>b<\/sub>*Quarter<sub>c<\/sub>*Quarter<sub>d<\/sub>*<br \/>\nQuarter<sub>e<\/sub>*Quarter<sub>f<\/sub>.<\/p>\n<p> In linear logic, the inference cost is always a linear equation.<\/p>\n<p> Compare that to conventional propositional logic. In the example<br \/>\nat the beginning of this post, I used A twice to infer B and C. So the<br \/>\ninference cost there is A<sup>2<\/sup> &#8211; it&#8217;s quadratic. I can&#8217;t do that in<br \/>\nlinear logic.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Sorry for the lack of posts this week. I&#8217;m traveling for work, and I&#8217;m seriously jet-lagged, so I haven&#8217;t been able to find enough time or energy to do the studying that I need to do to put together a solid post. Fortunately, someone sent me a question that I can answer relatively easily, even [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":false,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[33],"tags":[],"class_list":["post-589","post","type-post","status-publish","format-standard","hentry","category-logic"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_shortlink":"https:\/\/wp.me\/p4lzZS-9v","jetpack_sharing_enabled":true,"jetpack_likes_enabled":true,"_links":{"self":[{"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/posts\/589","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/comments?post=589"}],"version-history":[{"count":0,"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/posts\/589\/revisions"}],"wp:attachment":[{"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/media?parent=589"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/categories?post=589"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/tags?post=589"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}