{"id":2231,"date":"2013-09-29T14:35:09","date_gmt":"2013-09-29T18:35:09","guid":{"rendered":"http:\/\/scientopia.org\/blogs\/goodmath\/?p=2231"},"modified":"2013-09-29T14:35:09","modified_gmt":"2013-09-29T18:35:09","slug":"combining-non-disjoint-probabilities","status":"publish","type":"post","link":"http:\/\/www.goodmath.org\/blog\/2013\/09\/29\/combining-non-disjoint-probabilities\/","title":{"rendered":"Combining Non-Disjoint Probabilities"},"content":{"rendered":"<p> In my previous post on probability, I talked about how you need to be careful about covering cases. To understand what I mean by that, it&#8217;s good to see some examples.<\/p>\n<p> And we can do that while also introducing an important concept which I haven&#8217;t discussed yet. I&#8217;ve frequently talked about independence, but equally important is the idea of <em>disjointness<\/em>.<\/p>\n<p> Two events are independent when they have no ability to influence one another. So two coin flips are independent. Two events are <em>disjoint<\/em> when they can&#8217;t possibly occur together. Flipping a coin, the event &#8220;rolled a head&#8221; and the event &#8220;rolled a tail&#8221; are disjoint: if you rolled a head, you <em>can&#8217;t<\/em> roll a tail, and vice versa. <\/p>\n<p> So let&#8217;s think about something abstract for a moment. Let&#8217;s suppose that we&#8217;ve got two events, A and B. We know that the probability of A is 1\/3 and the probability of B is also 1\/3. What&#8217;s the probability of A or B?<\/p>\n<p> Naively, we could say that it&#8217;s P(A) + P(B). But that&#8217;s not necessarily true. It depends on whether or not the two events are disjoint.<\/p>\n<p> Suppose that it turns out that the probability space we&#8217;re working in is rolling a six sided die. There are three basic scenarios that we could have:\n<\/p>\n<ol>\n<li><em>Scenario 1:<\/em> A is the event &#8220;rolled 1 or 2&#8221;, and B is &#8220;rolled 3 or 4&#8221;. That is,  A and B are disjoint.<\/li>\n<li><em>Scenario 2:<\/em> A is the event &#8220;rolled 1 or 2&#8221;, and B is &#8220;rolled 2 or 3&#8221;.  A and B are different, but they overlap.<\/li>\n<li><em>Scenario 3:<\/em> A is the event &#8220;rolled 1 or 2&#8221;, and B is the event &#8220;rolled 1 or 2&#8221;. A and B are really just different names for the same event.<\/li>\n<\/ol>\n<p> In scenario one, we&#8217;ve got disjoint events. So P(A or B) is P(A) + P(B). One way of checking that that makes sense is to look at how the probability of events work out. P(A) is 1\/3. P(B) is 1\/3. The probability of neither A nor B &#8211; that is, the probability of rolling either 5 or 6 &#8211; is 1\/3. The sum is 1, as it should be.<\/p>\n<p> But suppose that we looked at scenario 2. If we made a mistake and added them as if they were disjoint, how would things add up? P(A) is 1\/3. P(B) is 1\/3. P(neither A nor B) = P(4 or 5 or 6) = 1\/2. The total of these three probabilities is 1\/3 + 1\/3 + 1\/2 = 7\/6. So just from that addition, we can see that there&#8217;s a problem, and we did something wrong.<\/p>\n<p> If we know that A and B overlap, then we need to do something a bit more complicated to combine probabilities. The general equation is:<\/p>\n<p>    <center><img src='http:\/\/l.wordpress.com\/latex.php?latex=%20P%28A%20cup%20B%29%20%3D%20P%28A%29%20%2B%20P%28B%29%20-%20P%28A%20cap%20B%29&#038;bg=FFFFFF&#038;fg=000000&#038;s=0' title=' P(A cup B) = P(A) + P(B) - P(A cap B)' style='vertical-align:1%' class='tex' alt=' P(A cup B) = P(A) + P(B) - P(A cap B)' \/><\/center><\/p>\n<p> Using that equation, we&#8217;d get the right result. P(A) = 1\/3; P(B) =<br \/>\n1\/3; P(A and B) = 1\/6. So the probability of A or B is 1\/3 + 1\/3 &#8211; 1\/6 = 1\/2. And P(neither A nor B) = P(4 or 5 or 6) = 1\/2.  The total is 1, as it should be.<\/p>\n<p> From here, we&#8217;ll finally start moving in to some more interesting stuff. Next post, I&#8217;ll look at how to use our probability axioms to analyze the probability of winning a game of craps. That will take us through a bunch of applications of the basic rules, as well as an interesting example of working through a limit case. <\/p>\n<p> And then it&#8217;s on to combinatorics, which is the main tool that we&#8217;ll use for figuring out how many cases there are, and what they are, which as we&#8217;ve seen is an essential skill for probability.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>In my previous post on probability, I talked about how you need to be careful about covering cases. To understand what I mean by that, it&#8217;s good to see some examples. And we can do that while also introducing an important concept which I haven&#8217;t discussed yet. I&#8217;ve frequently talked about independence, but equally important [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":false,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":true,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[53],"tags":[],"class_list":["post-2231","post","type-post","status-publish","format-standard","hentry","category-probability"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_shortlink":"https:\/\/wp.me\/p4lzZS-zZ","jetpack_sharing_enabled":true,"jetpack_likes_enabled":true,"_links":{"self":[{"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/posts\/2231","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/comments?post=2231"}],"version-history":[{"count":0,"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/posts\/2231\/revisions"}],"wp:attachment":[{"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/media?parent=2231"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/categories?post=2231"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/tags?post=2231"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}