{"id":314,"date":"2007-02-19T08:30:00","date_gmt":"2007-02-19T08:30:00","guid":{"rendered":"http:\/\/scientopia.org\/blogs\/goodmath\/2007\/02\/19\/building-towards-homology-vector-spaces-and-modules\/"},"modified":"2016-10-14T20:47:36","modified_gmt":"2016-10-15T00:47:36","slug":"building-towards-homology-vector-spaces-and-modules","status":"publish","type":"post","link":"http:\/\/www.goodmath.org\/blog\/2007\/02\/19\/building-towards-homology-vector-spaces-and-modules\/","title":{"rendered":"Building Towards Homology: Vector Spaces and Modules"},"content":{"rendered":"<p> One of the more advanced topics in topology that I&#8217;d like to get to is <em>homology<\/em>. Homology is a major topic that goes beyond just algebraic topology, and it&#8217;s really very interesting. But to understand it, it&#8217;s useful to have some understandings of some basics that I&#8217;ve never written about. In particular, homology uses <em>chains<\/em> of <em>modules<\/em>. Modules, in turn, are a generalization of the idea of a <em>vector space<\/em>. I&#8217;ve said a little bit about vector spaces when I was writing about the gluing axiom, but I wasn&#8217;t complete or formal in my description of them. (Not to mention the amount of confusion that I caused by sloppy writing in those posts!) So I think it&#8217;s a good idea to cover the idea in a fresh setting here.<\/p>\n<p> So, what&#8217;s a vector space? It&#8217;s yet another kind of abstract algebra. In this case, it&#8217;s an algebra built on top of a field (like the real numbers), where the values are a set of objects where there are two operations: addition of two vectors, and <em>scaling<\/em> a vector by a value from the field.<\/p>\n<p> To define a vector space, we start by taking something like the real numbers: a set whose values form a <em>field<\/em>. We&#8217;ll call that basic field F, and the elements of F we&#8217;ll call <em>scalars<\/em>. We can then define a <em>vector space over F<\/em> as a set <b>V<\/b> whose members are called <em>vectors<\/em>, and which has two operations:<\/p>\n<dl>\n<dt>Vector Addition<\/dt>\n<dd> An operation mapping two vectors to a third vector, +:<b>V<\/b>&times;<b>V<\/b>&rarr;<b>V<\/b><\/dd>\n<dt>Scalar Multiplication<\/dt>\n<dd> An operation mapping a scalar and a vector to another scalar: *:F&times;<b>V<\/b>&rarr;<b>V<\/b><\/dd>\n<\/dl>\n<p> Vector addition forms an abelian group over <b>V<\/b>, and scalar multiplication is distributive over vector addition and multiplication in the scalar field. To be complete, this means that the following properties hold:<\/p>\n<ul>\n<li><b>(<b>V<\/b>,+) are an Abelian group<\/b>\n<ul>\n<li> Vector addition is associative: &forall;a,b,c&isin;<b>V<\/b>: a+(b+c)=(a+b)+c<\/li>\n<li> Vector addition has an identity element, <em>0<\/em>; &forall;a&isin;<b>V<\/b>:a+0=0+a=a.<\/li>\n<li> Vector addition has an inverse element: &forall;a&isin;<b>V<\/b>:(&exist;b&isin;<b>V<\/b>:a+b=0.) The additive inverse of a vector a is normally written -a. (<em>Up to this point, this defines (<b>V<\/b>,+) is a group.<\/em>)<\/li>\n<li> Vector addition is commutative: &forall;a,b&isin;<b>V<\/b>: a+b=b+a. <em>(The addition of this commutative rule is what makes it an abelian group.)<\/em><\/li>\n<\/ul>\n<\/li>\n<li><b>Scalar Multiplication is Distributive<\/b>\n<ul>\n<li> Scalar multiplication is distributive over vector addition: &forall;a&isin;F,&forall;b,c&isin;<b>V<\/b>, a*(b+c)=a*b+a*c<\/li>\n<li> Scalar multiplication is distributive over addition in F: &forall;a,b&isin;F,&forall;c&isin;<b>V<\/b>: (a+b)*c = (a*c) + (b*c).<\/li>\n<li> Scalar multiplication is associative with multiplication in F: &forall;a,b&isin;F,c&isin;<b>V<\/b>: (a*b)*c = a*(b*c).<\/li>\n<li> The multiplicative identity for multiplication in F is also the identity element for scalar multiplication: &forall;a&isin;<b>V<\/b>: 1*a=a.<\/li>\n<\/li>\n<\/ul>\n<\/ul>\n<p> So what does all of this mean? It really means that a vector space is a structure over a field where the elements can be added (vector addition) or scaled (scalar multiplication). Hey, isn&#8217;t that exactly what I said at the beginning?<\/p>\n<p> One obvious example of a vector space is a Euclidean space. Vectors are arrows from the origin to some point in the space &#8211; and so they can be represented as ordered tuples. So for example, &real;<sup>3<\/sup> is the three-dimensional euclidean space; points (x,y,z) are vectors. Adding two vectors (a,b,c)+(d,e,f)=(a+d,b+e,c+f); and scalar multiplication x<em>(a,b,c)=(x<\/em>a,x<em>b,x<\/em>c).<\/p>\n<p> Following the same basic idea as the euclidean spaces, we can generalize to matrices of a particular size, each of which is a vector space. There are also ways of creating vector spaces using polynomials, various kinds of functions, differential equations, etc. <\/p>\n<p> In homology, we&#8217;ll actually be interested in <em>modules<\/em>. A module is just a generalization of the idea of a vector space. But instead of using a field as a basis the way that you do in a vector space, in a module, the basis is just a general ring; so the basis is less constrained: a field is a commutative ring with multiplicative inverses of all values except 0, and distinct additive and multiplicative identities. So a module does <em>not<\/em> require multiplicative inverse for the scalars; nor does it require scalar multiplication to be commutative.<\/p>\n","protected":false},"excerpt":{"rendered":"<p>One of the more advanced topics in topology that I&#8217;d like to get to is homology. Homology is a major topic that goes beyond just algebraic topology, and it&#8217;s really very interesting. But to understand it, it&#8217;s useful to have some understandings of some basics that I&#8217;ve never written about. In particular, homology uses chains [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"jetpack_post_was_ever_published":false,"_jetpack_newsletter_access":"","_jetpack_dont_email_post_to_subs":true,"_jetpack_newsletter_tier_id":0,"_jetpack_memberships_contains_paywalled_content":false,"_jetpack_memberships_contains_paid_content":false,"footnotes":"","jetpack_publicize_message":"","jetpack_publicize_feature_enabled":true,"jetpack_social_post_already_shared":false,"jetpack_social_options":{"image_generator_settings":{"template":"highway","default_image_id":0,"font":"","enabled":false},"version":2}},"categories":[65],"tags":[],"class_list":["post-314","post","type-post","status-publish","format-standard","hentry","category-topology"],"jetpack_publicize_connections":[],"jetpack_featured_media_url":"","jetpack_shortlink":"https:\/\/wp.me\/p4lzZS-54","jetpack_sharing_enabled":true,"jetpack_likes_enabled":true,"_links":{"self":[{"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/posts\/314","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/comments?post=314"}],"version-history":[{"count":1,"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/posts\/314\/revisions"}],"predecessor-version":[{"id":3320,"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/posts\/314\/revisions\/3320"}],"wp:attachment":[{"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/media?parent=314"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/categories?post=314"},{"taxonomy":"post_tag","embeddable":true,"href":"http:\/\/www.goodmath.org\/blog\/wp-json\/wp\/v2\/tags?post=314"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}