### Vector spaces: Conclusion of Vector spaces

### Notes

Vector space (or linear space) is the central concept in linear algebra. It is the mathematical representation of our intuitive idea of space (as discussed in the section *Vector calculus in dimensions 2 and 3*), but the strength of the concept lies in the fact that it brings many, at first glance totally different mathematical settings, under a common denominator. An example of this is the similar treatment of sets of linear equations and spanning vectors in a vector space.

The abstract formulation of the theory by means of calculation rules from *the notion of vector space* is obtained by Giuseppe Peano (1858-1932). The concept of vector space, however, had already been in use for several decades by then, albeit in a less precise form.

As mentioned, vector spaces are used in different settings. We mention a few.

- Besides the use of vector spaces for modelling our space, vector spaces are used as a framework for physical concepts such as velocity, acceleration, momentum and force in mechanics and various fields in electromagnetism.
- The treatment of signals (signal analysis), and the quantum mechanical description of a variety of physical phenomena (e.g., the structure of atoms) make use of vector spaces of functions and demonstrate the relationship between linear algebra and calculus. Incidentally, in the context of such functions, spaces of infinite dimension are the rule rather than the exception.
- Regarding geometry, in vector spaces `straight' items such as lines and planes are definitely on the agenda. But vector spaces also play a fundamental role in the study of curved spaces, such as tangent spaces to such curved objects.

In our definition of vector space we let the scalars be real or complex numbers, but it is also possible (and useful) to allow other (fields of) scalars; almost all the results in this chapter are valid for these other scalars, but treatment is beyond the scope of this course. In coding theory and cryptology, vector spaces like the numbers **modulo 2** play a central role; these are the numbers #0# and #1# with the calculation rules #0+0=0#, #0+1=1+0=1#, #1+1=0#, #1\cdot 0=0\cdot 1 =0#, and #1\cdot 1 =1# . Essential for fields of scalars are the ordinary rules of calculation (associativity, distributivity, the existence of #0# and #1#, the inverse of a number distinct from #0#, the negative of a number, etc.; commutativity is not really needed).

The process of *coordinatization*, that we saw earlier for the plane #\mathbb{E}^2# and the space #\mathbb{E}^3#, appears to be valid for each #n#-dimensional vector space. We given an overview of the correspondence between the coordinate space #\mathbb{R}^n# and any vector space #V# with basis #\basis{\vec{v}_1,\ldots,\vec{v}_n}#.

name | #V# with basis #\basis{\vec{v}_1,\ldots,\vec{v}_n}# | #\mathbb{R}^n# |

vector | #\vec{v}=\lambda_1\cdot\vec{v}_1+\cdots+\lambda_n\cdot\vec{v}_n# | #\rv{\lambda_1,\ldots,\lambda_n}# |

sum | #\vec{v}+\vec{w}#, where #\vec{w}=\mu_1\cdot\vec{w}_1+\cdots+\mu_n\cdot\vec{w}_n# | #\rv{\lambda_1+\mu_1,\ldots,\lambda_n+\mu_n}# |

scalar product #\phantom{m}# | #\lambda\cdot\vec{v}# | #\rv{\lambda\cdot\lambda_1,\ldots,\lambda\cdot\lambda_n}# |

basis element | #\vec{v}_i# | #\vec{e}_i=\rv{0,\ldots,0,1,0,\ldots,0}# |

In particular, we learned about the correspondence between systems of linear equations and the vector space #L_n#. Later we will build upon this correspondence using the notion of duality. There, linear functionals are special (homogeneous) linear polynomial functions: they form the linear subspace of #L_n# spanned by #\basis{x_1,\ldots,x_n}#.

We got to know the following vector spaces:

- #F#: the functions on a given set #X#
- #P_n#: the polynomial functions in one variable on #\mathbb{R}# of degree at most #n#
- #L_n#: the linear polynomial functions on #\mathbb{R}^n#
- #M_{m\times n}#: the #(m\times n)#-matrices

We also learned about the correspondence between *affine subspaces* and systems of linear equations:

- Every affine subspace of #\mathbb{R}^n# is the solution set of a system of linear equations in #n# unknowns.
- The solution set of each system of linear equations in #n# unknowns is an affine subspace of #\mathbb{R}^n#.

The process of identifying an #n#-dimensional real vector space with #\mathbb{R}^n# is coordinatization. The coordinates of a vector depend on the choice of a basis. The transformation from one basis to another basis is called a coordinate transformation. This notion can be discussed properly only after the formalism for comparing vector spaces has been introduced. This subject is dealt with in the natural sequel to the current chapter: *Linear maps*. Both matrices and the rank of a matrix will help to understand linear maps.

**Pass Your Math**independent of your university. See pricing and more.

Or visit omptest.org if jou are taking an OMPT exam.