[WIP] A concise introduction to linear algebra by Geza Schay

Linear algebra is concerned with the study of vecto spaces. It investigates and isolates mathematical structure and methods involving the concept of linearity. Instances of linearity arise in many different contexts of mathematics and its applications, and linear algebra provides a uniform framework for their treatment.

As is typical in mathematics, the extraction of common key features that are observed in various seemingly unrelated areas gives rise to an abstraction and simplification which allows the study of these crucial features in isolation. The results of this investigation can then be carried back into all those areas where the underlying common feature arises, with the benefit of a unifying perspective. -- Martin Otto

Chapter 1: Analytic geometry of Euclidean spaces

The fundamental objects of Linear Algebra are vectors ( vv ) and matrices ( AA ).

Vectors

A vector is a list of numbers. The numbers are called the components of the vector. e.g.

v=[v1v2]=[24]v = \begin{bmatrix} v_1 \\ v_2 \end{bmatrix} = \begin{bmatrix} 2 \\ 4 \end{bmatrix}

vv is called a 2-dimensional vector because it has two components.

Properties of Vectors

  1. Dimension: The number of components determines the dimension of the vector. e.g. vv below is a vector in 3-dimensional space ($\mathbb{R}^3$):v=[v1v2v3]v = \begin{bmatrix} v_1 \\ v_2 \\ v_3 \end{bmatrix}
  2. Operations:

Linear combinations

The linear combination of vectors is a sum of scalar multiples of the vectors. e.g. The linear combination of vv and ww are the vectors cv+dwcv + dw where cc and dd are any numbers. $ c\begin{bmatrix} u_1 \ u_2 \end{bmatrix} + d\begin{bmatrix} v_1 \ v_2 \end{bmatrix} = \begin{bmatrix} c \cdot u_1 + d \cdot v_1 \ c \cdot v_2 + d \cdot v_2 \end{bmatrix} $

Using linear combinations to solve systems of equations

A system of linear equations is a collection of equations that involve the same set of variables. Solving such systems often involves finding whether the equations have:

Consider a system of equations: $ \begin{align*} a_{11}x_1 + a_{12}x_2 + ... a_{1n}x_n &= b_1 \ a_{21}x_1 + a_{22}x_2 + ... a_{2n}x_n &= b_2 \end{align*} $ Here:

This system can be rewritten in vector form as: $ x_1\begin{bmatrix} a_{11} \ a_{21} \end{bmatrix} + x_2\begin{bmatrix} a_{12} \ a_{22} \end{bmatrix} + ... + x_n\begin{bmatrix} a_{1n} \ a_{2n} \end{bmatrix} = \begin{bmatrix} b_1 \ b_2 \end{bmatrix} $

The goal is to find the scalars x1,x2,...,xnx_1, x_2, ..., x_n, such that the linear combination produces the vector bb.

The elimination method can be used to solve the system of equations by transforming the system into an equivalent system that is easier to solve. It is based on the same principles used in "Scalar algebra".

Basis Vectors

A basis for a vector space is a set of vectors that are linearly independent and spans the vector space. e.g.

There can be multiple basis vectors for a vector space. To determine if a vector [v1v2vn]\begin{bmatrix} v_1 \\ v_2 \\ \vdots \\ v_n \end{bmatrix} forms a basis, verify:

  1. Linear independence: The vector is not a linear combination of the other vectors. This can be verified by solving the equation $c_1v_1 + c_2v_2 + \ldots + c_nv_n = 0$ where c1,c2,,cnc_1, c_2, \ldots, c_n are scalars. If the only solution is c1=c2==cn=0c_1 = c_2 = \ldots = c_n = 0, then the vectors are linearly independent.
  2. Spanning: The vectors span the vector space. This means that any vector in the space can be expressed as a linear combination of the basis vectors. To verify, use the rank of the matrix formed by the vectors. If the rank is equal to the dimension of the space, then the vectors span the space.