A **linear equation** in the variables $x_1, x_2, ... , x_n$ is an equation that can be written in the form

A **system of linear equations** is a collection of linear equations involving the same variables.

A **solution** of the system is a list of numbers that makes each equation true. If the matrix has a solution, it is considered **consistent**, if not, it is inconsistent.

The set of all solutions is the **solution set**.

Two systems are **equivalent** if they have the same solution set.

In a matrix, a vertical dashed line indicates the other side of an equation. This is a way of denoting an augmented matrix versus a variable coefficient matrix. While this is not a normal idea in math, just Dr. Villalpando's idea, it should be the normal.

#### Linear Combination

Let $\bar{v}_1 , \bar{v}_2, ... , \bar{v}_p$ be vectors in $\mathbb{R}^n$.

Let $c_1, c_2, ..., c_p$ be scalars.

Then

is the linear combination of $\bar{v}_1 , \bar{v}_2, ... , \bar{v}_p$

#### Span

Let $\bar{v}_1 , \bar{v}_2, ... , \bar{v}_p \in \mathbb{R}^n$. The **span** of $\bar{v}_1 , \bar{v}_2, ... , \bar{v}_p$ is the set of all linear combinations of$\bar{v}_1 , \bar{v}_2, ... , \bar{v}_p$

#### Echelon Forms

- All nonzero rows are above any rows of all zeros
- Each leading entry of a row is in a column to the right of the leading entry of the row above it
- All entries in a column
**below**a leading entry are zero

**REDUCED ECHELON**

The above three plus

- Leading entry in each row is 1
- Each leading 1 is the only non-zero entry in its column

#### Row Reductions

(3)#### Vector

A line that has both direction and magnitude (oh yeah!) represented as a vertical matrix of dimensions nx1 where $\vec{u} = \begin{bmatrix} u_1 \\ u_2 \\ ... \\ u_n \end{bmatrix}$

**Properties**

$\vec{u} + \vec{v} = \vec{v} + \vec{u}$ Commutative Property

$(\vec{u} + \vec{v}) + \vec{w} = \vec{u} + (\vec{v} + \vec{w})$ Associative Property

$\vec{u} + \vec{0} = \vec{u}$ Identity Property

$\vec{u} + (-\vec{u}) = \vec{0}$ Inverse Property

$c(\vec{u} + \vec{v}) = c\vec{u} + c\vec{v}$ Distributive Property

Linear Combinations of any two vectors are in the span of those vectors, therefore,

(4)## Homogeneous Equations

A **Homogeneous** equations has a solution that satisfies the equation $A\vec{x} = \vec{0}$.

Therefore, $x_1 = 4x_3 \\ x_2 = -2x_3 \\ x_3 = x_3$ or in **Parametric Form**, $\vec{x} = x_3 \begin{bmatrix} 4\\-2\\1 \end{bmatrix}$

This is the homogeneous solution.

#### Non-Homogeneous

A solution that satisfies the equation $A\vec{x} = \vec{b}$ where A is a matrix of n x m dimensions and b is a matrix of n x 1 dimensions (is it unnecessary to say that?)

(7)To solve for the non-homogeneous $\vec{x}$, simply augment with $\vec{b}$ and solve. We get $\vec{x} = x_5 \begin{bmatrix} 4\\-2\\1 \end{bmatrix} + \begin{bmatrix} 9\\-3\\0 \end{bmatrix}$ in Parametric form. Notice that the solution here looks very much like the homogeneous solution if $\vec{b} = \vec{0}$ but there is now a point (9,-3,0) that this line is forced through. That's the difference. The Homogeneous solution is a set of all lines that satisfy the equation, but the non-homogeneous solution is that particular solution. Therefore, all the lines such $A\vec{x} = \vec{b}$ will have the same $x_5 \begin{bmatrix} 4\\-2\\1 \end{bmatrix}$ regardless of $\vec{b}$, but it only changes the point the line is forced through.

## Linear Independence

The vectors $\vec{v_1}, \vec{v_2}, ... ,\vec{v_n}$ in $\Re^n$ are **linearly independent** if the vector equation

has only a trivial ($x = 0$) solution.

In summary, a linearly independent solution has only one solution, where each vector supplies independent information about the solution.

Two vectors that are not scalar multiples of eachother must be linearly independent.

If the vectors are linearly **dependent**, there must exist a set $c_1, c_2, c_3, ... , c_p$ where at least one coefficient is not zero such that

In other words, there is more than just the zero solution. Ergo, if there are infinite solutions, it's linearly dependent.

If, however, there are only two vectors, they are linearly dependent if one is a scalar multiple of another.

A function is a system with one output for every input

$A\vec{x}$ map a vector in $\vec{x}$ in $\Re^n$ to a vector in $\Re^m$

A = $\begin{bmatrix} 1&3&-1&0\\0&1&0&1 \end{bmatrix}$

(10)T: $\Re^4 \rightarrow \Re^2 \text{ } T(\vec{x}) = A\vec{x}$

Image of $\vec{x}$ is $T(\vec{x}) = A\vec{x}$

(11)Do note that $T(\vec{x_4}) = T(\vec{x_6})$. This means that the function T is not one to one, meaning there can be more than one input for the same output, much as a parabola has the same y value for two different x values.

## Linear Transformations

A transformation with domain D is Linear if

- $T(\vec{u} + \vec{v}) = T(\vec{u}) + T(\vec{v})$
- $T(c\vec{u}) = cT(\vec{u})$