A linear transformation is a function where and are a Vector Spaces(In most cases and we call a linear operator).
And must have the properties that
This means that all straight lines become new but still straight lines after the transformation as well as the origin has to map to itself. Grid lines remain parallel and evenly spaced. This is a VERY important property and is really the defining property of a linear transform and why the are so special.
Proof
Given any straight line. Take the Straight Line Vector Equation, applying a linear transform gets .
Since and are just vectors themselves we simply have another straight line equation showing the straight lines remain straight after a transform.
Also notice that two parallel lines will have the same direction vector so taking the transform of a parallel line gets which is parallel to as they have the same direction vector
Using this we can show that a linear transform is defined solely by the what happens to the basis vectors after the transform. Given a vector where are the basis vectors. Let then as we can see has the same format as but with the basis vectors changed to the new ones after the transform.
Using this property that a linear transform is based upon where the basis vector land we can write a transform as a square matrix with column vectors as the new basis vectors.
For 2D this is where the basis vectors -typically and - map to and respectively.
Then to find we can write it out as a matrix-vector multiplication with and using our what out basis vectors change to we get
Matrix multiplication is simply finding a single matrix that has the exact same effect as applying the matrices one after another. Matrix multiplication is read from right to left eg is a transformation by then . This comes from functional notation.
For 2D If we apply two linear transforms in a row we get
We want to find a single matrix that has the exact same effect as applying the matrices one after another.
To find this we can consider where the basis vectors end up. By definition the basis vector ends up at and then carrying out matrix-vector multiplication with doing the same with yields that
This process of tracking the basis vectors can be generalised for any dimension.
Column Space and Rank
For an matrix , the rank of a is the number of dimensions in the output after a linear transform under . If that means that the rank of is equal to the dimension of e.g. . If that means the space has collapsed/shrunk into a lower dimension and the rank of is .
The column space of , denoted as is similar and is simply the span of the column vectors in . Equivalently defined as the set of all possible outputs of .
If is an real number matrix than if and the column space of is . So the rank of a matrix is simply the number of dimensions of the column space.
Null Space
For a matrix the null space (also called the kernel) is the set of all vectors denoted as . This is the space of all vectors that collapse onto the origin under the linear transformation of . The bigger the null space the more dimensions are lost. The dimension of the null space is called the
The null space is connected to rank as for an matrix $A
Specific Transformations
Rotations
Enlargement and Stretches
You can represent a stretch with matrix It has stretch factor parallel to the -axis and stretch factor parallel to the -axis.
For stretches only along the -axis, points on the -axis are invariant and the line is invariant and vice versa.
For stretches in both direction the only invariance is the origin
For a linear transform by matrix , is the scale factor of area (if it’s negative the shape has been reflected)