Hey everyone, I've got another quick footnote for you between chapters today. When I've talked about linear transformations so far, I've only really talked about transformations from 2D vectors to other 2D vectors, represented with 2x2 matrices, or from 3D vectors to other 3D vectors, represented with 3x3 matrices. But several commenters have asked about non-square matrices, so I thought I'd take a moment to just show what those mean geometrically.
By now in the series, you actually have most of the background you need to start pondering a question like this on your own, but I'll start talking through it just to give a little mental momentum. It's perfectly reasonable to talk about transformations between dimensions, such as one that takes 2D vectors to 3D vectors. Again, what makes one of these linear is that gridlines remain parallel and evenly spaced, and that the origin maps to the origin.
What I have pictured here is the input space on the left, which is just 2D space, and the output of the transformation shown on the right. The reason I'm not showing the inputs move over to the outputs like I usually do is not just animation laziness, it's worth emphasizing that 2D vector inputs are very different animals from these 3D vector outputs, living in a completely separate, unconnected space. Encoding one of these transformations with a matrix is really just the same thing as what we've done before.
You look at where each basis vector lands, and write the coordinates of the landing spots as the columns of a matrix. For example, what you're looking at here is an output of a transformation that takes i-hat to the coordinates 2, negative 1, negative 2, and j-hat to the coordinates 0, 1, 1. Notice, this means the matrix encoding our transformation has three rows and two columns, which, to use standard terminology, makes it a 3x2 matrix.
In the language of last video, the column space of this matrix, the place where all the vectors land, is a 2D plane slicing through the origin of 3D space. But the matrix is still full rank, since the number of dimensions in this column space is the same as the number of dimensions of the input space. So if you see a 3x2 matrix out in the wild, you can know that it has the geometric interpretation of mapping two dimensions to three dimensions, since the two columns indicate that the input space has two basis vectors, and the three rows indicate that the landing spots for each of those basis vectors is described with three separate coordinates.
Likewise, if you see a 2x3 matrix with two rows and three columns, what do you think that means? Well, the three columns indicate that you're starting in a space that has three basis vectors, so we're starting in three dimensions, and the two rows indicate that the landing spot for each of those three basis vectors is described with only two coordinates, so they must be landing in two dimensions. So it's a transformation from 3D space onto the 2D plane, a transformation that should feel very uncomfortable if you imagine going through it.
You could also have a transformation from two dimensions to one dimension. One-dimensional space is really just the number line, so a transformation like this takes in 2D vectors and spits out numbers. Thinking about grid lines remaining parallel and evenly spaced is a little bit messy due to all of the squishification happening here, so in this case, the visual understanding for what linearity means is that if you have a line of evenly spaced dots, it would remain evenly spaced once they're mapped onto the number line.
One of these transformations is encoded with a 1x2 matrix, each of whose two columns has just a single entry. The two columns represent where the basis vectors land, and each one of those columns requires just one number, the number that that basis vector landed on. This is actually a surprisingly meaningful type of transformation with close ties to the dot product, and I'll be talking about that next video.
Until then, I encourage you to play around with this idea on your own, contemplating the meanings of things like matrix multiplication and linear systems of equations in the context of transformations between different dimensions. Have fun!