Week 272 – Nov. 11th to Nov. 17th

It wasn’t a spectacular week for me working through KA, but I did manage to get a decent amount of work done and made some progress. My goal was to get through the first section of this new unit, Matrix Transformations, which I didn’t do, but I made it through the two exercises and six of the eight remaining videos. For the most part, everything that read about and watched was pretty straightforward and easy to wrap my head around, although there were a few things that I still found confusing. (Like always.) Overall though, I feel pretty good about the effort I put in and what I was able to learn and get through. As I said, it was nothing spectacular but it was progress nonetheless!

The first exercise I worked through, Visualizing Linear Transformations, began by talking about what a linear transformation is and showed a bunch of videos of transformations that were and weren’t linear transformations. It was helpful to drive home the point that for a transformation to be linear, 1) the origin must remain fixed and 2) the basis can rotate around the origin and be stretched, but the “grid lines” – I’m not sure if that’s what they’re called in dimensions greater than 3D – cannot be curved.

The rest of the exercise got into the idea that if you have a point on a 2D grid, say [3, 2], you can get to that point by going along the basis vectors [1, 0] and [0, 1] by multiplying them by 3 and 2, respectively. After a linear transformation, you can still get to wherever the point [3, 2] ended up post transformation by multiplying wherever the basis vectors ended up by 3 and 2:

The second exercise, Matrix from Visual Representation of Transformation, was just a bunch of videos of examples of LT’s and then multiple-choice options below each video of what the matrix was that was shown in the video. Here’s an example:

Here are screenshots and notes from the six videos I watched this week:

Video 1 – Matrix Vector Products as Linear Transformations

At the start of this video, Sal was going through the general and somewhat abstract idea of what’s happening with a linear transformation. It was helpful to get a broader grasp of what’s going on, and also to get a better idea of the notation.

The following three screenshots were from the remainder of the video where Sal worked through the algebraic proof of T(a + b) = T(a) + T(b) and T(ca) = cT(a) which I’m happy to say that I actually fully understood and could visualize throughout his explanation. 😊

Video 2 – Linear Transformations as Matrix Vector Products

I don’t have the words to properly explain this, but in this video Sal showed how to use the identity matrix to break apart (?) a vector that is made up of functions (?) and turn it into a 3×2 matrix where there are only scaler values without variables.

(That could be totally wrong but, like I said, I don’t know how to explain it. 😔 So, in other words, I don’t understand what’s going on…)

Video 3 – Image of a Subset Under a Transformation

This was a cool video and showed how linear transformations can “move” shapes by taking the vectors and running them through LT’s. I was able to follow along with the math pretty well and had a decent grasp on how everything worked and what the notation was referring to. It still seems a bit hieroglyphic to me but I can more-or-less make sense of the notation, so I’ve got that going for me which is nice. I’m also getting pretty good at doing matrix multiplication.

Video 4 – im(T): Image of a Transformation

I may have missed the point of this video (what else is new?), but I believe Sal was just saying that if you run a subspace through a LT and that doing so Rn will “map” onto Rm. Up to this point, I always thought that the dimensions of the first matrix, Rm, had to be smaller than the dimensions of second matrix, Rn, (i.e. “m” had to be smaller than “n”) so Sal’s diagram was confusing to me. I couldn’t understand how the Rn would be “smaller” and fit inside of Rm given that I thought Rm had to have fewer dimensions and so, in my mind, Rm had to be “smaller”. I think 1) I may just be thinking about this completely wrong , and 2) I realized that m does not need to be smaller than n as long as the number of columns in the first matrix and the number of rows in the second matrix are both equal, i.e. they’re both n.

Video 5 – Pre-image of a Set

This was a very short video where Sal talked about the idea of going in the inverse direction of a transformation and talked about the notation and terminology used when doing so. After a LT, the output subspace is called the “image” so going the inverse direction, the OG subspace before the LT is known as the “pre-image”.

Video 6 – Preimage and a Kernel Example

In this video Sal worked through an example of taking two output vectors (or image vectors? 🤔) from a LT, [0, 0] and [1, 0], and finding their pre-image vectors. He does this by augmenting the LT matrix with each output vector, using EROs to turn the augmented matrix into REF, then putting the REF matrix into vector column notation which ends up being the vectors that collapse down to the output vectors [0, 0] (a.k.a. the null space) and [1, 0].

Probably none of what I just wrote made any sense. Nonetheless, even though I can’t explain it, I do think I have the gist of what’s going on, but I don’t understand why or how augmenting the LT matrix and turning it into REF works to find the pre-image of the output vectors. 🤷🏻‍♂️

And that was it for this past week. Not too bad, but I wish I would have gotten more done.

I have two more videos left in this section. After that, there are 43 videos remaining in the unit spread out across six more sections. I’m hoping to get through eight videos by the end of this week which would finis off the section I’m currently in AND the next section. Getting through eight videos seems like it should be easily manageable, but my goal was to get through eight videos this week which I didn’t do, so we’ll see. I do feel like I’ve got some momentum back though which is nice! I’m making progress and am getting closer and closer to finishing off the MATH section of KA. I definitely want to have a good week to keep my momentum going so I can finish off the MATH section before I turn 90 years old. 🙏🏼