Week 286 – Feb. 17th to Feb. 23rd

It was another subpar week for me on KA, BUT I at least finished off Linear Algebra. That’s not saying much though since I only had two videos left to get through… To make things worse, I didn’t really understand much of what those last two videos talked about. 😒 To better understand eigenvectors, eigenvalues, etc., I watched two other videos from a YT channel called Visual Kernel (which is a pretty weird name) and those videos confirmed that I at least understand what eigenvectors and eigenvalues are. I’d already watched all four of the videos from that YT channel before and talked about them in a previous post, but they’re so well done that I figured I’d go back and rewatch them. The silver lining of this week is 1) I finished LA, 2) I understand what eigenvectors and eigenvalues are (even though I don’t have a strong grasp on how or why they work), and 3) I watched the first video of Differential Equations (which I thought was named Differential Calculus? 🤔). So, all in all not a great week, but I’m finally moving on to Differential Equations, so I’ve at least got that going for me which is nice. 

Video 1 – Eigenvectors and Eigenspaces for a 3×3 Matrix

Reading back my notes and looking at the screen shots from his vid, I’m so confused. 😩 I can kind of follow along and I sort of understand the gist of what’s happening, but the truth is I’m pretty lost with all of it. The point of this vid is to show how to solve a given matrix transformation’s eigenvectors, in this case using λ = 3. One positive is that as I was working through this question, I tried solving it before watching Sal solve it and actually managed to get the vector column notation correct. 😮‍💨 Plus, I understand that the solutions mean there’s a plane created by the vectors [½, 1, 0] and [½, 0, 1] that, when any vector on that plane gets multiplied by the matrix A, it gets scaled by λ = 3. Boom. 🧨

Video 2 – Showing That an Eigen-basis Makes for Good Coordinate Systems

The point of this video is to say that, if you have a vector in the standard basis in, for example, R3 and you’re performing a transformation on the vector, T(x), IF the transformation matrix has the same number of linearly independent basis vectors as the dimension (which in R3 would be three L.I. basis vectors), you can change the basis from the standard basis to the “eigen-basis”, THEN do the transformation, THEN take the transformed vector back to the standard basis (i.e. go from [x]S.B. –> [x]Eig.B. –> [T(x)]Eig.B. –> [T(x)]S.B) which will be easier because the transformation matrix in the eigen-basis will be a diagonal lambda matrix. (You can see where Sal drew this matrix in the second screen shot above on the right labelled matrix D.)

(That was really confusing, but that’s essentially what I had written in my notes.)

Video 3 – Visualize Spectral Decomposition | SEE Matrix, Chapter 2

Below is the first video I watched from Visual Kernel:

I didn’t understand everything from this video, but halfway through (from ~5:00 to 8:00) he talks about eigenvectors and really makes what they are seem clear. In my notes I wrote:

“As the video goes on, I was right in thinking that eigenvectors SCALE along a line and don’t rotate which is important because, as the video goes on to explain, certain transformations can be made easier by 1) rotating the eigenvectors onto the standard basis, 2) scaling them, and then 3) rotating the scaled shape/vectors back to their OG position (apart from being scaled).”

I think this is what Sal was explaining in the previous video. Here are a bunch of screen shots from Visual Kernel’s vid and a brief description below each one of what’s going on:

Shows a random rectangle about to be scaled by the highlighted matrix.

Shows the rectangle transformed by the matrix.

Shows the blue eigenvectors of the transformation before being scaled.

Shows the blue eigenvectors rotated onto the standard basis.

Shows the length of the eigenvectors before being scaled.

Shows the x-axis eigenvector being scaled by λ = 6.

Shows the y-axis eigenvector being scaled by λ = 2.

Shows rotating the shape back to where it was in the second screenshot of this series of screenshots.

Video 4 – SVD Visualized, Singular Value Decomposition Explained SEE Matrix, Chapter 3

Below is the other video I watched from Visual Kernel and it was very tough to follow the majority of what he was talking about:

One key takeaway I got from this video was that you can take a rectangular matrix and make it square and symmetrical by multiplying the matrix by its transpose. I believe this is important because it allows you to find eigenvectors for matrices that aren’t square, which can make doing certain transformations easier to calculate. (I could be wrong about that though.) Here a bunch of screenshots from the video of what I’m talking about:

A rectangular matrix’s transpose multiplied by itself equals a square, symmetrical matrix.

This works in either order.

The previous two screenshots explain that it’s important to make matrices square and symmetrical because it will have eigenvectors that are orthogonal to each other. If you normalize the eigenvectors and put them into a matrix then… (See next screenshot.)

The transpose rotates the eigenvectors to align with the standard basis.

(I don’t understand this final screenshot, BUT..) If you don’t take the transpose of eigenvectors, the eigenvector matrix, in the creator’s words, “rotates the standard basis to the eigenvectors.” (However that works…)

So, ya. I don’t at all understand how or why any of this works, but I think I understand the gist of why eigenvectors are helpful, and how to create square, symmetrical matrices and why they’re also useful.

And now, FINALLY, I’m moving on to Differential Equations. Like I said in the intro, I watched the first video from the course which simply introduced what differential equations are. I didn’t watch it until Sunday, however, and figured I’d wait until the start of this coming week to make notes on it. That said, a more accurate way of saying it would be that I RE-watched the video as I’d already watched it before. It turns out that I’ve already watched about 75% of the videos and did seven of the eight exercises from the first unit. I’m happy about this, but my plan is still to rewatch all the videos and redo all the exercises. It will take longer, but I’m hoping what’s covered will stick a bit more going through it this second time than it did the first time. Plus, it’ll be good to get as much practice in as possible before returning to the Multivariable Calculus Course Challenge, which is the final thing I have left to do… I’M SO CLOSE TO THE END! 😤

(I will literally be the Frodo meme when I finish this.)

Leave a Reply

Your email address will not be published. Required fields are marked *