I had a good week on KA, although it wasn’t exceptional in any way. I made achieved my goal of getting through six videos and likely studied for a bit more than five hours in total, which is what I always aim for. Most of what I worked through covered what’s called the Gram-Schmidt process which allows you to make a basis orthonormal. I was able to grasp the general idea of what this process does, but understanding how it works (i.e. understanding the linear algebra behind it) was difficult. I worked through a couple of examples which helped it sink in a bit, but I can tell I’m far from intuitively understanding how it works. My week ended on the final section of this unit, Eigen-Everything. I only made it through one video in this section, and it was pretty straightforward. I remember learning about Eigenvectors from Dr. TB’s linear algebra playlist, and I’m pretty sure I’ve seen a few random videos on YouTube covering Eigenvectors. So even though most of the week was a bit rough, it ended well and made me optimistic that I’ll be able to get through the final six videos of the course relatively easily. 😁
Video 1 – Example Using Orthogonal Change-of-Basis Matrix to Find Transformation Matrix








I’ll let my written notes speak for themselves on this one.
Video 2 – Orthogonal Matrices Preserve Angles and Lengths


The point of this video was that if you have a vector multiplied by an orthogonal change-of-basis matrix, the vector’s length and angle (relative to other vectors) remains the same. I believe this means it’s the ‘grid’ behind the vectors (i.e. the basis) that changes, not the vectors. Although I thought this was true for ALL change-of-basis matrices, so now I’m not sure… 🤔
Video 3 – The Gram-Schmidt Process



In this video, Sal introduced the Gram-Schmidt process. Like I said in the intro, I found the linear algebra pretty confusing, but the gist is that you take non-orthonormal vectors, find the projection of one onto the other, then subtract the projection to get the orthogonal component of the vector that’s being projected, what I think of as the “opposite” side of a right triangle. (That made no sense, but I don’t know how to put it into words… 😒)
The point is, the Gram-Schmidt process is a way to take a basis that’s NOT orthonormal and make it orthonormal.
Video 4 – Gram-Schmidt Process Example






In this video, Sal worked through an example of using the Gram-Schmidt process. I don’t understand why the linear algebra works, but actually working through the math along with him wasn’t too difficult. As for what is going on, the screen shot below shows the two vectors Sal was working with in 2D (v1 = [-1, 1, 0] in blue and v2= [-1, 0, 1] in red) and the yellow vector represents the orthogonal vector of v1 that would line up with the tip of v2. I don’t know how to further explain what’s going on, but that’s essentially what you’re doing with the G.S. process, finding orthogonal vectors by subtracting projections of one vector onto another. (Something like that…)

Video 5 – Gram-Schmidt Example with 4 Basis Vectors









This is just another example of the G.S. process but finding the O.N. basis of three vectors in R4.
Video 6 – Introduction to Eigenvalues and Eigenvectors

This final video I watched was only eight minutes long and simply introduced Eigenvalues and Eigenvectors. In the video Sal said:
“Any vector that satisfies this right here [T(v) = λ(v)] is called an Eigenvector for the transformation ‘T’, and the Lambda – the multiple it becomes – is the Eigenvalue associated with that Eigenvector.”
As far as I can tell, Eigenvectors are the vectors in a transformation that may be scaled but DON’T rotate, which is the key point.
And that’s going to do it for this week’s post. I’m feeling pretty confident that I’ll be able to get through this final section by the end of the upcoming week, meaning I’ll FINALLY be done this course. A while ago I said that if I didn’t feel like I had a solid grasp on linear algebra by the time I finished the course, I’d spend some time searching for and watching other LA videos online. Although I still feel like I have a long way to go before I completely understand LA, I feel WAY better about my overall grasp on it now than I did a few weeks ago. So, my point is, as soon as I get through this course I’m going to head straight into Differential Calculus and will look up LA videos as necessary. I’m assuming linear algebra will come up in Differential Calculus (it was in Multivariable Calculus) so when that happens I’ll have a good excuse to go back and review LA vids. It always seems to be the case that when I go back to review a subject it seems WAY easier to understand, so hopefully that will happen with LA. ANYWAYS… Six more vids to go in LA, one more course to get through with DC, then the course challenge to redo in MC, then… done. 😭
(Well, actually then Physics. But the goal I set for myself six years ago of getting through the Math section of KA will officially be done. 😮💨)