okay so this is the first lecture on eigenvalues and eigenvectors and that's a big subject that will take up most of the rest of the course it's a again matrices are square and we're looking now for some special numbers the eigenvalues and some special vectors the eigenvectors and so this lecture is mostly about what are these numbers and then the other lectures are how do we use them why do we want them okay so what's an eigenvector maybe I'll start with eigenvector what's an eigenvector so I have a matrix a okay what does a matrix
do it acts on vectors it multiplies vectors X so the way that matrix a ax is in goes a vector X and out comes uh vector ax is it's like a function with a with a function in calculus in goes a number X out comes f of X here in linear algebra we're up in more dimensions in goes a vector X out comes a vector ax and the vectors I'm specially interested in are the ones that come out in the same direction that they went in that won't be typical most vectors ax is in points
in some different direction but there are certain vectors where ax comes out parallel to X and those are the eigen vectors so ax parallel to X those are the eigenvectors and what do I mean by parallel Oh much easier to just state it in an equation ax is some multiple and everybody calls that multiple lambda of X that's our big equation we look for special vectors and remember most vectors won't be eigenvectors that for which ax is in the same direction as X and by same direction I allow it to be the very opposite direction
I allow lambda to be negative or zero well I guess we've met the eigen vectors that have eigenvalue zero those are in the same direction but they're in a kind of very special way so the I get this this is the eigenvector x lambda whatever this multiplying factor is whether it's six or minus six or zero or even some imaginary number that's the eigenvalue so there's the eigenvalue there's the eigenvector let's just take a second on eigenvalue zero from the point of view of eigenvalues that's no special deal that that's we have an eigenvector if
the eigenvalue happened to be zero that would mean that ax was zero X in other words zero so what would X where would we look for what are the X's what are the eigenvectors with eigenvalue zero they're the guys in the null space ax equals zero so if our matrix is singular so let me write this down if if a is singular then that what singular mean it means that it takes some vector X into zero some nonzero vector that's what all right will be the eigenvector into 0 then lambda equals 0 is an eigenvalue
but we're interested in all eigenvalues now lambda equals 0 is not like so special anymore okay so the question is how do we find these X's and lambdas and notice we don't have an equation ax equal B anymore I can't use elimination I've got the I've got two unknowns and in fact they're multiplied together lambda and X are both unknowns here so we need to we need a good idea of how to find them but before I before I do that and that's where a determinate so come in can I just give you some matrices
like here you go take the matrix a projection matrix okay so suppose we have a plane and our matrix P is what I've called a now I'm gonna call it P for the moment because it's I'm thinking okay let's look at a projection matrix what are the eigenvalues of a projection matrix so that's my question what are the X's the eigenvectors and the lambdas the eigenvalues for and now let me say a projection matrix my point is that we before we get into determinants and formulas and all that stuff let's take some matrices where we
know what they do we know that if we take a vector B what this matrix does is it projects it's down to PB so is be an eigenvector in in that picture is that vector be an eigenvector no not so so B is not an eigenvector because P be the projection is in a different direction so now tell me what vectors are eigenvectors of P what vectors do get projected in the same direction that they start so so answer tell me some X's in this picture where could I start with a vector B or X
do its projection and end up in the same direction well that would happen if the vector was right in that plane already if the vector X was so it let the vector X so X any any vector any X in the plane will be an eigen vector and and what will happen when I multiply by P what I project the vector X I I called it be here because this is our familiar picture but now I'm going to say that B was no good for for our purposes I'm interested in a vector X that's actually
in the plane and I project it and what do I get back X of course doesn't move so any X in the plane is unchanged by P and and what's that telling me that's telling me that x is an eigenvector and it's all so telling me what's the eigenvalue which is just compare it with that the eigen value of the multiplier is just one good so we have actually a whole plane of eigenvectors now I asked are there any other eigenvectors and i expect the answer to be yes because i would like to get three
if i'm in three dimensions i would like to hope for three independent eigenvectors two of them in the plane and one not in the plane okay so this guy B that I drew there was not any good what's the right eigenvector that's not in the plane the good one is the one that's perpendicular to the plane there's an another good X because what's the projection so that these are eigenvectors another guy here would be another eigenvector but now here's another one any X that's perpendicular to the plane what's px for that for that vector what's
the projection of this guy perpendicular the plane it is zero of course so it's it there's the null space px and note for those guys are 0 or 0 X if we like and the eigen value is 0 so my answer to the question is what are the eigen values for our projection matrix there they are 1 and 0 ok we've we know projection matrices we can write them down as that a a transpose a inverse a transpose thing but without doing that from the from the picture we can see what are the eigen vectors
okay are there other matrices let me take a second example how about a permutation matrix what about the matrix I'll call it a now zero one one zero can you tell me a vector X see we'll have a system soon enough so I would like to just do these these couple of examples just to see the picture before we before we let it all go into a system where that matrix isn't anything special because it is special and what so what vector could I multiply by and end up in the same direction can you spot
an eigenvector for this guy it that that's a matrix that permutes x1 and x2 right it switches the first the two components of X how could the vector with its X 2 X 1 with per muted turn out to be a multiple of x1 x2 the vector we start with can you tell me an eigenvector here for this guy X equal what is actually can you tell me one vector that has eigenvalue 1 so what what vector would have eigenvalue 1 so that if i if i permute it it doesn't change there that could be
1 1 thanks 1 1 okay take that vector 1 1 that will be an eigenvector because if i do ax i get 1 1 so that's the eigenvalue is 1 great that's one eigenvalue but I have here a 2 by 2 matrix and I figure there's going to be a second eigenvalue and eigenvector now what about that what's a vector ok maybe we can just like guess it a vector that the other actually this one that I'm thinking of it's going to be a vector that has eigenvalue minus one that's my that's going to be
my other eigenvalue for this matrix that's the notice the nice positive or not negative matrix but an eigenvalue is going to come out negative and can you guess spot the X that will work for that so I want a vector when I multiply by a which reverses the two components I want the thing to come out minus the original so what shall I send in in that case if I send in negative 1 1 then when I apply a I get I do that multiplication and I get 1 negative 1 so it reversed sign so
ax is minus X lambda is minus 1 aah so ax was X there and ax is minus X here can I just mention like jump ahead and point out a special little fact about eigenvalues n by n matrices will have n eigenvalues and it's not like that suppose n is 3 or 4 or more it's not so easy to find them we'd have a third degree or a fourth degree or an nth degree equation but here's one nice fact there's one pleasant fact that the sum of the eigenvalues equals the sum down the diagonal that's
called the trace and I put that in the lecture content specifically so this is a neat fact a fact that some the some of the lambdas add up the lambdas equals there's some what would you like me to household I write that down what I want to say in words is the sum down the diagonal of a shall I write a 1 1 plus a 2 2 plus plus a in in that's add up the diagonal entries in this example it's 0 in other words once I found this eigen value of 1 I knew the
other one had to be minus 1 in this 2 by 2 case because in the 2 by 2 case which is a good one to to play with the trace tells you right away what the other eigen value is so if I tell you 1 eigen value you could tell me the other one will we'll have that we'll see that again ok now can i I could give more examples but maybe it's time to face the the equation ax equal lambda X and figure how are we going to find X and lambda ok so this
is the question now is how to find eigenvalues and eigenvectors how to solve how to solve ax equal lambda X when we've got two unknowns both in the equations ok here's the trick simple idea bring this on to the same side rewrite bring this over as a minus lambda times the identity x equals 0 right I have a X minus lambda X so I brought that over and I've got 0 left on the on the right hand side ok I don't know lambda and I don't know X but I do know something here what I
know is if I if I'm going to be able to solve this thing for some X that's not the zero vector that's no that's a useless eigenvector doesn't count what I know now is that this matrix must be what if I'm going to be if there is an X I don't care if I don't right now I don't know what it is I'm going to find lambda first actually and but if there is an X it tells me that this matrix this special combination which is like the matrix a with lambda shifted by lambda shifted
by lambda I that it has to be singular this matrix must be singular otherwise the only X would be the zero X and zero interest okay so this is singular and what do I now know about singular matrices they're determinant is zero so I so from the fact that that has to be singular I know that the determinant of a minus lambda I has to be zero and that now I've got X out of it I've got an equation for lambda that the key equation it's called the characteristic equation or the eigenvalue equation and that
in other words I I'm now in a position to find lambda first so I this is the idea will be to find lambda first and actually I'll know I won't find one lambda I'll find n different lambdas well and lambdas maybe not in different ones a lambda could be repeated repeated lambda is the source of all trouble in 1806 so let's hope for the moment that they're not repeated they're there they were different right one and minus one in that in that for that permutation okay so and after I found this lambda can I just
look ahead how am I going to find X after I have found this lambda the lambda being this one of the numbers that makes this matrix singular then of course finding X is just by elimination right it's just now I've got a singular matrix I'm looking for the null space we're experts at finding the null space you know you do elimination you identify the the pivot columns and so on you and give values to the free variables probably there only be one free variable will give it the value one like there and we find the
other variable okay so let's find the X second will be a doable job let's go let's look at the first job of finding lambda okay can I take another example and let's let's work that one out okay so let me take the example say let me make it easy three three one and one so I've made it easy I made it - by - I made it symmetric and I even made it constant down the diagonal so that so the more like special properties I stick into the matrix the more special outcome I get for
the eigenvalues for example that this symmetric matrix I know that it'll come out with real eigenvalues the eigenvalues will turn out to be nice real numbers and up in our previous example that was a symmetric matrix yeah actually while we're at it that was a symmetric matrix its eigenvalues were nice real numbers 1 and minus 1 and you notice anything about its eigenvectors anything particular about those two vectors 1 1 and -1 1 they just happen to be no I can't say they just happened to be because that's a whole point is that they had
to be what what are they they're perpendicular the vector when if I see a vector 1 1 and a 1 and a minus 1 1 my mind immediately takes that dot product it's 0 those vectors are perpendicular that'll happen here too well let's find the eigenvalues actually the I owe my examples too easy my example is too easy let me tell you in advance what's going to happen may I or shall I do the determinant of a minus lambda and then point out at the end what will you remind me at the after I found
the eigenvalues to say why they were why why they were easy from the from the example we did okay let's let's do the job here let's fake let's compute determinant of a minus lambda I so that's a determinant and what's what is this thing it's the matrix a with lambda removed from the diagonal so the diagonal matrix is shifted and then I'm taking the determinant okay so I multiply this out so what is that determinant you notice I didn't take lambda away from all the entries its lambda I so it slammed along the diagonal so
I get three much lambda squared and then minus 1 right and I want that to be 0 well I'm gonna simplify it and what will I get so if I multiply this out I get lambda squared minus 6 lambda plus what plus 8 and that I'm going to set to 0 and I'm going to solve it so and it's it's a quadratic equation I can use factorization I can use the quadratic formula I'll get two lambdas before I do it tell me what's that number six that's showing up in this equation it's a trace that
number six is three plus three and while we're at it what's the number eight that's showing up in this equation it's the determinant that our matrix has determined at eight so in a 2x2 case it's really nice it's lambda squared minus the trace times lambda the trace is the linear coefficient at plus the determinate the constant term okay so let's can can we find the roots I guess the easy way is to factor that as something times something if we couldn't factor it then we'd have to use the old b squared minus 4ac formula but
I think we can factor that into lambda minus what times lambda minus what can you do that factorization for and to lambda minus four times lambda minus two so that the eigenvalues are four and two so the eigenvalues are one eigenvalue lambda one let's say is four lambda to the other eigenvalue is two the eigenvalues are 4 & 2 and then I can go for the eigenvectors you see I got the eigenvalues first 4 & 2 now for the eigenvectors so what are the eigenvectors they're these guys in the null space when I take away
when I make the matrix singular by taking 4i or to Iowa so we got to do those separately let me find the eigenvector for for first so I'll subtract 4 so a minus 4i is so taking 4 away we'll put minus ones there and what's the point about that matrix if 4 is an eigenvalue then a minus 4 I had better be out what kind of matrix singular if that matrix isn't singular the 4 wasn't correct but we're okay that matrix is singular and what's the X now the X is in the null space so
what's the x1 that goes with with the lambda 1 so that aim so this is now I'm doing ax 1 is lambda 1 X 1 so I took a minus lambda 1 I that's this matrix and now I'm looking for the X 1 in its null space and who who is he what's the vector X in the null space of course it's 1 1 so that's the eigenvector that goes with that eigenvalue now how about the eigenvector that goes with the other eigenvalue can I do that with with erasing I take a minus 2 I
so now I take 2 away from the diagonal and that leaves me with a 1 and a 1 so a minus 2 I as again produced a singular matrix as it had to I'm looking for the null space of that guy what vector is in its null space well of course a whole line of vectors so when I say the eigenvector I'm not speaking correctly there's a whole line of eigenvectors and you just I just want a basis and for a line I just want one vector but you could your there's some freedom in choosing
that one but choose a reasonable one what's a vector in the null space of that well the natural vector to pick as the eigenvector with with lambda 2 is minus 1 1 if I did elimination on that vector and set that the free variable to be 1 I would get minus 1 and get that eigen vector so do you see then that I've got I can backfire eigenvalue eigenvector eigenvalue for this for this matrix and now becomes that thing that i wanted to be reminded of what's the relation between that problem and let me write
just above what we found here a equals 0 1 1 0 that has eigenvalues 1 and minus 1 and eigenvectors 1 1 and eigenvector minus 1 1 and what do you notice what's how is this matrix related to that matrix how are those 2 matrices related well 1 is just 3i more than the other one right I just took that matrix and I I took this matrix and I added 3i so my question is what happened to the eigenvalues and what happened to the eigenvectors that's the that's like the question we keep asking now in
this chapter if I if I do something to the Mae Trix what happens if I already know something about the matrix what what's the conclusion for its eigenvectors and eigenvalues because those eigenvalues and eigenvectors are going to tell us important information about the matrix and here what are we what are we seeing what's happening to these eigenvalues 1 and minus 1 when I add 3 I it just added 3 to the eigenvalues I got 4 & 2 3 more than 1 and minus 1 what happened to the eigenvectors nothing at all 1 1 is +
- + 1 and minus 1 1 are is still the eigenvectors in other words simple but useful observation if I add 3i to a matrix its eigenvectors don't change and its eigenvalues are 3 bigger let's let's just see why let me keep all this on the same board suppose I have a matrix a and ax equal lambda X now I add 3i to that matrix do you see what its eigenvalue are the eigenvalues and eigenvectors are going to how they're coming out if so is there's an if ax equals lambda X then this this other
new matrix I just have an ax which is lambda X and I have a 3x from the 3x so it's just I mean it's just sitting there lambda plus 3x so if they if this had eigenvalue lambda this size eigenvalue lambda plus 3 and the X the eigenvector is the same X for both matrices okay so that's a great of course it's special we got the new matrix by adding three I suppose I had added another matrix suppose I know the eigenvalues and eigenvectors of a so this is this this little board here is going
to be not so great suppose I have a matrix a and it has an eigen vector X with an eigen value lambda and now I add on some other matrix so so what I'm asking you is if you know the eigenvalues of a and you know the eigenvalues of B let me say suppose B so this is if let me put an if here if ax equals lambda X fine and B has eigen values has eigen values what should we call them alpha alpha 1 and alpha or say alpha I'll use alpha for the eigen
values of B for no good reason well you see what I'm gonna ask is how about a plus B let me let me give you the let me give you a what you might think first okay if ax equals lambda X and if B has an eigen value alpha then am I allowed to say what's what's the matter with this argument it's wrong what I'm gonna write up is wrong I'm gonna say BX is Alpha X add those up and you get a plus BX equals lambda plus Alpha X so you would think that if
you knew the eigen values of a and you knew the eigen values of B then if you added you would know the eigen values of a plus B but that's false a plus B what when B was 3i that work worked great but this is not so great and what's the matter with that argument there we have no reason to believe that X is also an eigenvector of B it B has some eigenvalues but it's got some different eigenvectors normally it's a different matrix I don't know anything special if I don't know anything special then
as far as I know it's got some different eigen vector Y and when I add I get just rubbish I mean I get I can add but I don't learn anything so not so great is a plus B or a times B normally the eigenvalues of a plus B or a times B are not eigenvalues of a plus eigenvalues of b eigenvalues are not like linear or and they don't multiply because eigenvectors are usually different and and there's this no way to find out what a plus B does to a vector okay so that's like
a caution don't if B is a multiple of the identity great but if B is some general matrix then for a plus B you've got to find you've got to solve the eigenvalue problem okay now I want to do another example that brings out another point about eigenvalues let me make this example a rotation matrix okay so here's another example so a rotate oh I better call it Q I often use Q for four rotations because those are the like very special very important examples of orthogonal matrices let me let me make it a 90
degree rotation so that by my matrix is going to be the one that rotates every vector by 90 degrees so do you remember that matrix it's the cosine of 90 degrees which is zero the sine of 90 degrees which is one minus the sine of 90 the cosine of 90 so that matrix deserves the letter Q it's an orthogonal matrix very very orthogonal matrix now I'm interested in its eigenvalues and eigenvectors two-by-two it can't be that tough we know that the eigenvalues add to zero actually we know something already here the eigen what's the sum
of the two eigenvalues just tell me what I just said zero right from that trace business the sum of the eigenvalues is is going to come out zero and the product of the eigenvalues did I tell you about the determinant being the product of the eigenvalues no but that's a good thing to know we pointed out that how that eight appeared in the in the quadratic equation so so so let me just say this the trace is zero plus zero obviously and that's the sum that's lambda 1 plus lambda 2 now the other neat fact
is that the determinant what's the determinant of that matrix 1 and that is lambda 1 times lambda 2 in our example the one we worked out we the eigenvalues came out four and two their product was eight that it had to be eight because we factored into lambda minus four times lambda minus two that gave us the constant term eight and that was the determinant okay what I'm leading up to with this example is that something's gonna go wrong something goes wrong for a rotation because what vector can come out parallel to itself after a
rotation if if this matrix rotates every vector by 90 degrees what could be an eigenvector you see we're going to have trouble I can vectors are well our picture of eigenvectors of coming out in the same direction that they went in there won't be any and with and with eigenvalues we're gonna have trouble from these equations let's see why am I expecting trouble the the first equation says that the eigenvalues add to 0 so there's a plus and a minus but then the second equation says that the product is plus 1 and we're in trouble
but there's a way out so how let's do the usual stuff look at determinant of Q minus lambda I it'll it'll so I'll just follow the rules take the determinant subtract lambda from the diagonal where I had zeros the rest is the same rest of Q is just copied compute that determinant okay so what does that determine it equal lambda squared minus -1 plus 1 what's up there's my equation like my equation for the eigenvalues is lambda squared plus 1 equals 0 what are the eigenvalues lambda 1 and lambda 2 there I whatever that is
and minus it right those are the right numbers they add to 0 as the trace requires and they multiply to 1 as the determinant requires but they don't happen to be real numbers even though the matrix was perfectly real so this can happen complex numbers are going to have to enter 1806 at this moment boo right all right if I just choose good matrices that have real eigenvalues we can postpone that evil day but just so you see so I'll try to do that but it's it's out there that a matrix a perfectly real matrix
could have get give a perfectly innocent looking quadratic thing but the roots of that quadratic can be complex numbers and and of course you everybody knows that there what do you know about the complex numbers so so now let's just spend one more minute on this bad possibility of complex numbers we do know a little information about the the two complex numbers they're complex conjugates of each other if if lambda is an eigenvalue then when I change when I go you remember what complex conjugates are you switch the sign of the imaginary part well this
was only imaginary had no real part so we just switched its side so the eigenvalues come in pairs like that but they're complex a complex conjugate pair and that can happen with a perfectly real matrix and as a matter of fact so that was Mike my point earlier that if the matrix was symmetric it wouldn't happen so if we stick to matrices that are symmetric or like close to symmetric then the eigenvalues will stay real but if we move far away from symmetric and that's as far as you can move because that matrix is how
is Q transpose related to Q for that matrix that matrix is anti-symmetric q transpose is minus q that's the very opposite of symmetry when I flip across the diagonal I get I reverse all the signs those are the guys that have pure imaginary eigenvalues so they're the extreme case and in between our matrices that are not symmetric or anti-symmetric but but they have partly a symmetric part and an anti-symmetric part okay so I'm doing a bunch of examples here to show the possibilities the good possibilities being perpendicular eigenvectors real eigenvalues the bad possibilities being complex
eigenvalues we could say that's bad there's another even worse I'm getting through the bad things here today then then the next lecture can can can be like pure happiness okay here's one more bad thing that that could happen so again I'll do it with an example suppose my matrix is suppose I take this three three one and I change that guy to zero what are the eigenvalues of that matrix what are the eigenvectors this is always our question of course the next sections are going to show why why or why do we care but but
for the moment this lecture is introducing them and let's just find them okay what are the eigenvalues of that matrix let me tell you I at a glance we could answer that question because the matrix is triangular it's really useful to know if you've got properties like a triangular matrix it's very useful to know you can read the eigenvalues off there right on the diagonal so the eigenvalue is 3 and also 3 3 is a repeated eigenvalue but let's see that happen let me do it right the determinant of a minus lambda I what I
always have to do is this determinate I take away lambda from the diagonal I leave the rest I compute the determinant so I get a 3 minus lambda times a 3 minus lambda and nothing so that's where the triangular part came in triangular part the one thing we know about triangular matrices is the determinate is just the product down the diagonal and in this case it's this thing repeated so lambda 1 is 1 sorry lambda 1 is 3 and lambda 2 is 3 that was easy I mean no why why should I be pessimistic about
a matrix whose eigenvalues can be read off right away the problem with this matrix is in the eigenvectors so let's go to the eigenvectors so how do i find the eigenvectors I'm looking for a couple of eigenvectors so I take the eigenvalue well what do I do now we remember I solve a minus lambda IX equals zero and and what is a minus lambda IX so so take three away and I get this matrix zero zero zero one right times X is supposed to give me zero right that's my big equation for x now I'm
looking for X the eigenvector so I took a minus lambda IX and what kind of a matrix am I supposed to have here singular right it's supposed to be singular and then it's got some vectors which it is so it's got some vector X in the null space and what what's the what's to give me a basis for the null space for that guy tell me what's what's a vector X in the null space so that'll be the the eigenvector that goes with lambda 1 equal 3 the eigenvector is so what's in the null space
1 0 is it great now what's the other eigenvector what's what's the eigenvector that goes with lambda 2 well lambda 2 is 3 again so I've got the same thing again give me another vector I want it to be independent if I'm going to write down an x2 I'm never going to let it be dependent on x1 I'm looking for independent eigenvectors and what's the conclusion there isn't one this is a degenerate matrix it's only got one line of eigenvectors instead of two it's it's the this possibility of a repeated eigenvalue opens this further possibility
of a shortage of eigenvectors and so there's no second independent eigenvector x 2 so it's a matrix it's a two-by-two matrix but with only one independent eigenvector so that will be those are the matrices that where eigenvectors are or don't give the complete story ok my lecture on Monday will give the complete story for all the other matrices thanks have a good weekend a real New England weekend