An interactive, step-by-step guide with live visualizations, a built-in calculator, and a self-check quiz.
Based on MIT 18.06 Linear AlgebraA matrix A acts on vectors — it multiplies them and usually changes their direction. But certain special vectors come out pointing in the same (or opposite) direction as they went in. These are eigenvectors.
A non-zero vector that does not change direction when multiplied by A. It may be stretched, shrunk, or flipped — but it stays on the same line.
The scalar multiplier. It tells you how much the eigenvector is stretched (λ > 1), shrunk (0 < λ < 1), flipped (λ < 0), or zeroed out (λ = 0).
If λ = 0, then A·x = 0. This means x is in the null space of A, and A is a singular matrix (no inverse).
Start from A·x = λ·x and rearrange:
For a non-zero x to exist, the matrix (A − λI) must be singular. A matrix is singular when its determinant is zero:
(A − λᵢI)·x = 0. Use elimination to find the null space. The result is the eigenvector(s).
λ² − (trace)·λ + (det) = 0. Use this to skip the determinant expansion!Enter your 2×2 matrix A:
Matrix: A = [[3,1],[1,3]] (Symmetric matrix)
For λ₁ = 4:
For λ₂ = 2:
| Eigenvalue | Eigenvector | Meaning |
|---|---|---|
| λ₁ = 4 | x₁ = [1, 1]ᵀ | Stretched by factor 4 |
| λ₂ = 2 | x₂ = [−1, 1]ᵀ | Stretched by factor 2 |
Matrix: A = [[0,1],[1,0]] (Swaps the two components of any vector)
trace = 0, det = −1
| Eigenvalue | Eigenvector | Intuition |
|---|---|---|
| λ₁ = 1 | x₁ = [1, 1]ᵀ | Swapping two equal entries → same vector |
| λ₂ = −1 | x₂ = [−1, 1]ᵀ | Swapping opposite entries → flips sign |
Projection Matrix P onto a plane (3D) — no computation needed, just geometry!
Case 1 — Vector in the plane:
Projection of a vector already in the plane = itself. Unchanged.
Case 2 — Vector ⊥ to the plane:
Projection of a perpendicular vector = zero. Collapses to origin.
| Eigenvalue | Which eigenvectors? |
|---|---|
| λ = 1 | All vectors lying inside the projection plane |
| λ = 0 | All vectors perpendicular to the plane (null space of P) |
Rotation Matrix Q — rotates every vector by 90°. No real vector stays on the same line after 90° rotation!
trace = 0, det = +1
| Matrix Type | Eigenvalue Type |
|---|---|
| Symmetric (Aᵀ = A) | Always real |
| Anti-symmetric (Aᵀ = −A) | Pure imaginary |
| Rotation matrix | Complex (a ± bi) |
Upper Triangular Matrix — a case with a repeated eigenvalue.
Key fact: For triangular matrices, eigenvalues = diagonal entries!
Now find eigenvectors for λ = 3:
The sum of the diagonal entries of A always equals the sum of all eigenvalues.
This means if any eigenvalue is 0, the determinant is 0 → A is singular.
The eigenvectors stay exactly the same. Only the eigenvalues shift by c.
This only works when B = c·I. In general, A and B have different eigenvectors, so you cannot simply add their eigenvalues. You must solve the eigenvalue problem from scratch for A+B.
| Property | Formula / Fact |
|---|---|
| Eigenvalue equation | A·x = λ·x |
| Characteristic equation | det(A − λI) = 0 |
| 2×2 shortcut | λ² − (trace)λ + (det) = 0 |
| Triangular matrix | Eigenvalues = diagonal entries |
| Symmetric matrix | Real λ, perpendicular eigenvectors |
| Singular matrix | λ = 0 is always an eigenvalue |
| Count | n×n matrix → exactly n eigenvalues |
| Identity shift | (A + cI) has same eigenvectors, λ shifts by c |
Rotation and anti-symmetric matrices can have complex eigenvalues even though the matrix entries are real. Always check if the discriminant (trace² − 4·det) is negative.
A repeated eigenvalue doesn't automatically mean a shortage of eigenvectors — but it can. Triangular matrices with repeated diagonal entries are the classic example.
Eigenvalues don't add or multiply across matrix sums/products (unless one matrix is a multiple of I). Always re-solve for compound matrices.
Any scalar multiple of an eigenvector is also an eigenvector. When we "find the eigenvector," we're really finding a basis for the eigenspace (a whole line of vectors).