From ad499755b99b79f51b665112cb0d96952f42ebaa Mon Sep 17 00:00:00 2001 From: nameEqualsJared <32991996+nameEqualsJared@users.noreply.github.com> Date: Fri, 22 Mar 2019 19:01:01 -0400 Subject: [PATCH] (possibly?) fixed a fact about eigenvectors Was this meant to say what I've proposed? Because as stated, I am not sure if the proposition is correct. Let for a counter example A = [0 1; 1 0]. Then consider the column vectors a=[1; 1] and b=[-1; 1]. For the eigenvalue y=1, both vectors a and b and eigen vectors (because A*v = 1*v for both a and b), correct? So I think one eigenvalue may have many associated eigenvectors; but one eigenvector (I believe) always a unique eigenvalue. --- eigenvaluesAndEigenvectors/definition.tex | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/eigenvaluesAndEigenvectors/definition.tex b/eigenvaluesAndEigenvectors/definition.tex index 6644548..1277684 100644 --- a/eigenvaluesAndEigenvectors/definition.tex +++ b/eigenvaluesAndEigenvectors/definition.tex @@ -21,7 +21,7 @@ Before proceeding with examples, we note that -\begin{proposition} If $\bf v$ is an eigenvalue of a matrix $A$, the eigenvector associated with it is unique. +\begin{proposition} If $\bf v$ is an eigenvector of a matrix $A$, the eigenvalue associated with it is unique. \end{proposition} \begin{proof} Suppose $\lambda_1{\bf v} = A*{\bf v} = \lambda_2{\bf v}$. Then $\lambda_1{\bf v} - \lambda_2{\bf v} = (\lambda_1 - \lambda_2){\bf v} = {\bf 0}$. But since ${\bf v}\ne {\bf 0}$, the only way this could happen is if the coefficient $(\lambda_1 - \lambda_2)$ is equal to zero, or equivalently, if $\lambda_1 = \lambda_2$.