Let $A$ be $n \times n$ matrix. Show that if $A^2 = 0$, then $I - A$ is non-singular and $(I-A)^{-1} = I+A$.
The second part is easy for me, but how can I show that if $A^2 = 0$, then $I - A$ is non-singular. I found in Wolfram Alpha that "A matrix is singular iff its determinant is 0.", but how I can relate this to the given $A^2 = 0$. or is there another easier way. Thank you for your help in advance.
$\endgroup$ 15 Answers
$\begingroup$$(I-A)(I+A) = I^2 - A^2 = I$. So $\det(I-A)$ cannot be zero since $\det(I-A)\det(I+A) = \det I = 1$.
In fact the first line shows $I + A = (I - A)^{-1}$.
$\endgroup$ $\begingroup$Are you familiar with eigenvalues? In particular $I - A$ is non-singular iff $1$ is not an eigenvalue of $A$.
So, for contradiction, assume that $1$ is an eigenvalue, i.e. there is an $x\neq 0$ so that $Ax = 1x = x$.
But then, on one hand $$ A(Ax) = A(1x) = Ax = x \neq 0 $$
but on the other hand
$$ A(Ax) = AAx = A^2 x = 0x = 0 $$
$\endgroup$ $\begingroup$If you can prove the second part, that is a proof of the first part. A matrix is defined to be singular if it has no inverse. You've proven $I-A$ has an inverse, namely $I+A$. Therefore, $I-A$ is not singular.
$\endgroup$ $\begingroup$Comparing $|I-A|$ to the characteristic polynomial of $A$:
$|I-A| = 0 \implies \lambda=1$ is an eigenvalue of $A$. But $A$ nilpotent necessitates that the only eigenvalue of $A$ is $0$, contradiction.
$\endgroup$ $\begingroup$You have a matrix $A$ such that $A^2=0$.
So $(I-A)(I+A)=I^2+IA-AI-A^2=I+A-A-0=I$. This mean that the matrix $I-A$ is invertible (non-singular) and it's inverse is $I+A$.
Take a $n\times n$ matrix $S$. If there is another matrix $T$ such that $ST=I$ (where $I$ is the identity matrix), then $S$ is invertible and $T$ is its inverse. In fact, let $L_{S}$ and $L_{T}$ be the linear maps associated respectively to the matrix $S$ and $T$ (for example, $L_{S}:\Bbb F^n\to \Bbb F^n$ is such that $L_S(x)=Sx$ for all $x\in\Bbb F^n$). A map has a right inverse if and only if it is surjective: so, because our $L_T$ is such that $L_S\circ L_T=id$, it follows that $L_S$ is surjective. But a surjective linear map between two spaces of the same dimension is necessarily injective. Therefore $L_S$ has a left inverse (every map is injective if and only if it has a left inverse), say $G$. But then, because composition of mappings is associative, we have that $G=L_T$: in fact $$G = G\circ (L_S\circ L_T)=(G\circ L_S)\circ L_T=L_T$$ So, by definition of inverse, $L_S$ is invertible and its inverse is $L_T$ ($=G$). Finally, the isomorphism between matrices and linear maps (that to every matrix $M$ associates the map $x\mapsto Mx$) gives us that also the matrix $S$ is invertible, and its inverse is $T$.
$\endgroup$