I'm having some trouble understanding the process of actually finding what $[T]_\beta ^\gamma$ is, given $2$ bases $\beta$ and $\gamma$. Here's an example:
Let $T$: $P_3(\mathbb{R})$ $\rightarrow$ $P_2(\mathbb{R})$ be the linear transformation defined by $T(f(x))$ = $f'(x)$. Let $\beta$ and $\gamma$ be the standard ordered bases for $P_3$ and $P_2$ (over $\mathbb{R}$), respectively. Find $[T]_\beta ^\gamma$.
What exactly is the step-by-step process of finding $[T]_\beta ^\gamma$? I know it's a very simple concept, but it's just not clicking with me.
$\endgroup$ 33 Answers
$\begingroup$Let $\beta=(\beta_1,\beta_2,\beta_3)$ and $\gamma=(\gamma_1,\gamma_2)$, where $\beta_1,\beta_2,\beta_3\in P_3(\Bbb{R})$ and $\gamma_1,\gamma_2\in P_2(\Bbb{R})$. The columns of the matrix $[T]_{\beta}^{\gamma}$ are the images of the basis vectors of $\beta$ under $T$, written on the basis $\gamma$. These are $$T(\beta_1)=\beta_1',\qquad T(\beta_2)=\beta_2',\qquad T(\beta_3)=\beta_3',$$ but to express them in terms of $\gamma_1$ and $\gamma_2$ you need to know (more about) what the bases $\beta$ and $\gamma$ are exactly. For example, on the standard bases $E_3=(1,x,x^2)$ and $E_2=(1,x)$ for $P_3(\Bbb{R})$ and $P_2(\Bbb{R})$ respectively, you have \begin{align*} T(1)=&\ 0&=0\cdot1+0\cdot x\\ T(x)=&\ 1&=1\cdot1+0\cdot x\\ T(x^2)=&\ 2x&=0\cdot1+2\cdot x \end{align*} so the matrix $[T]_{E_3}^{E_2}$ is given by $$[T]_{E_3}^{E_2}=\begin{pmatrix} 0&1&0\\0&0&2\end{pmatrix}$$
$\endgroup$ $\begingroup$Very simple: the column vectors of $T_\beta^\gamma$ are the coordinates (in basis $\gamma$) of the derivatives of the vectors in basis $\beta$.
$\endgroup$ 5 $\begingroup$Here is the more general setting.
Let $V$ (resp. $W$) be an $n$ (resp. $m$) dimensional vector space over $\mathbb{R}$. Let $\alpha=(v_1,\cdots,v_n)$ be an ordered basis in $V$ and $\beta=(w_1,\cdots,w_m)$ an ordered basis in $W$.
For any vector $x\in V$, denote its coordinate w.r.t. the basis $\alpha$ as $ [x]_\alpha=(x_1,\cdots,x_n)^T $ and for any vector $y\in W$, denote its coordinate w.r.t. the basis $\beta$ as $ [y]_\beta=(y_1,\cdots,y_m)^T. $
Let $T:V\to W$ be a linear transformation. Let $[T]_\alpha^\beta$ denotes the matrix for $T$ w.r.t. the bases $\alpha$ and $\beta,$ i.e., $$ [T]^\alpha_\beta=\bigr[[Tv_1]_\beta,\cdots,[Tv_n]_\beta\bigr]. $$ Note in particular that $[T]^\alpha_\beta$ is an $m\times n$ matrix.
So what are the steps to find $[T]^\alpha_\beta$?
- Note that you are given $\alpha$ and $\beta$, and $T$. Identify what are $m$ and $n$ for your problem;
- Find $Tv_j$ for each $j$;
- and then find $[Tv_j]_\beta$.
For the very last step, you need to know how to find $[z]_{\beta}$ given $z\in W$. Suppose $[z]_\beta=(z_1,\cdots,z_m)^T$ Then by the definition of coordinates: $$ z=\sum_1^m z_iw_i $$ which essentially gives you a linear equation about the $z_i$'s.
$\endgroup$ 1