MATH-5112/6012-001 | File
Course: Applied Linear Algebra
MWF 2:30-3:25 Room 119 WCharles
Assignment 6 due Monday October 28. Turn in solutions of one exercises on page 8 and one exercise on page 11 of Notes. Here is a somewhat better re-write of these questions:
- Choose one of the following (Compare page 8 of the notes)
- Consider vector space $V=\mathbb{R}^4$ with the standard dot product. Find the formula for the orthogonal projection $P_W(\vec v)$ when $\vec v=\begin{bmatrix}x \\y\\z\\w\end{bmatrix}\in\mathbb{R}^4$ and $W\subset \mathbb{R}^4$ is the plane given by equations
$$\begin{eqnarray}
x+y+z+w &=&0 \\
x-y+z-w &=&0
\end{eqnarray}
$$
- Consider a vector space $V=C[-1,1]$ of continuous functions on the interval $[-1,1]$ with the inner product $\langle f,g\rangle=\int_{-1}^1 f(x)g(x)dx$. Find the orthogonal projection (best $L_2[-1,1]$ approximation) of function $f(x)=x$ onto the two-dimensional subspace $W=span\{\sin(\pi x), \cos(\pi x)\}$.
- Choose one of the following (Compare page 11 of the notes)
- Consider the subspace $W=\{ \begin{bmatrix}x \\y\\z\\w\end{bmatrix}\in\mathbb{R}^4: x+y+z+w=0\}$ from page 9 of the notes, with
basis
$$\mathcal{B}=\left\{
\begin{bmatrix}1\\-1\\0\\0\end{bmatrix}, \begin{bmatrix}1\\0\\-1\\0 \end{bmatrix}, \begin{bmatrix}1\\0\\0\\-1 \end{bmatrix}
\right\}
$$
What does one get from Gram-Schmidt orthogonalization of this basis? (Applied in the order as the vectors are written!)
- Find an orthogonal basis for the vector space $V=\mathcal{P}_2$ of quadratic polynomials with inner product $\langle p,q\rangle=\int_0^1 p(x)q(x)dx$
- Find an orthogonal basis for the vector space $V=\mathcal{P}_2$ of quadratic polynomials with inner product $\langle p,q\rangle=p(0)q(0)+p(1)q(1)+p(-1)q(-1)$
If you happen to be interested in fitting quadratic polynomials to data, you may consider a more general situation of $V=\mathcal{P}_2$ with the inner product $\langle p,q\rangle=\sum_{k=1}^n p(x_k)q(x_k)w(x_k)$, where $x_1,\dots,x_n$ are given data points, $n$ is the sample size, and $w(x)>0$ is a "weight function" that de-emphasizes some of the data. In the above problem, data points are $x_1=0, x_2=1, x_3=-1$ and $w(x)\equiv 1$.