Problem 1. For simplicity let's denote $A=|a_0|^2+|a_1|^2+\cdots +|a_{n-1}|^2$, let $z_0$ be such that $P(z_0)=0$, we are going to show that $|z_0|^2\leq 1+A$.
Since $z_0^n=-a_{n-1}z_0^{n-1}-\cdots -a_1z_0-a_0$, by triangle inequality and then Cauchy-Schwarz inequality, we have \begin{align*}
|z_0|^n&\leq |a_{n-1}||z_0^{n-1}|+\cdots+ |a_2||z_0|^2+|a_1||z_0|-|a_0|\cdot 1\\
&\leq \sqrt{(|a_{n-1}|^2+\cdots +|a_2|^2+|a_1|^2+|a_0|^2)(|z_0^{n-1}|^2+\cdots+|z_0^2|^2+ |z_0|^2+1^2)}.
\end{align*} Here we see that Cauchy-Schwarz inequality is very helpful in separating two sets of numbers. Let $x=|z_0|$, we have \begin{equation}\label{ineq}
x^{2n}\leq A\sum_{k=0}^{n-1}(x^2)^k=A\frac{x^{2n}-1}{x^2-1}.
\end{equation} It is a natural moment to divide our problem into two cases. When $x\leq 1$, then $x\leq \sqrt{1+A}$ obviously, so we are done.
Suppose that $x>1$, then (\ref{ineq}) makes sense, and which is the same as \[
\frac{x^{2n}(x^2-1)}{x^{2n}-1}\leq A.
\] We observe that if $x^2-1\leq \frac{x^{2n}(x^2-1)}{x^{2n}-1}$, then we are done. Indeed, since \[
x^2-1\leq \frac{x^{2n}(x^2-1)}{x^{2n}-1}\iff (x^2-1)(x^{2n}-1)\leq x^{2n+2}-x^{2n}\iff 1\leq x^2,
\] the last one holds since the case $x>1$ is under consideration so we have \[
x^2-1\leq A\iff x\leq \sqrt{1+A},
\] as desired.
Definition. A function $\inner{\cdot,\cdot}:V\times V\to \R$ defined on a vector space $V$ is said to be an inner product ifRemark. When $\inner{\cdot,\cdot}$ is symmetric, then $\inner{\cdot,\cdot}$ is bilinear iff $\inner{\cdot,\cdot}$ is linear in one of the two variables. For example, if $\inner{\cdot,\cdot}$ is symmetric and linear in the first variable:\[
(i) Symmetric: $\inner{u,v}=\inner{v,u}$.
(ii) Bilinear: $\inner{u+aw,v}=\inner{u,v}+a\inner{w,v}$ and $\inner{u,v+aw}=\inner{u,v}+a\inner{u,w}$.
(iii) Positive Definite: $\inner{v,v}\ge 0$ and $\inner{v,v}=0\implies v=0$.
\inner{u+aw,v}=\inner{u,v}+a\inner{w,v},
\] then \[
\inner{u,v+aw}=\inner{v+aw,u}=\inner{v,u}+a\inner{w,u}=\inner{u,v}+a\inner{u,w},
\] so $\inner{\cdot,\cdot}$ is also linear in the second variable, i.e., it is bilinear.
Problem 2. (a) We use the property that for every matrix $C$, $\tr C=\tr C^T$ then:
$\boxed{1}$: $\inner{B,A}=\tr (A^TB) = \tr( (A^TB)^T)=\tr(B^TA)=\inner{A,B}$.
$\boxed{2}$: We prove linearity in the first variable:\begin{align*}
\inner{A+\alpha C,B}&=\tr(B^T(A+\alpha C))\\
&=\tr(B^TA + \alpha B^TC)\\
&=\tr (B^TA)+\alpha \tr (B^TC)\\
&=\inner{A,B} +\alpha \inner{C,B}.
\end{align*} Thus by symmetricness, $\inner{\cdot,\cdot}$ is bilinear (explained in the remark above).
$\boxed{3}$: To prove positive definteness, we observe that in fact for $A=\matrixx{a_1&a_2&\cdots &a_n}$ and $B=\matrixx{b_1&b_2&\cdots &b_n}$, \begin{align*}
\inner{A,B}&=\tr(B^TA)\\
&=\tr\brac{\matrixx{\text{--- }b_1^T\text{ ---}\\ \text{--- }b_2^T \text{ ---}\\ \vdots \\ \text{--- }b_n^T\text{ ---}}\matrixx{a_1&a_2&\cdots &a_n}}\\
&=\tr\brac{\matrixx{
b_1\cdot a_1&b_1\cdot a_2&\cdots & b_1\cdot a_n\\
b_2\cdot a_1&b_2\cdot a_2&\cdots & b_2\cdot a_n\\
\vdots &\vdots&\ddots &\vdots\\
b_n\cdot a_1&b_n\cdot a_2&\cdots & b_n\cdot a_n }
}\\
&=a_1\cdot b_1 + a_2\cdot b_2 +\cdots +a_n\cdot b_n.
\end{align*} Here for $x,y\in \R^n$, the notation $x\cdot y$ means the dot product. So our inner product defined on $M_{n\times n}(\R)$ is nothing but the sum of dot products of the corresponding column vectors. Therefore we have for matrix $C=\matrixx{c_1&\cdots &c_n}$, \[
\inner{C,C}=\|c_1\|^2+\|c_2\|^2+\cdots +\|c_n\|^2\ge 0
\]and also \[
\inner{C,C}=0\implies \|c_1\|^2+\|c_2\|^2+\cdots +\|c_n\|^2=0\implies c_1=c_2=\cdots=c_n=0,
\] in other words, $C=0$, the zero matrix. Thus $\inner{\cdot,\cdot}$ is positive definite.
All together, $\inner{\cdot,\cdot}$ is an inner product.
(b) Let $S=\{A,B,C\}$ be the corresponding matrices, a direct computation gives \[
\inner{A,B}=\inner{A,C}=\inner{B,C}=0,
\] so $S$ is orthogonal. In fact they are orthonormal since $\inner{A,A}=\inner{B,B}=\inner{C,C}=1$.
Problem 3. For $\inner{f,g}$ defined by $\int_a^bfg\,dx$, it is obvious that $\inner{\cdot,\cdot}$ is symmetric and bilinear, next we check that $\inner{\cdot,\cdot}$ is positive definite: In fact, let $f\in C[a,b]$, then \[
\inner{f,f}=\int_a^b\underbrace{f^2(x)}_{\ge 0}\,dx \ge 0.
\] And also if $\inner{f,f}=\int_a^bf^2(x)\,dx=0$, then since $f^2(x)\ge 0$, by calculus, we have $f^2(x)=0$ on $[a,b]$, it follows that $f(x)=0$ on $[a,b]$. So $\inner{\cdot,\cdot}$ is positive definite.
Problem 4. (a) Choose $u_1=1$, $u_2=x$, $u_3=x^2$, which forms a basis of $\mathcal P'$, let's orthogonalize it by Gram-Schmidt process as follows: \begin{align*}
v_1&=1\\
v_2&=u_2 -\sum_{i<2}\frac{\inner{u_2,v_i}v_i}{\|v_i\|^2}\\
v_3&=u_3 -\sum_{i<3}\frac{\inner{u_3,v_i}v_i}{\|v_i\|^2}.
\end{align*} So to obtain $v_2,v_3$, we must first work out the quantities on the RHS.
For $v_2$, since $\dis \inner{u_2,v_1} = \int_0^1 x\cdot 1\,dx =\frac{x^2}{2}\bigg|_0^1=\frac{1}{2}$, we have \[
v_2=x-\frac{\frac{1}{2} \cdot 1}{1}=x-\frac{1}{2}.
\] For $v_3$, then:
$\inner{u_3,v_1}=\inner{x^2,1}= \int_0^1 x^2\,dx = \frac{1}{3}$, $\|v_1\|^2=1$
$\inner{u_3,v_2}=\inner{x^2,x-\frac{1}{2}} =\int_0^1 x^2(x-\frac{1}{2})\,dx = \frac{1}{12}, \|v_2\|^2=\frac{1}{12}$.
We combine the calculation to get \[
v_3=x^2-\frac{\frac{1}{3}\cdot 1}{1} - \frac{\frac{1}{12} (x-\frac{1}{2})}{\frac{1}{12}}=x^2-\frac{1}{3}-x+\frac{1}{2} =x^2-x+\frac{1}{6}.
\]
(b) We make good use of $v_i$'s above, in fact, for $i\neq j$,\begin{align*}
0&=\int_0^1 v_i(x)v_j(x)\,dx\\
&=\int_{-1/2}^{1/2} v_i(x+1/2)v_j(x+1/2)\,dx\\
&=\frac{1}{2\pi}\int_{-\pi}^{\pi} v_i\brac{\frac{x}{2\pi} +\frac{1}{2} }v_j\brac{\frac{x}{2\pi} +\frac{1}{2}}\,dx.
\end{align*} $\dis v_i\brac{\frac{x}{2\pi} +\frac{1}{2}},i=1,2,3$ will do.
Problem 5. ($\Rightarrow$) Suppose $\inner{u,v}=0$, then \[
\|u+av\|^2=\|u\|^2+\|av\|^2\ge \|u\|^2,
\] hence $\|u+av\|^2\ge \|u\|^2$ for every $a\in \R$.
($\Leftarrow$) Suppose $\|u\|\leq \|u+av\|$ for every $a\in \R$, then for every $a\in \R$, \begin{align*}
\|u\|^2&\leq \|u+av\|^2\\
\|u\|^2&\leq \|u\|^2+|a|^2\|v\|^2 +2\inner{u,av}\\
-2a\inner{u,v}&\leq |a|^2\|v\|^2.
\end{align*} Choose $\alpha$ such that $\alpha \inner{u,v}=|\inner{u,v}|$, replace $a$ by $-t\alpha$, $t\in \R$, then \[
2t|\inner{u,v}|\leq t^2\|v\|^2.
\] Let $t>0$ and set $t\to 0^+$ to conclude $|\inner{u,v}|=0$, i.e., $\inner{u,v}=0$, done.
Problem 6. (a) Typo: $S$ should at least contain $0$. Let $x\in S\cap S^\perp$, then $x\in S$ and $\inner{x,s}=0$ for every $s\in S$. But $x\in S$, so $\inner{x,x}=0$. Since $\inner{\cdot,\cdot}$ is an inner product, we have $x=0$. Hence \[
S\cap S^\perp \subseteq\{0\},
\] as both $S$ and $S^\perp$ contains $0$, we have $S\cap S^\perp =\{0\}$.
For second part of (a), let $x\in (W_1+W_2)^\perp$, then $\inner{x,v}=0$ for all $v\in W_1+W_2$, in particular, $\inner{x,v}=0$ for all $v\in W_1$ and also all $v\in W_2$, so $x\in W_1^\perp \cap W_2^\perp$. Thus \[(W_1+W_2)^\perp\subseteq W_1^\perp \cap W_2^\perp.\] Conversely, let $x\in W_1^\perp \cap W_2^\perp$, then for every $w_1\in W_1$ and every $w_2\in W_2$, \[
\inner{x,w_1},\inner{x,w_2}=0\implies \inner{x,w_1+w_2}=0,
\] so $\inner{x,w}=0$ for all $w\in W_1+W_2$, thus $x\in (W_1+W_2)^\perp$, so \[
W_1^\perp \cap W_2^\perp\subseteq (W_1+W_2)^\perp.
\] (b) Let $P$ denote the collection of ALL polynomials on $[0,1]$. $P$ is viewed as a subspace of $C[0,1]$, we define an inner product on $C[0,1]$ by, for $f,g\in C[0,1]$, \[
\inner{f,g}=\int_0^1f(x)g(x)\,dx.
\] We will show that $(P^\perp)^\perp =C[0,1]$, hence $(P^\perp)^\perp\neq P$. Here $A^\perp$ means a subspace of elements in $C[0,1]$ which are orthogonal to all elements in $A$, i.e., $\cdot^\perp$ denotes the orthogonal complement in $C[0,1]$.
What is $P^\perp$? A function $g\in P^\perp$ iff \[
\inner{g,p}=\int_0^1 gp\,dx=0,\quad \forall p\in P.
\] As $g\in C[0,1]$, there is (by Weierstrass approximation theorem in Real Analysis) a sequence of polynomials such that for each $n$, \[
\max_{x\in [0,1]} |g-p_n|<\frac{1}{n},
\] it follows that \[
0=\limn \int_0^1gp_n\,dx = \int_0^1g\limn p_n\,dx=\int_0^1g^2\,dx,
\] so $g=0$ in $C[0,1]$. In conclusion, we have shown that $g\in P^\perp\implies g=0$, so $P^\perp\subseteq \{0\}$, but $0\in P^\perp$ of course, so $P^\perp =\{0\}$.
Therefore we have \[
(P^\perp)^\perp=\{0\}^\perp = C[0,1]\neq P.
\]
No comments:
Post a Comment