Understanding these basic definition and statement~
- Let $\left(a_{k}\right)$ be a sequence of real numbers. We use the notation $s_{n}=\sum_{k=0}^{n} a_{k}$ to denote the $n$th partial sum of the infinite series $s_{\infty}=\sum_{k=0}^{\infty} a_{k} .$ If the sequence of partial sums $\left(s_{n}\right)$ converges to a real number $s$, we say that the series $\sum_{k} a_{k}$ is convergent and we write $s=\sum_{k=0}^{\infty} a_{k} . \mathrm{A}$ series that is not convergent is called divergent.
- An infinite series $\sum_{k=0}^{\infty} a_{k}$ is said to converge absolutely if $\sum_{k=0}^{\infty}\left|a_{k}\right|$ converges. If a series converges absolutely, then it converges. Furthermore, an absolutely convergent series converges to the same sum in whatever order the terms are taken.
If $\sum^{\infty} a_{k}$ converges but $\sum_{k}^{\infty}\left|a_{k}\right|$ diverges, then we say $\sum^{\infty} a_{k}$ converges conditionally. Any conditionally convergent series can be rearranged to obtain a series which converges to any given sum or diverges to $\infty$ or $-\infty .$
- Ratio Test: Given a series $\sum_{k=1}^{\infty} a_{k}$ with $a_{k} \neq 0$, if $a_{k}$ satisfies
$$
\lim {k \rightarrow \infty}\left|\frac{a{k+1}}{a_{k}}\right|=r<1
$$
then the series converges absolutely. - Root Test: Let $\sum_{k=1}^{\infty} a_{k}$ be a series and
$$
\alpha:=\lim {k \rightarrow \infty} \sqrt[k]{\left|a{k}\right|}
$$
Then the following hold.
a) $\sum_{k=1}^{\infty} a_{k}$ converges absolutely if $\alpha<1$. b) $\sum_{k=1}^{\infty} a_{k}$ diverges if $\alpha>1$.
For $\alpha=1$ both convergence and divergence of $\sum_{k=1}^{\infty} a_{k}$ are possible. - Cauchy Criterion for Series: The series $\sum^{\infty} a_{k}$ converges if and only if given $\varepsilon>0$, there exists $N \in \mathbb{N}$ such that whenever $n>m \geq N$ it follows that
$$
\left|a_{m+1}+a_{m+2}+\cdots+a_{n}\right|<\varepsilon
$$ - Comparison Test: Assume $\left(a_{k}\right)$ and $\left(b_{k}\right)$ are sequences satisfying $0 \leq a_{k} \leq b_{k}$ for all $k \in \mathbb{N}$.
a) If $\sum^{\infty} b_{k}$ converges, then $\sum^{\infty} a_{k}$ converges.
b) If $\sum_{k=1}^{\infty} a_{k}$ diverges, then $\sum_{k=1}^{\infty} b_{k}$ diverges. - Geometric Series: A series is called geometric if it is of the form
$$
\sum_{k=1}^{\infty} a r^{k}=a+a r+a r^{2}+a r^{3} \cdots
$$
and
$$
\sum_{k=1}^{\infty} a r^{k}=\frac{a}{1-r}
$$
if and only if $|r|<1 .$ In case $r=1$ and $a \neq 0$, the series diverges.
- Alternating Series Test: Let $\left(a_{n}\right)$ be a sequence satisfying
a) $a_{1} \geq a_{2} \geq a_{3} \geq \cdots \geq a_{n} \geq a_{n+1} \geq \cdots$ and
b) $\left(a_{n}\right) \rightarrow 0$.
Then the alternating series $\sum_{n=1}^{\infty}(-1)^{n+1} a_{n}$ converges. - Let $\sum_{k=1}^{\infty} a_{k}$ be a series. A rearrangement is the series $\sum_{k=1}^{\infty} a_{\sigma(k)}$ where $\sigma$ is a permutation of ${1,2,3, \ldots} .$ The summands of the rearrangement $\sum_{k=1}^{\infty} a_{\sigma(k)}$ are the same as those of the original series, but they occur in different order. If $\sigma$ is a permutation of $\mathbb{N}$ with $\sigma(k)=k$ for almost all $k \in \mathbb{N}$, then $\sum_{k=1}^{\infty} a_{k}$ and $\sum_{k=1}^{\infty} a_{\sigma(k)}$ have the same convergence behavior, and their values are equal if the series converge. For a permutation $\sigma(k) \neq k$ for infinitely many $k \in \mathbb{N}$, this may not be true.
- If $\sum_{k=1}^{\infty} a_{k}$ converges absolutely, then any rearrangement of this series converges to the same limit.
Try the following problem to test yourself!
Show that the Euler’s series $\sum_{k=1}^{\infty} \frac{1}{k^{2}}$ converges. Find the sum of the series.
Let us use the substitution $u=\pi-x .$ Then we have
$$
\int_{0}^{\pi} x e^{\sin (x)} d x=\int_{\pi}^{0}-(\pi-u) e^{\sin (\pi-u)} d u=\int_{0}^{\pi}(\pi-u) e^{\sin (u)} d u
$$
which implies
$$
\int_{0}^{\pi} x e^{\sin (x)} d x=\pi \int_{0}^{\pi} e^{\sin (u)} d u-\int_{0}^{\pi} u e^{\sin (u)} d u
$$
Since
$$
\int_{0}^{\pi} x e^{\sin (x)} d x=\int_{0}^{\pi} u e^{\sin (u)} d u
$$
we get
Because the terms in the sum are all positive, the sequence of partial sums given by
$$
s_{n}=1+\frac{1}{4}+\frac{1}{9}+\cdots+\frac{1}{n^{2}}
$$
is increasing. To find an upper bound for $s_{n}$, observe
$$
\begin{aligned}
s_{n} &=1+\frac{1}{2 \cdot 2}+\frac{1}{3 \cdot 3}+\cdots+\frac{1}{n \cdot n} \
&<1+\frac{1}{2 \cdot 1}+\frac{1}{3 \cdot 2}+\cdots+\frac{1}{n \cdot(n-1)} \
&=1+\left(1-\frac{1}{2}\right)+\left(\frac{1}{2}-\frac{1}{3 \cdot 2}\right)+\cdots+\left(\frac{1}{n-1}-\frac{1}{n}\right) \
&=1+1-\frac{1}{n} \
&<2 .
\end{aligned}
$$
Thus 2 is an upper bound for the sequence of partial sums, so by the Monotone Convergence Theorem, $\sum_{k=1}^{\infty} \frac{1}{k^{2}}$ converges to a limit less than 2 . Next, we claim that $\sum_{k=1}^{\infty} \frac{1}{k^{2}}=\frac{\pi^{2}}{6}$. This can be shown by using the well-known clever trick of evaluating the double integral
$$
I=\int_{0}^{1} \int_{0}^{1} \frac{1}{1-x y} d x d y
$$
and we evaluate $I$ in two different ways. First notice
$$
\frac{1}{1-x y}=\sum_{k=0}^{\infty}(x y)^{k},
$$
therefore
$$
\begin{aligned}
I &=\int_{0}^{1} \int_{0}^{1} \sum_{k=0}^{\infty}(x y)^{k} d x d y \
&=\sum_{k=0}^{\infty} \int_{0}^{1} \int_{0}^{1}(x y)^{k} d x d y \
&=\sum_{k=0}^{\infty}\left(\int_{0}^{1} x^{k} d x\right)\left(\int_{0}^{1} y^{k} d y\right) \
&=\sum_{k=0}^{\infty} \frac{1}{(k+1)^{2}} \
&=\sum_{k=1}^{\infty} \frac{1}{k^{2}}
\end{aligned}
$$
The second way to evaluate $I$ comes from a change of variables. Let
$$
u=\frac{x+y}{2} \quad \text { and } \quad v=\frac{y-x}{2}
$$
or equivalently
$$
x=u-v \quad \text { and } \quad y=u+v
$$
Given this transformation,
$$
\frac{1}{1-x y}=\frac{1}{1-\left(u^{2}+v^{2}\right)}
$$
and using the change of variables formula we obtain
$$
I=\iint f(x, y) d x d y=\iint f(x(u, v), y(u, v))\left|\frac{d(x, y)}{d(u, v)}\right| d u d v
$$
where
$$
\left|\frac{d(x, y)}{d(u, v)}\right|=2
$$
Since the function to be integrated and the domain in the $u v$-plane are symmetric with respect to the $u$-axis, we can split the integral into two parts as such,
$$
\begin{aligned}
I &=4 \int_{0}^{1 / 2}\left(\int_{0}^{u} \frac{d v}{1-u^{2}+v^{2}}\right) d u+4 \int_{1 / 2}^{1}\left(\int_{0}^{1-u} \frac{d v}{1-u^{2}+v^{2}}\right) d u \
&=4 \int_{0}^{1 / 2} \frac{1}{\sqrt{1-u^{2}}} \arctan \left(\frac{u}{\sqrt{1-u^{2}}}\right) d u+4 \int_{1 / 2}^{1} \frac{1}{\sqrt{1-u^{2}}} \arctan \left(\frac{1-u}{\sqrt{1-u^{2}}}\right) d u .
\end{aligned}
$$
Now, observe that if we set
$$
k(u)=\arctan \left(\frac{u}{\sqrt{1-u^{2}}}\right) \text { and } h(u)=\arctan \left(\frac{1-u}{\sqrt{1-u^{2}}}\right),
$$
then we obtain the derivatives
$$
k^{\prime}(u)=\frac{1}{\sqrt{u^{2}}} \text { and } h^{\prime}(u)=-\frac{1}{2} \frac{1-u}{\sqrt{1-u^{2}}} \text {. }
$$
This yields
$$
\begin{aligned}
I &=4 \int_{0}^{1 / 2} k^{\prime}(u) k(u) d u+4 \int_{1 / 2}^{1}-2 h^{\prime}(u) h(u) d u \
&=\left.2(k(u))^{2}\right|{0} ^{1 / 2}-\left.4(h(u))^{2}\right|{1 / 2} ^{1} \
&=2(k(1 / 2))^{2}-2(k(0))^{2}–4(h(1))^{2}+4(h(1 / 2))^{2} \
&=2\left(\frac{\pi}{6}\right)^{2}-0+0+4\left(\frac{\pi}{6}\right)^{2} \
&=\left(\frac{\pi}{6}\right)^{2} .
\end{aligned}
$$
Let $T:(C[0,1],|\cdot|) \rightarrow(C[0,1],|\cdot|)$ be a function defined as
$$
T f(x)=\int_{0}^{x} f(t) d t
$$
where by $C[0,1]$ we mean the vector space of all continuous real-valued functions defined on $[0,1]$ and $|f|=\sup _
Discuss the convergence or divergence of $\sum x_{n}$ where
$$
x_{n}=\frac{1 \cdot 3 \cdots(2 n-1)}{2 \cdot 4 \cdots(2 n)}
$$
Note that
$$
x_{n}=\frac{(2 n) !}{\left(2^{n} n !\right)^{2}}=\frac{(2 n) !}{2^{2 n}(n !)^{2}}
$$
Let us first remark that $\lim {n \rightarrow \infty} x{n}=0 .$ Indeed, notice that $0<x_{n}<1$ for any $n \geq 1$. Since $x_{n+1}=\frac{2 n+1}{2 n+2} x_{n}$, we have $x_{n+1}<x_{n}$ for any $n \geq 1$, which implies that $\left{x_{n}\right}$ is decreasing. So $\lim {n \rightarrow \infty} x{n}=l$ exists. Define
$$
y_{n}=\frac{2 \cdot 4 \cdots(2 n)}{3 \cdot 5 \cdots(2 n+1)}
$$
Since $4 n^{2}-1<4 n^{2}$ we get $\frac{2 n-1}{2 n}<\frac{2 n}{2 n+1}$, for any $n \geq 1 .$ This obviously implies $x_{n}<y_{n}$ for any $n \geq 1 .$ Since $x_{n} y_{n}=\frac{1}{2 n+1}$ we deduce that $x_{n}^{2}<x_{n} y_{n}=\frac{1}{2 n+1}$ for any $n \geq 1 .$ Hence $\lim {n \rightarrow \infty} x{n}^{2}=0$
which implies $\lim {n \rightarrow \infty} x{n}=0 .$ This conclusion may suggest that the series $\sum x_{n}$ is convergent. But since
$$
\frac{x_{n+1}}{x_{n}}=\frac{2 n+1}{2 n+2}
$$
we have
$$
n\left(1-\frac{x_{n+1}}{x_{n}}\right)=\frac{n}{2 n+2}
$$
Hence $\lim {n \rightarrow \infty} n\left(1-\frac{x{n+1}}{x_{n}}\right)=\frac{1}{2}<1$, then $\sum x_{n}$ is divergent based on the previous problem. Note that the ratio test does not help since $\lim {n \rightarrow \infty} \frac{x{n+1}}{x_{n}}=1 .$ In fact, the root test will also be not conclusive.
Given two series $\sum x_{n}$ and $\sum y_{n}$, define $z_{n}=\sum_{k=0}^{n} x_{k} y_{n-k} .$ Suppose that $\sum x_{n}$ and $\sum y_{n}$ are absolutely convergent. Show that $\sum z_{n}$ is absolutely convergent, and
$$
\sum z_{n}=\sum x_{n} \cdot \sum y_{n} .
$$
We show that the series
$$
x_{0} y_{0}+x_{0} y_{1}+x_{1} y_{1}+x_{1} y_{0}+x_{0} y_{2}+x_{1} y_{2}+x_{2} y_{2}+x_{2} y_{1}+x_{2} y_{0}+\cdots
$$
converges absolutely. In fact, the sum of the first $(n+1)^{2}$ terms of the series
$(8.4)$
$\left|x_{0} y_{0}\right|+\left|x_{0} y_{1}\right|+\left|x_{1} y_{1}\right|+\left|x_{1} y_{0}\right|+\left|x_{0} y_{2}\right|+\left|x_{1} y_{2}\right|+\left|x_{2} y_{2}\right|+\left|x_{2} y_{1}\right|+\left|x_{2} y_{0}\right|+\cdots$
is $\sum_{k=0}^{n}\left|x_{k}\right| \cdot \sum_{k=0}^{n}\left|y_{k}\right|$, which converges to $\sum_{k=0}^{\infty}\left|x_{k}\right| \cdot \sum_{k=0}^{\infty}\left|y_{k}\right|$.
Hence the sequence of partial sums of $(8.4)$ has a convergent subsequence. But all the terms of the series are positive, so the sequence of all partial sums is increasing, and bounded above by $\sum_{k=0}^{\infty}\left|x_{k}\right| \cdot \sum_{k=0}^{\infty}\left|y_{k}\right|$, so it also converges, to the same limit. Thus (8.3) converges absolutely. The same argument as above, considering sums of the first $(n+1)^{2}$ terms of the series $(8.3)$, shows that the sum of this series is $\sum x_{n} \cdot \sum y_{n}$ But
$$
\sum z_{n}=x_{0} y_{0}+\left(x_{0} y_{1}+x_{1} y_{0}\right)+\left(x_{0} y_{2}+x_{1} y_{1}+x_{2} y_{0}\right)+\cdots
$$
is a rearrangement of $(8.3)$, so by the Rearrangement Theorem it converges to the same limit. Note: You might like to think about the above proof by considering an “infinite matrix” in which the $(i, j)$ term is $x_{i} y_{i}$ :
$\begin{array}{lllll}x_{0} y_{0} & x_{0} y_{1} & x_{0} y_{2} & x_{0} y_{3} & \cdots \ x_{1} y_{0} & x_{1} y_{1} & x_{1} y_{2} & x_{1} y_{3} & \cdots \ x_{2} u_{0} & x_{2} u_{1} & x_{2} u_{2} & x_{2} y_{2} & \cdots\end{array}$
$$
\cdots
$$
$0 \quad x_{3}$
$$
\begin{array}{r}
\cdots \
\ddots
\end{array}
$$
复分析代写,数学代写Riemann surface请认准UprivateTA™. UprivateTA™为您的留学生涯保驾护航。
Fourier analysis代写
离散数学代写
Partial Differential Equations代写可以参考一份偏微分方程midterm答案解析