Advance Probability Theory 代写

UNIVERSITY OF WARWICK
Third Year EXAMINATIONS: SUMMER 2021

Time allowed: 2 hours
Approved calculators may be used.
Full marks may be obtained by correctly answering 3 complete questions. Candi-
dates may attempt all questions. Marks will be awarded for the best 3 answers
only.

Problem 1.

(a) State and prove the first and second Borel-Cantelli lemmas.
For the rest of this question you may use, without proof, Kolmogorov’s zero one law.
(b) Let $\left(X_{n}\right){n \geq 1}$ be a sequence of independent random variables on probability space $(\Omega, \mathcal{F}, \mathbb{P})$ taking values in ${1,2,3, \ldots}$. Suppose that $\mathbb{P}\left(X{n} \geq i\right)=1 / i$ for each $n$ and each $i \in{1,2,3, \ldots}$.
(i) Calculate
$$
\mathbb{P}\left(X_{n} \geq n^{\alpha} \text { i.o. }\right)
$$
for each fixed $\alpha>0$.
(ii) Show that the random variable
$$
\limsup {n \rightarrow \infty} \frac{\log \left(X{n}\right)}{\log (n)}
$$
is almost surely constant, and find the value of this constant.
(b) Suppose $\left(X_{n}\right){n \geq 1}$ is a sequence of independent identically distributed random variables such that $$ \mathbb{P}\left(X{n}=1\right)=\mathbb{P}\left(X_{n}=-1\right)=\frac{1}{2} \quad \text { for each } n \in \mathbb{N}
$$
Let $S_{n}=\sum_{k=1}^{n} X_{k}$, and define
$$
B_{-}=\left{\liminf {n \rightarrow \infty} S{n}=-\infty\right} \quad \text { and } \quad B_{+}=\left{\limsup {n \rightarrow \infty} S{N}=\infty\right} .
$$
(i) Show that both $B_{-}$ and $B_{+}$ belong to the tail $\sigma$ -algebra of $\left(X_{n}\right){n \geq 1}$, and that $\mathbb{P}\left(B{+}\right)=$ $\mathbb{P}\left(B_{-}\right) \in{0,1}$
(ii) Using the Borel-Cantelli lemmas, show that for each $k \geq 1$
$$
\limsup {n \rightarrow \infty}\left(S{n+k}-S_{n}\right)=k \text { a.s. }
$$
[Hint: Consider $A_{n}=\left{S_{n+k}-S_{n}=k\right} .$.
(iii) Deduce that $\mathbb{P}\left(B_{+}^{c} \cap B_{-}^{c}\right)=0$ and hence $\mathbb{P}\left(B_{+}\right)=\mathbb{P}\left(B_{-}\right)=1$.

Proof .

$\quad$ (Applied Bookwork:) $($ Borel-Cantelli Lemmas $)$ Let $\left(A_{n}\right){n} \geq 1$ be a sequence of events on probability space $(\Omega, \mathcal{F}, \mathbb{P}) .$ Then, $($ BC1 $)$ If $\sum{n} \mathbb{P}\left(A_{n}\right)<\infty$ then $\mathbb{P}\left(\limsup A_{n}\right)=\mathbb{P}\left(A_{n}\right.$ i.o. $)=0$
$(B C 2)$ If $\left(A_{n}\right){n \geq 1}$ are also independent then $\sum{n} \mathbb{P}\left(A_{n}\right)=\infty$ implies $\mathbb{P}\left(\limsup A_{n}\right)=\mathbb{P}\left(A_{n}\right.$ i.o. $)=1$ (i.e. the converse of (BC1) holds under independence). $(1$ marks $)$ For $(B C 1)$ we assume that $\sum_{n} \mathbb{P}\left(A_{n}\right)<\infty .$ Let $G_{n}=\bigcup_{m \geq n} A_{m}$, which is a decreasing sequence of sets such that $G_{n} \searrow G$ where $G=\limsup {n \rightarrow \infty} A{n} .$ Fix $k \in \mathbb{N}$ and observe
$$
\mathbb{P}\left(\limsup {n \rightarrow \infty} A{n}\right)=\mathbb{P}\left(\bigcap_{n \geq 1} G_{n}\right) \leq \mathbb{P}\left(G_{k}\right) \leq \sum_{n \geq k} \mathbb{P}\left(A_{n}\right)
$$
where the final inequality holds by $\sigma$ -subadditivity. By assumption the right hand side converqes to zero as $k \rightarrow \infty$, which concludes the proof of $(B C 1) .$ $(2$ marks $)$ For $(B C 2)$ we assume that $\left(A_{n}\right){n \geq 1}$ are independent and $\sum{n} \mathbb{P}\left(A_{n}\right)=\infty .$ The main idea of the proof is to take compliments and use the standard bound $1-x \leq e^{-x}$ for $x \in \mathbb{R} .$ Fix $m, r \in \mathbb{N}$, then by independence
$$
\mathbb{P}\left(\bigcap_{m \leq n} A_{n}^{c}\right) \leq \mathbb{P}\left(\bigcap_{m \leq n \leq r} A_{n}^{c}\right)=\prod_{m \leq n \leq r} \mathbb{P}\left(A_{n}^{c}\right)=\prod_{m \leq n \leq r}\left(1-\mathbb{P}\left(A_{n}\right)\right)
$$
Applying the bound $1-x \leq e^{-x}$, we find
$$
\mathbb{P}\left(\bigcap_{m \leq n} A_{n}^{c}\right) \leq e^{-\sum_{m \leq n \leq r} \mathbb{P}\left(A_{n}\right)} \rightarrow 0 \quad \text { as } r \rightarrow \infty
$$
where convergence follows by assumption on $\sum_{n} \mathbb{P}\left(A_{n}\right) .$ It follows that, since $\left{A_{n}^{c}\right.$ ev. $}$ is a countable. union of null sets, that $\mathbb{P}\left(A_{n}\right.$ i.o. $)=1-\mathbb{P}\left(A_{n}^{c}\right.$ ev. $)=1$, as required. $(3$ marks $)$
Tor the rest of this question you may use, without proof, Kolmogorov’s zero one law.

(i) Apply $(B C 1)$ and $(B C 2)$ above with $A_{n}=\left{X_{n} \geq n^{\alpha}\right} .$ Observe
$$
\sum_{n} \mathbb{P}\left(A_{n}\right)=\sum_{n} \frac{1}{\left\lceil n^{\alpha}\right\rceil}\left{\begin{array}{ll}
=\infty & \text { if } \alpha \leq 1, \
<\infty & \text { if } \alpha>1 .
\end{array}\right.
$$

So (since the $X_{n}$ are independent) the Borel-Cantelli lemmas imply
$$
\mathbb{P}\left(X_{n} \geq n^{\alpha} \text { i.o. }\right)=\left{\begin{array}{ll}
1 & \text { if } \alpha \leq 1 \
0 & \text { if } \alpha>1
\end{array}\right.
$$
(3marks)
(ii) Let $\mathcal{F}{n}=\sigma\left(X{n}\right)$, then $\left(\mathcal{F}{n}\right){n \geq 1}$ is an independent sequence of $\sigma$ -algebras. Let $L_{n}=\sup {m \geq n} \frac{\log X{m}}{\log m}$
and $L=\lim {n \rightarrow \infty} L{n} .$ Since $L_{n}$ is $\sigma\left(\cup_{m \geq n} \mathcal{F}{m}\right)$ -measurable we have $L$ is $\mathcal{T}$ -measurable, in particular $L$ is almost surely constant by Kolmogorov’s zero-one law. Observe ${L \geq 1} \supseteq\left{L{n} \geq 1\right.$ i.o. $} \supseteq$ $\left{X_{n} \geq n\right.$ i.o. $} .$ It follows that $\mathbb{P}(L \geq 1)=1 .$ Now fix $k \in \mathbb{N}$,
$$
\left{L>1+\frac{2}{k}\right} \subseteq\left{L \geq 1+\frac{2}{k}\right} \subseteq \bigcap_{m}\left{L_{n}>1+\frac{2}{k}-\frac{1}{m} \text { i.o. }\right} \subseteq\left{L_{n}>1+\frac{1}{k} \text { i.o. }\right} \text { . }
$$
$B y$ definition $\left{L_{n}>1+\frac{1}{k}\right.$ i.o. $} \subseteq\left{X_{n} \geq n^{1+1 / k}\right.$ i.o}, and by part (i) $\mathbb{P}\left(X_{n} \geq n^{1+1 / k}\right.$ i.o $)=0$. The
conclusion follows, since
$$
{L>1}=\bigcup_{k}\left{L>1+\frac{1}{k}\right}
$$
and a countable union of null sets is null. $(5$ marks)

Problem 2.

(a) Suppose that $\left(X_{n}\right){n \geq 1}$ is a sequence of random variables with $\left|X{n}\right| \leq Y$ for each $n$, where $Y \in \mathcal{L}^{1}$ (i.e. $\mathbb{E}[|Y|]<\infty)$. Show that $\left{X_{n}\right}_{n \geq 1}$ is uniformly integrable.
(b) Suppose that $\left(X_{n}\right){n \geq 1}$ is a sequence of random variables converging to zero in probability. Further, suppose that there exists a constant $K \in(0, \infty)$ such that $\mathbb{E}\left[\left|X{n}\right|^{3}\right] \leq K$ for each $n$. Show that $X_{n} \rightarrow 0$ in $\mathcal{L}^{2}$
(You may use results about uniform integrability so long as you state them clearly)
(c) Suppose that $\left(X_{n}\right){n \geq 1}$ is a sequence of independent normally distributed random variables, with common mean $\mu$ and common variance 1. Let $Z{n}=\exp \left(X_{1}+\ldots+X_{n}\right)$
(i) Show that $Z_{n} \rightarrow 0$ in $\mathcal{L}^{2}$ if and only if $\mu<-1$. (Hint: If $W$ is a standard normal random variable then $\mathbb{E}\left(e^{\theta W}\right)=e^{\theta^{2} / 2}$.)
(ii) Show that $Z_{n} \rightarrow 0$ in probability if $\mu<0$.
(d) Let $\left(X_{n}\right){n \geq 2}$ be a sequence of independent random variables such that for each $n \in{2,3 \ldots}$ $$ \mathbb{P}\left(X{n}=n\right)=\mathbb{P}\left(X_{n}=-n\right)=\frac{1}{2 n \log n}, \quad \text { and } \quad \mathbb{P}\left(X_{n}=0\right)=1-\frac{1}{n \log n}
$$
Let $S_{n}=X_{2}+\ldots+X_{n}$. Show that $\frac{S_{n}}{n} \rightarrow 0$ in probability but not almost surely. (Hint: You may use without proof that $\left.\sum_{n} \frac{1}{n \log n}=\infty .\right)$ [TOTAL: 20]

Proof .

Answer: (Unseen example) (i) Consider
$$
\mathbb{E}\left(Z_{n}^{2}\right)=\mathbb{E}\left[\exp \left(2\left(X_{1}+\ldots+X_{n}\right)\right)\right]=\left(\mathbb{E}\left[\exp \left(2\left(X_{1}-\mu\right)\right)\right]\right)^{n} e^{2 n \mu}
$$
and apply the hint.
(ii) Either by considering $\mathbb{E}\left[\left(\frac{1}{n}\left(X_{1}+\ldots+X_{n}\right)\right)\right]=1 / n \rightarrow 0$ as $n \rightarrow \infty$ (convergence in $\mathcal{L}^{2}$ implies in prob.), or by the Weak Law of Large Numbers, $\frac{1}{n}\left(X_{1}+\ldots+X_{n}\right)$ to $\mu$ in probability as $n \rightarrow \infty$. Hence $\mathbb{P}\left[X_{1}+\ldots+X_{n} \geq n(\mu+\varepsilon)\right] \rightarrow 0$ for each $\varepsilon>0 .$ Fix $\varepsilon>0$ such that $\mu+\varepsilon<0$ (possible since $\mu<0$ ). Now $\mathbb{P}\left[Z_{n} \geq e^{n(\mu+\varepsilon)}\right] \rightarrow 0$ and $e^{n(\mu+\varepsilon)} \rightarrow 0$ as $n \rightarrow \infty$, which implies convergence in probability. (5 marks) (d) Let $\left(X_{n}\right){n \geq 2}$ be a sequence of independent random variables such that for each $n \in{2,3 \ldots}$ $$ \mathbb{P}\left(X{n}=n\right)=\mathbb{P}\left(X_{n}=-n\right)=\frac{1}{2 n \log n}, \quad \text { and } \quad \mathbb{P}\left(X_{n}=0\right)=1-\frac{1}{n \log n} $$ Let $S_{n}=X_{2}+\ldots+X_{n} .$ Show that $\frac{S_{n}}{n} \rightarrow 0$ in probability but not almost surely. (Hint: You may use without proof that $\left.\sum_{n} \frac{1}{n \log n}=\infty .\right)$ Answer: (Unseen example) Fix $\varepsilon>0$ and consider $\mathbb{P}\left(\left|S_{n} / n\right|>\varepsilon\right)$, since $S_{n}$ is mean zero we can apply Chebyshev’s inequality. Hence consider the variance of $S_{n}$,
$$
\begin{aligned}
\operatorname{Var}\left(S_{n}\right) &=\sum_{k=2}^{n} \operatorname{Var}\left(X_{k}\right)=\sum_{k=2}^{n} \mathbb{E}\left(X_{k}^{2}\right)=\sum_{k=2}^{n} \frac{k}{\log k} \
& \leq \frac{(n-2) n}{\log n}+\frac{2}{\log 2},
\end{aligned}
$$
where in the final line we used $x \mapsto x / \log x$ is increasing for $x \geq 3>e .$ So by Chebyshev’s inequality
$$
\mathbb{P}\left(\left|S_{n} / n\right|>\varepsilon\right) \leq \frac{\mathbb{E}\left(S_{n}^{2}\right)}{\varepsilon^{2} n^{2}} \leq \frac{1}{\varepsilon^{2} \log n}+\frac{2}{\varepsilon^{2} n^{2} \log 2} \rightarrow 0
$$

as $n \rightarrow \infty$, as required. For (lack of) almost sure convergence, consider $A_{n}=\left{X_{n}=n\right}$ and $B_{n}=\left{X_{n}=\right.$ $-n}$, then
$$
\sum_{n \geq 2} \mathbb{P}\left(A_{n}\right)=\sum_{n \geq 2} \mathbb{P}\left(B_{n}\right)=\sum_{n \geq 2} \frac{1}{n \log n}=\infty
$$
so by $($ BC2 $) \mathbb{P}\left(A_{n}\right.$ i.o. $)=\mathbb{P}\left(B_{n}\right.$ i.o. $)=1$, so in particular $S_{n}$ can not converge almost surely. $(7$ marks $)$

Problem 3.

(a) Suppose $\left(X_{n}\right){n \geq 1}$ is a sequence of random variables and $X$ is a random variable all on probability space $(\Omega, \mathcal{F}, \mathbb{P})$. Let $\mu{n}, F_{n}$ be the law and distribution function of $X_{n}$ respectively and $\mu, F$ be the law and distribution function of $X$ respectively.
(i) Suppose that $\mu_{n} \rightarrow \mu$ weakly; show that this implies $X_{n} \rightarrow X$ in distribution.
(ii) Suppose that $X_{n} \rightarrow X$ in distribution, and let $\left(a_{n}\right){n \geq 1}$ be a sequence of real numbers such that $a{n} \rightarrow 0$ as $n \rightarrow \infty$. Show that $a_{n} X_{n}$ converges in distribution to zero.
(b) Let $\mu_{X}$ be the law of a random variable $X$. Show that the distribution of $X$ is symmetric, i.e. $\mu_{X}(-\infty, x]=\mu_{X}[-x, \infty)$ for all $x \in \mathbb{R}$, if and only if the characteristic function of $X$ is real. $[4]$
(c) In the following, identify if the sequence of random variables $\left(Y_{n}\right){n \geq 1}$ converges weakly and if so identify the limit. (You should explain your reasoning, and state clearly any results from lectures that you use). (i) $Y{n}=\max \left{U_{1}, \ldots, U_{n}\right}$ where $U_{1}, U_{2}, \ldots$ are independent Uniform $(-1,1)$ random variables.
(ii) $Y_{n}=n\left(1-\max \left{U_{1}, \ldots, U_{n}\right}\right)$ where $U_{1}, U_{2}, \ldots$ are independent Uniform $(-1,1)$ random variables.
(iii) $Y_{n}=\sqrt{\frac{3}{n}}\left(U_{1}+\ldots+U_{n}\right)$ where $U_{1}, U_{2}, \ldots$ are independent Uniform $(-1,1)$ random variables.
(iv) $Y_{n}=n \min \left{U_{1}, \ldots, U_{n}\right}$ where $U_{1}, U_{2}, \ldots$ are independent Uniform $[0,1]$ random variables.
(v) $Y_{n}$ is an exponential random variable with mean $\lambda_{n}>0$ for each $n$, i.e. $\mathbb{P}\left(Y_{n}>y\right)=e^{-y / \lambda_{n}}$, where $\lambda_{n} \rightarrow 0$ as $n \rightarrow \infty$.

Proof .

(i) (Applied Bookwork) Fix $x \in \mathbb{R}$ a continuity point of $F$, and $\delta>0 .$ Define $h_{x} \in C_{b}(\mathbb{R})$ by
$$
h_{x}(y)=\left{\begin{array}{ll}
1 & \text { if } y \leq x \
1-\frac{y-x}{\delta} & \text { if } y \in(x, x+\delta) \
0 & \text { if } y \geq x+\delta
\end{array}\right.
$$
Then by assumption $\mu_{n}\left(h_{x}\right) \rightarrow \mu\left(h_{x}\right)$ as $n \rightarrow \infty$, and by construction of $h_{x}$ we have
$$
F_{n}(x) \leq \mu_{n}\left(h_{x}\right) \quad \text { and } \quad \mu\left(h_{x}\right) \leq F(x+\delta)
$$
which implies that
$$
\limsup {n \rightarrow \infty} F{n}(x) \leq \limsup {n \rightarrow \infty} \mu{n}\left(h_{x}\right)=\mu\left(h_{x}\right) \leq F(x+\delta)
$$
Now take the limit $\delta \rightarrow 0$ and use continuity of $F$ at $x$ to get $\limsup {n \rightarrow \infty} F{n}(x) \leq F(x) .$ We use the same trick to get the desired upper bound on $F(x) .$ That is, define $g_{x} \in C_{b}(\mathbb{R})$ by
$$
g_{x}(y)=\left{\begin{array}{ll}
1 & \text { if } y \leq x-\delta \
1-\frac{y-(x-\delta)}{\delta} & \text { if } y \in(x-\delta, x) \
0 & \text { if } y \geq x
\end{array}\right.
$$
Then by the same argument,
$$
\liminf {n \rightarrow \infty} F{n}(x) \geq \liminf {n \rightarrow \infty} \mu{n}\left(g_{x}\right)=\mu\left(g_{x}\right) \geq F(x-\delta) .
$$
Now take the limit $\delta \rightarrow 0$ and use continuity of $F$ at $x$ to get $\lim \inf {n \rightarrow \infty} F{n}(x) \geq F(x) .$ It follows that $F_{n} \stackrel{d}{\rightarrow} F$. (4 marks)
(ii) (Unseen) Let $Y_{n}=a_{n} X_{n} .$ Fix $\varepsilon>0$ and $u$ a continuity point of $F$ such that $F(u)>1-\varepsilon .$ If $x>0$ then for all $n$ sufficiently large $x / a_{n}>u$, and $\left|F_{n}(u)-F(u)\right|<\varepsilon .$ It follows that $$ F_{Y_{n}}(x)=\mathbb{P}\left(a_{n} X_{n} \leq x\right)=\mathbb{P}\left(X_{n} \leq x / a_{n}\right) \geq F_{n}(u)>1-2 \varepsilon
$$
Thus $\lim {n \rightarrow \infty} F{Y_{n}}(x)=1 .$ By the same argument for $x<0 \lim {n \rightarrow \infty} F{Y_{n}}(x)=0 .$
$(4$ marks $)$
(b) Let $\mu_{X}$ be the law of a random variable $X .$ Show that the distribution of $X$ is symmetric, i.e. $\mu_{X}(-\infty, x]=\mu_{X}[-x, \infty)$ for all $x \in \mathbb{R}$, if and only if the characteristic function of $X$ is real. Answer: (Similar to exercise) $\varphi_{X}(t)=\mathbb{E}\left[e^{i t X}\right]=\mathbb{E}\left[e^{i(-t)(-X)}\right]=\overline{\mathbb{E}\left[e^{i t(-X)}\right]}=\overline{\varphi_{-X}(t)} .$ So by Lévy’s in-
version formula (i.e. one-to-one correspondence between probability measures on $(\mathbb{R}, \mathcal{B})$ and characteristic functions) $X$ has the same law as $-X$ if and only if $\varphi_{X}(t)$ is real. (4 marks)
(c) In the following, identify if the sequence of random variables $\left(Y_{n}\right){n \geq 1}$ converges weakly and if so identify the limit. (You should explain your reasoning, and state clearly any results from lectures that you use). (i) $Y{n}=\max \left{U_{1}, \ldots, U_{n}\right}$ where $U_{1}, U_{2}, \ldots$ are independent Uniform $(-1,1)$ random variables.

(i) Weak convergence to 1. Check (equivalently) convergence in distribution. For $u \geq 1, \mathbb{P}\left(Y_{n} \leq u\right)=1$, if $u \leq-1$ then $\mathbb{P}\left(Y_{n} \leq u\right)=0$ and for $u \in(-1,1), \mathbb{P}\left(Y_{n} \leq u\right)=((1+u) / 2)^{n} \rightarrow 0$.
(ii) Weak convergence to an $\operatorname{Exp}(1 / 2)$ random variable. Check (equivalently) convergence in distribution. For $u>0, \mathbb{P}\left(Y_{n} \leq u\right)=1-\mathbb{P}\left(\max \left{U_{1}, \ldots, U_{n}\right}<1-u / n\right)=1-\mathbb{P}\left(X_{1}<1-u / n\right)^{n}=1-((2-$ $u / n) / 2)^{n} \rightarrow 1-e^{-u / 2}$ (iii) Weak convergence to standard normal. This follows from the Central Limit Theorem, which states if $X_{1}, X_{2}, \ldots$ is a sequence of independent identically distributed random variables with mean $\mu$ and variance $\sigma$, then $$ \frac{1}{\sqrt{n \sigma^{2}}}\left(X_{1}+\ldots+X_{n}-n \mu\right) \stackrel{\text { weakly }}{\rightarrow} \mathcal{N}(0,1) \text { . } $$ Here $\mu=0$ and $\sigma^{2}=\mathbb{E}\left[X_{1}^{2}\right]=\frac{2}{3}$. (iv) $$ \mathbb{P}\left(n \min \left\{U_{1}, \ldots, U_{n}\right\} \leq x\right)=1-\mathbb{P}\left(U_{1}>\frac{x}{n}, \ldots, U_{n}>\frac{x}{n}\right)=1-\left(1-\frac{x}{n}\right)^{n} \rightarrow 1-e^{-x}
$$
as $n \rightarrow \infty .$ So $Y_{n}$ converges in distribution to an $\operatorname{Exp}(1)$ random variable.
(v) Fix $f \in C_{b}(\mathbb{R})$ then
$$
\mathbb{E} f\left(Y_{n}\right)=\int_{\mathbb{R}} f(x) \frac{1}{\lambda_{n}} e^{-x / \lambda_{n}} d x=\int_{\mathbb{R}} f\left(\lambda_{n} y\right) e^{-y} d y \rightarrow f(0)
$$
as $n \rightarrow \infty$, where the final limit follows from the dominated convergence theorem since the integrand is dominated by $\left|f\left(\lambda_{n} y\right) e^{-y}\right| \leq(\sup |f|) e^{-y}$ and converges to $f(0) e^{-y} .$ Hence $Y_{n}$ converges weakly to $0 .$

Problem 4.

(a) Let $X$ and $Y$ and $Z$ be integrable random variables on a probability space $(\Omega, \mathcal{F}, \mathbb{P})$, and let $\mathcal{G} \subseteq \mathcal{F}$ be a sub- $\sigma$ -algebra. Show that if $Y$ and $Z$ are both versions of $\mathbb{E}[X \mid \mathcal{G}]$ then $Y=Z$ almost surely.
(b) Suppose that $\left(X_{n}\right){n \geq 1}$ is a random process adapted to $\left(\mathcal{F}{n}\right){n \geq 1} .$ Define the hitting time $\tau{A}$ of $\left(X_{n}\right){n \geq 1}$ on a Borel set $A$. Show that $\tau{A}$ is a stopping time for this filtration.
(c) Supose $\left(X_{n}\right){n \geq 1}$ is a symmetric simple random walk on $\mathbb{Z}$ started from 0 . Use the Optional Stopping Theorem for Bounded Stopping Times to evaluate the probability $\left(X{n}\right){n \geq 1}$ hits $b>0$ before $-a<0$. (d) Let $\left(X{n}\right){n \geq 1}$ be a sequence of independent random variables on probability space $(\Omega, \mathcal{F}, \mathbb{P})$, with $\mathbb{E}\left[X{n}\right]=1$ for each $n .$ Let $\mathcal{F}{0}={\emptyset, \Omega}, \mathcal{F}{n}=\sigma\left(X_{1}, \ldots, X_{n}\right)$ for $n \in \mathbb{N}$, and
$$
M_{0}=1, \quad \text { and } \quad M_{n}=\prod_{k=1}^{n} X_{k} n=1,2, \ldots
$$
(i) Show that $\left(M_{n}\right){n \geq 0}$ is a martingale with respect to $\left(\mathcal{F}{n}\right){n>0}$. (ii) Now suppose $\varphi(t)=\mathbb{E}\left[e^{t X{1}}\right]<\infty$ for all $t \in \mathbb{R} .$ Let $S_{0}=0, S_{n}=\sum_{k=1}^{n} X_{k}$, and $$ Y_{n}=\frac{e^{t S_{n}}}{\varphi(t)^{n}}, \quad \text { for } n=0,1,2, \ldots . $$ Show, using part $(i)$, that $\left(Y_{n}\right){n \geq 0}$ is a martingale with respect to $\left(\mathcal{F}{n}\right){n \geq 0}$.

(e) Let $\left(\varepsilon{n}\right){n \geq 1}$ be independent random variables with

$$ \mathbb{P}\left(\varepsilon{n}=+1\right)=p, \quad \mathbb{P}\left(\varepsilon_{n}=-1\right)=q$$

where $1 / 2<p=1-q<1$. Then $\mathbb{E}\left[\log \left(X_{n+1} / X_{n}\right) \mid \mathcal{F}{n}\right]=f\left(V{n+1} / X_{n}\right)$ where
$$
f(x)=p \log (1+x)+q \log (1-x)
$$
(ii) Deduce that $\left(\log \left(X_{n}\right)-n \alpha\right)_{n \geq 0}$ is a supermartingale, where
$$
\alpha=p \log p+q \log q+\log 2 \text { . }
$$

Proof .

(a) Let $X$ and $Y$ and $Z$ be integrable random variables on a probability space $(\Omega, \mathcal{F}, \mathbb{P})$, and let $\mathcal{G} \subseteq \mathcal{F}$ be a sub- $\sigma$ -algebra. Show that if $Y$ and $Z$ are both versions of $\mathbb{E}[X \mid \mathcal{G}]$ then $Y=Z$ almost surely. Answer: (Applied bookwork:) Suppose $Z_{1}$ and $Z_{2}$ are both versions of $\mathbb{E}[X \mid \mathcal{G}]$, let $A_{\varepsilon}=\left{Z_{2}-Z_{1}>\varepsilon\right} \in$ $\mathcal{G}$ (because $Z_{1}$ and $Z_{2}$ are both $\mathcal{G}$ -measurable so $Z_{1}-Z_{2}$ is). Therefore, by (3.), $\mathbb{E}\left[X ; A_{\varepsilon}\right]=\mathbb{E}\left[Z_{1} ; A_{\varepsilon}\right]=$ $\mathbb{E}\left[Z_{2} ; A_{\varepsilon}\right] .$ So by linearity of expectation, and construction of $A_{\varepsilon}$, we have $0=\mathbb{E}\left[Z_{1}-Z_{2}\right] \geq \varepsilon \mathbb{P}\left(A_{\varepsilon}\right)$, hence $\mathbb{P}\left(Z_{2}-Z_{1}>\varepsilon\right)=0 .$ By the same argument (symmetry) $\mathbb{P}\left(Z_{1}-Z_{2}>\varepsilon\right)=0$, and since a union of null events is null $\mathbb{P}\left(\left|Z_{1}-Z_{2}\right|>\varepsilon\right)=0 .$ Finally, by taking compliments and applying monotone convergence of measures
$$
\mathbb{P}\left(Z_{1}=Z_{2}\right)=\mathbb{P}\left(\bigcap_{n \geq 1}\left{\left|Z_{1}-Z_{2}\right| \leq \frac{1}{n}\right}\right)=\lim {n \rightarrow \infty} \mathbb{P}\left(\left|Z{1}-Z_{2}\right| \leq 1 / n\right)=1
$$
$(3$ mark $)$
(b) Suppose that $\left(X_{n}\right){n \geq 1}$ is a random process adapted to $\left(\mathcal{F}{n}\right){n \geq 1}$. Define the hitting time $\tau{A}$ of $\left(X_{n}\right){n \geq 1}$ on a Borel set $A$. Show that $\tau{A}$ is a stopping time for this filtration.
$[3]$
Answer: (Applied bookwork:) $\left(\mathcal{F}{n}\right){n>1}$ is a filtration on $(\Omega, \mathcal{F}, \mathbb{P})$ if for all $n \in \mathbb{N}, \mathcal{F}{n} \subseteq \mathcal{F}{n+1} \subseteq \mathcal{F}$. The
hitting time is defined by $\tau_{A}=\inf \left{n \in \mathbb{N}: X_{n} \in A\right} . \tau_{A}$ is a stopping time if and only if $\left{\tau_{A} \leq n\right} \in \mathcal{F}{n}$ for each $n \in \mathbb{N} .$ This holds because $$ \left{\tau{A} \leq n\right}=\bigcup_{k \leq n}\left{X_{k} \in A\right}
$$
and since $\left(X_{n}\right){n \geq 1}$ is adapted to $\left(\mathcal{F}{n}\right){n \geq 1}$ we have $\left{X{k} \in A\right} \in \mathcal{F}{k} \subseteq \mathcal{F}{n} .$ Hence $\left{\tau_{A} \leq n\right} \in \mathcal{F}{n}$. $(3$ marks $)$ (c) Supose $\left(X{n}\right){n \geq 1}$ is a symmetric simple random walk on $\mathbb{Z}$ started from 0 . Use the Optional Stopping Theorem for Bounded Stopping Times to evaluate the probability $\left(X{n}\right){n \geq 1}$ hits $b>0$ before $-a<0$. Answer: (Seen in Exercises:) Let $T=\tau{{-a, b}}$, then $T$ is not bounded, but it is almost surely finite by a Borel-Cantelli argument, and $T_{k}=T \wedge k$ is bounded for each $k \in \mathbb{N} .$ Hence, by dominated convergence, $E\left[X_{T}\right]=\lim {k \rightarrow \infty} \mathbb{E}\left[X{T_{k}}\right]$, and $\mathbb{E}\left[X_{T_{k}}\right]=0$ by OST. Let $p=\mathbb{P}\left(X_{T}=b\right)=1-\mathbb{P}\left(X_{T}=-a\right)$, so that $\mathbb{E}\left[X_{T}\right]=p b-a(1-p)$ and solving
gives $p=a /(b+a)$.

(d) Let $\left(X_{n}\right){n \geq 1}$ be a sequence of independent random variables on probability space $(\Omega, \mathcal{F}, \mathbb{P})$, with $\mathbb{E}\left[X{n}\right]=1$ for each $n .$ Let $\mathcal{F}{0}={\emptyset, \Omega}, \mathcal{F}{n}=\sigma\left(X_{1}, \ldots, X_{n}\right)$ for $n \in \mathbb{N}$, and
$$
M_{0}=1, \quad \text { and } \quad M_{n}=\prod_{k=1}^{n} X_{k} \text { for } n=1,2, \ldots
$$
(i) Show that $\left(M_{n}\right){n \geq 0}$ is a martingale with respect to $\left(\mathcal{F}{n}\right){n \geq 0}$. (ii) Now suppose $\varphi(t)=\mathbb{E}\left[e^{t X{1}}\right]<\infty$ for all $t \in \mathbb{R} .$ Let $S_{0}=0, S_{n}=\sum_{k=1}^{n} X_{k}$, and
$$
Y_{n}=\frac{e^{t S_{n}}}{\varphi(t)^{n}}, \quad \text { for } n=0,1,2, \ldots
$$
Show, using part $(i)$, that $\left(Y_{n}\right){n \geq 0}$ is a martingale with respect to $\left(\mathcal{F}{n}\right){n}>0$. Answer: (Unseen example (seen similar):) (i) Fix $n \in \mathbb{N}$. Firstly, by independence and integrability of the $\left(X{n}\right){n \geq 1}$ $$ \mathbb{E}\left[\left|M{n}\right|\right]=\prod_{k=1}^{n} \mathbb{E}\left[\left|X_{k}\right|\right]<\infty
$$

and hence $M_{n}$ is integrable. Since $M_{n}$ depends only on $X_{1}, \ldots, X_{n}$ it is $\mathcal{F}{n}$ -measurable. Since $M{n+1}=$ $X_{n+1} M_{n}$ and $M_{n}$ is $\mathcal{F}{n}$ -measurable, we have $\mathbb{E}\left[M{n+1} \mid \mathcal{F}{n}\right]=\mathbb{E}\left[X{n+1} M_{n} \mid \mathcal{F}{n}\right]=M{n} \mathbb{E}\left[X_{n+1} \mid \mathcal{F}{n}\right]=M{n} \mathbb{E}\left[X_{n+1}\right]=M_{n}$ a.s.
where we used ‘taking out what is known’ in the second equality and independence in the third.
(ii) Using part $(i)$, since $e^{t S_{n+1}}=e^{t X_{n+1}} e^{t S_{n}}$, we have
$$
Y_{0}=1,, \quad \text { and } \quad Y_{n}=\prod_{k=1}^{n} \frac{e^{t X_{k}}}{\varphi(t)} \text { for } n=1,2, \ldots
$$
Also $\mathbb{E}\left[e^{t X_{k}} / \varphi(t)\right]=1$ and $\left(e^{t X_{n}} / \varphi(t)\right){n \geq 1}$ is and independent sequence. Hence the result follows from part $(i)$ with $e^{t X{n}} / \varphi(t)$ in place of $X_{n}$
$(4$ marks $)$
(c) Let $\left(\varepsilon_{n}\right){n \geq 1}$ be independent random variables with $$ \mathbb{P}\left(\varepsilon{n}=+1\right)=p, \quad \mathbb{P}\left(\varepsilon_{n}=-1\right)=q, \quad \text { where } 1 / 20$ then $\mathbb{E}\left[\log \left(X_{n+1} / X_{n}\right) \mid \mathcal{F}{n}\right]=f\left(V{n+1} / X_{n}\right)$ where
$$
f(x)=p \log (1+x)+q \log (1-x)
$$
(ii) Deduce that $\left(\log \left(X_{n}\right)-n \alpha\right){n \geq 0}$ is a supermartingale, where $$ \alpha=p \log p+q \log q+\log 2 \text { . } $$ Answer: (Unseen example:) (i) Observe $$ \log \left(\frac{X{n+1}}{X_{n}}\right)=\log \left(\frac{X_{n}+V_{n+1}}{X_{n}}\right) \mathbb{1}{\varepsilon{n+1}=+1}+\log \left(\frac{X_{n}-V_{n+1}}{X_{n}}\right) \mathbb{1}{\varepsilon{n+1}=-1}
$$
Also $\mathbb{E}\left[\mathbb{1}{\varepsilon{n+1}=+1} \mid \mathcal{F}{n}\right]=\mathbb{E}\left[\mathbb{1}{\varepsilon_{n+1}=-1}\right]=p$ and $\mathbb{E}\left[\mathbb{1}{\varepsilon{n+1}=+1} \mid \mathcal{F}{n}\right]=q .$ Taking expectation conditioned on $\mathcal{F}{n}$, and ‘taking out what is known’ we get
$$
\mathbb{E}\left[\log \left(X_{n+1} / X_{n}\right) \mid \mathcal{F}{n}\right]=p \log \left(1+\frac{V{n+1}}{X_{n}}\right)+q \log \left(1-\frac{V_{n+1}}{X_{n}}\right)=f\left(\frac{V_{n+1}}{X_{n}}\right)
$$
$(3$ marks $)$
(ii) By part (i),
$$
\mathbb{E}\left[\log \left(X_{n+1}\right)-(n+1) \alpha \mid \mathcal{F}{n}\right]=\log \left(X{n}\right)-n \alpha+f\left(\frac{V_{n+1}}{X_{n}}\right)-\alpha
$$
so we have to show $f\left(V_{n+1} / X_{n}\right)-\alpha \leq 0 .$ Notice $f$ is differentiable, so to find the max of $f$ on $[0,1]$ consider the derivative. For each $x \in(0,1)$
$$
f^{\prime}(x)=\frac{p}{x+1}-\frac{q}{1-x},
$$
so $f$ is increasing on $(0, p-q]$ and decreasing on $[p-q, 1]$, hence attains its maximum at
$$
f(p-q)=p \log (2 p)+q \log (2 q)=\alpha
$$
It follows that $\left(\log \left(X_{n}\right)-n \alpha\right)_{n \geq 0}$ is a supermartingale.

概率论代写认准UpriviateTA

BS equation代写

复分析Math301

量子力学代写

实分析代写

随机微积分代写