In what follows $\left(B_t\right), t \geq 0$ denotes the Brownian motion process started at zero.

1. Let $X_t=\mu(t)+\sigma(t) B_t$, with some differentiable functions $\mu(t)$ and $\sigma(t)$, and the stock price be given by $S_t=e^{X_t}$.
(a) Derive the stochastic differential equation for $S_t$.
(b) Give the mean and the variance of $S_t$. Hint: use the moment generating function of Normal distribution.
(c) Find the conditional expectation $E\left(S_t \mid S_s\right)$ for $s<t$.
(d) Give the relation between functions $\mu$ and $\sigma$ so that the process $S_t$ is a martingale, and show why it is a martingale.
(e) Calculate the mean and the variance of the stochastic integral $\int_0^T e^{t^2+B_t} d B_t$.
1. Let the process $\left(X_t\right), t \geq 0$, solve the SDE
$$d X_t=-X_t d t+d B_t, \quad X_0 \in \mathbb{R}$$
(a) Show that the process $Y_t=e^t X_t$ has independent Gaussian increments, and give the distribution of the increments over the time interval $[s, t]$.
(b) Show that the process $\left(X_t\right)$ has Gaussian increments, but they are not independent.
(c) State with reason whether the process $\left(X_t\right)$ is a Gaussian process and give its mean and covariance functions.
(d) Derive the conditional expectation $E\left(X_t \mid X_s\right)$ for $s<t$.
(e) Show that if $X_0$ has distribution $N(0,1 / 2)$ and is independent of the process $\left(B_t\right)$, then for any time $t, X_t$ has $N(0,1 / 2)$ distribution.
1. Let $\sigma>0$ and $\alpha=\frac{1}{2}\left(\sigma^2-3\right)$. Define $\left(X_t\right)_{t \geq 0}$ by
$$X_t=\sigma t^{-(\alpha+1)} \int_0^t s^{\alpha+1} d B_s, \quad X_0=0 .$$
(a) Show that the Itô integral $\int_0^t s^{\alpha+1} d B_s$ is well defined.
(b) Show that the process $X_t$ is a Gaussian process and that for any $t, X_t$ has $N(0, t)$ distribution.
(c) Show that $X_t$ satisfies the SDE for $t>0$
$$d X_t=\frac{1-\sigma^2}{2 t} X_t d t+\sigma d B_t$$
(d) Show that the process $X$ is a Brownian motion only for $\sigma=1$ (or $\alpha=-1$ ).
(e) Give the covariance function, $\operatorname{Cov}\left(X_u, X_t\right)$.

4 (2) Suppose $X_1, \ldots, X_n$ are independent and identically distributed samples from a negative binomial distribution $X \sim \operatorname{NBin}(2, p)$.
(a) Show that $T=\sum_{i=1}^n X_i$ is a sufficient statistic for $p$.
(b) Show that the MLE for $\theta=(1-p)^2$ is $\left(\frac{2 n}{T+2 n}\right)^2$.
(c) Show that the MLE for $\theta$ is biased.
(d) Show that the estimator
$$S= \begin{cases}1 & \text { if } X_1=0 \ 0 & \text { otherwise }\end{cases}$$
$$1$$
is an unbiased estimator for $\theta$.
(e) Use Rao-Blackwellization to show that
$$\hat{S}=\left(\frac{2 n-1}{T+2 n-1}\right)\left(\frac{2 n-2}{T+2 n-2}\right)$$
is an improved estimator for $\theta$. Is it unbiased?
(f) Show that the negative binomial distributions for a fixed number of failures $r$ form an exponential family with respect to the sufficient statistic $T$. Use this to determine whether $\hat{S}$ is sufficient for $p$.
(g) Compute the Cramer-Rao lower bound on the variance of the estimator you constructed in part (e). Is $\hat{S}$ an efficient estimator of $\theta$ ?

real analysis代写analysis 2, analysis 3请认准UprivateTA™. UprivateTA™为您的留学生涯保驾护航。

Picard定理，ODE存在唯一性介绍

Categories: cs代写