Stochastic Calculus代写认准Uprivateta
Problem 1. Feynman–Kac Formula for Multi-Dimensional Diffusion Process

Feynman-Kac Formula for Multi-Dimensional Diffusion Process. Let $(\Omega, \mathscr{F}, \mathbb{P})$ be a probability space and let $\mathbf{S}{t}=\left(S{t}^{(1)}, S_{t}^{(2)}, \ldots, S_{t}^{(n)}\right) .$ We consider the following PDE problem:
$$
\begin{array}{l}
\frac{\partial V}{\partial t}\left(\mathbf{S}{t}, t\right)+\frac{1}{2} \sum{i=1}^{n} \sum_{j=1}^{n} \rho_{i j} \sigma\left(S_{t}^{(i)}, t\right) \sigma\left(S_{t}^{(j)}, t\right) \frac{\partial^{2} V}{\partial S_{t}^{(i)} \partial S_{t}^{(j)}}\left(\mathbf{S}{t}, t\right) \ +\sum{i=1}^{n} \mu\left(S_{t}^{(i)}, t\right) \frac{\partial V}{\partial S_{t}^{(i)}}\left(\mathbf{S}{t}, t\right)-r(t) V\left(\mathbf{S}{t}, t\right)=0
\end{array}
$$
with boundary condition $V\left(\mathbf{S}{T}, T\right)=\Psi\left(\mathbf{S}{T}\right)$ where $\mu, \sigma$ are known functions of $S_{t}^{(i)}$ and $t$. $r$ and $\Psi$ are functions of $t$ and $\mathbf{S}{T}$, respectively where $t{u}=e^{-\int_{t}^{u} r(v) d v} V\left(\mathbf{S}{u}, u\right) $$ where $S{t}^{(i)}$ satisfies the generalised SDE
$$
d S_{t}^{(i)}=\mu\left(S_{t}^{(i)}, t\right) d t+\sigma\left(S_{t}^{(i)}, t\right) d W_{t}^{(i)}
$$


such that $\left{ W_{t}^{(i)}: t \geq 0\right}$ is a standard Wiener process and $d W_{t}^{(i)} \cdot d W_{t}^{(j)}=\rho_{i j} d t, \rho_{i j} \in$,
$(-1,1)$ for $i \neq j$ and $\rho_{i j}=1$ for $i=j$ where $i, j=1,2, \ldots, n$, show that under the filtration $\mathscr{F}{t}$, the solution of the $\mathrm{PDE}$ is given by $$ V\left(\mathbf{S}{t}, t\right)=\mathbb{E}\left[e^{-\int_{t}^{T} r(v) d v} \Psi\left(\mathbf{S}{T}\right) \mid \mathscr{F}{t}\right]
$$

Proof .

we let $g(u)=e^{-\int_{t}^{u} r(v) d v}$ and set
$$
Z_{u}=g(u) V\left(\mathbf{S}{u}, u\right) . $$ By applying Taylor’s expansion and Itō’s formula on $d Z{u}$, we have
$$
d Z_{u}=\frac{\partial Z_{u}}{\partial u} d u+\sum_{i=1}^{n} \frac{\partial Z_{u}}{\partial S_{u}^{(i)}} d S_{u}^{(i)}+\frac{1}{2} \sum_{i=1}^{n} \sum_{j=1}^{n} \frac{\partial^{2} Z_{u}}{\partial S_{u}^{(i)} \partial S_{u}^{(j)}} d S_{u}^{(i)} d S_{u}^{(j)}+\ldots
$$

\begin{equation}
\begin{aligned}
=&\left(g(u) \frac{\partial V}{\partial u}+V\left(\mathbf{S}{u}, u\right) \frac{\partial g}{\partial u}\right) d u+\sum{i=1}^{n}\left(g(u) \frac{\partial V}{\partial S_{u}^{(i)}}\right) d S_{u}^{(i)} \
&+\frac{1}{2} \sum_{i=1}^{n} \sum_{j=1}^{n}\left(g(u) \frac{\partial^{2} V}{\partial S_{u}^{(i)} \partial S_{u}^{(j)}}\right) d S_{u}^{(i)} d S_{u}^{(j)} \
=&\left(g(u) \frac{\partial V}{\partial u}-r(u) g(u) V\left(\mathbf{S}{u}, u\right)\right) d u \ &+\sum{i=1}^{n}\left(g(u) \frac{\partial V}{\partial S_{u}^{(i)}}\right)\left(\mu\left(S_{u}^{(i)}, u\right) d u+\sigma\left(S_{u}^{(i)}, u\right) d W_{u}^{(i)}\right) \
&+\frac{1}{2} \sum_{i=1}^{n} \sum_{j=1}^{n}\left(g(u) \frac{\partial^{2} V}{\partial S_{u}^{(i)} \partial S_{u}^{(i)}}\right)\left(\rho_{i j} \sigma\left(S_{u}^{(i)}, u\right) \sigma\left(S_{u}^{(j)}, u\right) d t\right) \
=& g(u)\left(\frac{\partial V}{\partial u}\left(\mathbf{S}{u}, u\right)+\frac{1}{2} \sum{i=1}^{n} \sum_{j=1}^{n} \rho_{i j} \sigma\left(S_{u}^{(i)}, t\right) \sigma\left(S_{u}^{(j)}, u\right) \frac{\partial^{2} V}{\partial S_{u}^{(i)} \partial S_{u}^{(j)}}\left(\mathbf{S}{u}, u\right)\right.\ &\left.+\sum{i=1}^{n} \mu\left(S_{u}^{(i)}, u\right) \frac{\partial V}{\partial S_{u}^{(i)}}\left(\mathbf{S}{u}, u\right)-r(u) V\left(\mathbf{S}{u}, u\right)\right) d u \
&+g(u) \sum_{i=1}^{n} \sigma\left(S_{u}^{(i)}, u\right) \frac{\partial V}{\partial S_{u}^{(i)}} d W_{u}^{(i)} \
=& g(u) \sum_{i=1}^{n} \sigma\left(S_{u}^{(i)}, u\right) \frac{\partial V}{\partial S_{u}^{(i)}} d W_{u}^{(i)}
\end{aligned}
\end{equation}

since
$$
\begin{array}{l}
\frac{\partial V}{\partial u}\left(\mathbf{S}{u}, u\right)+\frac{1}{2} \sum{i=1}^{n} \sum_{j=1}^{n} \rho_{i j} \sigma\left(S_{u}^{(i)}, t\right) \sigma\left(S_{u}^{(j)}, u\right) \frac{\partial^{2} V}{\partial S_{u}^{(i)} \partial S_{u}^{(j)}}\left(\mathbf{S}{u}, u\right) \ +\sum{i=1}^{n} \mu\left(S_{u}^{(i)}, u\right) \frac{\partial V}{\partial S_{u}^{(i)}}\left(\mathbf{S}{u}, u\right)-r(u) V\left(\mathbf{S}{u}, u\right)=0
\end{array}
$$
By integrating both sides of $d Z_{u}$ we have
$$
\begin{aligned}
\int_{t}^{T} d Z_{u} &=\sum_{i=1}^{n}\left{\int_{t}^{T} g(u) \sigma\left(S_{u}^{(i)}, u\right) \frac{\partial V}{\partial S_{u}^{(i)}} d W_{u}^{(i)}\right} \
Z_{T}-Z_{t} &=\sum_{i=1}^{n}\left{\int_{t}^{T} e^{-\int_{t}^{u} r(v) d v} \sigma\left(S_{u}^{(i)}, u\right) \frac{\partial V}{\partial S_{u}^{(i)}} d W_{u}^{(i)}\right}
\end{aligned}
$$
Taking expectations and using the property of Itō calculus.
$$
\mathbb{E}\left(Z_{T}-Z_{t}\right)=0 \text { or } \mathbb{E}\left(Z_{t}\right)=\mathbb{E}\left(Z_{T}\right)
$$

Problem 2.

Novikov’s Condition II. Let $\left{W_{t}: t \geq 0\right}$ be a P-standard Wiener process on the probability space $(\Omega, \mathscr{F}, \mathbb{P})$ and let $\theta_{t}$ be an adapted process, $0 \leq t \leq T$. By considering
$$
Z_{t}=e^{-\int_{0}^{t} \theta_{s} d W_{s}-\frac{1}{2} \int_{0}^{t} \theta_{s}^{2} d s}
$$
and if
$$
\mathbb{E}^{\mathrm{P}}\left(e^{\frac{1}{2} \int_{0}^{T} \theta_{l}^{2} d t}\right)<\infty
$$
then, using Itō’s formula, show that $Z_{t}$ is a positive $\mathbb{P}$ -martingale for $0 \leq t \leq T$.

Proof .

Let $Z_{t}=f\left(X_{t}\right)=e^{X_{t}}$ where $X_{t}=-\int_{0}^{t} \theta_{s} d W_{s}-\frac{1}{2} \int_{0}^{t} \theta_{s}^{2} d s$ and by applying
Taylor’s expansion and subsequently Itō’s formula,
$$
\begin{aligned}
d Z_{t} &=\frac{\partial f}{\partial X_{t}} d X_{t}+\frac{1}{2} \frac{\partial^{2} f}{\partial X_{t}^{2}}\left(d X_{t}\right)^{2}+\ldots \
&=e^{X_{t}}\left(-\theta_{t} d W_{t}-\frac{1}{2} \theta_{t}^{2} d t\right)+\frac{1}{2} e^{X_{t}} \theta_{t}^{2} d t \
&=-\theta_{t} Z_{l} d W_{t} .
\end{aligned}
$$
Integrating both sides of the equation from $u$ to $t$, where $u{u}$ and because $\int{u}^{t} \theta_{s} Z_{s} d W_{s}$ is independent of $\mathscr{F}{u}$, we have $$ \mathbb{E}^{P}\left(Z{t} \mid \mathscr{F}{u}\right)=Z{u} $$ where $\mathbb{E}^{\mathrm{P}}\left(\int_{u}^{t} \theta_{s} Z_{s} d W_{s} \mid \mathscr{F}{u}\right)=\mathbb{E}^{P}\left(\int{u}^{t} \theta_{s} Z_{s} d W_{s}\right)=0 .$ Using the same steps as described in Problem 4.2.2.1 (page 194), we can also show that $\mathbb{E}^{\mathbb{P}}\left(\left|Z_{t}\right|\right)<\infty .$ Since $Z_{t}$ is $\mathscr{F}{t}$ -adapted and $Z{t}>0$, we have shown that $\left{Z_{t}: 0 \leq t \leq T\right}$ is a positive $\mathbb{P}$ -martingale.

Problem 3.

Pure Birth Process. Let $(\Omega, \mathscr{F}, \mathbb{P})$ be a probability space. If $\left{N_{t}: t \geq 0\right}$ is a Poisson process with intensity $\lambda>0$ then for small $h>0$ and $k \in \mathbb{N}$, show that it satisfies the following property:
$$
\mathbb{P}\left(N_{t+h}=k+j \mid N_{t}=k\right)=\left{\begin{array}{ll}
1-\lambda h+o(h) & j=0 \
\lambda h+o(h) & j=1 \
o(h) & j>1
\end{array}\right.
$$

Proof .

We first consider $j=0$ where
$$
\begin{aligned}
\mathbb{P}\left(N_{t+h}=k \mid N_{t}=k\right) &=\frac{\mathbb{P}\left(N_{t+h}=k, N_{t}=k\right)}{\mathbb{P}\left(N_{t}=k\right)} \
&=\frac{\mathbb{P}\left(\text { zero arrival between }(t, t+h], N_{t}=k\right)}{\mathbb{P}\left(N_{t}=k\right)} .
\end{aligned}
$$
Since the event $\left{N_{t}=k\right}$ relates to arrival during the time interval $[0, t]$ and the event {zero arrival between $(t, t+h]}$ relates to arrivals after time $t$, both of the events are independent and because $N_{t}$ has stationary increments,
$$
\begin{aligned}
\mathbb{P}\left(N_{t+h}=k \mid N_{t}=k\right) &=\frac{\mathbb{P}(\text { zero arrival between }(t, t+h]) \mathbb{P}\left(N_{t}=k\right)}{\mathbb{P}\left(N_{t}=k\right)} \
&=\mathbb{P}(\text { zero arrival between }(t, t+h]) \
&=\mathbb{P}\left(N_{t+h}-N_{t}=0\right) \
&=e^{-\lambda h} \
&=1-\lambda h+\frac{(\lambda h)^{2}}{2 !}-\frac{(\lambda h)^{3}}{3 !}+\ldots \
&=1-\lambda h+o(h) .
\end{aligned}
$$
Similarly,
$$
\begin{aligned}
\mathbb{P}\left(N_{t+h}=k+1 \mid N_{t}=k\right) &=\mathbb{P}(1 \text { arrival between }(t, t+h]) \
&=\mathbb{P}\left(N_{t+h}-N_{t}=1\right) \
&=\lambda h e^{-\lambda h} \
&=\lambda h\left(1-\lambda h+\frac{(\lambda h)^{2}}{2 !}-\frac{(\lambda h)^{3}}{3 !}+\ldots\right) \
&=\lambda h+o(h)
\end{aligned}
$$
and finally
$$
\mathbb{P}\left(N_{t+h}>k+1 \mid N_{t}=k\right)=1-\mathbb{P}\left(N_{t+h}=k \mid N_{t+h}=k\right)-\mathbb{P}\left(N_{t+h}=k+1 \mid N_{t}=k\right)
$$
$$
\begin{array}{l}
=1-(1-\lambda h+o(h))-\lambda h+o(h) \
=o(h) .
\end{array}
$$

MT3610/4610/5461 Error Correcting Codes代写请认准UpriviateTA

Stochastic Calculus代写

2020-2021Lecturer(s): Prof. Hanqing JinCourse Term: MichaelmasCourse Overview: 

Stochastic differential equations are used to model the behaviour of financial assets and stochastic calculus is the fundamental tool for understanding and manipulating these models. This course will give an introduction to the main ideas in stocahstic calculus that will be used through out the MSc programme.Course Syllabus: 

Introduction to Brownian motion, continuous martingales and their properties, distribution of first hitting times, maximum and minimum, for Brownian motion. Ito’s calculus: quadratic variation, stochastic
integrals, Ito’s formula, exponential martingales, Girsanov’s theorem, the martingale representation theorem. Stochastic differential equations: weak and strong solutions, existence and uniqueness of solutions.

18.676. Stochastic Calculus.代写

Spring 2021, MW 11:00-12.30 (virtual).

All announcements and course materials will be posted on the 18.676 Canvas page. To attend lectures, go to the Zoom section on the Canvas page, and click Join. Some general course information is below.

Please see also the spring 2020 course page which contains a summary of lectures.

Prerequisite: 18.675.

ADMIN.
Instructor: Nike Sun (nsun at ##).
TA: Matthew Nicoletti (mnicolet at ##).
## = mit dot edu. Please include “18.676” in the subject line of all emails.
All office hours will be announced on Canvas (see the calendar).

TEXTBOOKS. References marked * are available for free electronically through libraries.mit.edu.
Main reference for this class:
*[online] J.-F. Le Gall, Brownian Motion, Martingales, and Stochastic Calculus. Springer, 2016.
Additional references for stochastic calculus:
*[online] I. Karatzas and S. E. Shreve, Brownian Motion and Stochastic Calculus, Springer, 1998.
*[online] D. Revuz and M. Yor, Continuous Martingales and Brownian Motion, Springer, 1999.
*[online] J. M. Steele, Stochastic Calculus and Financial Applications. Springer, 2001.
*[online] D. Stroock, Elements of Stochastic Calculus and Analysis. Springer, 2018.
[online] G. Lawler, Stochastic Calculus: An Introduction with Applications (book draft).
[online] N. Berestycki, lecture notes for stochastic calculus.
[online] D. Stroock, lecture notes for 18.676, compiled by Sinho Chewi.
[online] J. Pitman and M. Yor, “A guide to Brownian motion and related stochastic processes.”
Additional references for general probability and analysis:
*[online] E. H. Lieb and M. Loss, Analysis, AMS, 2001.
*[online] R. Durett, Probability: Theory and Examples, Cambridge UP, 2019.
[online] A. Dembo, lecture notes for Stanford Math 230 / Stat 310.
[online] D. Aldous, lecture notes for Berkeley Math 218A / Stat 205A, compiled by Sinho Chewi.
*[online] O. Kallenberg, Foundations of Modern Probability, 2nd ed., Springer, 2002.

GRADING. Homework (60%), exam 1 (18%), exam 2 (18%), class participation (4%).
To find a study group, see psetpartners.mit.edu.

KEY DATES.
04/07 (Wednesday) exam 1 will be held online, 6:00pm to 8:30pm Boston time.
05/19 (Wednesday) exam 2 will be held online, 6:00pm to 8:30pm Boston time.

[accessibility.mit.edu]

Course Syllabus: Stochastic Processes – AMCS 241代写

Division Computer, Electrical and Mathematical Sciences & Engineering
Course Number AMCS 241
Course Title Stochastic Processes
Academic Semester Spring
Academic Year 2021
Semester Start Date 01/24/2021
Semester End Date 05/11/2021
Class Schedule
(Days & Time)
08:30 AM – 11:30 AM | Sun
Instructor(s)
Name Email Phone Office
Location
Office Hours
Mohamed-Slim
Alouini
[email protected] +966128080283 Office: 4301
Building 1
Office hours: By
appointment only.
Teaching Assistant(s)
Name Email
Jia Ye [email protected]
Course Information
This course presents the fundamentals of probability theory and stochastic processes. Contents
of this course are relevant to several disciplines including statistics, communications and
information systems, computer engineering, signal processing, machine learning,
bioinformatics, econometrics, and mathematical finance.
Comprehensive
Course
Description
Contents:
I- Review of Probability theory
Introduction and basic probability; Discrete Random Variables; Continuous Random
Variables; Multivariate Distributions; Moment-generating functions and characterstic
functions; Inequalities and bounds for random variables.
II- Introduction to Random Processes
Introduction and basic concepts; stationarity and ergodicity; covariance functions; Poisson
processes; Gaussian processes; Spectral representations; Branching processes; Linear filters;
Integration and differentiation of stochastic processes; ARMA models; Markov chains; Queuing
theory.
Course
Description
from Program
Guide
Prerequisites: Advanced and multivariate calculus. Introduction to probability and random
processes. Topics include probability axioms, sigma algebras, random vectors, expectation,
probability distributions and densities, Poisson and Wiener processes, stationary processes,
autocorrelation, spectral density, effects of filtering, linear least- squares estimation and
convergence of random sequences.
Goals and
Objectives
AMCS 241 is an introductory graduate course. Students will learn the fundamentals of
probability theory and stochastic processes. The main goal is for the students to thoroughly
understand the covered topics and be able to apply them. The course prepares the students for
more advanced and specialized courses.
Required
Knowledge
Prerequisites: Adequate background in basic probability (including random variables and
distributions), linear algebra, multivariate calculus, Fourier transform, z-transform and Laplace
transform. Students should be competent in writing rigorous proofs and should understand terms
such as ‘if and only if’, sufficient conditions, necessary conditions, etc.
Important note: The course may be time-demanding, especially for those without the required
background and mathematical fluency. A student who has deficiencies in his or her background
can still take the course. However, the price will be more time spent on the course, or less
acquired knowledge, or both.
This course involves tutorial sessions and significant self-study and reading from handouts and
references.
Textbooks:
Probability, Random Processes, and Statistical Analysis by H. Kobayashi, B. L. Mark and W.
Turin.
Stationary stochastic processes for scientists and engineers by G. Lindgren, H. Rootzén and M.
Sandsten.
References:
Geoffrey Grimmett and David Stirzaker, Probability and Random Processes, 3rd Edition,
Oxford University press, 2001. (The exercises in the book are solved in another book by the
authors titled One Thousand Exercises in Probability. This book has numerous solved exercises
covering all the topics we do in AMCS 241.)
Reference
Texts
S. Kay, Intuitive Probability and Random Processes using Matlab, Springer, 2006.
Gallager, Stochastic Processes: Theory for Applications, Cambridge University Press, 2014.
Sheldom Ross, Stochastic Processes, 2nd Edition, John Wiley & Sons, 1996.
Papoulis, Probability, Random Variables, and Stochastic Processes, 4th Edition, Mc-Graw Hill,
2002.
Leon-Garcia, Probability, Statistics, and Random Processes for Electrical Engineering, 3rd
Edition, Prentice-Hall, 2008.
Advanced References:
Patrick Billingsley, Probability and Measure
http://www.colorado.edu/amath/sites/default/files/attached-files/billingsley.pdf
Rick Durrett, Probability: Theory and Examples
https://www.cambridge.org/core/books/probability/81949AABAA8B3A8411CB88402F0F05C9
David Williams, Probability with Martingales
https://www.cambridge.org/core/books/probability-withmartingales/B4CFCE0D08930FB46C6E93E775503926
René L. Schilling, Measures, Integrals and Martingales
https://www.cambridge.org/core/books/measures-integrals-andmartingales/7BEE19069C88A1376AEB988487D4131C
Method of
evaluation
15.00% – Homework /Assignments
35.00% – Midterm exam
50.00% – Final exam
Nature of the
assignments
The midterm exam and the final exam are closed books and closed notes.
Homework sets will be assigned weekly. Some homework assignments may require use of
mathematical software such as Matlab or R for calculations and/or plots.
Course Policies – All exams are required. Students who do not show up for an exam should expect a grade of
zero on that exam.

  • If you dispute your grade on any homework, or exam, you may request a re-grade (from the
    TA for the homework or from the instructor for the exams) only within 48 hours of receiving
    the graded exam. Incomplete (I) grade for the course will only be given under extraordinary
    circumstances such as sickness, and these extraordinary circumstances must be verifiable. The
    assignment of an (I) requires first an approval of the dean and then a written agreement between
    the instructor and student specifying the time and manner in which the student will complete the
    course requirements.
    Additional
    Information
    Tentative Course Schedule
    (Time, topic/emphasis & resources)
    Week Lectures Topic
    1 Sun 01/24/2021 Introduction and basic probability theory
    2 Sun 01/31/2021 Discrete random variables
    3 Sun 02/07/2021 Continuous and mixed random variables
    4 Sun 02/14/2021 Simulation and generation of random variables + Inequalities
    5 Sun 02/21/2021 Two random variables
    6 Sun 02/28/2021 Multiple random variables and Multiple Gaussian random variables
    7 Sun 03/07/2021 Sum of random variables
    8 Sun 03/14/2021 Midterm exam
    9 Sun 03/21/2021 Introduction to stochastic processes
    10 Sun 03/28/2021 Stationarity and ergodicity of stochastic processes
    11 Sun 04/04/2021 Integration and differentiation of stochastic processes
    12 Sun 04/11/2021 Spectral representation + filtering
    13 Sun 04/18/2021 Poisson processes, Birth and death proecsses, and renewal processes
    14 Sun 04/25/2021 Gaussian processes
    15 Sun 05/02/2021 Queuing theory and branching processes
    16 Sun 05/09/2021 Final exam
    Note
    The instructor reserves the right to make changes to this syllabus as necessary.