Sept. 3

  • A stochastic process is defined as a random variable $X_t$ which represents the state of a system at time $t$
  • Conditional expectation is the best guess for a random variable given some (but maybe not all) information
    • Define $A_1, A_2, …, A_n$ as a partition of the probability space; the conditional expectation can be written as $E(Y \vert A_j) = \frac{E(Y \cdot 1_{A_j})}{P(A_j)}$
    • Redefine the partition as $\mathcal{F}_n = (A_1, A_2, …, A_n)$; $Y$ is $\mathcal{F}_n$-measurable
  • Definition: If $\mathbb{E}[\vert Y \vert] < \infty$, then $E[Y \vert \mathcal{F}_n]$ is the random variable satisfying:
    • $E[Y \vert \mathcal{F}_n]$ is $\mathcal{F}_n$-measurable; that is, it is determined by events $X_1, \ldots, X_n$
    • If $A$ is an $\mathcal{F}_n$-measurable event, then $\mathbb{E}[Y \cdot 1_A] = \mathbb{E}[E[Y \vert \mathcal{F}_n]\cdot 1_A]$
  • Properties of conditional expectation
    • Linearity: $E[aY + bZ \vert \mathcal{F}_n] = aE[Y \vert \mathcal{F}_n] + bE[Z \vert \mathcal{F}_n]$
    • Total Law of Expectation: $\mathbb{E}[E[Y \vert \mathcal{F}_n]] = \mathbb{E}[Y]$
    • If $Y$ is $\mathcal{F}_n$-measurable, then $E[Y \vert \mathcal{F}_n] = Y$
    • If $Y$ is independent of $X_1, \ldots, X_n$, then $E[Y \vert \mathcal{F}_n] = E[Y]$
    • Tower or Projection Property: If $m<n$, ${E[E(Y \vert \mathcal{F}_n) \vert \mathcal{F}_m] = E[Y \vert \mathcal{F}_m]}$
    • If $Z$ is $\mathcal{F}_n$-measurable, then $E[YZ \vert \mathcal{F}_n] = ZE[Y\vert \mathcal{F}_n]$

Examples

  • We can use the properties to define a martingale
    • $X_1, \ldots, X_n$ are i.i.d. and $E[X_j] = 0$
    • $Var(X_j) = E(X_j^2) = \sigma^2$
    • Let $S_n = X_1 + \ldots + X_n$ and $m\leq n$ \begin{align*} E[S^2_n \vert \mathcal{F}_m] &= E[(S_m + S_n - S_m)^2 \vert \mathcal{F}_m] \\ &= E[S_m^2 \vert \mathcal{F}_m] + 2E[S_m(S_n - S_m)\vert \mathcal{F}_m] + E[(S_n - S_m)^2 \vert \mathcal{F}_m]\\ &= S_m^2 + 2S_m\mathbb{E}[S_n - S_m] +\mathbb{E}[(S_n - S_m)^2]\\ &= S_m^2 + Var(S_n-S_m) = S_m^2 + (n-m)\sigma^2 \end{align*}
  • If we set $E[X_j] = \mu$, then $E[X_1 \vert S_n] = \frac{S_n}{n}$
    • Reasoning: Since $X_1, \ldots, X_n$ are i.i.d., $E[X_1 \vert S_n] = E[X_2 \vert S_n] = \ldots = E[X_n \vert S_n]$
    • $E[X_1 \vert S_n] + E[X_2 \vert S_n] + \ldots + E[X_n \vert S_n] = E[S_n \vert S_n] = S_n$, so each individual should equal $\frac{S_n}{n}$

Sept. 4

  • Simple (symmetric) random walk: $S_n$ is the position after n step, where $S_n = X_1 + \ldots + X_n$, $P{X_j = 1} = P{X_j = -1} = \frac{1}{2}$, and $S_0 = 0$

Basic Facts

  • $E[S_n] = E[X_1] + \ldots + E[X_n] = 0$
  • $Var[X_j] = E[X_j^2] = 1$
  • $Var[S_n] = Var[X_1] + \ldots + Var[X_n] = n$
  • $E[S_n^2] = n$
  • $\frac{S_n}{\sqrt{n}}\sim N(0,1)$

Questions

  • After $n$ steps, where is the walker?
  • What is the probability of being at $0$ at time $n$?
    • If $n$ is odd, then $S_n$ cannot be equal to $0$
    • Let the number of steps be represented by $2n$
    • $P{S_n=0} = {2n\choose n} \left(\frac{1}{2}\right)^n\left(\frac{1}{2}\right)^n = \frac{(2n)!}{n!n!}\left(\frac{1}{2}\right)^{2n}$
    • Stirling’s Formula: As $n\rightarrow\infty$, $n! \approx \sqrt{2\pi}n^{n+\frac{1}{2}}e^{-n}$
    • We can rewrite the above as $\frac{1}{\sqrt{\pi n}}$
  • If $a,b > 0$, what is the probability to reach $b$ before $-a$?
    • Let $f(x)$ be the probability to reach $b$ befor $-a$ starting at $x$
    • Base cases: $f(b) = 1$, $f(-a) = 0$
    • $f(x) = \frac{1}{2}f(x+1) + \frac{1}{2}f(x-1), -a<x<b$
    • Answer: $\frac{a}{a+b}$, can verify by drawing out $f(x)$ and seeing that it is linear
    • Also known as gambler’s ruin where the probability of starting with $x$ and reaching $N$ dollars before losing it all is $\frac{x}{N}$
  • Does the random walk return to $0$ infinitely often?
    • Let $V$ be the number of visits to the origin; $V=\sum^\infty_{n=0}\mathbb{I}{{S{2n}=0}}$
    • ${E[V] = \sum^\infty_{n=0}E[\mathbb{I}{{S{2n}=0}}] = \sum^\infty_{n=0}P{S_{2n}=0} = \sum^\infty_{n=0}\frac{1}{\sqrt{\pi n}} = \infty}$
    • Let $q$ be the probability that the walk never returns to 0
    • $P{V=1} = q, P{V=1} = (1-q)q, P{V=n} = (1-q)^{n-1}q$
      • $E(V) = \frac{1}{q}$ by the Geometric distribution
      • If $q=0$, then $V = \infty$, but if it does not, then $E(V) < \infty$ which is not true; therefore $q=0$ and $V=\infty$
  • What happens if walker moves in more than one dimension?
    • Choose one of four directions with probability $\frac{1}{4}$
    • In two dimensions, the walker will return to $(0,0)$ infinitely often (AKA recurrent), but in three or more, the walker returns to the origin finitely often

Sept. 16

  • A random variable that takes values in ${0, 1, 2, \ldots, \infty}$ is called a stopping time with respect to $\mathcal{F}_n$ if for all $n$, the event $T=n$ is $\mathcal{F}_n$-measurable
    • If $T_1$ and $T_2$ are stopping times, so are $T_1\wedge T_2 = \min{T_1, T_2}$ and $T_1\vee T_2 = \max{T_1, T_2}$
    • Intuition: You know when to stop using the history of the process; you don’t need to see the future
  • If $X_n$ is a process with $X_n$ being adapted to $\mathcal{F}_n$ and $V\subseteq \mathbb{R}$, then $T=\min {n\cdot X_n \in V}$
  • We can define $M_{n\wedge T}$ as equaling $M_T$ if $T < n$ and $M_n$ if $T\geq n$
    • If $M$ is a martingale, then $M_{n\wedge T}$ is a martingale
  • Optional Sampling (Stopping) Theorem: Suppose $M_0, M_1, \ldots$ is a martingale and $T$ is a stopping time with respect to ${\mathcal{F}_n}$.
    • If there exists $K<\infty$ such that $P{T\leq K}=1$, then $E[M_T] = E[M_0]$; cannot beat a fair game in a finite amount of time
      • Fixing some $n$, $E[M_t]=E[M_{T\wedge n}] + E[M_T - M_{T\wedge n}] = E[M_0]$ as $n$ goes to infinity
        • Fact: If $E[\vert M_T \vert] < \infty$, then $\lim_{n\rightarrow\infty}E[M_T \cdot\mathbb{I}{T>n}] = 0$
    • If $E[\vert M_T \vert]< \infty$ and $\lim_{n\rightarrow\infty} E[\vert M_n\vert \cdot \mathbbP{I}{T>n}] = 0$, then $E[M_T] = E[M_0]$
    • Suppose there exists $C<\infty$ such that for all $n$, $E[M_n^2]\leq C$; then, $E[M_T] = E[M_0]$

Sept. 17

  • Martingale Convergence Theorem: For a martingale $M_n$, if there exists $C<\infty$ such that for all $n$ $E[\vert M_n \vert]\leq C$, then there exists a random variable $M_\infty$ such that, with probability one, $M_\infty = \lim_{n\rightarrow\infty}M_n$
    • If $M_n\geq 0$ with probability one for all $n$, then $E[\vert M_n\vert]= E[M_n]=E[M_0]$
  • Square integrable martingales are defined as martingales that, for every $n$, $E[M_n^2]<\infty$
    • Two random variables $X,Y$ are orthogonal if $E[XY]=0$
    • Another way to pose it is that, for increments $M_n-M_{n-1}$, $E[M_n-M_{n-1}]=0$
      • This means that, if $m<n$, then $E[(M_n-M_{n-1})(M_m-M_{m-1})]=0$