Abstract
As mentioned in the introduction, stochastic processes can be classified into two main families, namely Markov processes on the one hand, and martingales on the other hand. Markov processes have been our main focus of attention so far, and in this chapter we turn to the notion of martingale. In particular we will give a precise mathematical meaning to the description of martingales stated in the introduction, which says that when \((X_n)_{n\in {\mathord {\mathbb {N}}}}\) is a martingale, the best possible estimate at time \(n\in {\mathord {\mathbb {N}}}\) of the future value \(X_m\) at time \(m>n\) is \(X_n\) itself. The main application of martingales will be to recover in an elegant way the previous results on gambling processes of Chap. 2. Before that, let us state many recent applications of stochastic modeling are relying on the notion of martingale. In financial mathematics for example, the notion of martingale is used to characterize the fairness and equilibrium of a market model.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
Integrable means \(\mathrm{I}\! \mathrm{E}[|Z_n|]<\infty \) for all \(n\in {\mathord {\mathbb {N}}}\).
- 2.
A random variable \(X_n\) is said to be centered if \(\mathrm{I}\! \mathrm{E}[X_n ]=0\).
- 3.
By application of the dominated convergence theorem.
- 4.
By application of the monotone convergence theorem.
- 5.
“This obviously inappropriate nomenclature was chosen under the malign influence of the noise level of radio’s SUPERman program, a favorite supper-time program of Doob’s son during the writing of [Doo53]”, cf. [Doo84], historical notes, p. 808.
Author information
Authors and Affiliations
Corresponding author
Exercises
Exercises
Exercise 10.1
Consider a sequence \((X_n)_{n \ge 1}\) of independent Bernoulli random variables with
and the process \((M_n)_{n\in {\mathord {\mathbb {N}}}}\) defined by \(M_0 := 0\) and
See (Fig. 10.2), Note that when \(X_1=X_2=\cdots =X_{n-1}=-1\) and \(X_n=1\), we have
-
(a)
Show that the process \((M_n)_{n\in {\mathord {\mathbb {N}}}}\) is a martingale.
-
(b)
Is the random time
$$ \tau : = \inf \{ n \ge 1 \ : \ M_n = 1 \} $$a stopping time?
-
(c)
Consider the stopped process
$$ M_{\tau \wedge n} := M_n \mathbbm {1}_{ \{ n< \tau \} } + \mathbbm {1}_{ \{ \tau \le n\} } = \left\{ \begin{array}{ll} M_n = 1-2^n &{} \text{ if } n < \tau , \\ \\ M_\tau = 1 &{} \text{ if } n \ge \tau , \end{array} \right. $$\(n \in {\mathord {\mathbb {N}}}\), See (Fig. 10.3). Give an interpretation of \((M_{n\wedge \tau })_{n\in {\mathord {\mathbb {N}}}}\) in terms of betting strategy for a gambler starting a game at \(M_0=0\).
-
(d)
Determine the two possible values of \(M_{\tau \wedge n}\) and the probability distribution of \(M_{\tau \wedge n}\) at any time \(n\ge 1\).
-
(e)
Show, using the result of Question (d), that we have
$$\begin{aligned} \mathrm{I}\! \mathrm{E}[ M_{\tau \wedge n} ] = 0, \qquad n \in {\mathord {\mathbb {N}}}. \end{aligned}$$ -
(f)
Show that the result of Question (e) can be recovered using the stopping time theorem.
Exercise 10.2
Let \((M_n)_{n\in {\mathord {\mathbb {N}}}}\) be a discrete-time submartingale with respect to a filtration \((\mathcal{F}_n)_{n\in {\mathord {\mathbb {N}}}}\), with \(\mathcal{F}_0 = \{ \emptyset , \varOmega \}\), i.e. we have
-
(a)
Show that we have \(\mathrm{I}\! \mathrm{E}[ M_{n+1} ] \ge \mathrm{I}\! \mathrm{E}[ M_n]\), \(n \ge 0\), i.e. a submartingale has an increasing expectation.
-
(b)
Show that independent increment processes whose increments have nonnegative expectation are examples of submartingales.
-
(c)
(Doob-Meyer decomposition) Show that there exists two processes \((N_n)_{n\in {\mathord {\mathbb {N}}}}\) and \((A_n)_{n\in {\mathord {\mathbb {N}}}}\) such that
-
(i)
\((N_n)_{n\in {\mathord {\mathbb {N}}}}\) is a martingale with respect to \((\mathcal{F}_n)_{n\in {\mathord {\mathbb {N}}}}\),
-
(ii)
\((A_n)_{n\in {\mathord {\mathbb {N}}}}\) is non-decreasing, i.e. \(A_n\le A_{n+1}\), a.s., \(n\in {\mathord {\mathbb {N}}}\),
-
(iii)
\((A_n)_{n\in {\mathord {\mathbb {N}}}}\) is predictable in the sense that \(A_n\) is \(\mathcal{F}_{n-1}\)-measurable, \(n\in {\mathord {\mathbb {N}}}\), and
-
(iv)
\(M_n = N_n + A_n\), \(n\in {\mathord {\mathbb {N}}}\).
Hint: Let \(A_0=0\) and
and define \((N_n)_{n\in {\mathord {\mathbb {N}}}}\) in such a way that it satisfies the four required properties.
-
(d)
Show that for all bounded stopping times \(\sigma \) and \(\tau \) such that \(\sigma \le \tau \) a.s., we have
$$ \mathrm{I}\! \mathrm{E}[M_\sigma ] \le \mathrm{I}\! \mathrm{E}[M_\tau ]. $$Hint: Use the Doob stopping time Theorem 10.6 for martingales and (10.3.3).
Exercise 10.3
Consider an asset price \((S_n)_{n=0,1,\ldots , N}\) which is a martingale under the risk-neutral measure \(\mathbb {P}^*\), with respect to the filtration \((\mathcal{F}_n)_{n\in {\mathord {\mathbb {N}}}}\). Given the (convex) function \(\phi (x) := (x-K)^+\), show that the price of an Asian option with payoff
is upper bounded by the price of the European call option with maturity N, i.e. show that
Hint: Use in the following order:
-
(i)
the convexity inequality
$$ \phi \left( \frac{x_1 + x_2 + \cdots + x_n}{n} \right) \le \frac{\phi ( x_1)+\phi ( x_2)+\cdots + \phi (x_n)}{n}, $$ -
(ii)
the martingale property of \((S_k)_{k\in {\mathord {\mathbb {N}}}}\),
-
(iii)
the conditional Jensen inequality \(\phi ( \mathrm{I}\! \mathrm{E}[ F \mid \mathcal{G} ] ) \le \mathrm{I}\! \mathrm{E}[ \phi ( F ) \mid \mathcal{G} ]\),
-
(iv)
the tower property of conditional expectations.
Exercise 10.4
A process \((M_n)_{n\in {\mathord {\mathbb {N}}}}\) is a submartingale if it satisfies
-
(a)
Show that the expectation \(\mathrm{I}\! \mathrm{E}[M_n]\) of a submartingale increases with time \(n\in {\mathord {\mathbb {N}}}\).
-
(b)
Consider the random walk given by \(S_0: =0\) and
$$ S_n := \sum _{k=1}^n X_k = X_1+ X_2+ \cdots + X_n, \qquad n \ge 1, $$where \((X_n)_{n\ge 1}\) is an i.i.d. Bernoulli sequence of \(\{0,1\}\)-valued random variables with \(\mathbb {P}(X_n = 1 ) =p\), \(n \ge 1\). Under which condition on \(\alpha \in {\mathord {\mathbb {R}}}\) is the process \((S_n - \alpha n )_{n\in {\mathord {\mathbb {N}}}}\) a submartingale?
Exercise 10.5
Recall that a process \((M_n)_{n\in {\mathord {\mathbb {N}}}}\) is a submartingale if it satisfies
-
(a)
Show that any convex function \((\phi (M_n))_{n \in {\mathord {\mathbb {N}}}}\) of a martingale \((M_n)_{n\in {\mathord {\mathbb {N}}}}\) is itself a submartingale. Hint: Use Jensen’s inequality.
-
(b)
Show that any convex nondecreasing function \(\phi (M_n)\) of a submartingale \((M_n)_{n\in {\mathord {\mathbb {N}}}}\) remains a submartingale.
Problem 10.6
-
(a)
Consider \((M_n)_{n\in {\mathord {\mathbb {N}}}}\) a nonnegative martingale. For any \(x >0\), let
$$ \tau _x := \inf \{ n \ge 0 \ : \ M_n \ge x \}. $$Show that the random time \(\tau _x\) is a stopping time.
-
(b)
Show that for all \(n\ge 0\) we have
$$\begin{aligned} \mathbb {P}\left( \max _{k=0,1,\ldots , n} M_k \ge x \right) \le \frac{\mathrm{I}\! \mathrm{E}[M_n]}{x}. \end{aligned}$$(10.5.1)Hint: Use the Markov inequality and the Doob stopping time Theorem 10.6 for the stopping time \(\tau _x\).
-
(c)
Show that (10.5.1) remains valid when \((M_n)_{n\in {\mathord {\mathbb {N}}}}\) is a nonnegative submartingale.
Hint: Use the Doob stopping time theorem for submartingales as in Exercise 10.2-(d).
-
(d)
Show that for any \(n\ge 0\) we have
$$\begin{aligned} \mathbb {P}\left( \max _{k=0,1,\ldots , n} M_k \ge x \right) \le \frac{\mathrm{I}\! \mathrm{E}[(M_n)^2]}{x^2}, \qquad x>0. \end{aligned}$$ -
(e)
Show that more generally we have
$$\begin{aligned} \mathbb {P}\left( \max _{k=0,1,\ldots , n} M_k \ge x \right) \le \frac{\mathrm{I}\! \mathrm{E}[ | M_n |^p]}{x^p}, \qquad x>0, \end{aligned}$$for all \(n\ge 0\) and \(p \ge 1\).
-
(f)
Given \((Y_n)_{n\ge 1}\) a sequence of centered independent random variables with same mean \(\mathrm{I}\! \mathrm{E}[Y_n] =0\) and variance \(\sigma ^2 = {\mathrm {Var}}[ Y_n]\), \(n \ge 1\), consider the random walk \(S_n = Y_1+Y_2+\cdots +Y_n\), \(n\ge 1\), with \(S_0=0\).
Show that for all \(n\ge 0\) we have
$$\begin{aligned} \mathbb {P}\left( \max _{k=0,1,\ldots , n} | S_k | \ge x \right) \le \frac{n\sigma ^2}{x^2}, \qquad x>0. \end{aligned}$$ -
(g)
Show that for any (not necessarily nonnegative) submartingale we have
$$\begin{aligned} \mathbb {P}\left( \max _{k=0,1,\ldots , n} M_k \ge x \right) \le \frac{\mathrm{I}\! \mathrm{E}[ M_n^+ ]}{x}, \qquad x>0, \end{aligned}$$where \(z^+ = \max ( z , 0)\), \(z\in {\mathord {\mathbb {R}}}\).
-
(h)
A process \((M_n)_{n\in {\mathord {\mathbb {N}}}}\) is a supermartingaleFootnote 5 if it satisfies
$$ \mathrm{I}\! \mathrm{E}[ M_n \mid \mathcal{F}_k ] \le M_k, \qquad k=0,1,\ldots , n. $$Show that for any nonnegative supermartingale we have
$$\begin{aligned} \mathbb {P}\left( \max _{k=0,1,\ldots , n} M_k \ge x \right) \le \frac{\mathrm{I}\! \mathrm{E}[ M_0 ]}{x}, \qquad x>0. \end{aligned}$$ -
(i)
Show that for any nonnegative submartingale \((M_n)_{n\in {\mathord {\mathbb {N}}}}\) and any convex nondecreasing nonnegative function \(\phi \) we have
$$\begin{aligned} \mathbb {P}\left( \max _{k=0,1,\ldots , n} \phi ( M_k ) \ge x \right) \le \frac{\mathrm{I}\! \mathrm{E}[\phi ( M_n )]}{x}, \qquad x>0. \end{aligned}$$Hint: Consider the stopping time
$$ \tau ^\phi _x := \inf \{ n \ge 0 \ : \ M_n \ge x \}. $$ -
(j)
Give an example of a nonnegative supermartingale which is not a martingale.
Rights and permissions
Copyright information
© 2018 Springer Nature Singapore Pte Ltd.
About this chapter
Cite this chapter
Privault, N. (2018). Discrete-Time Martingales. In: Understanding Markov Chains. Springer Undergraduate Mathematics Series. Springer, Singapore. https://doi.org/10.1007/978-981-13-0659-4_10
Download citation
DOI: https://doi.org/10.1007/978-981-13-0659-4_10
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-13-0658-7
Online ISBN: 978-981-13-0659-4
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)