Loading web-font TeX/Math/Italic
Skip to main content

Probability Theory: Convergence in Distribution Problem

Let's solve some theoretical problem in probability, specifically on convergence. The problem below is originally from Exercise 5.42 of Casella and Berger (2001). And I just want to share my solution on this. If there is an incorrect argument below, I would be happy if you could point that to me.

Problem

Let X_1, X_2,\cdots be iid (independent and identically distributed) and X_{(n)}=\max_{1\leq i\leq n}x_i.
  1. If x_i\sim beta(1,\beta), find a value of \nu so that n^{\nu}(1-X_{(n)}) converges in distribution;
  2. If x_i\sim exponential(1), find a sequence a_n so that X_{(n)}-a_n converges in distribution.

Solution

  1. Let Y_n=n^{\nu}(1-X_{(n)}), we say that Y_n\rightarrow Y in distribution. If \lim_{n\rightarrow \infty}F_{Y_n}(y)=F_Y(y).
    Then, \begin{aligned} \lim_{n\rightarrow\infty}F_{Y_n}(y)&=\lim_{n\rightarrow\infty}P(Y_n\leq y)=\lim_{n\rightarrow\infty}P(n^{\nu}(1-X_{(n)})\leq y)\\ &=\lim_{n\rightarrow\infty}P\left(1-X_{(n)}\leq \frac{y}{n^{\nu}}\right)\\ &=\lim_{n\rightarrow\infty}P\left(-X_{(n)}\leq \frac{y}{n^{\nu}}-1\right)=\lim_{n\rightarrow\infty}\left[1-P\left(-X_{(n)}> \frac{y}{n^{\nu}}-1\right)\right]\\ &=\lim_{n\rightarrow\infty}\left[1-P\left(\max\{X_1,X_2,\cdots,X_n\}< 1-\frac{y}{n^{\nu}}\right)\right]\\ &=\lim_{n\rightarrow\infty}\left[1-P\left(X_1< 1-\frac{y}{n^{\nu}},X_2< 1-\frac{y}{n^{\nu}},\cdots,X_n< 1-\frac{y}{n^{\nu}}\right)\right]\\ &=\lim_{n\rightarrow\infty}\left[1-P\left(X_1< 1-\frac{y}{n^{\nu}}\right)^n\right],\;\text{since}\;X_i's\;\text{are iid.} \end{aligned}
    And because x_i\sim beta(1,\beta), the density is f_{X_1}(x)=\begin{cases} \beta(1-x)^{\beta - 1}&\beta>0, 0\leq x\leq 1\\ 0,&\mathrm{Otherwise} \end{cases}
    Implies, \begin{aligned} \lim_{n\to \infty}P(Y_n\leq y)&=\lim_{n\to \infty}\left\{1-\left[\int_0^{1-\frac{y}{n^{\nu}}}\beta(1-t)^{\beta-1}\,\mathrm{d}t\right]^n\right\}\\ &=\lim_{n\to \infty}\left\{1-\left[-\int_1^{\frac{y}{n^{\nu}}}\beta u^{\beta-1}\,\mathrm{d}u\right]^{n}\right\}\\ &=\lim_{n\to \infty}\left\{1-\left[-\beta\frac{u^{\beta}}{\beta}\bigg|_{u=1}^{u=\frac{y}{n^{\nu}}}\right]^{n}\right\}\\ &=1-\lim_{n\to \infty}\left[1-\left(\frac{y}{n^{\nu}}\right)^{\beta}\right]^{n} \end{aligned}
    We can simplify the limit if \nu=\frac{1}{\beta}, that is \lim_{n\to\infty}P(Y_n\leq y)=1-\lim_{n\to\infty}\left[1-\frac{y^{\beta}}{n}\right]^{n}=1-e^{-y^{\beta}}
    To confirm this in Python, run the following code using the sympy module

    import sympy as sy
    from sympy import oo
    b, y, n = sy.symbols("b y n")
    print 1 - sy.limit((1 - (y ** b) / n) ** n, n, oo)
    view raw cprob1.py hosted with ❤ by GitHub
    Therefore, if 1-e^{-y^{\beta}} is a distribution function of Y, then Y_n=n^{\nu}(1-X_{(n)}) converges in distribution to Y for \nu=\frac{1}{\beta}.
    \hspace{12.5cm}\blacksquare
  2. \begin{aligned} P(X_{(n)}-a_{n}\leq y) &= P(X_{(n)}\leq y + a_n)=P(\max\{X_1,X_2,\cdots,X_n\}\leq y+a_n)\\ &=P(X_1\leq y+a_n,X_2\leq y+a_n,\cdots,X_n\leq y+a_n)\\ &=P(X_1\leq y+a_n)^n,\;\text{since}\;x_i's\;\text{are iid}\\ &=\left[\int_{-\infty}^{y+a_n}f_{X_1}(t)\,\mathrm{d}t\right]^n \end{aligned}
    Since X_i\sim exponential(1), then the density is f_{X_1}=\begin{cases} e^{-x},&0\leq x\leq \infty\\ 0,&\mathrm{otherwise} \end{cases}
    So that, \begin{aligned} P(X_{(n)}-a_{n}\leq y)&=\left[\int_{0}^{y+a_n}e^{-t}\,\mathrm{d}t\right]=\left\{-\left[e^{-(y+a_n)}-1\right]\right\}^n\\ &=\left[1-e^{-(y+a_n)}\right]^n \end{aligned}
    If we let Y_n=X_{(n)}-a_n, then we say that Y_n\rightarrow Y in distribution if \lim_{n\to\infty}P(Y_n\leq y)=P(Y\leq y)
    Therefore, \begin{aligned} \lim_{n\to\infty}P(Y_n\leq y) &= \lim_{n\to\infty}P(X_{(n)}-a_n\leq y)=\lim_{n\to \infty}\left[1-e^{-y-a_n}\right]^n\\ &=\lim_{n\to\infty}\left[1-\frac{e^{-y}}{e^{a_n}}\right]^n \end{aligned}
    We can simplify the limit if a_n=\log n, that is \lim_{n\to\infty}\left[1-\frac{e^{-y}}{e^{\log n}}\right]^n=\lim_{n\to\infty}\left[1-\frac{e^{-y}}{n}\right]^n=e^{-e^{-y}}
    Check this in Python by running the following code,

    import sympy as sy
    from sympy import oo
    y, n = sy.symbols("y n")
    print sy.limit((1 - sy.exp(-y) / sy.exp(sy.log(n))) ** n, n, oo)
    view raw cprob.py hosted with ❤ by GitHub
    In conclusion, if e^{-e^{-y}} is a distribution function of Y, then Y_n=X_{(n)}-a_n converges in distribution to Y for sequence a_n=\log n.
    \hspace{12.5cm}\blacksquare

Reference

  1. Casella, G. and Berger, R.L. (2001). Statistical Inference. Thomson Learning, Inc.