# Quantile function

In probability and statistics, the quantile function, associated with a probability distribution of a random variable, specifies the value of the random variable such that the probability of the variable being less than or equal to that value equals the given probability. It is also called the percent-point function or inverse cumulative distribution function.

## Definition

With reference to a continuous and strictly monotonic distribution function, for example the cumulative distribution function ${\displaystyle F_{X}\colon R\to [0,1]}$ of a random variable X, the quantile function Q returns a threshold value x below which random draws from the given c.d.f would fall p percent of the time.

In terms of the distribution function F, the quantile function Q returns the value x such that

${\displaystyle F_{X}(x):=\Pr(X\leq x)=p.\,}$

Another way to express the quantile function is

${\displaystyle Q(p)\,=\,\inf \left\{x\in \mathbb {R} :p\leq F(x)\right\}}$

for a probability 0 < p < 1. Here we capture the fact that the quantile function returns the minimum value of x from amongst all those values whose c.d.f value exceeds p, which is equivalent to the previous probability statement in the special case that the distribution is continuous. Note that the infimum function can be replaced by the minimum function, since the distribution function is right-continuous and weakly monotonically increasing.

The quantile is the unique function satisfying the Galois inequalities

${\displaystyle Q(p)\leq x}$ if and only if ${\displaystyle p\leq F(x)}$

If the function F is continuous and strictly monotonically increasing, then the inequalities can be replaced by equalities, and we have:

${\displaystyle Q=F^{-1}}$

In general, even though the distribution function F may fail to possess a left or right inverse, the quantile function Q behaves as an "almost sure left inverse" for the distribution function, in the sense that

${\displaystyle Q(F(X))=X}$ almost surely.

## Simple example

For example, the cumulative distribution function of Exponential(λ) (i.e. intensity λ and expected value (mean) 1/λ) is

${\displaystyle F(x;\lambda )={\begin{cases}1-e^{-\lambda x}&x\geq 0,\\0&x<0.\end{cases}}}$

The quantile function for Exponential(λ) is derived by finding the value of Q for which ${\displaystyle 1-e^{-\lambda Q}=p}$:

${\displaystyle Q(p;\lambda )={\frac {-\ln(1-p)}{\lambda }},\!}$

for 0  p < 1. The quartiles are therefore:

first quartile (p = 1/4)
${\displaystyle -\ln(3/4)/\lambda \,}$
median (p = 2/4)
${\displaystyle -\ln(1/2)/\lambda \,}$
third quartile (p = 3/4)
${\displaystyle -\ln(1/4)/\lambda .\,}$

## Applications

Quantile functions are used in both statistical applications and Monte Carlo methods.

The quantile function is one way of prescribing a probability distribution, and it is an alternative to the probability density function (pdf) or probability mass function, the cumulative distribution function (cdf) and the characteristic function. The quantile function, Q, of a probability distribution is the inverse of its cumulative distribution function F. The derivative of the quantile function, namely the quantile density function, is yet another way of prescribing a probability distribution. It is the reciprocal of the pdf composed with the quantile function.

For statistical applications, users need to know key percentage points of a given distribution. For example, they require the median and 25% and 75% quartiles as in the example above or 5%, 95%, 2.5%, 97.5% levels for other applications such as assessing the statistical significance of an observation whose distribution is known; see the quantile entry. Before the popularization of computers, it was not uncommon for books to have appendices with statistical tables sampling the quantile function.[1] Statistical applications of quantile functions are discussed extensively by Gilchrist.[2]

Monte-Carlo simulations employ quantile functions to produce non-uniform random or pseudorandom numbers for use in diverse types of simulation calculations. A sample from a given distribution may be obtained in principle by applying its quantile function to a sample from a uniform distribution. The demands, for example, of simulation methods in modern computational finance are focusing increasing attention on methods based on quantile functions, as they work well with multivariate techniques based on either copula or quasi-Monte-Carlo methods[3] and Monte Carlo methods in finance.

## Properties

${\displaystyle \int _{0}^{1}ppf(x)\,dx=\mu }$ (Integral of inverse functions, Inverse transform sampling)

## Calculation

The evaluation of quantile functions often involves numerical methods, as the example of the exponential distribution above is one of the few distributions where a closed-form expression can be found (others include the uniform, the Weibull, the Tukey lambda (which includes the logistic) and the log-logistic). When the cdf itself has a closed-form expression, one can always use a numerical root-finding algorithm such as the bisection method to invert the cdf. Other algorithms to evaluate quantile functions are given in the Numerical Recipes series of books. Algorithms for common distributions are built into many statistical software packages.

Quantile functions may also be characterized as solutions of non-linear ordinary and partial differential equations. The ordinary differential equations for the cases of the normal, Student, beta and gamma distributions have been given and solved.[4]

### Normal distribution

The normal distribution is perhaps the most important case. Because the normal distribution is a location-scale family, its quantile function for arbitrary parameters can be derived from a simple transformation of the quantile function of the standard normal distribution, known as the probit function. Unfortunately, this function has no closed-form representation using basic algebraic functions; as a result, approximate representations are usually used. Thorough composite rational and polynomial approximations have been given by Wichura[5] and Acklam.[6] Non-composite rational approximations have been developed by Shaw.[7]

#### Ordinary differential equation for the normal quantile

A non-linear ordinary differential equation for the normal quantile, w(p), may be given. It is

${\displaystyle {\frac {d^{2}w}{dp^{2}}}=w\left({\frac {dw}{dp}}\right)^{2}}$

with the centre (initial) conditions

${\displaystyle w\left(1/2\right)=0,\,}$
${\displaystyle w'\left(1/2\right)={\sqrt {2\pi }}.\,}$

This equation may be solved by several methods, including the classical power series approach. From this solutions of arbitrarily high accuracy may be developed (see Steinbrecher and Shaw, 2008).

### Student's t-distribution

This has historically been one of the more intractable cases, as the presence of a parameter, ν, the degrees of freedom, makes the use of rational and other approximations awkward. Simple formulas exist when the ν = 1, 2, 4 and the problem may be reduced to the solution of a polynomial when ν is even. In other cases the quantile functions may be developed as power series.[8] The simple cases are as follows:

ν = 1 (Cauchy distribution)
${\displaystyle Q(p)=\tan(\pi (p-1/2))\!}$
ν = 2
${\displaystyle Q(p)=2(p-1/2){\sqrt {\frac {2}{\alpha }}}\!}$
ν = 4
${\displaystyle Q(p)=\operatorname {sign} (p-1/2)\,2\,{\sqrt {q-1}}\!}$

where

${\displaystyle q={\frac {\cos \left({\frac {1}{3}}\arccos \left({\sqrt {\alpha }}\,\right)\right)}{\sqrt {\alpha }}}\!}$

and

${\displaystyle \alpha =4p(1-p).\!}$

In the above the "sign" function is +1 for positive arguments, -1 for negative arguments and zero at zero. It should not be confused with the trigonometric sine function.

## Quantile mixtures

Analogously to the mixtures of densities, distributions can be defined as quantile mixtures

${\displaystyle Q(p)=\sum _{i=1}^{m}a_{i}Q_{i}(p)}$,

where ${\displaystyle Q_{i}(p)}$, ${\displaystyle i=1,\ldots ,m}$ are quantile functions and ${\displaystyle a_{i}}$, ${\displaystyle i=1,\ldots ,m}$ are the model parameters. The parameters ${\displaystyle a_{i}}$ must be selected so that ${\displaystyle Q(p)}$ is a quantile function. Two four-parametric quantile mixtures, the normal-polynomial quantile mixture and the Cauchy-polynomial quantile mixture, are presented by Karvanen.[9]

## Non-linear differential equations for quantile functions

The non-linear ordinary differential equation given for normal distribution is a special case of that available for any quantile function whose second derivative exists. In general the equation for a quantile, Q(p), may be given. It is

${\displaystyle {\frac {d^{2}Q}{dp^{2}}}=H(Q)\left({\frac {dQ}{dp}}\right)^{2}}$

augmented by suitable boundary conditions, where

${\displaystyle H(x)=-{\frac {f'(x)}{f(x)}}}$

and ƒ(x) is the probability density function. The forms of this equation, and its classical analysis by series and asymptotic solutions, for the cases of the normal, Student, gamma and beta distributions has been elucidated by Steinbrecher and Shaw (2008). Such solutions provide accurate benchmarks, and in the case of the Student, suitable series for live Monte Carlo use.

## References

1. "Archived copy" (PDF). Archived from the original (PDF) on March 24, 2012. Retrieved March 25, 2012.CS1 maint: archived copy as title (link)
2. Gilchrist, W. (2000). Statistical Modelling with Quantile Functions. ISBN 1-58488-174-7.
3. Jaeckel, P. (2002). Monte Carlo methods in finance.
4. Steinbrecher, G., Shaw, W.T. (2008). "Quantile mechanics". European Journal of Applied Mathematics. 19 (2): 87–112. doi:10.1017/S0956792508007341.CS1 maint: multiple names: authors list (link)
5. Wichura, M.J. (1988). "Algorithm AS241: The Percentage Points of the Normal Distribution". Applied Statistics. Blackwell Publishing. 37 (3): 477–484. doi:10.2307/2347330. JSTOR 2347330.
6. Computational Finance: Differential Equations for Monte Carlo Recycling
7. Shaw, W.T. (2006). "Sampling Student's T distribution – Use of the inverse cumulative distribution function". Journal of Computational Finance. 9 (4): 37–73.
8. Karvanen, J. (2006). "Estimation of quantile mixtures via L-moments and trimmed L-moments". Computational Statistics & Data Analysis. 51 (2): 947–956. doi:10.1016/j.csda.2005.09.014.
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.