# Maximum (random variable)

This page is a stub, so it contains little or minimal information and is on a to-do list for being expanded.The message provided is:
This page is currently little more than notes, beware. Alec (talk) 18:56, 26 November 2017 (UTC)

## Definition notes

Let [ilmath]X_1,\ \ldots,\ X_n[/ilmath] be i.i.d random variables that are sampled from a distribution, [ilmath]X[/ilmath] and additionally let [ilmath]M:\eq\Max(X_1,\ldots,X_n)[/ilmath] for short.

• $\P{\Max(X_1,\ldots,X_n)\le x}\eq\P{X_1\le x}\Pcond{X_2\le x}{X_1\le x}\cdots\Pcond{X_n\le x}{X_1\le x\cap\cdots\cap X_{n-1}\le x}$
$\eq\prod^n_{i\eq 1}\P{X_i\le x}$ - provided the [ilmath]X_i[/ilmath] are independent random variables
$\eq\prod^n_{i\eq 1}\P{X\le x}$
$\eq \big(\P{X\le x}\big)^n$

We shall call this [ilmath]F'(x):\eq \big(\P{X\le x}\big)^n[/ilmath] (and use [ilmath]F(x):\eq \P{X\le x} [/ilmath], as is usual for cumulative distribution functions) Caveat:Do not confuse the 's for derivatives, then:

• the probability density function, $f'(x):\eq \frac{\mathrm{d} }{\mathrm{d} x}\big[F'(x)\big]\Big\vert_{x}$[Note 1] is:
• $f'(x)\eq\ddx{\big(\P{X\le x}\big)^n}{x}$
$\eq\ddx{\big(F(x)\big)^n}{x}$
$\eq n\big(F(x)\big)^{n-1}f(x)$ by the chain rule, herein written [ilmath]nF(x)^{n-1}f(x)[/ilmath] for simplicity.
• so $f'(x)\eq nF(x)^{n-1}f(x)$

## Expectation of the maximum

### Special cases

• For [ilmath]X\sim[/ilmath][ilmath]\text{Rect} [/ilmath][ilmath]([/ilmath][ilmath][a,b][/ilmath][ilmath])[/ilmath]:
• $\E{\Max(X_1,\ldots,X_n)}\eq \frac{nb+a}{n+1}$
• This is actually simplified from the perhaps more useful $a+\frac{n}{n+1}(b-a)$, recognising [ilmath](b-a)[/ilmath] as the width of the uniform distribution we see that it slightly "under estimates" [ilmath]a+(b-a)\eq b[/ilmath], from this we can actually obtain a very useful unbiased estimator.

#### [ilmath]a\eq 0[/ilmath] case

Suppose that [ilmath]a\eq 0[/ilmath], then to find [ilmath]b[/ilmath] we could observe that the $\E{X}\eq\frac{b}{2}$, so 2x the average of our sample would have expectation [ilmath]\eq b[/ilmath] - this is indeed true.

However note in this case that the maximum has expectation $\E{M}\eq \frac{n}{n+1}b$

• Thus: $\frac{n+1}{n}\E{M}\eq b$ and so [ilmath]\E{\frac{n+1}{n}M}\eq b[/ilmath]

So defining: [ilmath]M'\eq \frac{n+1}{n}M[/ilmath] we do obtain an unbiased estimator for [ilmath]b[/ilmath] from our biased one

It can be shown that for [ilmath]n\ge 8[/ilmath] that [ilmath]M'[/ilmath] has lower variance (and thus is better) than the 2x average estimator, they agree for [ilmath]n\eq 1[/ilmath]. For [ilmath]2\le n\le 7[/ilmath] 2x the average has the lower variance and is thus objectively better (or the same, from the [ilmath]n\eq 1[/ilmath] case) as an estimator for b when [ilmath]n\le 7[/ilmath] Warning:Only the following is known Alec (talk) 18:56, 26 November 2017 (UTC)

• This is only true when comparing the variance of [ilmath]M[/ilmath] (not [ilmath]M'[/ilmath]) to that of [ilmath]\frac{1}{n}\sum^n_{i\eq 1}X_i[/ilmath], as we double the average, the variance would go up 4 times making the difference even worse.
• To obtain [ilmath]M'[/ilmath] we multiply [ilmath]M[/ilmath] by a small (specifically [ilmath]\frac{n+1}{n} [/ilmath]) constant slightly bigger than 1 as [ilmath]n[/ilmath] stops being tiny, this will increase the variance by a factor of this value squared, which will still be "slightly bigger than 1" so [ilmath]M'[/ilmath] is a better estimator, it may however move the specific bound of "better for [ilmath]n\ge 8[/ilmath] further down probably - this needs to be calculated

This is independent of [ilmath]b[/ilmath]

Specifically:

• $\text{Var}(M)\eq \left(\frac{n}{n+2}-\frac{n^2}{(n+1)^2}\right)b^2$ - note this is for [ilmath]M[/ilmath] not [ilmath]M'[/ilmath] and
• $\text{Var}\left(\frac{1}{n}\sum^n_{i\eq 1}X_i\right)\eq\frac{b^2}{12n}$

## Notes

1. The [ilmath]x[/ilmath] inside the square brackets is bound to the [ilmath]x[/ilmath] at the base of the [ilmath]\vert[/ilmath]