Either I haven’t seen this before, or forgot it, but it’s surprising because I use the sum of independent uniform variables every once in a while — the sum of two vars is a tent function, the sum of three is a smooth piecewise quadratic lump, and the sum of many tends toward a normal distribution. And the distribution is easy calculated as the convolution of the input box functions (uniform variables). Looking it up just now I learned the sum of uniform variables is called an Irwin-Hall distribution (aka uniform sum distribution).<p>The min of two random vars has the opposite effect as the max does in this video. And now I’m curious - if we use the function definition of min/max — the nth root of the sum of the nth powers of the arguments — there is a continuum from min to sum to max, right? Are there useful applications of this generalized distribution? Does it already have a name?
If X1...Xn are independently uniformly distributed between 0 and 1 then:<p>P(max(X1 ... Xn) < x) =<p>P(X1 < x and X2 < x ... and Xn < x) =<p>P(X1 < x) P(X2 < x) ... P(Xn < x) =<p>x^n<p>Also,<p>P(X^{1/n} < x) = P(X < x^n) = x^n<p>I guess I am just an old man yelling at clouds, but it seems <i>so</i> strange to me that one would bother checking this with a numerical simulation. Is this a common way to think about, or teach, mathematics to computer scientists?
Front page material? P(max{X_1, X_2} <= x) = P(X_1 <= x, X_2 <= x) = P(X_1 <= x) P(X_2 <= x) = xx. P(sqrt(X_3) <= x) = P(X_3 <= x^2) = x^2.
It is late in the day when midgets cast long shadows.
Just a side comment on what a great little video.<p>Short, to the point, and the illustrations/animations actually helped convey the message.<p>Would be super cool if someone could recommend some social media account/channel with collections of similar quality videos (for any field).
Matt Parker's video on Square Roots and Maxima: <a href="https://www.youtube.com/watch?v=ga9Qk38FaHM" rel="nofollow">https://www.youtube.com/watch?v=ga9Qk38FaHM</a>