Here's the solution I thought of, which is pretty similar.<p>Start with the case definition of min.<p>Notice that min(a,b) <= a, min(a,b) <= b and min(a,b) in {a,b}<p>Let d = min(a,min(b,c)) and e = min(min(a,b),c)<p>Then d <= a, d <= b, d <= c and d in {a,b,c}<p>Same goes for e.<p>Since e is in {a,b,c} and d <= a, d <= b, d <= c, it must follow that d <= e. And by symmetry e <= d. So d=e.
The Dedekind cut formulation of the real numbers is that a real number is defined to be a nonempty "downwards closed" subset of rational numbers. Then the min operation corresponds to taking intersections, and taking intersections is associative.<p>It's cool they figured something like this out from first principles.<p>edit: oops, somehow didn't notice arnarbi already mentioned this!
Similarly you can show that gcd and lcm are associative, once you know their universal properties (for gcd, being the one common divisor that every common divisor divides; for lcm, the corresponding dual property). Similarly the associativity (up to unique diagram-fitting isomorphism) of products (or even fibered products) in any category (when they exist).<p>The really high horse here is called "Yoneda embedding", but there is not much gained from going that high.
I think a shorter way to write that is to first note the downset representation of reals. Then your definition of min is just the set intersection and there's nothing more to show.
<i>This is the definition that leads to the ugly proof by cases.</i><p>Right there is the problem. There's nothing wrong with proof by cases. People suck at it, but machines do it just fine. I ran into this years ago in the early days of program proving. The kind of proofs that mathematicians like are not the kind you really want to use to get work done by machine.
> The book expected them to do a proof by cases, with some sort of case split on the order of a , b , and c . What they turned in was mostly pretty good, actually, but while grading it I became disgusted with the whole thing and thought there has to be a better way.<p>This is how it's done in the Idris standard library[1]:<p><pre><code> total minimumAssociative : (l,c,r : Nat) -> minimum l (minimum c r) = minimum (minimum l c) r
minimumAssociative Z c r = Refl
minimumAssociative (S k) Z r = Refl
minimumAssociative (S k) (S j) Z = Refl
minimumAssociative (S k) (S j) (S i) = rewrite minimumAssociative k j i in Refl
</code></pre>
[1] Or prelude, or whatever it's called. <a href="https://github.com/idris-lang/Idris-dev/blob/master/libs/prelude/Prelude/Nat.idr#L772" rel="nofollow">https://github.com/idris-lang/Idris-dev/blob/master/libs/pre...</a>
I find proofs like this weird.<p>I can never quite understand what it means to say anything beyond:<p>"whichever of {a,b,c} is the smallest is the answer to both sides".<p><i>Who is this audience</i> for which all this extra ink carries extra meaning? <i>What additional insight</i> can you have possibly communicated to them?