Von Neumann's solution is clever, but extremely inefficient in terms of how many flips you need per random bit produced.<p>More interesting methods produce a stream of unbiased random bits out of a stream of flips; and you can look at the asymptotic efficiency.<p>A simple method to run better than von Neumann's method:<p>Produce 1,000 coin flips. Say you produce k heads and 1,000-k tails.<p>Produce a table of all possible sequences of length 1,000 with k heads and 1,000-k tails.<p>Look up the position of your actual realized sequence in that table. The index of that position is a uniform random integer in a range from 1 to the size of the table.<p>Use your favourite method to extract unbiased bits from that unbiased integer.<p>The only requirement to make the above method work is that coin flips are i.i.d., ie independent and identically distributed.<p>(In practice you probably don't want to actually construct the table, but instead compute the index directly as-if you had constructed the table.<p>You might also want to work with arbitrary number of flips, instead of a fixed 1,000; or even adopt the method to an arbitrary length stream of coin flips that you don't know in advance.<p>Also in practice, you don't need to store the whole sequence. What you do is keep a count of how many heads and tails you've seen so far, but feed the individual coin flips into an 'entropy pool'. Your head/tail count will inform your estimate of how many bits of entropy you have in your pool. (It basically feeds into the formula for how many possible orders of arrangement you have, similar to the fixed-size method suggested above.)<p>You generate your unbiased bits by draining the from the entropy pool.<p>The method of entropy pools described here is pretty much how /dev/random works on Linux.)