As I was reading, I was slowly understanding that N-bit division would require (in the worst case) an N + 1 bit magic number. How in the world could this be solvable in the general case? My assumption was that we just did division for those numbers the "old-fashioned" way.<p>But, at the end of the day, the <i>actual</i> solution -- ((n − q) ≫ 2) + q -- is ingenious, fantastic article!
For more algorithm fun exactly like this, check out the classic Hacker's Delight book (<a href="https://en.wikipedia.org/wiki/Hacker%27s_Delight" rel="nofollow">https://en.wikipedia.org/wiki/Hacker%27s_Delight</a>)