This is a very interesting article.<p>A minor quibble - it is very difficult to find a paradigm in which the running time of the proposed algorithm differs from simply choosing the best of the obvious algorithms.<p>Line at a time takes O( max(w,h) ). Binary searching each row and column separately takes O( min(w,h) log(max(w,h) ). Therefore, implementing both basic algorithms and using the faster one at run-time will run in time O(min(max(w,h), min(w,h)log(max(w,h)).<p>Which is very close to the proposed run-time - unfortunately so much so that it is very hard to find a sequence of (w,h) for which the proposed algorithm works asymptotically better than simply doing "the obvious thing".<p>But it is possible! And the lower bounds are nice!
I wrote a version in Python: <a href="https://gist.github.com/agfor/6225437" rel="nofollow">https://gist.github.com/agfor/6225437</a><p>The original code can be used under CC BY-SA 3.0 since it was posted on Stack Overflow: <a href="http://stackoverflow.com/a/2468729/500584" rel="nofollow">http://stackoverflow.com/a/2468729/500584</a>
Great article. I think considering lower bounds and building intuition of a problem before attempting to solve it is a major step in algorithm design lots of people miss. If you don't really know your problem well, how can you ever be certain that your solution is good, let alone optimal!
This is a common technique of switching between two algorithms. iirc, I never saw this as an material in undergrad algorithms course.<p>I wrote on another problem that uses this technique. Mix ternary search and linear search to find the minima of an first decreasing then increasing array.<p><a href="http://www.chaoxuprime.com/posts/2013-07-27-find-the-minimum-of-an-array.html" rel="nofollow">http://www.chaoxuprime.com/posts/2013-07-27-find-the-minimum...</a>