IMO, most of these explanations of Big O that I've ever seen have missed the key feature of Big O (the "killer application" if you will): that it's relatively independent of the exact computational model you use. Sure, you have to decide on the broad strokes of your computational model, but you don't need exact, instruction-level details.<p>This is a pretty huge benefit because it means every CS paper doesn't have to start with a section introducing an abstract machine. The downfall is, of course, that you lose some ability to estimate the potential real-world performance of the algorithm, and the notation itself hides differences between algorithms of the same Big O complexity.
I'll try my hand at a "plain english" explanation of an O(n^2) algorithm.<p>It's a function which takes a list of numbers as inputs. You give it a list of two numbers. It executes four operations before terminating.<p>You give it a list of three numbers. It executes nine operations before terminating.<p>You give it a list of four numbers. It executes sixteen operations before terminating.<p>You see how the time required grows as n^2 with respect to the input? That's an O(n^2) algorithm, and that's why such an algorithm can be bad. A list of 1,000,000 numbers would require a trillion operations.<p>The way this can happen is if you write a double-for loop:<p><pre><code> for x in input:
for y in input:
...
</code></pre>
So, there you go. You probably won't get any plainer of an explanation without becoming oversimplified and losing meaning.
I always found the mathematical definitions of asymptotic notation the most clear:<p>An algorithm with runtime r(n) is "big O" of f(n) if the limit as n -> infinity of r(n) / f(n) <= some constant.<p>EDIT: See replies below.
There have not only been discussions of this before, some previous HN contributions have provided significant criticism and enhancements. If you value the advice and expertise of the HN community at all, you should consider reading some of these earlier comments:<p>Here are the previous submissions with the most discussion:<p><a href="https://news.ycombinator.com/item?id=1520552" rel="nofollow">https://news.ycombinator.com/item?id=1520552</a> (24 comments, 1242 days ago)<p>I found them by using this search:<p><a href="https://www.hnsearch.com/search#request/all&q=title%3A%28plain+english+explanation+big%29&sortby=create_ts+desc&start=0" rel="nofollow">https://www.hnsearch.com/search#request/all&q=title%3A%28pla...</a><p>Here are some other submissions:<p><a href="https://news.ycombinator.com/item?id=695988" rel="nofollow">https://news.ycombinator.com/item?id=695988</a> (4 comments)<p><a href="https://news.ycombinator.com/item?id=2344181" rel="nofollow">https://news.ycombinator.com/item?id=2344181</a><p><a href="https://news.ycombinator.com/item?id=3807175" rel="nofollow">https://news.ycombinator.com/item?id=3807175</a><p><a href="https://news.ycombinator.com/item?id=3846993" rel="nofollow">https://news.ycombinator.com/item?id=3846993</a><p><a href="https://news.ycombinator.com/item?id=5164236" rel="nofollow">https://news.ycombinator.com/item?id=5164236</a><p><a href="https://news.ycombinator.com/item?id=5636683" rel="nofollow">https://news.ycombinator.com/item?id=5636683</a><p><a href="https://news.ycombinator.com/item?id=5778469" rel="nofollow">https://news.ycombinator.com/item?id=5778469</a><p>This submission of the same item has some discussion, but mostly of the fact that this item is submitted so often, whether it's a problem, and whether we should do something about it, possibly to make HN a better source of curated discussion:<p><a href="https://news.ycombinator.com/item?id=5785523" rel="nofollow">https://news.ycombinator.com/item?id=5785523</a> (24 comments)<p>The answer seems to be "No".<p>Additionally, you might be interested in reading these:<p><a href="https://news.ycombinator.com/item?id=4655061" rel="nofollow">https://news.ycombinator.com/item?id=4655061</a> : Big-O Misconceptions<p><a href="https://news.ycombinator.com/item?id=5770232" rel="nofollow">https://news.ycombinator.com/item?id=5770232</a> : What does O(log n) mean, exactly?
This is a fine example of Stack* giving huge rankings to very bad answers. Big O is about execution time not complexity. (And describing execution time as "time complexity" as some comments do is ridiculous. A lot of something is not more complicated than a little of something. 4 isn't more complex than π -- and no, I'm not saying the reverse is true either.)<p>A plain English explanation of Big O notation shouldn't take more than a few sentences. It's not that complicated, especially if you don't get into nitty gritty.
Interesting... although his explanation of TSP complexity is unfortunately incorrect (or misstated). The complexity he is describing is specifically for the naive brute force search [1] algorithm for finding the shortest path for TSP. And a 200 city TSP is quite solvable, and often times even by a mobile device like an iPhone 5 [2]! Indeed, no need for all the time in the universe -- lucky for companies like UPS who utilize TSP variants regularly.<p>[1] <a href="http://en.wikipedia.org/wiki/Big_O_notation#Orders_of_common_functions" rel="nofollow">http://en.wikipedia.org/wiki/Big_O_notation#Orders_of_common...</a><p>[2] <a href="http://www.math.uwaterloo.ca/tsp/iphone/" rel="nofollow">http://www.math.uwaterloo.ca/tsp/iphone/</a>
Suppose we have a human-written program that's N-bits long. It takes an input, and generates an infinite output by performing some hashing of the input. We run it on an M-bit input to get some output, then stop it manually.<p>Now suppose we have an input and the output made by this program. But we don't have the program. Suppose we want to discover by brute force what the program was.<p>The brute-force algorithm will create every possible program up to N bits long, and run each one until either a difference in outputs is found, or it found the correct output, or too much time has passed.<p>How do you come up with the order-notation for this brute-force algorithm?
The author of this answer, cletus, is also a member of this community and has quite a high karma.<p>See his content here: <a href="https://news.ycombinator.com/user?id=cletus" rel="nofollow">https://news.ycombinator.com/user?id=cletus</a>
Aaaand it's wrong like 99.9% other explanations:<p>f(n)=O(n^2) is actually NOT f(n) asymptotically grows as fast as n^2, it is f grows “no faster” than n^2.<p>“As fast as” is f(n)=Θ(n^2)<p>Edit: oh, and Big-O is not “a relative representation of the complexity of an algorithm”
a lot of words for "the larges number of times a loop/operation might have to run, for a list of size 'n'"<p>honestly the language of maths too often obfuscates rather than simplifies