One should also look at Gustafson's Law [0], which is kind of a foil to Amdahl's law. While the serial portion of a program can not necessarily be parallelized, who cares if the parallelizable part is what continues to grow in size year over year?<p>Take this example: If you are a video rendering house, and do all of your work during the day and render at night (for say, 8 hours) you can get 'x' amount of polygons rendered (or whatever other metric). If for your next movie, say you want to have twice the number of polygons making up your scene. Gustafson's law says that doubling the data set size (or number of polygons, flops, or whatever other metric that has parallelizable computation) means you can actually get '2n' done in the same amount of time if you were to double the amount of computational performance of a system.<p>You do run into a bit harder problem today with memory bound (either size or bandwidth) applications, but there are many approaches to addressing that.<p>[0] <a href="https://en.wikipedia.org/wiki/Gustafson%27s_law" rel="nofollow">https://en.wikipedia.org/wiki/Gustafson%27s_law</a><p>Also: Here is the original 1988 paper by Gustafson... a fantastic read, and winner of the first Gordon Bell award. <a href="http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.85.6348" rel="nofollow">http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.85.6...</a>