I was recently reading about Graham's Number (<a href="https://en.wikipedia.org/wiki/Graham%27s_number" rel="nofollow">https://en.wikipedia.org/wiki/Graham%27s_number</a>) and was pondering large numbers (as much as one can) and I was thinking: would it be possible to estimate the total/cumulative number of clock cycles run on every single microprocessor ever made? Would it be possible to mathematically describe how that number is changing along a time axis?
Certainly possible. Just for fun, let's do some quick math. Imagine that we had 100 billion processes running constantly at 10 GHz since ~1970. This would result in ~10^30 clock ticks (10^9 sec * 10^10 ticks/sec * 10^11 processors).<p>That number is child's play compared to numbers like Graham's number. The exponent of Graham's number is so large that it can't be written down. 30 is pretty easy to write down. For comparison there are about 10^80 atoms in the universe. That's still a tiny number compared to Graham's number.
As an FYI, any number that is grounded in counting the occurrences of any event across the entire universe and until the end of time won't even come close to Graham's Number and any other "immense numbers" in math.
You could try a Fermi approximation:<p>There are 10 billion people * 100 uC chips / person * 10 years of life / uC chip * 10^13 microseconds / year * 1 clock cycle / microsecond = ~10^26 clock cycles
The single number answer is going to be utterly dominated by the past 5 years, so it's not too difficult to answer.<p>If you wanted a time-series that shows how this number evolves over time, you would end up with a <i>very</i> interesting history of tech book. Something like Milton Friedman's "A Monetary History of the United States."<p>I would buy it / crowd fund it.<p>If all you really want to know is "number of cycles" you should probably research oscillator manufacturers.<p>If you're actually interested in the "volume of compute" you should start with 10K's for Intel and Xilinx, and fan out to their competitors. Use market capitalization over time as a filter for inclusion in your tally, as you can't research <i>every</i> manufacturer.
If you think Graham’s number is big, Rayo’s number is a lot bigger - <a href="https://en.m.wikipedia.org/wiki/Rayo%27s_number" rel="nofollow">https://en.m.wikipedia.org/wiki/Rayo%27s_number</a><p>Rayo’s number is actually a function R(n) where n=googol. You can define much bigger numbers just by using a bigger n, e.g. googolplex. For something even bigger still, you can iterate the function. Think about Rayo’s function iterated Rayo’s number times.
You might find this useful - Fermi problems: <a href="https://en.m.wikipedia.org/wiki/Fermi_problem" rel="nofollow">https://en.m.wikipedia.org/wiki/Fermi_problem</a><p>You could make a rough model based on educated guesses and perhaps not even be that far off the actual number.