The human brain has roughly 20 PFLOPS of compute according to Ray Kurzweil.[0]<p>Currently, a 42U rack with A100s has roughly 20 PFLOPS of compute.[1]<p>If we continue with Moores Law, doubling every 18 months, we could theoretically reach 1.7 x 10^22 PFLOPS of compute in the same 42U rack in 100 years.<p>This equates to the power of 850 quintillion human brains.<p>A single rack could hypothetically hold the equivalent of 100 billion humanities' computational power in a century.<p>[0] https://en.wikipedia.org/wiki/Computer_performance_by_orders_of_magnitude#Petascale_computing_(1015)<p>[1] https://en.wikipedia.org/wiki/Nvidia_DGX#DGX_Station_A100
Extrapolating Moore's Law out another hundred years seems highly optimistic. That would give you transistors 10^20 times smaller than they are today. You're not going to be making such transistors out of atoms...
Sure you can add 2 floats 2e16 times a second but can you do anything useful with it? I know my computing tasks have some branches in there. I also occasionally need to wait for data. Anyway there won't be anyone smart enough to write to AI program in 100 years so there will never be 1 brain equivalent.
It takes OpenAI millions of dollars to train and keep ChatGPT running, they basically can't get enough GPUs at this point, and it's so far away from AGI that it's not even funny. In a 100 years we may have AGI or not. I wouldn't hold my breath.