So we now have models with 0.5 trillion parameters, each the weight of a connection in a neural network.<p>Trillion-parameter models are surely within reach in the near term -- and that's only within two orders of magnitude of the number of synapses in the human brain, which is in the hundreds of trillions, give or take. To paraphrase the popular saying, a trillion here, a trillion there, and pretty soon you're talking really big numbers.<p>I know the figures are not comparable apples-to-apples, but still, I find myself <i>in awe</i> looking at how far we've come in just the last few years, to the point that we're realistically contemplating the possibility of seeing dense neural networks with hundreds of trillions of parameters used for real-world applications in our lifetime.<p>We sure live in interesting times.