This is an odd framing.<p>Training has become much more accessible, due to a variety of things (ASICs, offerings from public clouds, innovations on the data science side). Comparing it to Moore's Law doesn't make any sense to me, though.<p>Moore's Law is an observation on the pace of increase of a tightly scoped thing, the number of transistors.<p>The cost of training a model is not a single "thing," it's a cumulative effect of many things, including things as fluid as cloud pricing.<p>Completely possible that I'm missing something obvious, though.