Noob question: It sounds like the statement is “we learned it’s cheap to copy an already built model” from its outputs, but is it still expensive to train a new (better) base model?<p>If so, is this mostly a concern because there’s little moat available now to those who pay to train the better base models?