The most interesting thing about this article is the claim that GPT-4 has 1 trillion parameters.<p>Microsoft's recent GPT-4 paper [0] hints at the "unprecedented scale of compute and data" used to train the model. What else do we know about the new model itself?<p>[0] <a href="https://www.microsoft.com/en-us/research/publication/sparks-of-artificial-general-intelligence-early-experiments-with-gpt-4/" rel="nofollow">https://www.microsoft.com/en-us/research/publication/sparks-...</a>
Sam was a great pick, OpenAI will do well under his stewardship. Execution is important, but in this case, also humility and caution. Always interesting to hear about these internal power struggles and dynamics.