The reality is that because of the fast pace of development of other LLM providers, primarily Anthropic, Llama, Google's Gemini, and even some Chinese LLMs, the difference between OpenAI's models and their competitors has shrunk substantially. The next crucial battle between cloud providers will be around their custom silicon development. The 4th to enter the »hyperscaler« world is Oracle. Social media is the biggest consumer of GenAI and the first to benefit.
I'm still skeptical of inference vs training changing much at-least in the next few years. I agree at steady state that inference should be vastly greater than training, but we are in early experimental phase - there is going to be a lot of appetite to keep hammering on new paths - we've barely see the multi model thing take off. We haven't any major player try video generation yet - just a few smaller participants like runway.
Many of those big AGI size clusters are coming online end this year - we've heard Meta (100K), Grok (100k) and I assume its similar with OpenAI. This is a significantly larger than the clusters used to train the largest models we see today, about 4-5x, so we should see a step change in performance from this.
Thanks a lot for the great writeup!
I'm still skeptical of inference vs training changing much at-least in the next few years. I agree at steady state that inference should be vastly greater than training, but we are in early experimental phase - there is going to be a lot of appetite to keep hammering on new paths - we've barely see the multi model thing take off. We haven't any major player try video generation yet - just a few smaller participants like runway.
Richard, a very useful article. nice work
Thank you, I am glad you liked it!
Many of those big AGI size clusters are coming online end this year - we've heard Meta (100K), Grok (100k) and I assume its similar with OpenAI. This is a significantly larger than the clusters used to train the largest models we see today, about 4-5x, so we should see a step change in performance from this.