Friday, April 19, 2024

Costs of Creating Machine Learning Models is Up Sharply

With the caveat that we must be careful about making linear extrapolations into the future, training costs of state-of-the-art AI models have reached unprecedented levels, according to Stanford University’s Human-Centered Artificial Intelligence institute. 


Where OpenAI’s GPT-4 used an estimated $78 million worth of compute to train, Google’s Gemini Ultra cost $191 million for compute, HAI estimates. 

source: Stanford University Human-Centered AI report


HAI estimates suggest that model training costs have significantly increased. For example, in 2017,

the original Transformer model, which introduced the architecture that underpins virtually every modern

LLM, cost around $900 to train.


RoBERTa Large, released in 2019, which achieved state-of-the-art results on many canonical comprehension benchmarks like SQuAD and GLUE, cost around $160,000 to train.


Fast-forward to 2023, and training costs for OpenAI’s GPT-4 and Google’s Gemini Ultra are estimated to be around $78 million and $191 million, respectively, according to HAI. 

 

source: Stanford University Human-Centered AI report


No comments:

Will AI Fuel a Huge "Services into Products" Shift?

As content streaming has disrupted music, is disrupting video and television, so might AI potentially disrupt industry leaders ranging from ...