Context of Approximate
A spokesperson for OpenAl, the developer of Al chatbot ChatGPT, tells New Scientist: “We recognize training large models can be energy-intensive and is one of the reasons we are constantly working to improve efficiencies. We give considerable thought about the best use of our computing power.”
There are signs that smaller AI models are now approximating the capabilities of larger ones, which could bring energy savings, says Thomas Wolf, co-founder of AI company Hugging Face. Mistral 7B and Meta’s Llama 2 are 10 to 100 times smaller than GPT4, the Al behind ChatGPT, and can do many of the same things, he says. “Not everyone needs GPT4 for everything, just like you don’t need a Ferrari to go to work.”
–New Scientist