Discussion about this post

User's avatar
Amos Zeeberg's avatar

I think these graphs are only telling part of the story: they're only showing the energy used for training. If I recall correctly, there's a lot more energy used in the inference phase of operation. (For one thing, training happens rarely; inference happens all the time.) But I don't know how the energy demands of inference are changing over time. Maybe inference is scaling more slowly than training. To understand the future of AI energy use, I think you have to look at inference, too.

Expand full comment

No posts