ChatGPT may not be as power-hungry as once assumed : Only 0.3 Watts Per Response, Far Below Rumors!

ChatGPT, OpenAI’s AI-powered chatbot, may not be as energy-intensive as previously assumed. According to a new study by Epoch AI, the power consumption of ChatGPT queries is significantly lower than earlier estimates, though its energy demands are still expected to grow over time.

The study highlights the efficiency of models like GPT-4o and suggests using them sparingly to reduce energy footprints. While advancements in AI will likely lead to increased energy requirements, improvements in efficiency offer some mitigation

Despite this lower-than-expected consumption, AI’s overall energy impact remains a contentious issue. Last week, over 100 organisations signed an open letter urging AI companies and regulators to prevent AI data centres from depleting natural resources and increasing reliance on nonrenewable energy.

Despite the current low energy consumption data, You expects that energy consumption may rise in the future. Analysts mentioned that as AI technology advances, the energy demands for training these models may increase, and future AI may undertake more complex tasks, thereby consuming more electricity.

OpenAI and its investment partners plan to invest billions of dollars in new AI data center projects over the next few years. As AI technology develops, the industry’s focus is also beginning to shift towards inference models, which are more capable in handling tasks but also require more computational power and electricity support.

For those concerned about their AI energy consumption, You suggests reducing usage frequency or opting for models with lower computational demands.

Leave a Comment