Myth disproved: ChatGPT request consumes less than 1 W/h
ChatGPT has become an integral part of many users' workflows, but with its popularity, concerns about high power consumption have also grown. However, a new study by Epoch AI refutes these concerns.
Here's What We Know
Researcher Joshua Yu calculated that a single question to ChatGPT (based on GPT-4o) consumes about 0.3 W/h, which is much less than previously assumed. For comparison, it was previously thought that each chatbot query consumed 3 W/h, but this calculation was based on outdated data and less efficient hardware.
Joshua notes that this power consumption is much lower than that of most household appliances. In addition, tech companies are actively working to improve the energy efficiency of artificial intelligence. New generations of AI chips are becoming not only more powerful but also less energy-consuming.
However, as AI develops, electricity consumption will still increase. Models focused on more complex tasks will require more computing power, and future AI agents will be able to perform various human-level tasks.
It is expected that future AI agents will be able to work at the human level in various fields, which will lead to an increase in the number of data centres and, consequently, an increase in energy consumption.
"AI will continue to evolve, and its training will require more and more resources. In the future, these systems will work much more intensively, performing more complex tasks than they do now," the scientist concluded.
Source: Android Headlines