The Hidden Cost of Talking to AI: How Our Chatbots Are Burning Through Energy
From polite prompts to massive models, artificial intelligence is quietly racking up a carbon footprint. Should we be worried?
Artificial intelligence is now stitched into everyday life—powering search results, summarising emails, generating images, even offering career advice. But every time you ask ChatGPT a question, it draws on data centres packed with energy-hungry processors. Just one short response uses enough electricity to power a 10-watt LED lightbulb for up to 30 seconds. Multiply that across millions of users and you get the picture.
It’s such a problem that OpenAI’s CEO Sam Altman recently said the simple act of typing “please” and “thank you” into ChatGPT has cost the company tens of millions of dollars in energy. This isn’t just a tech issue—it’s an environmental one.
What’s really using the energy?
There are two phases to most AI systems: training and inference. Training is the energy monster. Large models like GPT-3 or GPT-4 are trained on vast datasets, using thousands of GPUs across several weeks. That process alone can consume more than 1,200 megawatt-hours—enough to power over 120 UK homes for an entire year.
Once trained, the model is deployed for inference—basically answering questions. That’s less intensive per request, but with millions of queries happening every day, the collective cost adds up fast.
Why it’s a growing issue
The use of AI is booming, and with it, the size of models. Bigger models mean more data, more computing power, and more electricity. While some companies now use greener data centres, many still depend on fossil-fuel-heavy grids.
If AI continues scaling at its current pace, its energy demands could become one of the tech industry’s biggest environmental headaches.
What’s being done to fix it?
There’s growing pressure within the industry to reduce AI’s energy use:
Lean models: Developers are finding ways to make models smaller and faster without sacrificing quality.
Efficient hardware: New processors are being designed to deliver more performance with less power.
Greener data centres: Companies are investing in renewable energy and smarter cooling systems.
Calls for transparency: Researchers are urging AI labs to publish energy and emissions data alongside model performance.
Should users care?
Most people don’t think twice before typing a question into ChatGPT or a similar tool—but they should. AI might feel virtual, but it runs on real-world power. Even your polite sign-off has a cost.
We’re not saying stop using AI. But we do need to ask: how do we keep innovating without running up an invisible environmental tab?