The true cost of AI content generation
It may seem "free", but it's not without costs - ones we should all stop to consider

Here we go again…
Giddy with excitement, the world is flocking to participate in the latest tech gimmick: this time turning themselves into a ChatGPT GenAI-generated doll.
Trends will always come and go. Those old enough to remember will remember the old MySpace and Bebo days. Damn, that’s a nostalgia trip and a half! Embarrassingly, I remember my friends and I rushing home to log on and customise our profile layouts with music, icons and pictures - all to express our identities and connect with others. Technology, including AI, gives us a means to scratch an innate human itch; the deep-seated desire to be seen and liked. Turning ourselves into toy-sized dolls is just the latest way to do it.
Sure, it's fun, but there are concerns. Nobody stops to think: "Hey, before I do this really ‘unique’ thing everyone else is doing, where does my data go? Where do my photos go? Or the information I entered about my job, the place I live, and my pets? Who gets to see it, save it, or use it?"
If you use the free versions (such as ChatGPT’s free tier), it costs nothing…right?
Sorry to break it to you, but nope.
Even when you use a free version, AI content generation is not without costs - ones we should all stop to consider.
Cost #1: Your Privacy
I work in the cybersecurity industry, so I’ve seen the good, the bad, and the ugly regarding compromised data. It still shocks me how quickly people will sacrifice their information in the name of entertainment. Think about all those historic Facebook trends where people wouldn’t think twice about publicly posting answers to a series of questions about themselves - what’s your favourite food? Favourite country? Favourite colour? Your pet’s name?
Social media profiles present a goldmine of information for threat actors performing reconnaissance activity ahead of an attack. Many people will say: “But Mollie, why would they ever want to hack me? I’m not important enough to be targeted”. You’re right in one regard - you’re not that important. Usually, you’re a small fish in a big pond of thousands of other fish. All they need is for one of you to get hooked and click a link or provide some data, which will be used to unlock doors to greater possibilities (i.e., to get access to a larger fish or pool of fishes - one example might be the company you work for - seeking to achieve greater rewards).
Remember: if the product is free, you are likely the product
Thanks to my years working to raise security awareness, I realise it’s likely too much to ask the general public to stop and consider the implications of handing over their data. I can say it until I’m blue in the face, but until something bad happens to them or someone they care about, most people won't think twice about trading a bit of privacy for entertainment.
But what about other more significant implications, such as GenAI’s environmental impact?
Cost #2: The Unspoken Environmental Impact
This cost is poorly understood by the general population and has only become more prevalent in recent years. As AI models become more complex, they become more power hungry, consuming a large amount of energy while generating a substantial carbon footprint. AI developers, like OpenAI, use cloud computing within large datacentres to train and analyse the algorithms underpinning the GenAI models.
Estimates over recent years have shown that ChatGPT emits 8.4 tonnes of carbon dioxide annually. To put this into context: a return economy flight from London to Nairobi emits around 1.11 tonnes of CO2e per passenger. Approximately eight return flights would be needed to reach close to 8.4 tonnes. And if this doesn't seem like a lot already, then remember that ChatGPT is just one Large Language Model (LLM) amongst many.
Of course, the amount of energy a model consumes depends on several factors. The amount of electricity a data centre consumes depends on the hardware it runs and its utilisation rate. The size of the emissions a model emits also depends on the power source of the data centre the model resides in (e.g., coal-fired data centres will generate higher levels than those that are powered by wind, water or solar).
But concerns about GenAI’s environmental footprint extend further than electricity and carbon. A Queen’s University Library study found that the training, operation, and maintenance of AI models not only produces huge amounts of carbon, but also consumes thousands of litres of water. Water is directly used to cool data centres, as well as create the technical components used for AI models. Additionally, another study by Cornell University, found that training the GPT-3 language model in Microsoft's US data centres evaporated 700,000 litres of clean freshwater. And that was just for its training, not even accounting for ongoing maintenance. The study surmises:
“The global AI demand is projected to account for 4.2-6.6 billion cubic meters of water withdrawal in 2027 - more than the total annual water withdrawal of 4-6 Denmark or half of the United Kingdom”
When you stop to think about it, this is scary. I hope my readers do feel somewhat overwhelmed by this.
We need to open our eyes and think twice before using these models, not just to protect our own privacy but also to protect the environment. GenAI tools can offer fantastic opportunities for productivity and efficiency gains, but our usage must justify the resulting drain on natural resources. Call me crazy, but I don’t think turning yourself into a toy-sized collectable is a justifiable use case, all things considered.
In before the “but what about the environmental impact of other queries, such as Google searches” crowd: comparisons of energy consumption per query reveal that ChatGPT consumes between 50-90x more energy per query than a conventional Google search (60x being most likely).
You might think participating in one trend will do no harm, but it’s not just you who’s participating. Gimmicky trends, such as the ChatGPT-generated doll, result in millions of people flocking to use GenAI models. The boom in usage results in a collective, substantial environmental impact.
AI companies (and Big Tech) have a social responsibility to lead by example in addressing the environmental footprint generated by their models. But this social responsibility extends to all of us as individuals, too. Even if we can't do much to influence the unrelenting tide of emerging technology development, we can all do our bit not to unnecessarily contribute to it.
As a population, we must become more conscious of our AI (and wider technology) use and its implications