There’s this thing called artificial intelligence, man. And let me tell you, it’s supposed to be all good with making our lives easier and stuff. Like, it can help coders code faster, keep drivers safer, and save us time on our daily tasks. But check it out, there’s this dude, Alex de Vries, who just dropped some knowledge in the journal Joule. He’s the founder of Digiconomist, and he’s saying that this AI stuff could end up guzzling more energy than some whole countries, bro.
Alex is like, “Yo, with the way AI is becoming so in-demand, the energy consumption is gonna go through the roof, man.” And he’s got a point. This generative AI thing, you know, the ones that can create text, images, and all kinds of data? Well, they need to be trained with loads of data, and that sh*t takes up a ton of energy. Look at this AI-developing company in New York called Hugging Face. They said their multilingual text-generating AI tool used up around 433 megawatt-hours during training. That’s enough to power 40 American homes for a whole year, dude.
But here’s the kicker, man. The energy hogging doesn’t stop with training. Alex’s analysis shows that when you put that AI tool to work, like it generates some text or images, it chews up a crazy amount of computing power and energy. Think about it, bro. ChatGPT, one of these AI tools, could be slurping up 564 MWh of electricity every day, man.
Now, these companies are all trying to make AI hardware and software more energy-efficient, which is cool and all. But here’s the thing, when you make something more efficient, it usually means more people start using it. So, we end up using more resources overall. It’s called Jevons’ Paradox, dude.
Just look at Google, man. They’re using generative AI in their email service and they’re even testing out using AI for their search engine. Can you imagine if every Google search used AI? Alex estimated that it would need like 29.2 TWh of power a year, which is the same as Ireland’s whole annual electricity consumption, dude.
Now, don’t freak out just yet. This extreme scenario probably won’t happen super soon, because it’s gonna be crazy expensive and there are issues with the supply chain of AI servers. But, here’s the kicker again, the production of AI servers is expected to skyrocket in the near future. So buckle up, man. By 2027, we could be looking at an increase of 85 to 134 TWh in global AI-related electricity consumption, based on the production of those AI servers.
That’s like the Netherlands, Argentina, and Sweden’s annual electricity consumption right there. And get this, dude. If we keep improving AI efficiency, developers could start using those computer processing chips for AI, which means even more electricity being sucked up by AI, man.
Alex is dropping some serious truth bombs here. We gotta be careful, bro. We shouldn’t just slap AI into everything without thinking about it, ’cause it’s energy-intensive, man. We gotta be mindful about how we use it, ya know? That’s what Alex is saying, and he’s got a point, man.