Key Takeaways:
- AI is powering amazing tools that make our jobs easier and our lives simpler, but it’s not all upside. There is a hidden cost associated with AI use, and that cost is serious energy consumption that leaves a hefty carbon footprint. In addition to the moral and intellectual quandaries that AI is forcing us to confront, there is an environmental impact we need to talk about.
- AI uses energy in two main ways: training and inference (more on that below). AI is constantly seeking, absorbing, processing, and condensing huge amounts of data. That alone uses a huge amount of data centers’ energy. Then, there's the actual use of AI platforms, which means energy is consumed every time someone opens an AI app for even simple tasks.
- There aren’t yet definitive numbers on just how much energy is being consumed. The tech companies who would report that information are tight-lipped about their potential impact. However, one study shared that training a large language model like GPT-3 uses just under 1,300 megawatt hours (MWh) of electricity - enough to power 130 US homes for an entire year.
- The larger and smarter AI programs become, the more energy they’ll use. Ideally, tech leaders will seek new ways to reduce the carbon footprint of their platforms even as they improve the products themselves. Ultimately, it’s up to the leaders in the tech world to make eco-friendly decisions for the environment.
- That doesn’t mean individual users are off the hook, though. We need to ask questions and raise expectations of the world’s technology leaders. Don’t be afraid to be vocal and write letters, post social media content, or share research about the environmental impact of AI. It’s up to all consumers to put pressure on today’s tech moguls.
AI might write your emails and plan your day, but it's important to know that there’s a cost–a carbon trail. While you’re busy asking your virtual assistant to remind you about your dentist appointment, there’s a whole lot of behind-the-scenes energy consumption going on.
AI tools are becoming part of everyday life, showing up in everything from chatbots to smart home devices. As we head deeper into the artificial intelligence future, a serious question is nagging at expert minds: What is the real environmental impact of AI?
Why AI Uses So Much Energy
So, why exactly does AI guzzle energy like it’s an all-you-can-eat buffet? It boils down to processing massive amounts of data repeatedly. Training AI requires a ton of trial and error; the systems are always learning. To better understand this concept, you need to know two key phases involved in AI's energy consumption:
- Training – This is where the real magic happens. Think of it as the AI program going to school. It learns how to understand and respond to various inputs, which involves crunching numbers and analyzing data like a caffeine-fueled student cramming for finals. It’s where the AI programs you’re familiar with actually absorb and process information, where they learn and respond.
- Running (Inference) – This is when you actually use the AI tool—like when you ask your chatbot, “What’s the weather like today?” Spoiler alert: there’s a whole range of answers the AI tool can come up with, but it takes a lot of energy to get there! Think of this as when the tool is actually pulling information to do the task at hand; when it’s literally drafting your email or taking notes in a meeting.
Here’s the key: both of these phases happen in energy-hungry data centers powered by large server farms. You could say these data centers are the AI equivalent of a bustling city, complete with constant energy needs and an impressive footprint. They also have special cooling infrastructures and power needs beyond typical buildings. The largest owner of US data centers has 132 facilities spanning 23.4 million net rentable square feet.
The Carbon Cost of AI Today
One major AI model can take weeks to train, using as much energy as several households do in a year. It’s hard to pin down specific numbers for a couple reasons. For one thing, the energy used is variable and isn’t the same across AI platforms. For another thing, the people who would share the relevant numbers publicly choose not to (which is the prerogative of private companies). But, there are credible estimates of the energy used. For example, one study says that training a large language model like GPT-3 is estimated to use just under 1,300 megawatt hours (MWh) of electricity; about as much power is consumed annually by 130 US homes.
For context, streaming an hour of Netflix requires around 0.8 kWh (0.0008 MWh) of electricity. So, you’d have to watch 1,625,000 hours to consume the same amount of power it takes to train GPT-3.
Even everyday uses that may seem insignificant, like asking a chatbot a question, require server power behind the scenes. It might seem harmless, but as AI gets smarter, it often gets bigger—and more energy-intensive. It’s like that friend who starts with a small hobby and ends up with a garage full of stuff they “might need someday.” And don’t forget, training is only one source of energy consumption. After a system is created, consumers begin to use it to generate output, the “inference” process we mentioned above.
There’s a really interesting study where AI completed small tasks 1,000 times (like answering a question or naming an image), and the energy consumption was measured. This research found that most tasks that were tested used a small amount of energy, for instance 0.002 kWh to classify written samples. Continuing with Netflix streaming as a comparison, these numbers are equivalent to the energy consumed watching nine seconds or 3.5 minutes, respectively.
Related Post: The Carbon Footprint of the Internet: How Your Data Usage Emits CO2
Looking Ahead: 5-Year Forecast
As you’re no doubt aware, the demand for AI is skyrocketing. Whether it’s healthcare, finance, education, or marketing, everyone wants a piece of the AI pie. Some experts predict that AI-related computing could outpace current data center energy use in just a few years.
A recent report by the International Energy Agency offered further estimates, suggesting that electricity usage by data centers will increase significantly in the near future thanks to the demands of both AI and cryptocurrency. That agency says data center energy usage was around 460 terawatt hours in 2022, and is likely to increase to as much as 1,050 TWh in 2026 — equivalent to the energy demands of the entire country of Germany.
Without greener practices, this could significantly increase tech’s overall carbon footprint. Not to be dramatic, but it’s like piling too many toppings on a pizza: eventually, the crust won’t hold up and your lunch may end up in your lap.
Who Bears the Responsibility?
So, who’s responsible for this carbon conundrum? That would be the big tech companies building and hosting these models—ahem, Google, Microsoft, OpenAI, and Amazon. They need to step up and ensure that their cloud infrastructure (like AWS or Azure) adopts cleaner energy sources. Additionally, being more transparent about energy usage figures and the measures being taken to limit carbon impacts would be helpful. It’s hard to have a real conversation about improving when the most important voices at the table are insistent on remaining quiet. More pressure on these companies to share not only their energy consumption figures, but the measures they are taking to limit their impacts, could make a difference.
At Shift, we believe that accountability isn’t just for the big players—it’s something we all share. That’s why we launched the Shift Impact Grant, a $25,000 USD fund supporting sustainability initiatives in tech that are working toward real, measurable change. To apply, visit shift.com/impact/.
Of course, responsibility doesn’t end there. As users, we also have a vital role to play. It’s worth asking: what’s the environmental cost of this tool? As technology becomes more embedded in our everyday lives, questions like this can help guide more conscious and responsible choices.
Related Post: The Environmental Impact of Scrolling Through Social Media
Can AI Be Sustainable?
If you’re considering throwing your devices out the window in despair, pause. There are efforts underway to make AI more sustainable. Things like:
- Designing smaller, more efficient models that don’t require a massive energy boost.
- Using renewable energy to power data centers (yay, wind and solar!).
- Improving transparency around energy usage so we know what’s really going on behind the scenes.
The real challenge is balancing innovation with responsibility. The stand-out tech heroes of the next few years will be those who can help companies to achieve that balance, so that we can all enjoy the superpowers of AI without leaving the planet in worse shape.
AI isn’t going away—but the way we build and use it matters. The truth is that the generative AI revolution comes with a planetary cost that is completely unknown to us right now. The next five years will help to determine whether AI becomes a sustainable tool or a major climate problem. Lets each do our part to ask the harder questions, stay informed on eco-friendly tech practices, and support ethical technology efforts. There’s a lot that we can do to ensure that our newly AI-powered lives don’t come at the expense of our one and only planet, but we need to start now.
So, What Now?
You don’t need to be an engineer or climate scientist to make a difference. Staying curious and informed is a powerful place to start. Learn how your everyday tools impact the planet, explore companies taking sustainability seriously, and keep asking: how can we do this better?
If you want to stay in the loop on how tech is shaping our world—and what we can do to make it more sustainable—sign up for Re:Earth, our newsletter on technology’s climate impact.