কৃত্রিম বুদ্ধিমত্তাপরিবেশ ও পৃথিবী

The Invisible Electricity Bill of Artificial Intelligence

Share
Share

Artificial intelligence, or AI, has become an inseparable part of our daily lives. From studying, research, and programming to creating images or videos, we increasingly rely on chatbots and generative models. Yet, how much do we think about the massive energy consumption hidden behind this reliance? The enormous amount of electricity required for each query, image, or short video often goes unnoticed. However, the sum of these small bursts of energy usage is creating a crisis that’s forcing us to rethink not only technology but also climate change itself.

The AI training phase is usually the first thing discussed. For example, building OpenAI’s GPT-4 model cost nearly a hundred million dollars and required 50 gigawatt-hours of electricity—a volume that could power the entire city of San Francisco for three days. But experts note that the majority of energy consumption actually happens when we use the models—during the inference stage. Now, about 80 to 90 percent of total power use occurs during this phase, as AI answers millions of questions every day.

Data centers are at the heart of this entire process. There are now around three thousand data centers in the United States, where thousands of graphics processing units (GPUs) run continuously. Millions of gallons of water are used daily to cool them, and in most cases, they’re powered by polluting sources like natural gas or coal. Major tech companies keep secret where each request is processed, which power grid supplies their energy, and the carbon intensity of that energy. As a result, we’re left with countless blanks when trying to gauge the real impact.

Still, some figures open our eyes. Meta’s open-source model Llama, in its smallest version, uses an average of 114 joules to answer a single question, while the largest version requires 6,706 joules. For comparison, that’s equivalent to running a microwave for a few seconds. The situation is even more striking for image generation. The popular image generator Stable Diffusion consumes about 2,282 joules to create a standard image. But video generation requires astronomical energy—China’s AI firm Zhipu’s CogVideoX model needs about 3.4 million joules just to create a five-second video, which is more than running a microwave for an hour.

These numbers may seem negligible at first, but the scale of use changes the equation. OpenAI has shared that ChatGPT receives over a billion messages every day. And it’s not just text—there are about 80 million image generation requests daily as well. This amounts to AI consuming hundreds of gigawatt-hours of electricity annually in just the United States, enough to power millions of homes. Research suggests that by 2028, the electricity required for AI-related servers alone could reach 326 terawatt-hours, or about 22 percent of the power used by all U.S. households.

To meet this soaring demand for electricity, tech giants are racing down various paths. Microsoft and Meta are investing in nuclear power plants; OpenAI and its partners are building data centers as part of the $500 billion Stargate project; and Google plans to spend $75 billion on AI infrastructure in 2025 alone. But building new nuclear plants takes decades, and until then, AI-dependent data centers will continue to rely on fossil fuels. As a result, data center carbon emissions are, on average, about 48 percent higher than those of other sectors in the U.S.

The most concerning aspect of this situation is the lack of transparency. Neither OpenAI, Google, nor Microsoft are disclosing exactly how much energy each request consumes. Researchers are relying on alternative sources and open model data for estimates, but these are all fragmented. As a result, policymakers, environmentalists, and even power providers are unable to make accurate plans for the future. In fact, the extra spending often falls on ordinary people—studies show that in some states like Virginia, the average resident’s monthly electricity bill could rise by about $37.50.

The biggest question is: how meaningful is this expense for our society? Is it really justified to spend so much energy for generating a joke or a five-second video to post on social media? Technology may tell us this is an inevitable path to the future, but are we ready for a future where the hidden shadow of higher carbon emissions hangs over our everyday online activities?

There’s no doubt that AI is making our lives easier. But we cannot ignore the invisible electricity bill that comes with it. If AI companies truly care about a sustainable future, their first responsibility should be to ensure full transparency—how much energy each model uses, the sources of that energy, and the impact of that use on the planet. Otherwise, the invisible costs hidden behind technological advancement may one day make the climate crisis even worse.

affordablecarsales.co.nz
Share

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

ফ্রি ইমেইল নিউজলেটারে সাবক্রাইব করে নিন। আমাদের নতুন লেখাগুলি পৌছে যাবে আপনার ইমেইল বক্সে।

বিভাগসমুহ

বিজ্ঞানী অর্গ দেশ বিদেশের বিজ্ঞানীদের সাক্ষাৎকারের মাধ্যমে তাদের জীবন ও গবেষণার গল্পগুলি নবীন প্রজন্মের কাছে পৌছে দিচ্ছে।

Contact:

biggani.org@জিমেইল.com

সম্পাদক: মোঃ মঞ্জুরুল ইসলাম

Biggani.org connects young audiences with researchers' stories and insights, cultivating a deep interest in scientific exploration.

নিয়মিত আপডেট পেতে আমাদের ইমেইল নিউজলেটার, টেলিগ্রাম, টুইটার X, WhatsApp এবং ফেসবুক -এ সাবস্ক্রাইব করে নিন।

Copyright 2024 biggani.org