Skeptoid - Skeptoid #994: Making AI Environmentally Friendly Aired: 2025-06-24 Duration: 20:34 === AI Powering Like Crypto Mining (09:01) === [00:00:03] The word on the street is that all the new AI engines, artificial intelligence like ChatGPT or MidJourney, are taking up a disproportionate share of the world's computing resources, not unlike cryptocurrency mining was a few years ago. [00:00:19] Can our limited infrastructure keep up with the demands required by AI place on our power grid? [00:00:27] Or will this lead to even more carbon emissions and more environmental impact? [00:00:33] That's coming up right now on Skeptoid. [00:00:41] Hi, I'm Alex Goldman. [00:00:43] You may know me as the host of Reply All, but I'm done with that. [00:00:47] I'm doing something else now. [00:00:49] I've started a new podcast called Hyperfixed. [00:00:51] On every episode of HyperFixed, listeners write in with their problems and I try to solve them. [00:00:56] Some massive and life-altering, and some so minuscule it'll boggle your mind. [00:01:00] No matter the problem, no matter the size, I'm here for you. [00:01:04] That's HyperFixed, the new podcast from Radiotopia. [00:01:07] Find it wherever you listen to podcasts or at hyperfixedpod.com. [00:01:15] You're listening to Skeptoid. [00:01:17] I'm Brian Dunning from Skeptoid.com. [00:01:21] Making AI environmentally friendly. [00:01:26] Welcome to the show that separates fact from fiction, science from pseudoscience, real history from fake history, and helps us all make better life decisions by knowing what's real and what's not. [00:01:38] In 2017, there were various financial upheavals all around the world. [00:01:43] Japan became the first country to recognize Bitcoin as legal tender. [00:01:48] Australia followed soon after. [00:01:51] The natural result was that Bitcoin's value jumped by 20 times that year, and more entrepreneurs than ever began Bitcoin mining operations in earnest, often with millions of dollars of investor backing. [00:02:06] Why did they need it? [00:02:07] Two reasons. [00:02:08] First, you have to buy a whole bunch of tiny computers that are optimized for these particular types of calculations. [00:02:15] And second, you'll quickly learn that your electric bills are going through the roof. [00:02:20] These tiny computers, often referred to as mining rigs, suck up an incredible amount of electricity to power their processors. [00:02:28] This, in turn, causes those processors to emit a surprising amount of heat, which sends your air conditioning bill into the stratosphere to keep the computers from melting down. [00:02:40] And while that was the problem a decade or so ago, its analog has reared the same ugly head again. [00:02:48] Not in cryptocurrency, but this time in AI. [00:02:53] Just listen to these recent nightmarish reports. [00:02:57] A 2024 study by Known Host found that ChatGPT alone produces 261 tons of CO2 per month, with every page view producing 1.59 grams. [00:03:11] But some, such as Writer, produce as much as 10.1 grams of CO2 per page view. [00:03:19] Using data from Lawrence Berkeley National Lab, a 2025 article in the MIT Technology Review found that at current trends, AI in the United States alone will consume as much power as 22% of all U.S. homes by 2028. [00:03:37] In its 2024 sustainability report, Google revealed that in the past five years, AI has caused their greenhouse gas emissions to increase by 48%. [00:03:48] Moreover, they expect their energy use to triple by 2030. [00:03:53] At the 2024 World Economic Forum, OpenAI CEO Sam Altman told Bloomberg that the future power needs of AI are currently unreachable and that, quote, there's no way to get there without a breakthrough. [00:04:11] All of this negativity surrounding AI's environmental impact is, to put it bluntly, basically true. [00:04:18] But like with all burgeoning technologies, it's necessary to go through the steps of doing things badly on the way to learning to do them well. [00:04:26] The first airplanes were terrible, but we pushed through in order to learn to make better ones. [00:04:32] 150 years ago, medical care was likely to do more harm than good, but that's how we learned to do it so well today. [00:04:41] In 2022, Ethereum, the second largest cryptocurrency after Bitcoin, made a fundamental change to the way it works in order to address these environmental concerns and massive costs. [00:04:54] This was switching from proof-of-work validation to proof-of-stake validation. [00:04:59] And while you don't need to understand what that means, it meant a reduction in power consumption of over 99%. [00:05:07] Might there be a fundamental shift like this in the future of AI? [00:05:14] In January of 2025, we thought there might be. [00:05:18] You may have heard about DeepSeek, a Chinese AI that claimed to be 90% more efficient than other models. [00:05:25] A 90% reduction in power consumption would indeed be a game changer. [00:05:30] But it turned out this claim was not what everyone hoped. [00:05:33] The efficiency gains were only realized during DeepSeek's initial training period. [00:05:39] The hardware used for this was indeed highly optimized, but it was only trained for 10% as many GPU hours as those it was being compared against, including Meta's LLAMA AI. [00:05:52] So, obviously, it would have only consumed 10% as much power. [00:05:56] But now that it's trained and it's up and running, analysts have found that DeepSeek is actually less efficient at running each query given to it, consuming up to twice as much power as the same query on LLMA. [00:06:09] So, once again, be skeptical of amazing science news coming from China. [00:06:17] Let's take a quick look at how AI is sucking up all this power. [00:06:21] There are various kinds of AIs for different applications. [00:06:25] Neural networks, natural language processing, generative AI, machine learning, but they all run on basically the same hardware. [00:06:33] A conventional data center that runs websites and cloud computing consists of thousands of computers, each with a CPU, memory, and storage, but little else. [00:06:44] But like cryptocurrency mining, AI computers are constantly doing complex calculations, and so they are much more reliant on GPUs. [00:06:56] Most of us know GPUs, graphics processing units, as that thing our computer needs to drive an extra monitor or to play video games with amazing real-time graphics. [00:07:06] While a CPU may have a few powerful cores, a GPU has thousands of tiny cores, allowing it to perform thousands of simpler, repetitive tasks simultaneously, in parallel. [00:07:20] This is ideal for the matrix operations that are central to AI processing. [00:07:25] So today, these simple yet highly specialized machines, loaded up with powerful GPUs, are what carry the lion's share of AI, and what eat up all that power. [00:07:39] However, growing needs drive innovation, and innovation in the land of AI hardware means ever greater efficiency. [00:07:47] Higher efficiency serves two needs at the same time. [00:07:50] It improves the speed and power of the AIs, and doing more with less also means reduced power consumption. [00:07:58] In 2015, Google developed a new kind of chip called a TPU, a tensor processing unit. [00:08:05] These are optimized for processing multi-dimensional arrays, a central computing function of AI. [00:08:12] They're designed explicitly for this and can't really do anything else. [00:08:17] Thus, they work faster and consume less energy than GPUs. [00:08:23] A TPU is one kind of ASIC, application-specific integrated circuit. [00:08:30] These are chips that are hardwired to perform a single specific function or algorithm. [00:08:35] Compared to running that same task in software, an ASIC does it far faster and requires much lower computing and power resources to do so. [00:08:45] TPUs are not the only ASICs that have been developed for AI, but you get the idea. [00:08:51] The more AI algorithms mature, the more the most resource-intensive part of the AI infrastructure can be made vastly more efficient. === Efficiency Fuels Massive Demand (08:40) === [00:09:05] Hey everyone, I want to remind you about a truly unique and once-in-a-lifetime adventure. [00:09:11] Join me and Mediterranean archaeologist Dr. Flint Dibble for a skeptoid sailing adventure through the Mediterranean Sea aboard the SV Royal Clipper, the world's largest full-rigged sailing ship. [00:09:24] This is also the only opportunity you'll have to hear Flint and I talk about our experiences when we both went on Joe Rogan to represent the causes of science and reality against whatever it is that you get when you're thrown into that lion pit. [00:09:39] We set sail from Málaga, Spain on April 18th, 2026 and finish the adventure in Nice, France on April 25th. [00:09:48] You'll enjoy a fascinating skeptical mini-conference at sea. [00:09:52] You'll visit amazing ports along the Spanish and French coasts and Flint will be our exclusive onboard expert sharing the real archaeology and history about every stop. [00:10:03] We've got special side quests and extra skeptical content planned at each port. [00:10:08] This is a true sailing ship. [00:10:10] You can climb the rat lines to the crow's nest, handle the sails. [00:10:14] You can even take the helm and steer. [00:10:16] This is a real bucket list adventure you don't want to miss. [00:10:20] But cabins are selling fast and this ship does always sell out. [00:10:24] Act now or you'll miss this once-in-a-lifetime opportunity. [00:10:28] Get the full details and book your cabin at skeptoid.com slash adventures. [00:10:34] Hope to see you on board. [00:10:36] That's skeptoid.com slash adventures. [00:10:44] Another interesting way that AI can be made more resource efficient is by methodological improvements like model pruning and quantization. [00:10:54] This is something like using heuristics or shortcuts in the way we think about things. [00:10:58] If you ask an AI whether you should bring an umbrella this afternoon, there are a million answers it could give you that you don't care about, all of which would require more computing time. [00:11:09] How many raindrops per square meter per second are falling? [00:11:13] What's the barometric pressure likely to do in the next three hours? [00:11:17] No, you only care whether it's raining a lot or hardly at all. [00:11:21] Simplifying the math where it makes sense, fewer significant digits, ignoring all but the most important inputs, can cut the size of the job tremendously. [00:11:31] Faster results, less power consumed, more useful in every way. [00:11:39] A question we might ask at this point is, which one is likely to accelerate faster, improvements in efficiency, or demand for more AI? [00:11:49] We do have a solid answer for this, and it's not the one we'd like to hear. [00:11:54] Remember those numbers at the top of the show were basically correct. [00:11:58] As of now, 2025, the best projections show that global data center power consumption will probably double by 2030. [00:12:06] And that's in spite of all the gains in efficiency. [00:12:09] But it's also due, in part, to those same gains in efficiency. [00:12:14] As we reduce their energy consumption, we're able to do more with them. [00:12:18] They become even more useful, and that drives their demand even faster. [00:12:24] It's a type of feedback loop we call the snowball effect. [00:12:27] The faster it rolls, the more snow it picks up and the heavier it gets, making it roll even faster, and so on. [00:12:34] This is called Jevon's paradox, after the 19th century British economist William Stanley Jevons. [00:12:41] Increased efficiency leads to increased consumption. [00:12:47] But we live in a capitalistic world. [00:12:49] Supply and demand are forever intertwined. [00:12:52] When demand becomes too great, we have to do one of two things. [00:12:56] We increase the supply or we reduce the demand. [00:13:01] In this case, we're not physically able to increase the supply. [00:13:04] So we do the other thing, reduce the demand. [00:13:08] And we do that by raising the price. [00:13:11] Expect AI to get more expensive, potentially a lot more. [00:13:16] As much as it takes to avoid melting the grid. [00:13:21] But wait, you say, a lot of AI is open source, meaning the algorithms and software, or at least analogs comparable to the commercial versions, are freely available to all. [00:13:32] That's nice. [00:13:33] The hardware and electricity are not. [00:13:37] This is a system where water is going to find its own level. [00:13:41] One thing nearly all the industry experts are projecting is that data centers are going to turn increasingly to renewable energy. [00:13:49] It's the one variable in this equation that's a one-time expenditure. [00:13:53] Invest once, now, and the energy needs will be covered for the industry's next phase of growth. [00:14:01] And that brings us to a whole other side to this issue that many people don't take into consideration. [00:14:06] Powering the AI engines may indeed have a high environmental impact, but some of what the AIs are doing is protecting the environment. [00:14:16] An AI can be trained to do just about anything, and whether your application is reducing carbon emissions or protecting old-growth forests, somebody probably has an AI at work on that problem. [00:14:28] Is the benefit each program produces worth the cost of generating the power to run it? [00:14:34] Well, maybe in some it is, maybe in some it isn't. [00:14:38] Let's take a look at a few examples. [00:14:42] Here's one that more than directly pays for itself. [00:14:46] One way we capture carbon out of smokestacks is to react the flue gas with a limestone slurry which can absorb the carbon. [00:14:54] Pumping that slurry takes a lot of power. [00:14:57] The University of Surrey developed an AI which samples the CO2 in the flue in real time, looks at current renewable energy availability and grid energy prices, and dynamically adjusts both the slurry pump rate and the slurry pH. [00:15:13] In field trials in India, the system saved 22% in power costs while capturing 17% more CO2. [00:15:24] Another way this is done is with the use of MOFs, metal organic frameworks, which are materials that selectively adsorb CO2 directly out of the gases. [00:15:35] A team from Argonne National Laboratory, the University of Illinois, and the University of Chicago set up an AI to invent new MOF compounds with a high predicted carbon selectivity. [00:15:48] In only 30 minutes, it came up with 120,000 of them, which were then fed to a supercomputer to run molecular dynamics simulations on them. [00:15:58] Six were as good as the top industrial adsorbents on the market. [00:16:03] That was just in the first 30 minutes, although the supercomputer simulations took considerably longer. [00:16:10] A California nonprofit called the Rainforest Connection has developed a novel system consisting of a sensitive microphone, transmitter, mini-computer, and solar panels that is mounted high in the treetops in places like the Brazilian rainforest. [00:16:25] The computer uses onboard AI to analyze the sounds being recorded, listening for the telltales of illegal logging operations and also poaching. [00:16:35] This allows law enforcement to catch the operators red-handed, whereas before they'd had to rely on random patrols in hopelessly vast areas. [00:16:46] Now, of course, these are only three of many, many such initiatives that turn to AI to increase efficiency and cleanliness of processes throughout the world. [00:16:56] But so far, they are not nearly enough to offset the costs of running the AIs. [00:17:01] Renewable energy helps, but it too is in a losing battle. [00:17:06] And so far, nobody sees anything on the horizon comparable to what Ethereum did in 2022 to reduce its consumption by 99%. [00:17:17] The bottom line is that gains against the environmental impact of AI are likely to be evolutionary, not revolutionary. [00:17:25] In the meantime, we can probably expect economic levers to be about the only effective tool we have. [00:17:31] And that means jacking up the cost more and more to reduce the demand. [00:17:37] Hopefully, in a few years, I'll have the pleasure of updating this episode with a major new development. === Support Skeptoid for Premium Access (02:48) === [00:17:45] We continue with more on how AI is improving weather forecasting in the ad-free and extended premium feed. [00:17:52] To access it, become a supporter at skeptoid.com slash go premium. [00:18:02] A great big Skeptoid shout out to our premium supporters, including James, Ever Hopeful, Brad Fonseca, Marla, and Chris Ling. [00:18:13] Did you know you can have Skeptoid come to you? [00:18:15] I love doing live shows, either at meetup clubs, university groups, and conferences. [00:18:21] I can show one of our movies like Science Friction, do a live podcast, or just give one of my popular presentations. [00:18:28] For more information, come to skeptoid.com and click on live shows. [00:18:34] Follow us on your favorite social media for even more great content. [00:18:38] You'll find both Skeptoid and me, Brian Dunning, on all your favorite social media platforms. [00:18:45] Skeptoid is a production of Skeptoid Media. [00:18:48] Director of Operations and Tinfoil Hat Counter is Kathy Reitmeyer. [00:18:52] Marketing guru and Illuminati liaison is Jake Young. [00:18:56] Production Management and All Things Audio by Will McCandless. [00:19:00] Music is by Lee Sanders. [00:19:02] Researched and written by me, Brian Dunning. [00:19:05] Listen to Skeptoid for free on Apple Podcasts, Spotify, Amazon Music, or iHeart. [00:19:14] You're listening to Skeptoid, a listener-supported program. [00:19:18] I'm Brian Dunning from Skeptoid.com. [00:19:27] Hello, everyone. [00:19:28] This is Adrian Hill from Skookum Studios in Calgary, Canada, the land of maple syrup and moose. [00:19:36] And I'm here to ask you to consider becoming a premium member of Skeptoid for as little as $5 per month. [00:19:45] And that's only the cost of a couple of Tim Horton's double doubles. [00:19:49] And that's Canadian for coffee with double cream and sugar. [00:19:54] Why support Skeptoid? [00:19:56] If you are like me and don't like ads, but like extended versions of each episode, Premium is for you. [00:20:02] If you want to support a worthwhile nonprofit that combats pseudoscience, promotes critical thinking, and provides free access to teachers to use the podcast in the classroom via the Teacher's Toolkit, then sign up today. [00:20:16] Remember that skepticism is the best medicine. [00:20:20] Next to giggling, of course. [00:20:22] Until next time, this is Adrienne Hill. [00:20:33] From PRX.