I want to start with something small because that’s where the whole AI energy story gets weird. Picture this. you type a prompt into an AI tool. Write a five line poem about winter in New York. And then you hit enter.
Okay. So let’s see what we get here.…Steam rises from the grates to meet the gray. As slush turns iron cold on fifth and broad away.…The city shivers in a quiet salt stained sway.
well, thank you, Gemini. that’ll be all now. Let’s see. Back to, the point. We get a clean little poem.
So we move on. That moment feels weightless. It’s just text on the screen, but a real building full of computers did the real work for that poem. Electricity ran through wires, chips heated up, fans spun, pumps moved coolant, and backup systems stayed ready in the back background in case the grid hiccups. So discussing AI energy, I wanna drag the conversation out of the cloud metaphor and back into physical reality.
Servers are not magic. They’re machines. Machines live somewhere. They have utility bills. And they have cooling towers and chillers and heat exchangers.
They have neighbors. And that’s where today’s episode lives. Not in AI is evil or AI will save us, This is an infrastructure episode. So you’re listening to resilience report, buildings, cities, risks. I’m Aaron.
And today, we’re talking about energy consumption and AI. At the request of the fabulous Wen Yi Hsu all the way around the world in Singapore. So AI and the angle about energy use that I think gets us past the usual argument. AI didn’t create the energy problem. AI is exposing how sloppy our energy accounting already is.
Let’s be direct about how this debate usually goes…. Someone posts a scary number. AI uses as much electricity as a small country. Someone else posted a more reassuring number. Well, actually, it’s just a tiny fraction of global demand.
And then we get three days of arguments where everyone is technically wrong and technically right in a different way. The problem with these comparisons is that they skip the questions that are at the heart of energy consumption. Where is the demand showing up When is it showing up? And who is paying to serve it? a grid doesn’t fail because a yearly total looks big on a chart.
A grid fails or a grid gets expensive when peak demand hits, when transmission is constrained, and when a region needs new substations, when a utility has to keep old fossil plants online to recover reliability, And when a community hosts an industrial scale load, they had never planned for. So I want to reframe the conversation around three practical questions. First, are we measuring AI energy use in a way that lets anyone make decisions or are we stuck with the vibes? Is this a public relations problem? Second, does AI act like a rigid new load that makes peak worse, or does it act like a flexible load that can shift around and sometimes make the grid run easier?
And third, when this infrastructure lands in a place, what does the deal look like for the people who live there? So let’s keep these three points in view and the conversation then gets a little clearer in focus. So we’ll start with measurement because that’s the root problem. In buildings, we argue about metrics all the time. People fight about which baseline is fair, and people fight for what normalization makes sense.
There’s a basic expectation. Show the numbers. Show energy use. Show intensity. Show trends.
If you’re in a city with disclosure laws, you might literally have to post something. New Yorkers are now familiar with building grades posted on the front door regarding their office’s energy consumption. With AI products, that culture barely exists. Users ask users may ask, how much energy did my request take? And the answer is almost always a shrug.
You might get a blog post. You might get a corporate sustainability report that talks about data centers in broad strokes. You might get a research paper that estimates training costs from a model from a year ago. You rarely get a simple comparable label. So when you don’t have a measurement baseline,…you get the worst kind of debate, numbers become weapons.
Here’s what I mean by a label. I mean, keeping this grounded in something we all recognize. Imagine something like the nutrition’s fax box on the box of your cereal, but this is for AI services. Not because you need to feel guilty looking at how much sugar is added to your cereal, but this is about using an AI tool. This label would be helpful because systems get better when performance is visible and measurable.
So this useful imaginary label would tell you roughly how much electricity is used per task under typical load not instead not some cherry picked best case. It would tell you where the work ran because the grid in one region can be much cleaner than another. It would tell you something about timing.…Because the carbon intensity of the grid changes hour by hour. It would tell you how the facility handles heat because cooling is part of the energy footprint and sometimes part of the water.
Water footprint too. It would tell you whether the hardware is being used efficiently because idle equipment still burns power, still gets cooled, and still burdens the grid. So right now, most people have to choose between two admittedly bad options. Option one is treating every AI interaction like an environmental crime. Option two is treating it like a rounding error, and it doesn’t really matter. better measurement would let us stop guessing. So now let’s talk about timing. Because timing is where the AI conversation comes becomes surprisingly hopeful or becomes ugly depending on the incentives. Most public debate fixates on total annual energy use. Annual energy is real, but grids are not designed around annual averages.
Grids are designed around peak demand and reliability. So utilities have to answer questions like, what happens on the hottest weekday at six PM when everyone is home cooking, charging, and running the air conditioning? That’s the moment that drives infrastructure costs. That’s the moment that pushes dirtier plants online. That’s the moment that triggers emergency imports.
So if you wanna know whether AI is a problem in a specific place, you don’t start with annual totals. You start with peak contribution. You start with the load shape, and you start with whether that demand shows up when the grid is already stressed. So here’s the twist. A lot of AI work is actually time flexible.
Some AI work is inference, meaning the model responds to the users in real time.. That kind of load has less flex flexibility because people expect fast responses. Even there, you can do smart things with caching, batching, routing.
But you can’t tell users, well, come back tomorrow when the wind is stronger. Other AI work is training, where large models learn from data. Training often takes a long time and consumes a lot of energy. But the timing can be flexible. A training run can be scheduled.
It can be throttled. It can be paused. It can be moved to a different region if the operator has capacity somewhere else. That’s not true for most parts of our economy. Your refrigerator does not wait for off peak.
The subway does not wait for off peak. And a hospital doesn’t wait for off peak. So AI is one of the rare large loads that could, in principle, behave like a controllable demand that works with a renewables heavy grid instead of fighting it. Picture a region with lots of solar in the middle of the day. Prices can drop because generation is high and demand is moderate.
Then evening comes, solar fades, demand stays high and prices rise. So a grid aware computing operator shifts flexible work into the midday solar window. A grid blind operator runs flat out through the evening peak. That choice is not about physics. The choice is instead about contracts and incentives.
If the operator pays a flat rate for power and never feels any pain of those peaks, they have less reason to cooperate. But if the operator faces time of use prices and demand charges that reflect grid stress, Well, that case, suddenly the load shaping becomes a real business strategy. So when you hear AI will wreck the grid, the right response is, well, which AI workloads? Under which pricing, in which region, with which constraints? Because those details actually decide the outcome.
So now let’s bring this down to earth, literally. Because the physical sighting story is where the abstract debate also becomes less abstract. Data centers are building. They are industrial facilities with a digital product They have site plans. They have building permits.
They have noise. They have heat rejection systems and often backup generators. They sometimes have large water demands depending on cooling design and the climate. They are not built in the cloud. They are built on land in specific towns served by specific substations, and connected to specific transmission corridors.
So that’s why the local politics can get intense. If a big data center proposal shows up in a community, the questions are not philosophical. They are practical. How much power are you taking? How many new lines do we need?
Who pays for the substation upgrades? What does this do to our rates? And how loud is the equipment? How much water do you use on a hot day? How often will a backup generator run?
Questions like this. How many long term jobs are there, not construction jobs? What does the tax base look like, and how stable is that? So this is a normal reaction to a large industrial facility moving in. People are not wrong to ask these questions.
Sometimes, the community will decide the trade is worth it. Sometimes, they will fight it. Either way, the conversation gets clearer when we admit what this infrastructure really is. Here’s another piece that matters. Digital infrastructure can move faster than grid infrastructure.
You can plan and build a data center relatively quickly compared to major transmission upgrades. Transmission upgrades could take years. Sometimes longer. That mismatch creates pressure. A developer wants capacity now.
The utility has a long planning cycle. A regulator wants reliability. And the citizens of the town, they want answers. So everyone is working on different schedule here.. So AI energy is not just a global climate debate. It’s a local planning debate, a utility planning debate, and a ratepayer fairness debate.
…At this point, someone listening is thinking, okay. But aren’t we still missing the big climate question? There Let’s go there, but let’s do it without getting sloppy. The climate impact of AI is not a single number. It depends on two different ledgers here.
Ledger one is direct energy and emissions. That includes the electricity use for computing. The electricity use for cooling, the standby losses of keeping the facility ready and the upstream effects of building and refreshing hardware. Ledger two is what AI changes downstream. If it’s applied to systems that waste energy and materials.
So you can find AI use that belongs in either bucket. There’s plenty of AI use that is basically novelty, generating jokes, making images, polishing emails. Well, that stuff can be fun and useful, but does not obviously reduce emissions. Elsewhere. In that zone, AI is mostly an added load.
If you care about climate, you should be honest about that. But then there’s also AI use in logistics manufacturing, and building operations that can reduce waste. Better routing can cut miles. Better forecasting can reduce spoilage. Better control can reduce over ventilation and unnecessary cooling.
Better fall detection can catch broken sensors and malfunctioning equipment before a building runs in efficiently for months. Does it always work? No. A lot of deployments are mediocre. Some are vapor.
Even when it works, rebound effects can eat the savings. In other words, if AI makes a process cheaper, people might do more of it. That’s the rebound trap. So when someone says AI is good for the climate, or AI is terrible for the climate, the first question is, well, where is the system boundary?…Are we counting only the data center’s electricity and calling it a day?
Are we counting what gets displaced? Are we counting new demand created by new features? Are we counting the grid upgrades that need to serve the load? Are we counting the carbon intensity? Of the region where the work runs?
Are we averaging all of this into a single global number? Because the boundary usually decides the conclusion. People choose the boundary that supports their side.. The honest way through this is to hold both ledgers at once and admit the uncertainty where it exists. of that’s less satisfying, but it also is how infrastructure decisions get made. Now I want to address a common misconception that sneaks into these conversations. People assume the AI energy problem is mostly about the user prompt. The moment that a you interact with an AI model. Well, that moment does matter, but that’s not the whole story.
The bigger energy swings often sit in the background. Training, fine tuning, testing, redundancy, keeping that capacity ready for any spikes, running multiple versions, copying data across regions for reliability, that whole operational stack. Creates energy demand that users never see. And similar to buildings, tenants see the lights and the thermostat They do not see the pumps, the fans, the controls that were never commissioned properly. And the simultaneous heating and cooling that happens because no one tuned the sequences.
The equipment that runs overnight because schedules never got set correctly. In both worlds, the visible part is only part of the load. That’s another reason better disclosure matters. When you can’t see the full system, people will argue about the wrong thing. So what should change?
I wanna offer three direct but I’m gonna talk through them like a story, not like a checklist. Um
, the first direction is standardized reporting. Voluntary sustainability reports are not enough because they are curated. They are not built for comparison. They are built for messaging, If AI companies want credibility, they should publish consistent metric that allow third parties to compare services and attract the progress they’re making. That includes energy use, peak demand, regional carbon intensity, and water impacts where that is relevant.
It also includes context on utilization and cooling. Because those factors often decide whether a facility is efficient or wasteful. This kind of disclosure changes behavior. Not by shaming people, but by letting buyers, regulators, and the communities that are receiving these AI data centers ask better questions. If you can compare two services and one uses half the energy for the same task, the market starts doing its job.
If a community can see what a proposed data center load looks like against local capacity, the planning conversation becomes more real. Second direction is aligning AI operations with the grid needs. Utilities already have tools for this, time of use rates. Demand response, curtailment agreements. The problem is not that the grid has no mechanism.
The problem is that participation is uneven, and incentives can actually be weak. If we are going to host large flexible computing loads, those loads should behave responsibly. Training workloads should shift away from the dirtiest peaks when possible. Operators should be rewarded for flexibility, and they should pay more when they stress the grid. That kind of pricing is not punishment.
This is also where the conversation can get surprisingly pragmatic. A company does not need to care morally to respond to pricing. If peak power is expensive, and off peak power is cheap, the scheduling changes. If carbon intensity data is incorporated into routing, work shifts, if grid constraints matter to contract terms, the capacity planning changes. This is how systems evolve.
The third direction is fairness at the local level. Because that’s where the AI energy turns into actual disputes. When a data center arrives, it can bring tax revenue, some jobs. It can also bring noise, water use, generator emissions, and grid upgrades that someone has to pay for. A fair deal makes those burdens and benefits explicit.
Grid upgrades should not be hidden costs dumped onto ratepayers who never asked for the project. Community benefits should not be vague promises. They should be tied to measurable impacts and long term commitments. Because the facility will be there for years and the equipment will refresh again and again. So if you want a blunt version, here it is.
If the infrastructure is industrial, the agreement should be industrial. No hand waving, no magic words. Show the load, show the upgrades, and show the protections. There’s also a side topic that I think deserves more attention because it changes how we think about waste. That’s heat.
Data centers produce a lot of heat. Most of that heat gets rejected to the outside because that’s the easiest engineering path. In some contexts, that heat can be recovered and used, especially in colder climates or where there’s a district heating network or nearby industrial process that needs low grade heat. Heat reuse is not a silver bullet. It depends on distance, temperature, demand patterns, economics.
Still, it’s a useful lens because it reminds us again, this is physical infrastructure. We can design around it or we can pretend it’s invisible. I like talking about heat because it forces everything to stop speaking in metaphors, You can’t cloud your way out of heat. You move it, reuse it, or dump it. So let’s zoom out one last time and answer the question that usually sits underneath all of this.
Is AI energy actually a big deal? Yes. In the place where it clusters and in the moments where it hits peaks. And in the cases where planning lags behind growth. It also matters because electricity demand in many regions is rising up again for multiple reasons.
Electrification pushes demand up. New manufacturing can push demand up. More digital infrastructure can push demand up. AI is not the only driver, but it’s a loud driver. And it’s arriving.
So the more useful framing is not AI is the whole problem. The useful framing is AI is a stress test. It’s a stress test for measurement because we do not have common labels. It’s a stress test for grid planning because the load is big and sometimes it becomes concentrated. It’s a stress test for fairness because communities want to know who pays and who benefits.
Finally, it’s a stress test for climate claims. Because boundary choices decide the conclusions of the claim. When you look at it that way, topic becomes less about dunking on chatbots and more about updating the rules for a new type of industrial load. Let me land this with the simplest version of what I want listeners to take away. When you hear claims about AI energy use, don’t stop at big or small.
Ask where the load is happening. Ask when it’s happening. Ask what it displaces. Ask who is paying for peaks and who is paying for upgrades. Ask what the local deal looks like, flexibility and ask whether the operator is using flexibility to cooperate with the grid or just to chase cheap power.
Those questions all make the debate harder to spin. They also lead to decisions that are actually actually implementable. So next time, if you wanna stay on this theme, we can go deeper into cooling and water because that’s the next place the cloud is physical and the story shows up. In Thanks for listening. If this was useful, send it to someone who argues about AI with country comparisons and no transmission map
…

