AI as a Grid Bully: Who Gets Throttled First?
- Johan Steyn
- 1 day ago
- 4 min read
The hard truth about “priority loads”, rising tariffs, and why resilience is now a board-level issue.

Audio summary: https://youtu.be/DHJstnpGdlQ
There’s a story doing the rounds in recent online discussions: a veteran industrial plant manager watches local grid priorities shift as a hyperscale data centre arrives nearby. The punchline is simple and unsettling: the “cloud” is not light. It is heavy industry, with an appetite for gigawatts, cooling systems that look like refineries, and a demand profile that can reshape who gets power first when the grid is under stress.
This matters for South Africa because we already live with constrained electricity, water pressure in certain metros, and a growing dependence on digital services. As AI becomes more embedded in business, the key question is no longer just “what can the models do?” but “what physical systems must bend to keep them running?”
CONTEXT AND BACKGROUND
For years, the dominant narrative was that software scales cheaply: write once, deploy everywhere, and let “the cloud” absorb the complexity. But AI is exposing the physical bill behind that story. When even mainstream tech reporting is discussing AI build-outs on a nation-scale energy footprint, it stops being a purely digital conversation and becomes an energy, infrastructure, and industrial policy problem. A recent Tom’s Hardware analysis, for example, highlighted how some of the most ambitious AI data-centre targets being discussed could demand electricity on the scale of an entire country, underscoring just how quickly computation is turning into a power-and-cooling race.
At the same time, the grid connection process is becoming a bottleneck. In the US, utilities and regulators are grappling with “phantom” data centre requests that clog up planning and interconnection capacity, creating uncertainty for other industries and communities that need reliable power. The implications are global: wherever electricity is scarce or slow to expand, big compute projects can distort priorities.
And then there is the nuclear conversation. Energy policy and financing are now being shaped by the same question driving AI: where do we find reliable, always-on power at scale? In the United States, the Department of Energy recently announced it had closed a $1 billion loan to support the restart of an 835 MW nuclear plant in Pennsylvania, explicitly positioning firm nuclear capacity as part of the infrastructure needed to “win the AI race” and strengthen grid reliability. Whether or not every project reaches the finish line, the direction is becoming hard to ignore: as data centre demand accelerates, AI is pulling national energy planning, industrial policy, and long-term grid investment directly into its orbit.
INSIGHT AND ANALYSIS
The first hard limit is heat, and South Africa is already seeing how AI’s infrastructure needs spill into the energy system. AI chips don’t merely “use power”; they convert electricity into heat at extreme density, which is why the old model of air-cooled server rooms is colliding with physics. Liquid cooling is moving from niche to mainstream because it removes heat far more efficiently, but it also turns data centres into industrial sites with complex plumbing, higher operational risk, specialised maintenance, and heavier water and engineering demands.
At the same time, hyperscalers are starting to back real generation capacity locally, not just buy electrons on paper. AWS’s solar project, contributing power to South Africa’s electricity grid, is a concrete example of how cloud players are becoming part of the country’s energy conversation, even as the thermal and cooling realities of AI push the infrastructure footprint into the realm of heavy industry.
The second limit is grid stability and power quality. AI workloads can be bursty: huge spikes and dips as clusters compute and then synchronise. That volatility matters to ageing grid infrastructure built for steadier industrial loads. Even without getting lost in engineering jargon, the point is practical: the more our economies rely on “always-on” AI services, the more pressure there is to prioritise those loads and to redesign grids around them.
The third, and most political, limit is sovereignty. Once energy becomes the binding constraint, the winners are those who can lock in firm 24/7 power, secure water rights, and finance long-lived infrastructure. That, in turn, nudges large AI players from being mere electricity customers towards becoming power owners or power partners, reshaping planning, pricing, and priority access. The Three Mile Island case shows how directly this dynamic can play out: a major tech company’s appetite for steady, large-scale electricity is now influencing the attempted restart of a legacy nuclear asset, blurring the line between public energy strategy and private compute strategy.
IMPLICATIONS
For business leaders, the immediate action is governance, not gadgets. If you are adopting AI, you should ask: What is the energy footprint of our AI roadmap? What happens during load-shedding or constrained supply? Are we building dependencies on vendors whose resilience plans assume abundant power and water? These are not abstract questions; they become operational and reputational risk the moment customers cannot access services.
For policymakers and regulators, the test is fairness and transparency. Data centres can bring investment and skills, but they must not quietly crowd out existing employers, hospitals, schools, or municipal resilience priorities. Grid connection rules, demand-response requirements, water-use disclosures, and community benefit expectations need to be explicit, measurable, and enforceable.
For parents and educators, there is a quieter implication: children’s futures are being shaped by infrastructural decisions. When digital learning, health records, and public services are increasingly mediated through AI-enabled systems, access and reliability become equity issues. A “smart” future that is physically fragile is not progress; it is a new kind of vulnerability.
CLOSING TAKEAWAY
We are moving from the “software is eating the world” story to a harsher reality: thermodynamics is setting the terms. AI is becoming a contest over electricity, cooling, water, and the governance of the infrastructure that makes modern life work. South Africa should not treat this as a distant Silicon Valley problem. If we want AI to strengthen society rather than concentrate power, we must plan for resilience, set clear public-interest rules, and insist on transparency about the physical costs of the digital economy. The future of intelligence will be decided as much in power stations and substations as in boardrooms and labs.
Author Bio: Johan Steyn is a prominent AI thought leader, speaker, and author with a deep understanding of artificial intelligence’s impact on business and society. He is passionate about ethical AI development and its role in shaping a better future. Find out more about Johan’s work at https://www.aiforbusiness.net


