These impersonal warehouses are the 21st century’s calling card — architecture for the computing power we are training to outsmart us

Edwin Heathcote

Every age has its own distinctive building type. For the 19th century it was the railway station, for the 20th century it was the skyscraper. For the 21st century it looks like it’ll be the data centre. What a time to be alive.

If you doubt that these dull, windowless warehouses really are the future of architecture, then an astonishing statistic in a recent report from McKinsey suggesting that we might be spending up to $7.9tn on data centres in the period up to 2030 might sway you. That’s roughly double the value of the entire UK economy.

The big banal shed full of humming and twinkling machines is arguably the first real major post-human building type, an architecture built not for us but for the computing power we are coding and training (and which will be coding itself) to outsmart, even supplant us.

Ancient peoples used to build huge temples to the gods they wanted to please and appease, to make them sacrifices and offerings. In the process they created architecture. We are somehow going the opposite way, creating the gods that could destroy us while housing them in the most nondescript, generic buildings imaginable.

There is a huge disjuncture, you might notice, between the architecture of the tech HQ, the public front, and the data centre, the back office. The giants of surveillance capitalism have spent mountains of cash on swanky offices designed by star architects: think of Norman Foster’s Apple HQ in Cupertino, Frank Gehry’s Facebook Campus in Menlo Park or BIG & Thomas Heatherwick’s much-delayed (and derided) London Google HQ at King’s Cross.

These buildings are intended to project a slick image of high design and physical presence, a manifestation of corporations that live in “the cloud”, ever present in the attention economy of cyber space and yet immaterial in real, physical space. They are attempts to convince us the most valuable companies and the hyperinflated AI bubble actually have a presence in reality.

A technician at an Amazon Web Services AI data centre in New Carlisle, Indiana © Noah Berger/Getty Images via Amazon Web Services

Data centres, though, are not quite invisible (they’re too big for that), but executed in the anonymous, off-motorway architectural language of the distribution centre, the modern warehouse or the freeport. AI is boosted as the wonder tech that will turbocharge productivity, increase wealth, make our lives easier, reduce our working days, detect our cancers earlier.

So why does their architecture look so dire? Where is the celebration? The Victorians made their stations and mills look awe-inspiring, the early modernist architects built huge power stations and communication towers to celebrate the thrill of new technologies. Tech looks embarrassed by itself. Ashamed almost.

Partly it is precisely because these are buildings for machines and microchips and not humans. The specialist language of their design is revealing. “White space”, which sounds like a very modernist conception, is the floor area dedicated to the machines, the blinking racks of servers and mainframes, the storage and network systems. The light is an eerie, inhuman cool blue, but you need to ask about the nomenclature: white space? Really? In 2026?

Then there is “grey space”. This is the back of house for the internet’s back of house, the huge banks of chillers and backup generators. There is huge built-in redundancy. This stuff, the clunky, very physical engineering, takes up the majority of the space, not the racks of servers. Not only is this a post-human architecture but this is an architecture sheltering machines built to service machines. AI is hot. Not just in terms of investment but quite literally. It needs massive levels of cooling. A conversation with ChatGPT might necessitate half a litre of water for cooling.

Cooling towers at Google’s data centre in Quilicura, Chile © Marcos Zegers/New York Times/Redux/eyevine

These metal boxes look silent on Google Earth but are anything but. The incessant hum of a data centre is surprisingly loud when you’re on the inside. The few remaining humans needed for maintenance of the machine-to-machine systems need to wear ear defenders, mostly to protect them from the noise generated by the massive chiller units.

And where does all that heat they generate go? Nowhere. Not to district heating systems or to local homes. Not to heat public pools or saunas. Not to any kind of public good at all. It dissipates, slowly heating the planet. All of the reductions in carbon emissions we’ve achieved through sustainable sources and electric vehicles will be reversed by the inexorable rise and insatiable appetite of AI.

Eyebrows were raised when it was announced that Microsoft was reviving the nuclear reactor on Three Mile Island. What could possibly go wrong? And what a perfect metaphor. Elsewhere the big tech companies are looking at small modular nuclear reactors (like those on submarines) to power this energy-hungry future.

The online world is often thought of as somehow ethereal. All that data mining and extraction, the vast resources being poured into surveillance capitalism in which we, the users, the askers of stupid questions, the generators of online companions and sexbots, are the real rare earth materials, the product: this is their manifestation.

Some of the biggest new AI data centres are sited in existing buildings, usually for the expediency of speed. Elon Musk, for example, was in a hurry when he embarked on his Colossus project, so he took over an old Electrolux plant in Memphis, Tennessee.

The QTS data centre complex in Fayetteville, Georgia, US © Elijah Nouvelage/Bloomberg

Its name is characteristically pompous, as are the others: Stargate, Hyperion and, of course, Meta’s Prometheus, a “titan cluster” roughly the size of Manhattan (yes, really) currently under construction in Ohio. They sound like what teenage nerds might adopt as usernames. But these are not virtual, not even just buildings: they are entire cities built purely for machine intelligence.

The reason these centres are so big is density, or latency. The chips need to be close together because it gives them nanoseconds’ worth of advantage. Distance still matters, which is why financial centres such as London and Manhattan are stuffed with data centres in the shells of missable office buildings.

It’s a curious kind of bubble because there is vast real estate involved, but ultimately it is all about the chips. These AI chips can be big — from the micro to the macros, they vary between the size of a hardback novel and a vinyl LP — but they are hardly massive. And, despite the trillions of investment, they last maybe two years before they become obsolete.

The physical and architectural impact of these structures is huge. They are noisy, power-hungry, polluting beasts. Their pollution may be hitting upstream from their buildings, but it is, increasingly, there. Some buildings have grown so fast there isn’t enough capacity for power generation and they have had to be fitted with large, noisy and inefficient generators on the outside.

Meta’s data centre in Eagle Mountain, Utah, under construction in 2024 © Christie Hemm Klok/The New York Times/Redux/eyevine

There is no sense of any discussion of the architecture of data centres, that it might even be something important. They are often referred to in terms of infrastructure, using comparisons to the railway bubbles of the 19th century. The boom-and-bust cycles of private operators and wild investment may have caused occasional economic meltdowns, but they also left in place the infrastructure, the rails, stations and signalling to be reused — a useful legacy.

This is not that. The infrastructure of fibre-optic cables, some laid as long ago as the 1990s, might fit that description (much of it is still being used), but not these blank warehouses filled with short-life chips; they have no utility beyond the self-indulgent, sci-fi dreams of their creators.

The name of Musk’s mega computer, incidentally, derives apparently from a trilogy of novels, published between 1966 and 1977, by British sci-fi writer DF Jones about a supercomputer built to oversee the US nuclear weapons system as a rational controller. Colossus establishes contact with its Soviet equivalent, Guardian, and when both sides attempt to disconnect their supercomputers for fear of divulging military secrets, the computers rebel and fire missiles. They eventually enslave mankind. Musk’s naming of Colossus is seemingly unironic. Just a cool name. No one learns anything.

This article has been amended since publication to reflect the fact that a conversation with ChatGPT might necessitate half a litre of water for cooling, rather than a single question

Find out about our latest stories first — follow FT Weekend on Instagram, Bluesky and X, and sign up to receive the FT Weekend newsletter every Saturday morning

In PersonGaining an Agentic Advantage with Enterprise AI

Strategies for using intelligent, agentic AI systems to power business growth

Wednesday 11 FebruaryDublin, Ireland

Presented by

Follow the topics in this article