Billion-Dollar Data Centers Are Taking Over the World
When Sam Altman said one year ago that OpenAI's Roman Empire is the actual Roman Empirehe wasn't kidding. In the same way that the Romans gradually amassed an empire of land spanning three continents and one-ninth of the earth's circumference, the CEO and his cohort now occupy the planet with their own latifundia – not farmlands, but AI data centers.
Technical directors such as Altman, Nvidia CEO Jensen HuangMicrosoft CEO Satya Nadellaand Oracle co-founder Larry Ellison are completely bought into the idea that the future of the US (and possibly global) economy is these new warehouses with IT infrastructure. But data centers are of course not really new. In the earliest days of computing, there were giant power-sucking mainframes in climate-controlled rooms, with co-ax cables moving information from the mainframe to a terminal computer. Then the consumer Internet boom of the late 1990s ushered in a new era of infrastructure. Massive buildings began to appear in the backyard of Washington, DC, with racks and racks of computers that stored and processed data for tech companies.
A decade later, “the cloud” became the squishy infrastructure of the Internet. Storage became cheaper. Some companies, such as Amazon, capitalized on this. Giant data centers continued to proliferate, but instead of being a tech company with a combination of on-site servers and rented data center racks, they outsourced their computing needs to a bunch of virtualized environments. (“What is the cloud?” a perfectly intelligent family member asked me in the mid-2010s, “and why am I paying for 17 different subscriptions to that?”)
All the while tech companies were storing petabytes of data, data that people were willingly sharing online, in corporate workspaces, and through mobile apps. Companies began to find new ways to mine and structure this “Big Data” and promised it would change lives. In many ways it did. You had to know where this was going.
Now the tech industry is in the fever-dream days of generative AI, which requires new levels of computing resources. Big Data is tired; large data centers are here, and wired – for AI. Faster, more efficient chips are needed to power AI data centers, and chip makers like Nvidia and AMD have been jumping up and down on the proverbial couch, proclaiming their love for AI. The industry has entered an unprecedented era of capital investment in AI infrastructure, pushing the US into positive GDP territory. These are massive, revolving deals that might as well be cocktail party handshakes, smeared with gigawatts and exuberance, while the rest of us try to track real contracts and dollars.
OpenAI, Microsoft, Nvidia, Oracle, and SoftBank have made some of the biggest deals. This year, a previous supercomputing project between OpenAI and Microsoft, called Stargate, became the vehicle for a massive AI infrastructure project in the US. (President Donald Trump called it the largest AI infrastructure project in history because he did, but that may not have been hyperbolic.) Altman, Ellison, and SoftBank CEO Masayoshi Son were all in on the deal, pledging $100 billion to start with, with plans to invest up to $500 billion in Stargate over the next few years. Nvidia GPUs would be deployed. Later, in July, OpenAI and Oracle announced an additional Stargate partnership – SoftBank curiously absent – measured in gigawatt capacity (4.5) and expected employment (about 100,000).