Marc Whitten stands in front of an army of servers (Microsoft).
Watching the reveal of the Xbox One this week, one particular claim about Microsoft's new console caught my ear. Marc Whitten, the executive in charge of Xbox Live, the company's online gaming network, charted its historical progression.
"When we launched Xbox Live in 2002, it was powered by 500 severs. With the advent of the 360, that had grown to over 3,000," Whitten said. "Today, 15,000 servers power the modern Xbox Live experience."
Then Whitten said something extraordinary, "This year, we will have more than 300,000 servers for Xbox One, more than the entire world's computing power in 1999."
Now that's impressive! The statement even turned into an (unplanned?) applause line that tripped up Whitten's presentation. Because 1999 is not some distant date. It was the height of the dot-com bubble, after all. On an average home computer, you could play complex 3D games and download MP3s, edit video and mess around in Photoshop. Tens of millions of people had computers in their homes and Microsoft Office was nearly universal in business. Deep Blue had already beaten Garry Kasparaov!
And now, not even 15 years later, that same amount of information processing -- all the nuclear physics and climate simulations and videogames and spreadsheets and databases -- was being dedicated to running just one entertainment network, just one videogame network.
First, because Hilbert and López were dealing with historical terrain, they used a measure called MIPS, short for million of instructions per second. They broke out the world's computation resources into two broad categories: 1) General-purpose computing tracks mostly the CPUs in personal computers and videogame consoles (see chart) and 2) Application-specific computing, which is composed of digital signal processors (say in a DVD player), microcontrollers, and GPUs. By those two measures, in 1999, the world had 180 billion MIPS in general-purpose computing power and
800 billion MIPS in application-specific computing power for a total of 980 billion MIPS. With me? OK.
So, nowadays, most measurement of computers is done in FLOPS, or floating-point operations per second. So, Hilbert had to use a conversion of 1 FLOP to 0.0141 IPS (alternatively, about 71 FLOPs per IPS). This operation, Hilbert admits, is a "questionable" (his word) assumption in these calculations, but it allows us to make some meaningful comparisons.
Hilbert said, let's create an upperbound, by taking the average performance of the bottom 100 supercomputers on the Top 500 supercomputer list, and imagine that Microsoft has 300,000 of them. Under those (improbable) conditions, they'd reach 300 billion MIPS, more than the general-purpose computing power of 1999, but not even a third of the total processing power available.
Of course, we know Microsoft is not deploying 300,000 top supercomputers, so their claim is very likely an exaggeration.
But here's the weird thing: It's not that big of an exaggeration, according to Hilbert. "Realistically, since they are using less powerful (but specialized) servers, and orienting ourselves on the computing powers that are common in the gaming industry," he said, "I think the reality is rather that the computing power of this cluster is equal to the world's total computing power in 1994 or the world's general-purpose computing power in 1996."
As he summed it up, "I'd say they are some 5 years off... but nevertheless very impressive!"
Because 1995 is less than 20 years ago. More than a quarter of American households already had a computer. This is not a comparison to the Apollo guidance computer or some IBM machine that used punch cards.
And now all of that power, all of it, resides in some cluster of computers served up by one company in Redmond, Washington, so that we can all play Call of Duty and watch movies together. How strange exponentiality (re)makes the world.