A new company uses big-data capabilities to decode the inner-workings of the modern Internet.
What is the Internet?
It used to be simple. If you were a regular Internet user, you used a modem to commandeer your telephone line and you called into a service provider. Once connected, you'd enter the Netscape address and Netscape's servers would send you information over the network of networks.
Nowadays, things are much more complicated. Every time you load a web page, it calls out to dozens of other servers, which often dynamically send content to that original page. Meanwhile, service providers like Akamai and Edgecast hold copies of many websites' content close to users to speed up the delivery of that information. Add it all up and you are a far cry from the relatively simple architecture of yesteryear. It's hard even for the big bandwidth providers to tell what's going on.
Today, we met a company that can map out this new world of the Internet; Deep Field can decode the tangled web.
"We're in the second era of the Internet as we're seeing this massive influx of money and new services," said Deep Field's CEO Craig Labovitz. "Most of it is happening underneath the hood, but we're literally watching the Internet be rebuilt. It looks completely different than it did four years ago."
This is not hyperbole. Labovitz has been working, thinking, and writing about the Internet backbone for 20 years.
"I started my career working on the precursor to the commercial Internet," Labovitz told us. "I was one of the first backbone engineers on what was then the NSFNet, the National Science Foundation Networking, predating the launch of the commercial Internet. Even back then, we had problems. In fact, I broke the Internet on a couple of occasions. But back circa 1993, no one really noticed."
The NSFNet backbone actually grew out of a Michigan project called the Merit Network, which, incidentally, has one of those Wikipedia pages that makes you go, "Dang, how did I not know about this immensely important piece of Internet history?" It wasn't until 1995 that the University of Michigan stopped being the hub of Internet backbone engineering.
All that to say, these Michigan guys know what a network backbone looks like, so when they tell you the subterranean architecture of the Internet is changing, it's worth listening. Venture capitalists have. The lead investor on their first round of funding was Silicon Valley heavyweight DFJ, but local VC RPM also got in on the round.
So, what does Deep Field do, exactly? They help the big network players understand their traffic. An obvious example is Netflix. People watching Netflix suck up a lot of bandwidth, but Netflix servers aren't the ones serving that stuff up directly to consumers. Akamai and other companies are the ones actually sending your computer the packets that make up 30 Rock. Why's that matter? Well, if you're building out a half a billion dollars of network infrastructure, you should probably know how people are using what you've already got in place. Without Deep Field, you might not know how much people watching Netflix is really costing you.
Deep Field provides their maps of the deep Internet by crawling the web, like a kind of SuperGoogle. They say they're "applying big data to the Internet itself." The man in charge of making sense of it all is Chief Data Scientist Naim Falandino, a graduate of both Michigan State (undergrad, computer science) and University of Michigan (School of Information). He grew up near Detroit and is precisely the kind of young, talented, trained person that mayors' economic-development officers want to stick around.
Why'd he put down roots in Ann Arbor? Part of it is clearly the challenge of understanding this new era of the Internet. But there's also something we've noticed all over this state: people from Michigan have a ton of state pride. You see it on billboards and you see it on t-shirts. You see it on Falandino's face when he says, "I don't want to be one of those young people who left."
This article available online at: