On March 21, 2006, a small team at a podcasting startup in San Francisco tested a simple product. It was a texting megaphone: Send an SMS message to the number 40404, and the service would forward it to all your friends.
After that, the historical record gets cloudier. The four men with some responsibility for the digital doodad were Jack Dorsey, Evan Williams, Biz Stone, and Noah Glass. They called it Twitter, because using it felt amusing and insubstantial, but they spelled it twttr, because vowels were still out of fashion in startup names. Also, Twitter.com was taken.
That afternoon, Jack Dorsey sent the first tweet:
just setting up my twttr— Jack (@jack) March 21, 2006
Today is the 10th birthday of that tweet. Dorsey, then an undergrad at New York University, later the company’s first CEO, is now CEO again. Twitter is no longer a small, quirky, unprofitable startup, but a big, famous, unprofitable corporation. And, having come in for a beating recently, it is thrilled to have a self-evidently positive PR event like a double-digit birthday. It’s releasing GIFs and blog posts and data visualizations to celebrate.
Of course you can quibble with the anniversary. Twitter describes March 21 as its birthday, but the event commemorated more closely resembles its conception. Twitter didn’t even open to the public until July 15, 2006. Later that year, it made a website where it displayed all the tweets. (Yes: all the tweets.) It wasn’t until February 2007 that Twitter itself superseded the podcasting company that founded it, Odeo.
Among users, Twitter didn’t catch on in any sizable way until the South by Southwest Interactive conference in March 2007. That week, it went from publishing 20,000 tweets per day to more than 60,000. Now, it publishes about 500 million. Incidentally, I joined Twitter that month too.
I like what Chris Hayes tweeted last week: “I feel my generation’s version of ‘this rocker turning X age makes me feel old’ are when apps and websites celebrate birthdays.”
Twitter’s 10th anniversary is eminently amenable to stock-taking, but how should we take stock? After all, there’s been no shortage of chances over the last decade to note How Twitter Has Changed Things. Infinite comparisons can be drawn between Twitter and any other social network. Facebook, two years Twitter’s senior, always had the users and the power, but Twitter’s form dictated the terms of our new media reality. Certainly it created the space for the media micro-observation: In more than the obvious sense, I wouldn’t be quoting an MSNBC anchor’s random musing about Twitter if he hadn’t tweeted it.
It also doesn’t seem worth dwelling on Twitter’s shortcomings. (For that, I can wait for earnings day.) Instead, it’s worth seeing Twitter not just as a 10-year-old social network, but as a product of its time.
The winter of 2006 was, in many ways, a nadir for a kind of American progressivism. Months earlier, Hurricane Katrina, and the Bush administration’s incompetent handling of it, had killed thousands of Americans. March 20, 2006, was the third anniversary of the Iraq invasion, and the country was stumbling into a de facto Sunni-Shi’a civil war. (By some estimates, more Iraqi civilians died that year than in the first two years of the invasion combined.) From a purely partisan standpoint, the Democratic takeover of Congress—and the first glimpse of what an Obama coalition might look like—was eight months away.
But on the Internet, something new seemed to be happening. Teenagers were spending hours on Myspace. Almost out of nowhere, Wikipedia had appeared as a full-fledged encyclopedia, at the top of half the Google searches. And YouTube was exploding: By November 2006, Google acquired it for $1.6 billion.
These sites could only exist because of certain technological advances. In the mid-2000s, enough Americans got broadband-speed Internet that web video was suddenly feasible. Meanwhile, web developers at Google and elsewhere figured out a new way to code a web page, such that it would work like software in a desktop web browser.
But YouTube, Wikipedia, and eventually Twitter represented cultural hypotheses as much as engineering achievements, borne by that old Brandian mix of communitarianism and libertarianism, as well as by a frustrated progressivism. Common wisdom holds that the new tech boom, the whole educated-millennials-moving-to-San-Francisco thing, is a post-Great Recession phenomenon, but 2006 was the year that Time declared ‘you’ the person of the year. Read it today, and their announcement becomes an artifact of the era:
Look at 2006 through a different lens and you’ll see another story, one that isn’t about conflict or great men. It’s a story about community and collaboration on a scale never seen before. It’s about the cosmic compendium of knowledge Wikipedia and the million-channel people's network YouTube and the online metropolis MySpace. Its about the many wresting power from the few and helping one another for nothing and how that will not only change the world, but also change the way the world changes.
[…] We’re ready to balance our diet of predigested news with raw feeds from Baghdad and Boston and Beijing. You can learn more about how Americans live just by looking at the backgrounds of YouTube videos—those rumpled bedrooms and toy-strewn basement rec rooms—than you could from 1,000 hours of network television.
[…] Web 2.0 is a massive social experiment, and like any experiment worth trying, it could fail. There’s no road map for how an organism that’s not a bacterium lives and works together on this planet in numbers in excess of 6 billion. But 2006 gave us some ideas. This is an opportunity to build a new kind of international understanding, not politician to politician, great man to great man, but citizen to citizen, person to person.
I remember that essay feeling faddish at the time, but now it’s hard to read it and not think of what was to come. The Obama campaign, the Arab Spring, even the solidarity of the SOPA-PIPA protests—the seeds of those movements, and especially what people wanted to see in those moments, were planted that year.
Last week, I was talking to the writer Sarah Jeong about how (and whether) the Internet has aided Donald Trump’s rise. We got to how the Internet’s design, its dream of universal publication for anyone who wants it—a concept inherent in Twitter’s design, as well—is inherently optimistic. She remembered Justice Oliver Wendell Holmes’s dissent in a 1919 Supreme Court case, Abrams v. United States.
In that opinion, Holmes rued that pamphleteers had been charged under the Espionage Act. The theory of free democracy, he said, was that all ideas must be allowed to triumph or perish in the marketplace of ideas:
When men have realized that time has upset many fighting faiths, they may come to believe, even more than they believe the very foundations of their own conduct, that the ultimate good desired is better reached by free trade in ideas—that the best test of truth is the power of the thought to get itself accepted in the competition of the market, and that truth is the only ground upon which their wishes safely can be carried out. That, at any rate, is the theory of our Constitution. It is an experiment, as all life is an experiment. Every year, if not every day, we have to wager our salvation upon some prophecy based upon imperfect knowledge.
I don’t know if “Web 2.0 is a massive social experiment, and like any experiment worth trying, it could fail” was meant to recall Holmes’s dissent: “It is an experiment, as all life is an experiment.” But in 2016, with the Web 2.0 experiment now at society-sized scale, Holmes’s insight seems ever more relevant. Facebook may be the most popular social network globally, but Twitter is the real marketplace: the plaza where hundreds of thousands log on everyday to make their own warring provocations and micro-observations, a realm as open to #BlackLivesMatter activists as to a nationalist presidential candidate.
When I think of 10 years of Twitter, I think of 10 years of watching the timeline view: the way the social network compresses everything into one endless, disorienting, friendly-anxious stream. Distant nighttime protests sit next to a friend’s baking experiment sit next to a blurry concert video; pleas for justice next to photos of opulence next to the latest discovery in genetics. This is Twitter: Hour by hour, as activists and demagogues prepare to hawk, peddle, and champion some favored idea—or just sell themselves—it sunders us from the world that once was.