Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
Russian billionaire Yuri Milner says if the space rock 'Oumuamua is giving off radio signals, his team will be able to detect them—and they may get the results within days.
The email about “a most peculiar object” in the solar system arrived in Yuri Milner’s inbox last week.
Milner, the Russian billionaire behind Breakthrough Listen, a $100 million search for intelligent extraterrestrial life, had already heard about the peculiar object. ‘Oumuamua barreled into view in October, the first interstellar object seen in our solar system.
Astronomers around the world chased after the mysterious space rock with their telescopes, collecting as much data as they could as it sped away. Their observations revealed a truly unusual object with puzzling properties. Scientists have long predicted an interstellar visitor would someday coast into our corner of the universe, but not something like this.
Russia's strongman president has many Americans convinced of his manipulative genius. He's really just a gambler who won big.
I. The Hack
The large, sunny room at Volgograd State University smelled like its contents: 45 college students, all but one of them male, hunched over keyboards, whispering and quietly clacking away among empty cans of Juicy energy drink. “It looks like they’re just picking at their screens, but the battle is intense,” Victor Minin said as we sat watching them.
Clustered in seven teams from universities across Russia, they were almost halfway into an eight-hour hacking competition, trying to solve forensic problems that ranged from identifying a computer virus’s origins to finding secret messages embedded in images. Minin was there to oversee the competition, called Capture the Flag, which had been put on by his organization, the Association of Chief Information Security Officers, or ARSIB in Russian. ARSIB runs Capture the Flag competitions at schools all over Russia, as well as massive, multiday hackathons in which one team defends its server as another team attacks it. In April, hundreds of young hackers participated in one of them.
Students don't seem to be getting much out of higher education.
I have been in school for more than 40 years. First preschool, kindergarten, elementary school, junior high, and high school. Then a bachelor’s degree at UC Berkeley, followed by a doctoral program at Princeton. The next step was what you could call my first “real” job—as an economics professor at George Mason University.
Thanks to tenure, I have a dream job for life. Personally, I have no reason to lash out at our system of higher education. Yet a lifetime of experience, plus a quarter century of reading and reflection, has convinced me that it is a big waste of time and money. When politicians vow to send more Americans to college, I can’t help gasping, “Why? You want us to waste even more?”
David Bentley Hart’s text recaptures the awkward, multivoiced power of the original.
In the beginning was … well, what? A clap of the divine hands and a poetic shock wave? Or an itchy node of nothingness inconceivably scratching itself into somethingness? In the beginning was the Word, says the Gospel according to John—a lovely statement of the case, as it’s always seemed to me. A pre-temporal syllable swelling to utterance in the mouth of the universe, spoken once and heard forever: God’s power chord, if you like. For David Bentley Hart, however, whose mind-bending translation of the New Testament was published in October, the Word—as a word—does not suffice: He finds it to be “a curiously bland and impenetrable designation” for the heady concept expressed in the original Greek of the Gospels as Logos. The Chinese word Tao might get at it, Hart tells us, but English has nothing with quite the metaphysical flavor of Logos, the particular sense of a formative moral energy diffusing itself, without diminution, through space and time. So he throws up his hands and leaves it where it is: “In the origin there was the Logos …”
As the Alabama Senate race enters its final days, Doug Jones is making an all out push—while his rival is nowhere to be seen.
MONTGOMERY, Ala.—The dramatic contrast between the two Senate campaigns in Alabama grew even more stark during the final weekend before Election Day.
Doug Jones, the Democratic candidate, is holding multiple events every day, taking questions from the press, and campaigning with other Democratic politicians. His opponent Roy Moore has all but disappeared. Dogged by controversy following allegations by nine women of sexual misconduct or abuse when they were teenagers, Moore has not appeared in public since last Tuesday, and isn’t scheduled to do so again before Monday. Reports suggest that he was in Philadelphia on Saturday watching the Army-Navy football game, and a Moore spokesperson told me that he attended a Christmas gathering with supporters and friends on Sunday evening—closed to the press.
A conversation about inheritance, philanthropy, and aging with the philosopher Martha Nussbaum and the law professor Saul Levmore
What is the right way to age? It’s a question that isn’t explored enough in American society, where, seemingly, people are expected to be forever young, until, suddenly, they are not. Reflecting this binary, any writing about a long life’s final decades tends toward extremes. On one hand, there are the accounts of heroic men and women who still put in more than 40 hours a week on the job in their late 60s and early 70s (a genre I like to call “retirement porn”). On the other, there are the articles warning about the dangers of not adapting a home for aging bodies, or the plague of financial scammers targeting lonely or cognitively challenged seniors.
That leaves out a vast middle, the space where many older people actually, you know, live their lives. Luckily, Martha Nussbaum, the renowned philosopher and ethicist at the University of Chicago, and Saul Levmore, the former dean of and a current professor at the university’s law school, decided to explore that middle. The result? The recently published Aging Thoughtfully: Conversations About Retirement, Romance, Wrinkles, & Regret.
The new and returning series that stood out the most
How to summarize television in 2017? While no descriptor captures the year’s diverse offerings, one word crops up more than any other: Netflix. The streaming service’s throw-content-at-the-wall-and-see-what-sticks strategy generated more than 1,000 hours of original TV and movies this year,and though plenty were duds, Netflix also seemed to spawn more critical hits than any other provider.
What this year might have lacked in offbeat ingenuity (Atlanta and Fleabag are scheduled to return next year), it made up for in star power, including an array of heavyweights from the film world. Jean-Marc Vallée. Reese Witherspoon. Nicole Kidman. Spike Lee. Sarah Polley. David Fincher. James Franco. Steven Soderbergh. Justin Simien. With seemingly endless resources on offer alongside almost total artistic freedom, it’s hard not to see still more creative talent being drawn toward TV in 2018 and beyond.
One person is in custody after Monday’s explosion. Three people suffered minor injuries.
Updated at 11:07 a.m. ET
A man wearing an improvised explosive device caused an explosion Monday in a New York subway tunnel near Times Square, authorities said. The state’s governor told New Yorkers what they already know: attacks like this one are inevitable in a city that is a permanent target for anti-American terrorists.
“This is New York,” Andrew Cuomo, the New York state governor, said Monday. “The reality is that we are a target by many who would like to make a statement against democracy, against freedom. We have the Statue of Liberty in our harbor, and that makes us an international target. We understand that.”
The blast occurred at about 7:20 a.m. in the passageway connecting the busy Times Square and Port Authority subway stations, officials said at a news conference. James O’Neill, the city’s police commissioner, identified the attacker as Akayed Ullah, 27, a resident of Brooklyn. Other news reports, citing unnamed police sources, said Ullah is an immigrant from Bangladesh who has lived in the U.S. for seven years. Ullah sustained serious injuries in the explosion, while three others in the vicinity of the blast had minor injuries, authorities said.
Will the vice president—and the religious right—be rewarded for their embrace of Donald Trump?
No man can serve two masters, the Bible teaches, but Mike Pence is giving it his all. It’s a sweltering September afternoon in Anderson, Indiana, and the vice president has returned to his home state to deliver the Good News of the Republicans’ recently unveiled tax plan. The visit is a big deal for Anderson, a fading manufacturing hub about 20 miles outside Muncie that hasn’t hosted a sitting president or vice president in 65 years—a fact noted by several warm-up speakers. To mark this historic civic occasion, the cavernous factory where the event is being held has been transformed. Idle machinery has been shoved to the perimeter to make room for risers and cameras and a gargantuan American flag, which—along with bleachers full of constituents carefully selected for their ethnic diversity and ability to stay awake during speeches about tax policy—will serve as the TV-ready backdrop for Pence’s remarks.
The cryptocurrency is almost certainly due for a major correction. But its long-term value remains a mystery.
To call Bitcoin the biggest and most obvious bubble in modern history may be a disservice to its surreality.
The price of bitcoin has doubled four times this year. In early January, one bitcoin was worth about $1,000. By May, it hit $2,000. In June, it breached $4,000. By Thanksgiving, it was $8,000. Two weeks later, it was $16,000.
This astronomical trajectory might make sense for a new public company with accelerating profits. Bitcoin, however, has no profits. It’s not even a company. It is a digital encrypted currency running on a decentralized network of computers around the world. Ordinary currencies, like the U.S. dollar, don’t double in value by the month, unless there’s a historic deflationary crisis, like the Panic of 1837. Instead, bitcoin’s behavior more resembles that of a collectible frenzy, like Beanie Babies in the late 1990s.