Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
Einstein’s gravitational waves rest on a genuinely radical idea.
After decades of anticipation, we have directly detected gravitational waves—ripples in spacetime traveling at the speed of light through the universe. Scientists at LIGO (the Laser Interferometic Gravitational-wave Observatory) have announced that they have measured waves coming from the inspiral of two massive black holes, providing a spectacular confirmation of Albert Einstein’s general theory of relativity, whose hundredth anniversary was celebrated just last year.
Finding gravitational waves indicates that Einstein was (once again) right, and opens a new window onto energetic events occurring around the universe. But there’s a deeper lesson, as well: a reminder of the central importance of locality, an idea that underlies much of modern physics.
Most people know how to help someone with a cut or a scrape. But what about a panic attack?
Here’s a thought experiment: You’re walking down the street with a friend when your companion falls and gashes her leg on the concrete. It’s bleeding; she’s in pain. It’s clear she’s going to need stitches. What do you do?
This one isn’t exactly a head-scratcher. You'd probably attempt to offer some sort of first-aid assistance until the bleeding stopped, or until she could get to medical help. Maybe you happen to have a Band-Aid on you, or a tissue to help her clean the wound, or a water bottle she can use to rinse it off. Maybe you pick her up and help her hobble towards transportation, or take her where she needs to go.
Here’s a harder one: What if, instead of an injured leg, that same friend has a panic attack?
The bureau successfully played the long game in both cases.
The story of law enforcement in the Oregon standoff is one of patience.
On the most obvious level, that was reflected in the 41 days that armed militia members occupied the Malheur National Wildlife Refuge near Burns. It took 25 days before the FBI and state police moved to arrest several leaders of the occupation and to barricade the refuge. It took another 15 days before the last of the final occupiers walked out, Thursday morning Oregon time.
Each of those cases involved patience as well: Officers massed on Highway 395 didn’t shoot LaVoy Finicum when he tried to ram past a barricade, nearly striking an FBI agent, though when he reached for a gun in his pocket they finally fired. Meanwhile, despite increasingly hysterical behavior from David Fry, the final occupier, officers waited him out until he emerged peacefully.
Today’s empires are born on the web, and exert tremendous power in the material world.
Mark Zuckerberg hasn’t had the best week.
First, Facebook’s Free Basics platform was effectively banned in India. Then, a high-profile member of Facebook’s board of directors, the venture capitalist Marc Andreessen, sounded off about the decision to his nearly half-a-million Twitter followers with a stunning comment.
“Anti-colonialism has been economically catastrophic for the Indian people for decades,” Andreessen wrote. “Why stop now?”
After that, the Internet went nuts.
Andreessen deleted his tweet, apologized, and underscored that he is “100 percent opposed to colonialism” and “100 percent in favor of independence and freedom.” Zuckerberg, Facebook’s CEO, followed up with his own Facebook post to say Andreessen’s comment was “deeply upsetting” to him, and not representative of the way he thinks “at all.”
Ben Stiller’s follow-up to his own comedy classic is a downright bummer, no matter how many celebrity cameos it tries to cram in.
You don’t need to go to the theater to get the full experience of Zoolander 2. Simply get your hands on a copy of the original, watch it, and then yell a bunch of unfunny topical lines every time somebody tells a joke. That’s how it feels to watch Ben Stiller’s sequel to his 2001 spoof of the fashion industry: Zoolander 2 takes pains to reference every successful gag you remember from the original, and then embellish them in painful—often offensive, almost always outdated—fashion. It’s a film that has no real reason to exist, and it spends its entire running time reaffirming that fact.
The original Zoolander, to be fair, had no business being as funny as it was—it made fun of an industry that already seems to exist in a constant state of self-parody, and much of its humor relied on simple malapropisms and sight gags. But it was hilarious anyway as a candid snapshot of the fizzling-out of ’90s culture. Like almost any zeitgeist comedy, it belonged to a particular moment—and boy, should it have stayed there. With Zoolander 2, Stiller (who directed, co-wrote, and stars) tries to recapture the magic of 2001 by referencing its past glories with increasing desperation, perhaps to avoid the fact that he has nothing new to say about the fashion industry or celebrity culture 15 years laters.
The number of American teens who excel at advanced math has surged. Why?
On a sultry evening last July, a tall, soft-spoken 17-year-old named David Stoner and nearly 600 other math whizzes from all over the world sat huddled in small groups around wicker bistro tables, talking in low voices and obsessively refreshing the browsers on their laptops. The air in the cavernous lobby of the Lotus Hotel Pang Suan Kaew in Chiang Mai, Thailand, was humid, recalls Stoner, whose light South Carolina accent warms his carefully chosen words. The tension in the room made it seem especially heavy, like the atmosphere at a high-stakes poker tournament.
Stoner and five teammates were representing the United States in the 56th International Mathematical Olympiad. They figured they’d done pretty well over the two days of competition. God knows, they’d trained hard. Stoner, like his teammates, had endured a grueling regime for more than a year—practicing tricky problems over breakfast before school and taking on more problems late into the evening after he completed the homework for his college-level math classes. Sometimes, he sketched out proofs on the large dry-erase board his dad had installed in his bedroom. Most nights, he put himself to sleep reading books like New Problems in Euclidean Geometry and An Introduction to Diophantine Equations.
By mining electronic medical records, scientists show the lasting legacy of prehistoric sex on modern humans’ health.
Modern humans originated in Africa, and started spreading around the world about 60,000 years ago. As they entered Asia and Europe, they encountered other groups of ancient humans that had already settled in these regions, such as Neanderthals. And sometimes, when these groups met, they had sex.
We know about these prehistoric liaisons because they left permanent marks on our genome. Even though Neanderthals are now extinct, every living person outside of Africa can trace between 1 and 5 percent of our DNA back to them. (I am 2.6 percent Neanderthal, if you were wondering, which pales in comparison to my colleague James Fallows at 5 percent.)
This lasting legacy was revealed in 2010 when the complete Neanderthal genome was published. Since then, researchers have been trying to figure out what, if anything, the Neanderthal sequences are doing in our own genome. Are they just passive hitchhikers, or did they bestow important adaptations on early humans? And are they affecting the health of modern ones?
Jim Gilmore joins Chris Christie and Carly Fiorina, and leaves the race after a poor showing in New Hampshire.
Jim Gilmore’s candidacy this year was improbable—but even more improbable was the minor cult of personality that developed around it.
The former Virginia governor never had a chance. Not, like, in the sense of Lindsey Graham, a candidate with national standing but no path to the presidency. More in the George Pataki sense: a guy who had no real business in race, but was running anyway. Except that Gilmore made Pataki look like a juggernaut. Also, Pataki saw the writing on the wall and had the sense to drop out in late December. Gilmore soldiered on, and ended up as the last of the truly longshots to leave.
The result was that Gilmore turned into a sort of folk hero. Not for voters, mind you—he managed only 12 votes in Iowa and 125 in New Hampshire, and his campaign was funded largely by loans from himself. Because of his low support in the polls, Gilmore only made the cut for the very first kid’s-table debate in August, and then again for the undercard in late January. Other than that, he was shut out completely.
A robotic road safety worker in India, a sacrificial llama in Bolivia, a sea otter receives a valentine, a deadly earthquake in Taiwan, a leopard attack in India, and much more.
A murmuration of starlings over Israel, a robotic road safety worker in India, a sacrificial llama in Bolivia, border barriers between Tunisia and Libya, a sea otter receives a valentine, a deadly earthquake in Taiwan, the annual Shrovetide football match in England, a leopard attack in India, and much more.
A photo series reveals what expectant mothers in various countries bring with them to the hospital.
For most expecting mothers in the Western world, a hospital bag is something that makes the birthing process marginally more comfortable. You’ve just brought a new being into the world; you deserve to wear your own sweatpants.
But in some parts of the world, hospitals are so bare-bones that women in labor must tote everything with them, from rubber gloves to water pans to gauze.
To draw attention to the difficulty of giving birth in regions where water is scarce, the organization WaterAid recently dispatched photographers to ask expecting and brand-new moms in various countries to open up their hospital bags. Here are their photos, as well as lightly edited interviews with the moms conducted by WaterAid.