Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
Thicker ink, fewer smudges, and more strained hands: an Object Lesson
Recently, Bic launched acampaign to “save handwriting.” Named “Fight for Your Write,” it includes a pledge to “encourage the act of handwriting” in the pledge-taker’s home and community, and emphasizes putting more of the company’s ballpoints into classrooms.
As a teacher, I couldn’t help but wonder how anyone could think there’s a shortage. I find ballpoint pens all over the place: on classroom floors, behind desks. Dozens of castaways collect in cups on every teacher’s desk. They’re so ubiquitous that the word “ballpoint” is rarely used; they’re just “pens.” But despite its popularity, the ballpoint pen is relatively new in the history of handwriting, and its influence on popular handwriting is more complicated than the Bic campaign would imply.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
The new drama series, which follows the Colombian kingpin’s rise to power, feels more like a well-researched documentary than the gripping saga it wants to be.
Netflix’s new series Narcos is possibly arriving at the wrong time: The doldrums of summer aren’t really the ideal moment for a narratively dense, documentary-like look at the rise and fall of the Colombian drug kingpin Pablo Escobar. Narrated in voiceover by DEA Agent Steve Murphy (Boyd Holbrook), the early hours of Narcos feel like a history lesson, though an visually sumptuous one.
As Netflix continues to expand its streaming empire, it’s making a concerted effort to appeal to worldwide audiences, and Narcos fits neatly into that plan, alongside last year’s expensive critical flop Marco Polo. Narcos was shot on location in Colombia and stars the acclaimed Brazilian actor Wagner Moura as Escobar. It takes full advantage of its setting, loaded with sweeping helicopter shots of the Colombian jungle where Escobar founded his cocaine empire, filling a power vacuum left by various political upheavals in late-’70s South America.
The billionaire’s campaign is alienating the fastest-growing demographic in American politics—and the talk-radio right treats damage control as heresy.
With Marco Rubio and Jeb Bush running for president, many Republicans hoped 2016 would be the year when the GOP won its biggest ever share of the Hispanic vote. Now Donald Trump is the frontrunner. And if he hangs on to win the nomination, the GOP will almost certainly do worse among Hispanic voters than ever before. Earlier this week, Gallup released an extraordinary poll about how Hispanics view the Republican candidates. Jeb Bush is easily the most popular. Ted Cruz is least popular among the traditional choices. Nearly everyone else fits in between them in a range so narrow that the 5 percent margin of error could scramble their order.
But not Trump, who is wildly, staggeringly unpopular among Hispanics:
The Republican frontrunner has offered Bush the perfect chance to display some passion—but he’s declined to take it.
Donald Trump has gotten a boost in his efforts to maul Jeb Bush in recent days from an unexpected source: Jeb Bush himself.
Trump’s attack on Jeb isn’t mostly about issues. As with most things Trump, it’s mostly about persona. The Donald thinks Jeb is a dud. “He’s a man that doesn’t want to be doing what he’s doing,” Trump said in June. “I call him the reluctant warrior, and warrior’s probably not a good word. I think Bush is an unhappy person. I don’t think he has any energy.”
Over the last week, Jeb has proven Trump right. Trump, and his supporters, continue to demonize Mexican American illegal immigrants. On Tuesday, Trump threw the most popular Spanish-language broadcaster in America out of a press conference. That same day, Ann Coulter warmed up for Trump in Iowa by offering gruesome details of murders by Mexican “illegals,” and suggesting that once Trump builds his wall along America’s southern border, tourists can come watch the “live drone shows.”
On the desperation behind the migrant tragedy in Austria
On Thursday, as Krishnadev Calamur has been tracking in The Atlantic’s new Notes section, Austrian authorities made a ghastly discovery: a truck abandoned in the emergency lane of a highway near the Hungarian border, packed with the decomposing bodies of 59 men, eight women, and four children. They are thoughtto be the corpses of migrants who suffocated to death, perhaps two days earlier, in the bowels of a vehicle whose back door was locked shut and refrigeration and ventilation systems weren’t functional. Stray identity documents suggest that at least some of the victims were Syrian—refugees from that country’s brutal civil war. The truck featured an image of a chicken and a slogan from the Slovakian poultry company that the lorry once belonged to: “I taste so good because they feed me so well.”
Every time you shrug, you don’t need to Google, then copy, then paste.
Updated, 2:20 p.m.
All hail ¯\_(ツ)_/¯.
In its 11 strokes, the symbol encapsulates what it’s like to be an individual on the Internet. With raised arms and a half-turned smile, it exudes the melancholia, the malaise, the acceptance, and (finally) the embrace of knowing that something’s wrong on the Internet and you can’t do anything about it.
As Kyle Chayka writes in a new history of the symbol at The Awl, the meaning of the “the shruggie” is always two, if not three- or four-, fold. ¯\_(ツ)_/¯ represents nihilism, “bemused resignation,” and “a Zen-like tool to accept the chaos of universe.” It is Sisyphus in unicode. I use it at least 10 times a day.
For a long time, however, I used it with some difficulty. Unlike better-known emoticons like :) or ;), ¯\_(ツ)_/¯ borrows characters from the Japanese syllabary called katakana. That makes it a kaomoji, a Japanese emoticon; it also makes it, on Western alphabetical keyboards at least, very hard to type. But then I found a solution, and it saves me having to google “smiley sideways shrug” every time I want to quickly rail at the world’s inherent lack of meaning.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
A new study shows that the field suffers from a reproducibility problem, but the extent of the issue is still hard to nail down.
No one is entirely clear on how Brian Nosek pulled it off, including Nosek himself. Over the last three years, the psychologist from the University of Virginia persuaded some 270 of his peers to channel their free time into repeating 100 published psychological experiments to see if they could get the same results a second time around. There would be no glory, no empirical eurekas, no breaking of fresh ground. Instead, this initiative—the Reproducibility Project—would be the first big systematic attempt to answer questions that have been vexing psychologists for years, if not decades. What proportion of results in their field are reliable?
Grasses—green, neatly trimmed, symbols of civic virtue—shaped the national landscape. They have now outlived their purpose.
The hashtag #droughtshaming—which primarily exists, as its name suggests, to publicly decry people who have failed to do their part to conserve water during California’s latest drought—has claimed many victims. Anonymous lawn-waterers. Anonymous sidewalk-washers. The city of Beverly Hills. The tag’s most high-profile shamee thus far, however, has been the actor Tom Selleck. Who was sued earlier this summer by Ventura County’s Calleguas Municipal Water District for the alleged theft of hydrant water, supposedly used to nourish his 60-acre ranch. Which includes, this being California, an avocado farm, and also an expansive lawn.
The case was settled out of court on terms that remain undisclosed, and everyone has since moved on with their lives. What’s remarkable about the whole thing, though—well, besides the fact that Magnum P.I. has apparently become, in his semi-retirement, a gentleman farmer—is how much of a shift all the Selleck-shaming represents, as a civic impulse. For much of American history, the healthy lawn—green, lush, neatly shorn—has been a symbol not just of prosperity, individual and communal, but of something deeper: shared ideals, collective responsibility, the assorted conveniences of conformity. Lawns, originally designed to connect homes even as they enforced the distance between them, are shared domestic spaces. They are also socially regulated spaces. “When smiling lawns and tasteful cottages begin to embellish a country,” Andrew Jackson Downing, one of the fathers of American landscaping, put it, “we know that order and culture are established.”