Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
The political commentator may be more committed to the Republican nominee’s platform than he is.
Donald Trump has just betrayed Ann Coulter. Which is a dangerous thing to do.
This week, Coulter released her new book, In Trump We Trust. As the title suggests, it’s a defense of Trump. But more than that, it’s a defense of Trumpism. Most Trump surrogates contort themselves to defend whatever The Donald says, no matter its ideological content. They’re like communist party functionaries. They get word from the ideologists on high, and regurgitate it as best they can.
Coulter is different. She’s an ideologist herself. She realized the potency of the immigration issue among conservatives before Trump did. On June 1 of last year, she released Adios America, which devotes six chapters to the subject of immigrants and rape. Two weeks later, Trump—having received an advanced copy—famously picked up the thread in his announcement speech.
A new anatomical understanding of how movement controls the body’s stress response system
Elite tennis players have an uncanny ability to clear their heads after making errors. They constantly move on and start fresh for the next point. They can’t afford to dwell on mistakes.
Peter Strick is not a professional tennis player. He’s a distinguished professor and chair of the department of neurobiology at the University of Pittsburgh Brain Institute. He’s the sort of person to dwell on mistakes, however small.
“My kids would tell me, dad, you ought to take up pilates. Do some yoga,” he said. “But I’d say, as far as I’m concerned, there's no scientific evidence that this is going to help me.”
Still, the meticulous skeptic espoused more of a tennis approach to dealing with stressful situations: Just teach yourself to move on. Of course there is evidence that ties practicing yoga to good health, but not the sort that convinced Strick. Studies show correlations between the two, but he needed a physiological mechanism to explain the relationship. Vague conjecture that yoga “decreases stress” wasn’t sufficient. How? Simply by distracting the mind?
How men and women digest differently, diet changes our skin, and gluten remains mysterious: A forward-thinking gastroenterologist on eating one's way to "gutbliss"
Robynne Chutkan, MD, is an integrative gastroenterologist and founder of the Digestive Center for Women, just outside of Washington, D.C. She trained at Columbia University and is on faculty at Georgetown, but her approach to practicing medicine and understanding disease is more holistic than many specialists with academic backgrounds. She has also appeared on The Dr. Oz Show (of which I’ve been openly skeptical in the past, because of Oz’s tendency to divorce his recommendations from evidence).
Officials say they face a public-health emergency, and believe a batch of the opioid may be tainted with an elephant tranquilizer.
NEWS BRIEF Cincinnati is facing a public-health emergency, as an estimated 174 people overdosed on heroin in the last six days.
Police in the Ohio city are trying to find the source of the heroin batch. Tim Ingram, the Hamilton County health commissioner, told reporters Friday the number of hospital visits this week have been “unprecedented.”
Officials are pointing to a potential cause of the overdoses, as the Associated Press reports:
Cincinnati City Manager Harry Black said authorities suspect carfentanil, a drug used to sedate elephants and other large animals, may be mixed in with heroin and causing the overdoses. The drug is 100 times more potent than fentanyl, which is suspected in spates of overdoses in several states.
Last month, carfentanil was discovered in the Cincinnati area's heroin stream, but many hospitals don't have the equipment to test blood for the previously uncommon animal opioid.
If Hillary Clinton beats Donald Trump, her party will have set a record in American politics.
If Donald Trump can’t erase Hillary Clinton’s lead in the presidential race, the Republican Party will cross an ominous milestone—and confront some agonizing choices. Democrats have won the popular vote in five of the six presidential elections since 1992. (In 2000, Al Gore won the popular vote but lost the Electoral College and the White House to George W. Bush.) If Clinton maintains her consistent advantage in national and swing-state polls through Election Day, that means Democrats will have won the popular vote in six of the past seven presidential campaigns.
Since the 1828 election of Andrew Jackson that historians consider the birth of the modern two-party system, no party has ever won the presidential popular vote six times over seven elections. Even the nation’s most successful political figures have fallen short of that standard.
An increasing number of respondents are checking “Some Other Race” on U.S. Census forms, forcing officials to rethink current racial categories.
Something unusual has been taking place with the United States Census: A minor category that has existed for more than 100 years is elbowing its way forward. “Some Other Race,” a category that first entered the form as simply “Other” in 1910, was the third-largest category after “White” and “Black” in 2010, alarming officials, who are concerned that if nothing is done ahead of the 2020 census, this non-categorizable category of people could become the second-largest racial group in the United States.
Among those officials is Roberto Ramirez, the assistant division chief of the Census Bureau’s special population statistics branch. Ramirez is familiar with the complexities of filling out the census form: He checks “White” and “Some Other Race” to reflect his Hispanic ethnicity. Ramirez joins a growing share of respondents who are selecting “Some Other Race.” “People are increasingly not answering the race question. They are not identifying with the current categories, so we are trying to come up with a (better) question,” Ramirez told me. Ramirez and his colleague, Nicholas Jones, the director of race and ethnic research and outreach at the Census Bureau, have been working on fine-tuning the form to extract detailed race and ethnic reporting, and subsequently drive down the number of people selecting “Some Other Race.”
History repeats itself, it is often said. The strife facing modern-day Libya—strife largely born of and fueled by internal, sometimes tribal divisions—is only the latest iteration in a longstanding pattern. As the Italians discovered during their colonization of Libya, and as ISIS discovered when it conquered Sirte, and as the international community has recently discovered in a multitude of ways, Libya is a deeply divided country. Without a real approach to that reality—including, perhaps, creating a confederal model for Libya—Libyans themselves will continue to be their own worst enemies.
Libya’s tribal divisions were long a reality for the Italians, who occupied the North African country from 1912, after winning it from Turkey, to 1943, when they lost it against the British. Italy also used those divisions to its advantage in early 1928, when it defeated the rebellious tribes of Mogharba and many others who were engaged in a fight against the Italian Royal Army, but also—and above all—against each other. The Italians occupied the Corridoio Sirtico (Sirtic Corridor), an ideal break line, and conquered the oases of al-Jufrah, Zellah, Awjilah, and Gialo, isolated in the Cyrenaic desert, more than 150 miles from the Mediterranean Sea. Shortly afterwards, three gruppi mobili (mobile groups), formed by thousands of Italian soldiers, moved in from Tripolitania and Cyrenaica in a pincer movement. The target: the rebels in the Sirtic Corridor, who also fell.
The professor who teaches Classical Chinese Ethical and Political Theory claims, "This course will change your life."
Picture a world where human relationships are challenging, narcissism and self-centeredness are on the rise, and there is disagreement on the best way for people to live harmoniously together.
It sounds like 21st-century America. But the society that Michael Puett, a tall, 48-year-old bespectacled professor of Chinese history at Harvard University, is describing to more than 700 rapt undergraduates is China, 2,500 years ago.
Puett's course Classical Chinese Ethical and Political Theory has become the third most popular course at the university. The only classes with higher enrollment are Intro to Economics and Intro to Computer Science. The second time Puett offered it, in 2007, so many students crowded into the assigned room that they were sitting on the stairs and stage and spilling out into the hallway. Harvard moved the class to Sanders Theater, the biggest venue on campus.
Last night, in Time Capsule #88, I noted the deafening silence of Republican officialdom, after Hillary Clinton delivered her calmly devastating indictment of Donald Trump’s racist themes.
After this frontal attack on their own party’s chosen nominee, the rest of the GOP leadership said ... nothing. The cable-news Trump advocates were out in force, but senators? Governors? Previous candidates? Wise men and women of the party? Crickets.
A reader who is not a Trump supporter says there’s a logic to the plan:
I think you might be missing the GOP strategy here regarding Sec. Clinton’s bigotry speech, and the fact that no Republican came forward to defend Donald Trump. Republicans know that she spoke the truth—the indefensible truth about Donald Trump—and they want to squelch any discussion about it. That’s what they are doing.
Because they don’t want this speech on the airwaves, debated on panels, over several news cycles, with more and more of the dirty laundry getting debated in the mainstream news cycles, leading the Nightly News with dramatic music. Screaming headlines. Any any—ANY—statement by a Republican will trigger that discussion that no GOPer wants.
The mainstream news guys are sitting there at their email boxes, waiting, waiting, for statements, so they can write a piece on it. Benjy Sarlin mentioned it on Twitter, which you probably saw. [JF: I have now] And a couple of other journos, agreed.
But without some outraged statement from Ryan, Cruz, anybody, the mainstream journos have nothing to write about, there is no news cycle, no panels, no screaming headlines, no multi-news cycle. Just a Wow! Clinton gave a rough speech!” End of story. And that’s the strategy. Bury this story. And it’s working.
That’s how the GOP handles this kind of story. And it works just fine, every time. The mainstream journos can't find a both-sides hook, and they are nervous about this alt-right stuff anyway, so the story dies. Journos fear the brutality of GOP pushback. So it goes. Every. Time.
Contrast that with the non-story about the Clinton Foundation. Every GOPer was sending out a truckload of statements to keep that story going. Chuck Todd has stated in the past that he—they—have no choice but to write about whatever the GOP is upset about because they all put their shoulder to the wheel. And the GOP always has something for journos to write about. Controversy! And no fear of brutality from the Democrats. That’s how that goes.