Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
What would the American culture wars look like if they were less about “values” and more about Jesus?
Evangelical Christianity has long had a stranglehold on how Americans imagine public faith. Vague invocations of “religion”—whether it’s “religion vs. science” or “religious freedom”—usually really mean “conservative, Protestant, evangelical Christianity,” and this assumption inevitably frames debates about American belief. For the other three-quarters of the population—Catholics, Jews, other Protestants, Muslims, Hindus, secular Americans, Buddhists, Wiccans, etc.—this can be infuriating. For some evangelicals, it’s a sign of success, a linguistic triumph of the culture wars.
But not for Russell Moore. In 2013, the 43-year-old theologian became the head of the Ethics and Religious Liberty Commission, the political nerve center of the Southern Baptist Convention. His predecessor, Richard Land, prayed with George W. Bush, played hardball with Democrats, and helped make evangelicals a quintessentially Republican voting bloc.
Paul faced danger, Ani and Ray faced each other, and Frank faced some career decisions.
This is what happens when you devote two-thirds of a season to scene after scene after scene of Frank and Jordan’s Baby Problems, and Frank Shaking Guys Down, and Look How Fucked Up Ray and Ani Are, and Melancholy Singer in the Dive Bar Yet Again—and then you suddenly realize that with only a couple episodes left you haven’t offered even a rudimentary outline of the central plot.
Many psychiatrists believe that a new approach to diagnosing and treating depression—linking individual symptoms to their underlying mechanisms—is needed for research to move forward.
In his Aphorisms, Hippocrates defined melancholia, an early understanding of depression, as a state of “fears and despondencies, if they last a long time.” It was caused, he believed, by an excess of bile in the body (the word “melancholia” is ancient Greek for “black bile”).
Ever since then, doctors have struggled to create a more precise and accurate definition of the illness that still isn’t well understood. In the 1920s, the German psychiatrist Kurt Schneider argued that depression could be divided into two separate conditions, each requiring a different form of treatment: depression that resulted from changes in mood, which he called “endogenous depression,” and depression resulting from reactions to outside events, or “reactive depression.” His theory was challenged in 1926, when the British psychologist Edward Mapother argued in the British Medical Journal that there was no evidence for two distinct types of depression, and that the apparent differences between depression patients were just differences in the severity of the condition.
The winners of the 27th annual National Geographic Traveler Photo Contest have just been announced.
The winners of the 27th annual National Geographic Traveler Photo Contest have just been announced. Winning first prize, Anuar Patjane Floriuk of Tehuacán, Mexico, will receive an eight-day photo expedition for two to Costa Rica and the Panama Canal for a photograph of divers swimming near a humpback whale off the western coast of Mexico. Here, National Geographic has shared all of this year’s winners, gathered from four categories: Travel Portraits, Outdoor Scenes, Sense of Place, and Spontaneous Moments. Captions by the photographers.
What if Joe Biden is going to run for the Democratic nomination after all?
Most Democrats seem ready for Hillary Clinton—or at least appear content with her candidacy. But what about the ones who who were bidin’ for Biden? There are new signs the vice president might consider running for president after all.
Biden has given little indication he was exploring a run: There’s no super PAC, no cultivation of a network of fundraisers or grassroots organizers, few visits to early-primary states. While his boss hasn’t endorsed Clinton—and says he won’t endorse in the primary—many members of the Obama administration have gone to work for Clinton, including some close to Biden.
But Biden also hasn’t given any clear indication that he isn’t running, and a column by Maureen Dowd in Saturday’s New York Times has set off new speculation. One reason Biden didn’t get into the race was that his son Beau was dying of cancer, and the vice president was focused on being with his son. But before he died in May, Dowd reported, Beau Biden tried to get his father to promise to run. Now Joe Biden is considering the idea.
The jobs that are least vulnerable to automation tend to be held by women.
Many economists and technologists believe the world is on the brink of a new industrial revolution, in which advances in the field of artificial intelligence will obsolete human labor at an unforgiving pace. Two Oxford researchers recently analyzed the skills required for more than 700 different occupations to determine how many of them would be susceptible to automation in the near future, and the news was not good: They concluded that machines are likely to take over 47 percent of today’s jobs within a few decades.
This is a dire prediction, but one whose consequences will not fall upon society evenly. A close look at the data reveals a surprising pattern: The jobs performed primarily by women are relatively safe, while those typically performed by men are at risk.
In the footage, secretly recorded by an anti-abortion-rights group, an official from the organization discusses the procurement and cost of intact fetuses.
Updated on August 4, 2015, at 5:54 p.m. ET
Planned Parenthood’s handling of fetal tissue for research is the subject of a fresh video released Tuesday by an anti-abortion group.
In the latest video, the fifth released by Irvine, California-based Center for Medical Progress, an official from Planned Parenthood discusses the procurement and cost of intact fetuses. The video, we should warn you, is graphic.
Planned Parenthood calls the videos a “smear campaign.” It says the footage is highly edited, misleading, and takes discussions out of context.
The Center for Medical Progress has faced two court orders that block the release of future videos, but those orders are limited to footage recorded at meetings of the National Abortion Federation and those dealing with a tissue procurement company. Fox News adds: “Tuesday’s release, purely reliant on video taken inside a Planned Parenthood clinic, would not seem to violate either order.”
An activist group is trying to discredit Planned Parenthood with covertly recorded videos even as contraception advocates are touting a method that sharply reduces unwanted pregnancies.
Abortion is back at the fore of U.S. politics due to an activist group’s attempt to discredit Planned Parenthood, one of the most polarizing organizations in the country. Supporters laud its substantial efforts to provide healthcare for women and children. For critics, nothing that the organization does excuses its role in performing millions of abortions––a procedure that they regard as literal murder––and its monstrous character is only confirmed, in their view, by covertly recorded video footage of staffers cavalierly discussing what to do with fetal body parts.
If nothing else, that recently released footage has galvanized Americans who oppose abortion, media outlets that share their views, and politicians who seek their votes. “Defunding Planned Parenthood is now a centerpiece of the Republican agenda going into the summer congressional recess,” TheWashington Postreports, “and some hard-liners have said they are willing to force a government shutdown in October if federal support to the group is not curtailed.”
Exceptional nonfiction stories from 2014 that are still worth encountering today
Each year, I keep a running list of exceptional nonfiction that I encounter as I publish The Best ofJournalism, an email newsletter that I send out once or twice a week. This is my annual attempt to bring some of those stories to a wider audience. I could not read or note every worthy article that was published last calendar year and I haven't included any paywalled articles or anything published at The Atlantic. But everything that follows is worthy of wider attention and engagement.
It’s impossible to “solve” the Iranian nuclear threat. This agreement is the next best thing.
Having carefully reviewed the lengthy and complex agreement negotiated by the United States and its international partners with Iran, I have reached the following conclusion: If I were a member of Congress, I would vote yes on the deal. Here are nine reasons why.
1. No one has identified a better feasible alternative. Before negotiations halted its nuclear advance, Iran had marched relentlessly down the field from 10 years away from a bomb to two months from that goal line. In response, the United States and its partners imposed a series of sanctions that have had a significant impact on Iran’s economy, driving it to negotiate. That strategy worked, and resulted in a deal. In the absence of this agreement, the most likely outcome would be that the parties resume doing what they were doing before the freeze began: Iran installing more centrifuges, accumulating a larger stockpile of bomb-usable material, shrinking the time required to build a bomb; the U.S. resuming an effort to impose more severe sanctions on Iran. Alternatively, Israel or the United States could conduct military strikes on Iran’s nuclear facilities, setting back the Iranian program by two years, or perhaps even three. But that option risks wider war in the Middle East, an Iran even more determined to acquire a bomb, and the collapse of consensus among American allies.