Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
The president’s attempt to intimidate James Comey didn’t merely backfire—it may also embolden hostile regimes to conclude his other threats are equally empty.
This is a first for the Trump presidency: the first formal presidential retraction of a presidential untruth.
President Trump tweeted a warning to James Comey: The fired FBI director had better hope that no “tapes” existed that could contradict his account of what happened between the two men. Trump has now confessed that he had no basis for this warning. There were no such tapes, and the president knew it all along.
The tweet was intended to intimidate. It failed, spectacularly: Instead of silencing Comey, it set in motion the special counsel investigation that now haunts Donald Trump’s waking imagination.
But the failed intimidation does have important real world consequences.
First, it confirms America’s adversaries in their intensifying suspicion that the president’s tough words are hollow talk. The rulers of North Korea will remember the menacing April 4 statement from the Department of State that the United States had spoken enough about missile tests, implying that decisive actions lay ahead—and the lack of actions and deluge of further statements that actually followed.
Why Millennials aren’t buying cars or houses, and what that means for the economy
In 2009, Ford brought its new supermini, the Fiesta, over from Europe in a brave attempt to attract the attention of young Americans. It passed out 100 of the cars to influential bloggers for a free six-month test-drive, with just one condition: document your experience online, whether you love the Fiesta or hate it.
Young bloggers loved the car. Young drivers? Not so much. After a brief burst of excitement, in which Ford sold more than 90,000 units over 18 months, Fiesta sales plummeted. As of April 2012, they were down 30 percent from 2011.
Don’t blame Ford. The company is trying to solve a puzzle that’s bewildering every automaker in America: How do you sell cars to Millennials (a k a Generation Y)? The fact is, today’s young people simply don’t drive like their predecessors did. In 2010, adults between the ages of 21 and 34 bought just 27 percent of all new vehicles sold in America, down from the peak of 38 percent in 1985. Miles driven are down, too. Even the proportion of teenagers with a license fell, by 28 percent, between 1998 and 2008.
The former president issued a warning about the Republican plan to replace his signature health-care law. The Senate is planning to vote on it as early as next week.
On Thursday, Senate Republicans released a draft version of their Obamacare replacement, the American Health Care Act. The bill looks similar to the version passed by the House in May, and would accomplish much of the same: a large increase in the number of uninsured people and drastic cuts to the Medicaid program that is critical for poor people, pregnant women, children, and people with chronic health conditions.
In the aftermath of the release of that bill, which Senate Majority Leader Mitch McConnell hopes will pass the Senate in the next two weeks, former President Barack Obama issued a rare full-throated post-presidential statement criticizing the AHCA and the political process by which it came to be. The statement, posted to Facebook, comes on the heels of another statement in March defending Obamacare, and is also one of the most thorough defenses of his signature policy, even dating back to his time in office.
Thirty minutes. That’s about how long it would take a nuclear-tipped intercontinental ballistic missile (ICBM) launched from North Korea to reach Los Angeles. With the powers in Pyongyang working doggedly toward making this possible—building an ICBM and shrinking a nuke to fit on it—analysts now predict that Kim Jong Un will have the capability before Donald Trump completes one four-year term.
About which the president has tweeted, simply, “It won’t happen!”
Though given to reckless oaths, Trump is not in this case saying anything that departs significantly from the past half century of futile American policy toward North Korea. Preventing the Kim dynasty from having a nuclear device was an American priority long before Pyongyang exploded its first nuke, in 2006, during the administration of George W. Bush. The Kim regime detonated four more while Barack Obama was in the White House. In the more than four decades since Richard Nixon held office, the U.S. has tried to control North Korea by issuing threats, conducting military exercises, ratcheting up diplomatic sanctions, leaning on China, and most recently, it seems likely, committing cybersabotage.
Over time, leaders lose mental capacities—most notably for reading other people—that were essential to their rise.
If power were a prescription drug, it would come with a long list of known side effects. It can intoxicate. It can corrupt. It can even make Henry Kissinger believe that he’s sexually magnetic. But can it cause brain damage?
When various lawmakers lit into John Stumpf at a congressional hearing last fall, each seemed to find a fresh way to flay the now-former CEO of Wells Fargo for failing to stop some 5,000 employees from setting up phony accounts for customers. But it was Stumpf’s performance that stood out. Here was a man who had risen to the top of the world’s most valuable bank, yet he seemed utterly unable to read a room. Although he apologized, he didn’t appear chastened or remorseful. Nor did he seem defiant or smug or even insincere. He looked disoriented, like a jet-lagged space traveler just arrived from Planet Stumpf, where deference to him is a natural law and 5,000 a commendably small number. Even the most direct barbs—“You have got to be kidding me” (Sean Duffy of Wisconsin); “I can’t believe some of what I’m hearing here” (Gregory Meeks of New York)—failed to shake him awake.
The justices unanimously limited the federal government’s power to strip immigrants of their hard-won status.
The U.S. Supreme Court narrowed the scope under which the federal government can strip naturalized Americans of their citizenship on Thursday, ruling that false statements made during the naturalization process had to be relevant to gaining citizenship in order to justify revoking it later.
Justice Elena Kagan, writing for a unanimous Court in Maslenjuk v. United States, said that using small omissions or minor lies to denaturalize immigrants went beyond what Congress authorized. “The statute it passed, most naturally read, strips a person of citizenship not when she committed any illegal act during the naturalization process, but only when that act played some role in her naturalization,” she wrote.
If the party cares about winning, it needs to learn how to appeal to the white working class.
The strategy was simple. A demographic wave—long-building, still-building—would carry the party to victory, and liberalism to generational advantage. The wave was inevitable, unstoppable. It would not crest for many years, and in the meantime, there would be losses—losses in the midterms and in special elections; in statehouses and in districts and counties and municipalities outside major cities. Losses in places and elections where the white vote was especially strong.
But the presidency could offset these losses. Every four years the wave would swell, receding again thereafter but coming back in the next presidential cycle, higher, higher. The strategy was simple. The presidency was everything.
The quality and variety of food in the U.S. has never been better. The business seems to be struggling. What’s really going on?
For restaurants in America, it is the best of times, and it is the worst of times.
Last century’s dystopians imagined that mediocre fast-food chains would take overevery square inch of the country. But in cities across the U.S., residents are claiming that the local restaurant scene is in a golden age of variety and quality. I’ve heard it in Portland, Oregon, named the best food city in America by the Washington Post; in Washington, D.C., named the best food city in America by Bon Appetit; in New Orleans, where the number of restaurants grew 70 percent after Hurricane Katrina; and in San Francisco, which boasts the most restaurants per capita in the country; and in Chicago, which has added several three-Michelin-star restaurants this decade. I live in New York, which will always lead the country in sheer abundance of dining options, but after years of visiting my sister in Los Angeles, I’m thoroughly convinced that America’s culinary capital has switched coasts.
In Appalachia, a primary-care clinic offers quick bursts of psychotherapy on the spot.
JOHNSON CITY, Tennessee—The first patient of the morning had been working 119 hours a week. Greta (not her real name) had been coming home late at night, skipping dinner, and crashing into bed. One recent night, her college-aged daughter melted down, telling an exhausted Greta that her parents’ marital tensions were putting a strain on her.
“She’s like, ‘Why don’t you just divorce him?’” Greta recounted to her psychotherapist, Thomas Bishop, who was perched on a rolling stool in the bright examination room. “‘Why don’t you just do it and get it over with?’” Greta planned to stay with her husband, but her daughter’s outburst worried her. “Is this going to affect the way she feels about relationships?” she asked Bishop.
The Senate bill coming out Thursday would do many things to health care in the U.S., but it won’t get rid of the Affordable Care Act, and Mitch McConnell won’t claim that it does.
The health-care bill Senate Republicans plan to unveil on Thursday likely will make substantial changes to Medicaid and cut taxes for wealthy Americans and businesses. It will eliminate mandates and relax regulations on insurance plans, and it will reduce the federal government’s role in health care.
What it won’t do, however, is actually repeal the Affordable Care Act.
Lost in the roiling debate over health care over the last several weeks is that Republicans have all but given up on their longstanding repeal-and-replace pledge. The slogan lives on in the rhetoric used by many GOP lawmakers and the Trump White House but not in the legislation the party is advancing. That was true when House Republicans passed the American Health Care Act last month, which rolled back key parts of Obamacare but was not a full repeal. And it is even more true of the bill the Senate has drafted in secret, which reportedly will stick closer to the underlying structure of the law.