Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
With every passing day, the stain and responsibility for Trump’s actions stick more lastingly to the Republican establishment.
Last night I was in circumstances where I could hear only a few excerpts from Donald Trump’s inflammatory speech in Phoenix. The parts I heard were remarkable enough.
They included Trump’s wink-wink implied promise to pardon ex-Sheriff Joe Arpaio, who was first turned out of office by the voters of Maricopa County and then found guilty by a federal judge of criminal contempt-of-court. There was also Trump’s threat to “close down our government” if the Congress won’t provide funding for his border wall—the same one Mexico was going to pay for. Plus his flatly deceitful rendering of what he had said about the neo-Nazi violence in Charlottesville, and why the press had criticized him for it. Plus his railing against Democratic obstructionism and the filibuster, when his biggest legislative failure, the repeal of Obamacare, was on a simple-majority vote.
An exclusive look at how Alphabet understands its most ambitious artificial intelligence project
In a corner of Alphabet’s campus, there is a team working on a piece of software that may be the key to self-driving cars. No journalist has ever seen it in action until now. They call it Carcraft, after the popular game World of Warcraft.
The software’s creator, a shaggy-haired, baby-faced young engineer named James Stout, is sitting next to me in the headphones-on quiet of the open-plan office. On the screen is a virtual representation of a roundabout. To human eyes, it is not much to look at: a simple line drawing rendered onto a road-textured background. We see a self-driving Chrysler Pacifica at medium resolution and a simple wireframe box indicating the presence of another vehicle.
The president went to Phoenix to deliver a speech that was dishonest, assailed his own allies, and contradicted itself.
How many people, given the prerogative to travel wherever they wanted and the use of a fully staffed jet to do it, would head to Phoenix, Arizona, in the dog days of August? But then Donald Trump often prefers to turn up the heat, and his rally Tuesday night was no different.
In remarks that veered between the carefully composed and the spontaneously concocted, Trump called for national unity even as he mounted all-out attacks on his enemies in the press and the Republican Party. He insisted he stood for all Americans but called Confederate monuments a part of “our” history. The president all but promised to pardon former Maricopa County Sheriff Joe Arpaio, convicted of criminal contempt of court, and told a series of out-and-out lies in the course of accusing the media of dishonesty.
Do you know someone who needs hours alone every day? Who loves quiet conversations about feelings or ideas, and can give a dynamite presentation to a big audience, but seems awkward in groups and maladroit at small talk? Who has to be dragged to parties and then needs the rest of the day to recuperate? Who growls or scowls or grunts or winces when accosted with pleasantries by people who are just trying to be nice?
More comfortable online than out partying, post-Millennials are safer, physically, than adolescents have ever been. But they’re on the brink of a mental-health crisis.
One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”
Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”
A faction on the left wants to weaken the free-speech rights that protect marginalized people at the very moment when doing so would help Donald Trump to persecute them.
When free-speech advocates point out that the First Amendment protects even hate speech, as the attorney Ken White recently observed, they are often met with extreme hypotheticals. For example: “So, the day that Nazis march in the streets, armed, carrying the swastika flag, Sieg-Heiling, calling out abuse of Jews and blacks, some of their number assaulting and even killing people, you'll still defend their right to speak?"
In Charlottesville, he declared, something like that scenario came to pass: “Literal Nazis marched the streets of an American city, calling out Jews and blacks and gays, wielding everything from torches to clubs and shields to rifles, offering Nazi slogans and Nazi salutes. Some of their number attacked counter-protesters, and one of them murdered a counter-protester and attempted to murder many others. This is the ‘what if’ and ‘how far’ that critics of vigorous free speech policies pose to us as a society.”
Small towns across Japan are on the verge of collapse. Whether they can do so gracefully has consequences for societies around the globe.
TOCHIKUBO, Japan—The children had moved to the big city, never to return.
So their parents, both over 70, live out their days in this small town in the mountains, gazing at the rice paddies below, wondering what will become of the house they built, the garden they tended, the town they love.
“I don’t expect them to come back,” Kensaku Fueki, 73, told me, about his three daughters, all married and living in Tokyo. “It’s very tough to live on farming.”
For decades, young people have been fleeing this rural village, lured by the pull of Japan’s big cities like Tokyo and Osaka. Tochikubo’s school now has eight children, and more than half of the town’s 170 people are over the age of 50. “Who will come here now?” said Fueki, who grew up in this village and remembers a time when many of the houses weren’t abandoned, when more people farmed the land and children roamed the streets.
A best-selling author submits a draft to his editor. Hijinks ensue.
I had written five books for Scott Moyers, following him as he moved from editing jobs at Scribner’s to Random House and then to Penguin Press. We worked well together, and in part thanks to his strong editing hand, my last three books had been bestsellers.
So what happened when I finished years of work and sent him the manuscript of my sixth book stunned me. In fact, I was in for a series of surprises.
They began about 18 months ago, after I emailed to him that manuscript, a dual appreciation of Winston Churchill and George Orwell. When I had begun work on it, in 2013, some old friends of mine thought the subject was a bit obscure. Why would anyone care how two long-dead Englishmen, a conservative politician and a socialist journalist who never met, had dealt with the polarized political turmoil of the 1930s and the world war that followed? By 2016, as people on both the American left and right increasingly seemed to favor opinion over fact, the book had become more timely.
The nation’s current post-truth moment is the ultimate expression of mind-sets that have made America exceptional throughout its history.
When did America become untethered from reality?
I first noticed our national lurch toward fantasy in 2004, after President George W. Bush’s political mastermind, Karl Rove, came up with the remarkable phrase reality-based community. People in “the reality-based community,” he told a reporter, “believe that solutions emerge from your judicious study of discernible reality … That’s not the way the world really works anymore.” A year later, The Colbert Report went on the air. In the first few minutes of the first episode, Stephen Colbert, playing his right-wing-populist commentator character, performed a feature called “The Word.” His first selection: truthiness. “Now, I’m sure some of the ‘word police,’ the ‘wordinistas’ over at Webster’s, are gonna say, ‘Hey, that’s not a word!’ Well, anybody who knows me knows that I’m no fan of dictionaries or reference books.
Many struggling dieters actually suffer from binge eating disorder, and could manage their condition—and lose weight—with the help of a psychologist.
Melissa Rivera always turned off the cameras before she binged. Newly married to a husband who traveled frequently, the 23-year-old med student, who had recently moved six hours from her friends and family, comforted herself with food. “I’d get this whole pizza that I would eat myself,” she says. Each time, she turned off the house’s security system so her husband wouldn’t see the coping mechanism she’d used since she was 8 years old. “At some point, I realized, ‘This is killing me. I cannot do it anymore,’” she says. She sought help from counselors at the University of Texas, where she was a student.
Rivera suffered from binge eating disorder (BED), but says the school’s experts weren’t able to help. She says a school dietitian encouraged the very behavior that kicks off the bingeing cycle: restriction. “‘You have to eat so many grams of meat, you have to eat at most a cube of cheese per day,’” Rivera recalls the dietitian telling her. “I never did what she said.” Finally, at the end of 2016, Rivera searched online and connected with Edward Tyson, a local eating-disorder specialist. But after years of struggle, she was skeptical about how much he could help. “Everything sounded like a beautiful promise, but it seemed impossible that he’d get me to this nice place that he was talking about,” Rivera says. “I’m happy to say that he did.” She has been binge-free since January.