Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
Trump’s supporters backed a time-honored American political tradition, disavowing racism while promising to enact a broad agenda of discrimination.
THIRTY YEARS AGO, nearly half of Louisiana voted for a Klansman, and the media struggled to explain why.
It was 1990 and David Duke, the former grand wizard of the Ku Klux Klan, astonished political observers when he came within striking distance of defeating incumbent Democratic U.S. Senator J. Bennett Johnston, earning 43 percent of the vote. If Johnston’s Republican rival hadn’t dropped out of the race and endorsed him at the last minute, the outcome might have been different.
Was it economic anxiety? The Washington Post reported that the state had “a large working class that has suffered through a long recession.” Was it a blow against the state’s hated political establishment? An editorial from United Press International explained, “Louisianans showed the nation by voting for Duke that they were mad as hell and not going to take it any more.” Was it anti-Washington rage? A Loyola University pollster argued, “There were the voters who liked Duke, those who hated J. Bennett Johnston, and those who just wanted to send a message to Washington.”
Turkey Day usually represents a late-season reset for pro football, but myriad controversies portend a more serious tone this year.
The National Football League’s tradition of playing on Thanksgiving Day is also its oldest. Back in 1920, the year the league was founded, 12 proto-football teams squared off in six Turkey Day matchups. Since then the NFL has hosted Thanksgiving games in every year but four—all during World War II—with the Dallas Cowboys and Detroit Lions emerging as annual hosts and other teams rotating through to play in front of a tryptophan-tripping, football-mad nation.
And as the NFL has ballooned into the most popular professional sports league in North America,its Thanksgiving custom has grown as well, adding pyrotechnics and halftime shows to impress massive TV audiences. Aside from the Super Bowl, no celebration better represents the NFL’s largesse, cultural might, spectacle, and promise ofescapism than Thanksgiving—theleague’s entire self-image, shrunken down to one day.
The outspoken lawyer-turned-ESPN analyst may be the moral conscience college basketball needs this season, as it grapples with its biggest scandal in decades.
After 22 years as a college-basketball commentator for ESPN, Jay Bilas is now slogging through his busiest November yet. Finding himself far-flung during a monthstacked with tournaments and traveling from Chicago to Maui—with maybe a night to recharge in his Charlotte-area home—is common practice by now. But the addition of the Phil Knight 80, a Thanksgiving tournament in Oregon that commemorates the Nike founder’s 80th birthday, has thrown Bilas’s carefully controlled schedule for a loop. “For me to do 12 games in basically seven days,” Bilas told me by phone last week, “is unprecedented.”
Such is the life of perhaps the most well-regarded and trusted individual in all of college basketball. With his voice honed over the decades into a reassuring timbre, Bilas effectively serves as the sport’s Walter Cronkite—a respected commentator unafraid to speak openly about an American institution beset by a fraught and ongoing debate about amateurism (and whether student-athletes should be paid), as well as a bribery scandal that has mushroomed into its most serious crisis in years. “The NCAA makes its own rules, and their rules are bad,” Bilas said during a panel discussion in Baltimore last month. “That’s been pointed out forever, and so for the people in charge, and specifically the president of the NCAA, to talk about some code of silence in college basketball that people weren’t telling them what was going on—they knew exactly what was going on.”
The post-Weinstein moment isn’t a war on sex. It’s a long-overdue revolution.
One of the principal pleasures of Mad Men, on rich display beginning with the pilot episode, was looking at all of the crazy things people used to be able to do in offices: smoke, drink, and—if they were male—grope and corner and sexually humiliate the women, who could either put up with it or quit.
It’s just about impossible to imagine someone lighting a cigarette in today’s hyper-sanitized workplace; anyone with liquor on his or her breath at midday is usually targeted as a massive loser or frog-marched to human resources. But to look at the shocking and ever-growing list of prominent men recently and credibly accused of acts ranging from sexual harassment to violent rape is to realize that abhorrent treatment of women is alive and well in many American workplaces.
How did Andrew Anglin go from being an antiracist vegan to the alt-right’s most vicious troll and propagandist—and how might he be stopped?
On December 16, 2016, Tanya Gersh answered her phone and heard gunshots. Startled, she hung up. Gersh, a real-estate agent who lives in Whitefish, Montana, assumed it was a prank call. But the phone rang again. More gunshots. Again, she hung up. Another call. This time, she heard a man’s voice: “This is how we can keep the Holocaust alive,” he said. “We can bury you without touching you.”
When Gersh put down the phone, her hands were shaking. She was one of only about 100 Jews in Whitefish and the surrounding Flathead Valley, and she knew there were white nationalists and “sovereign citizens” in the area. But Gersh had lived in Whitefish for more than 20 years, since just after college, and had always considered the scenic ski town an idyllic place. She didn’t even have a key to her house—she’d never felt the need to lock her door. Now that sense of security was about to be shattered.
How leaders lose mental capacities—most notably for reading other people—that were essential to their rise
If power were a prescription drug, it would come with a long list of known side effects. It can intoxicate. It can corrupt. It can even make Henry Kissinger believe that he’s sexually magnetic. But can it cause brain damage?
When various lawmakers lit into John Stumpf at a congressional hearing last fall, each seemed to find a fresh way to flay the now-former CEO of Wells Fargo for failing to stop some 5,000 employees from setting up phony accounts for customers. But it was Stumpf’s performance that stood out. Here was a man who had risen to the top of the world’s most valuable bank, yet he seemed utterly unable to read a room. Although he apologized, he didn’t appear chastened or remorseful. Nor did he seem defiant or smug or even insincere. He looked disoriented, like a jet-lagged space traveler just arrived from Planet Stumpf, where deference to him is a natural law and 5,000 a commendably small number. Even the most direct barbs—“You have got to be kidding me” (Sean Duffy of Wisconsin); “I can’t believe some of what I’m hearing here” (Gregory Meeks of New York)—failed to shake him awake.
In a presidency defined by its unpredictability, one of the few constants is the president’s eagerness to attack black people for failing to show deference.
When, in a game last Sunday in Mexico City versus the New England Patriots, the Oakland Raiders running back Marshawn Lynch chose to sit during the “Star Spangled-Banner,” and then stood during the Mexican National Anthem, the idea of the multiverse—multiple realities and infinite branching probabilities—suddenly seemed inadequate. As soon as the cameras focused on Lynch, this plane of existence narrowed to a single undeniable probability: that President Donald Trump was going to tweet about it sometime soon.
Trump happily obliged fate. On Monday morning at 6:25am, in the block of time reserved for blasting people and things he’s seen on cable news that he doesn’t like, the president tweeted that “next time [the] NFL should suspend him for remainder of season.” Utilizing the extra 140 extra characters Twitter recently bestowed, Trump was also able to imply that Lynch was a factor in the the NFL’s sinking ratings. With that, Lynch became just the latest in a line of outspoken black people that Trump has attacked. It’s kind of a thing for him.
A study of the famous animal’s bones suggests the conventional wisdom about how clones age is probably wrong.
Dolly the sheep was the first animal to be cloned from an adult cell, and like many firsts, she came to stand in for all of her kind.
So when scientists suspected she had short telomeres—stretches of DNA that normally shorten with age—people wondered if it was because she was cloned from an adult cell. When she started to limp at age five, headlines said that her arthritis “dents faith in cloning.” And when she died at age six—as the result of a common lung virus that also killed other sheep in her barn—her short life again became a parable about cloning. A certain narrative took hold.
Then last year, Kevin Sinclair, a developmental biologist at the University of Nottingham, published a paper about several clones including Dolly’s four “sisters,” who were created from the same cell line as Dolly and lived to the old age of eight (about 70 in human years). They were quite healthy for their age. So of course, he kept getting questions, like if these animals are so healthy, then why was Dolly so unhealthy? It was Dolly that everyone cared about.
More comfortable online than out partying, post-Millennials are safer, physically, than adolescents have ever been. But they’re on the brink of a mental-health crisis.
One day last summer, around noon, I called Athena, a 13-year-old who lives in Houston, Texas. She answered her phone—she’s had an iPhone since she was 11—sounding as if she’d just woken up. We chatted about her favorite songs and TV shows, and I asked her what she likes to do with her friends. “We go to the mall,” she said. “Do your parents drop you off?,” I asked, recalling my own middle-school days, in the 1980s, when I’d enjoy a few parent-free hours shopping with my friends. “No—I go with my family,” she replied. “We’ll go with my mom and brothers and walk a little behind them. I just have to tell my mom where we’re going. I have to check in every hour or every 30 minutes.”
Those mall trips are infrequent—about once a month. More often, Athena and her friends spend time together on their phones, unchaperoned. Unlike the teens of my generation, who might have spent an evening tying up the family landline with gossip, they talk on Snapchat, the smartphone app that allows users to send pictures and videos that quickly disappear. They make sure to keep up their Snapstreaks, which show how many days in a row they have Snapchatted with each other. Sometimes they save screenshots of particularly ridiculous pictures of friends. “It’s good blackmail,” Athena said. (Because she’s a minor, I’m not using her real name.) She told me she’d spent most of the summer hanging out alone in her room with her phone. That’s just the way her generation is, she said. “We didn’t have a choice to know any life without iPads or iPhones. I think we like our phones more than we like actual people.”
After laboring for years to close the gender gap, GOP strategists are suddenly facing a gender chasm.
It turns out those pink kitty-cat hats weren’t just for show after all.
Among its many electrifying aspects, the early Trump era has had a politically galvanizing effect on women. They are organizing in the streets and on social media, running for office in record numbers, training to enter future races, and volunteering on campaigns. And on November 7, they flocked to the polls to officially have their voices heard.
What they had to say more or less boiled down to: Things around here have got to change. Now. Which has many folks in the Republican Party reaching for the Xanax.
By now, you’ve likely heard some of the Election Day stats and stories. In Virginia, women went from holding 17 seats in the House of Delegates to holding 27. Winners include Danica Roem, who became the state’s first transgender delegate-elect by beating an incumbent who bragged of being the state’s “chief homophobe.” In the gubernatorial contest, women favored Democrat Ralph Northam by 22 points—5 points more than Hillary Clinton’s margin among them last fall. Particularly concerning for Republicans: Fifty-eight percent of white college-educated women went for Northam vs. only 50 percent for Hillary.