Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
They weren’t the first victims of a mass shooting the Florida radiologist had seen—but their wounds were radically different.
As I opened the CT scan last week to read the next case, I was baffled. The history simply read “gunshot wound.” I have been a radiologist in one of the busiest trauma centers in the nation for 13 years, and have diagnosed thousands of handgun injuries to the brain, lung, liver, spleen, bowel, and other vital organs. I thought that I knew all that I needed to know about gunshot wounds, but the specific pattern of injury on my computer screen was one that I had seen only once before.
In a typical handgun injury that I diagnose almost daily, a bullet leaves a laceration through an organ like the liver. To a radiologist, it appears as a linear, thin, grey bullet track through the organ. There may be bleeding and some bullet fragments.
The document, drafted by minority members of the House Intelligence Committee, sought to rebut claims that the bureau abused its power during the election.
The Republican charge that the FBI misled a secret surveillance court in order to spy on a former Trump campaign operative seemed to unravel on Saturday, when Democrats on the House Intelligence Committee revealed the exact wording that the bureau used when applying for the order in October 2016.
In a memo drafted by the intelligence committee’s Republicans in January and promptly declassified by the White House, the majority claimed that the FBI had misleadingly obscured the origins of a dossier written by former British intelligence officer Christopher Steele, some of whose research on Trump campaign adviser Carter Page was included in the bureau’s application for a warrant to surveil him. “Neither the initial application in October 2016, nor any of the renewals, disclose or reference the role of the DNC, Clinton campaign, or any party/campaign in funding Steele's efforts, even though the political origins of the Steele dossier were then known to senior FBI officials,” the Republicans’ memo alleged.
Decades before he ran the Trump campaign, Paul Manafort’s pursuit of foreign cash and shady deals laid the groundwork for the corruption of Washington.
The clinic permitted Paul Manafort one 10-minute call each day. And each day, he would use it to ring his wife from Arizona, his voice often soaked in tears. “Apparently he sobs daily,” his daughter Andrea, then 29, texted a friend. During the spring of 2015, Manafort’s life had tipped into a deep trough. A few months earlier, he had intimated to his other daughter, Jessica, that suicide was a possibility. He would “be gone forever,” she texted Andrea.
His work, the source of the status he cherished, had taken a devastating turn. For nearly a decade, he had counted primarily on a single client, albeit an exceedingly lucrative one. He’d been the chief political strategist to the man who became the president of Ukraine, Viktor Yanukovych, with whom he’d developed a highly personal relationship.
For the past decade, Rick Gates was fiercely loyal to his risk-taking boss. Not anymore.
There should be no denying Paul Manafort’s fate. Special Counsel Robert Mueller’s list of charges keeps on swelling—a repeatedly amended compendium of malfeasance that is now so long and so pointillistic that it could be only defused by a world-historic prosecutorial gaffe. Despite this seeming comprehensiveness, each fresh filing in court contains a moment where the special prosecutor winks at his target, as if letting him know that he has only begun to bring the pain: a small display of how comprehensively he has surveilled Manafort and his minions; a further sampling of the evidence that could be sitting in his reserve stash.
Everyone understands Manafort’s fate, except apparently the man himself. Rather than cutting a deal—as his longtime deputy Rick Gates did yesterday—Manafort continues to cut a figure of defiance. He has, in essence, dismissed Gates as a weakling. And even as the bedraggled Gates turned against him, Manafort boasted in a statement that he would not be knocked from his stance: “This does not alter my commitment to defend myself against the untrue piled up charges contained in the indictments against me."
Many seniors are stuck with lives of never-ending work—a fate that could befall millions in the coming decades.
CORONA, Calif.—Roberta Gordon never thought she’d still be alive at age 76. She definitely didn’t think she’d still be working. But every Saturday, she goes down to the local grocery store and hands out samples, earning $50 a day, because she needs the money.
“I’m a working woman again,” she told me, in the common room of the senior apartment complex where she now lives, here in California’s Inland Empire. Gordon has worked dozens of odd jobs throughout her life—as a house cleaner, a home health aide, a telemarketer, a librarian, a fundraiser—but at many times in her life, she didn’t have a steady job that paid into Social Security. She didn’t receive a pension. And she definitely wasn’t making enough to put aside money for retirement.
By “camouflaging” their condition, many women on the spectrum learn to fit in—and risk psychological harm.
Except for her family and closest friends, no one in Jennifer’s various circles knows that she is on the spectrum. Jennifer was not diagnosed with autism until she was 45 years old—and then only because she wanted confirmation of what she had figured out for herself over the previous decade. Most of her life, she says, she evaded a diagnosis by forcing herself to stop doing things her parents and others found strange or unacceptable. (Because of the stigma associated with autism, Jennifer asked to be identified only by her first name.)
Over several weeks of emailing back and forth, Jennifer confides in me some of the tricks she uses to mask her autism—for example, staring at the spot between someone’s eyes instead of into their eyes, which makes her uncomfortable. But when we speak for the first time over video chat one Friday afternoon in January, I cannot pick up on any of these ploys.
The revolutionary ideals of Black Panther’s profound and complex villain have been twisted into a desire for hegemony.
The following article contains major spoilers.
Black Panther is a love letter to people of African descent all over the world. Its actors, its costume design, its music, and countless other facets of the film are drawn from all over the continent and its diaspora, in a science-fiction celebration of the imaginary country of Wakanda, a high-tech utopia that is a fictive manifestation of African potential unfettered by slavery and colonialism.
But it is first and foremost an African American love letter, and as such it is consumed with The Void, the psychic and cultural wound caused by the Trans-Atlantic slave trade, the loss of life, culture, language, and history that could never be restored. It is the attempt to penetrate The Void that brought us Alex Haley’s Roots, that draws thousands of African Americans across the ocean to visit West Africa every year, that left me crumpled on the rocks outside the Door of No Return at Gorée Island’s slave house as I stared out over a horizon that my ancestors might have traversed once and forever. Because all they have was lost to The Void, I can never know who they were, and neither can anyone else.
The sisters turned domestic constraints into grist for brilliant books.
No body of writing has engendered more other bodies of writing than the Bible, but the Brontë corpus comes alarmingly close. “Since 1857, when Elizabeth Gaskell published her famous Life of Charlotte Brontë, hardly a year has gone by without some form of biographical material on the Brontës appearing—from articles in newspapers to full-length lives, from images on tea towels to plays, films, and novelizations,” wrote Lucasta Miller in The Brontë Myth, her 2001 history of Brontëmania. This year the Brontë literary-industrial complex celebrates the bicentennial of Charlotte’s birth, and British and American publishers have been especially busy. In the U.S., there is a new Charlotte Brontë biography by Claire Harman; a Brontë-themed literary detective novel; a novelistic riff on Jane Eyre whose heroine is a serial killer; a collection of short stories inspired by that novel’s famous line*, “Reader, I married him”; and a fan-fiction-style “autobiography” of Nelly Dean, the servant-narrator of Wuthering Heights. Last year’s highlights included a young-adult novelization of Emily’s adolescence and a book of insightful essays called The Brontë Cabinet: Three Lives in Nine Objects, which uses items belonging to Charlotte, Emily, and Anne as wormholes to the 19th century and the lost texture of their existence. Don’t ask me to list the monographs.
What’s the mail like from those who reject the need for new gun laws? Here are two samples. The first is — unfortunately, but realistically—representative in its tone and argumentative style of most of the dissenting messages that have arrived:
American teens are shaping a new kind of debate about gun violence—but why now?
The aftermath of a mass shooting in the United States can feel like an all-too-familiar play.
Act I: Some combination of grief and shock and terror ripples across the nation, accompanied by a deluge of news coverage.
Act II: Gun-control advocates leverage the moment to call for stricter laws; those who oppose such restrictions offer their thoughts and prayers to victims but argue that gun control won’t help.
Act III: the inevitable deadlock. America moves on; America forgets. Nothing changes, except for those for whom everything has changed. Public opinion on gun control remains as divided as ever.
That play is following a different script this time around. The curtain has stayed up on Act II, as survivors of what is now the deadliest high-school shooting in modern U.S. history have prevented the play from proceeding along its typical trajectory. “We call B.S.!” chanted Emma González—a Marjory Stoneman Douglas High School senior whose face has since become a symbol for this exploding youth-led political campaign—at a rally last Saturday. Since then, the Parkland, Florida, teens’ tweets, essays, and television appearances—equal parts fierce determination and fervent agony—have been the public-facing cry of what they have dubbed the “Never Again” movement.