Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
Is a lack of meaning really worse than a lack of freedom?
A man named François is a professor in Paris. He is a scholar of Joris-Karl Huysmans, an obscure 19th-century author who, in his later years, converted to Catholicism in an epiphany. François is the hero, or rather anti-hero, of French novelist Michel Houellebecq’s Submission. François is listless—even his attitude toward sex is uninspired, as if it’s an activity like any other, perhaps like playing tennis on a Sunday, but probably with less excitement. There is too much freedom and too many choices, and sometimes he’d rather just die.
The world around him, though, is changing. It is 2022. After a charismatic Islamist wins the second round of the French presidential elections against the right-wing Marine Le Pen (after gaining the support of the Socialists), a Muslim professor, himself a convert, attempts to persuade François to make the declaration of faith. “It’s submission,” the professor tells him. “The shocking and simple idea, which had never been so forcefully expressed, that the summit of human happiness resides in the most absolute submission.”
On Tuesday, Alex Van Der Zwaan, a lawyer who helped produce a report at Manafort’s behest, pleaded guilty to lying to the FBI.
Alex Van Der Zwaan, a former attorney at an international law firm, pleaded guilty to lying to federal agents about the last time he communicated with Paul Manafort’s longtime business partner, Rick Gates. Van Der Zwaan is the latest figure swept up in Robert Mueller’s expansive probe of Russian interference in the 2016 presidential election to admit to the charges against him.
Mueller’s interest in Van Der Zwaan, who helped produce a report about a contentious trial in Ukraine at Manafort’s behest, may be a signal that the special counsel is ramping up pressure on Manafort—whose connections to Russia and high-level role on the Trump campaign could prove invaluable to Mueller’s probe.
Gates is reportedly nearing his own plea deal with Mueller, according to the Los Angeles Times, but Manafort has continued to fight the charges he faces. His uphill battle to prove his innocence, however, will get steeper with Van Der Zwaan’s guilty plea.
A scathing obituary of Richard Nixon, originally published in Rolling Stone on June 16, 1994
MEMO FROM THE NATIONAL AFFAIRS DESK
DATE: MAY 1, 1994
FROM: DR. HUNTER S. THOMPSON
SUBJECT: THE DEATH OF RICHARD NIXON: NOTES ON THE PASSING OF AN AMERICAN MONSTER.... HE WAS A LIAR AND A QUITTER, AND HE SHOULD HAVE BEEN BURIED AT SEA.... BUT HE WAS, AFTER ALL, THE PRESIDENT.
"And he cried mightily with a strong voice, saying, Babylon the great is fallen, is fallen, and is become the habitation of devils, and the hold of every foul spirit and a cage of every unclean and hateful bird."
Richard Nixon is gone now, and I am poorer for it. He was the real thing -- a political monster straight out of Grendel and a very dangerous enemy. He could shake your hand and stab you in the back at the same time. He lied to his friends and betrayed the trust of his family. Not even Gerald Ford, the unhappy ex-president who pardoned Nixon and kept him out of prison, was immune to the evil fallout. Ford, who believes strongly in Heaven and Hell, has told more than one of his celebrity golf partners that "I know I will go to hell, because I pardoned Richard Nixon."
New evidence challenges one of the most celebrated ideas in network science.
A paper posted online last month has reignited a debate about one of the oldest, most startling claims in the modern era of network science: the proposition that most complex networks in the real world—from the World Wide Web to interacting proteins in a cell—are “scale-free.” Roughly speaking, that means that a few of their nodes should have many more connections than others, following a mathematical formula called a power law, so that there’s no one scale that characterizes the network.
Purely random networks do not obey power laws, so when the early proponents of the scale-free paradigm started seeing power laws in real-world networks in the late 1990s, they viewed them as evidence of a universal organizing principle underlying the formation of these diverse networks. The architecture of scale-freeness, researchers argued, could provide insight into fundamental questions such as how likely a virus is to cause an epidemic, or how easily hackers can disable a network.
Whatever their reasons, both Obama and Trump have argued against overemphasizing the effects of election interference—and they might both have a point.
At the start of the weekend, President Trump was buoyant, exulting that Robert Mueller’s latest round of indictments had not shown any evidence that the Trump campaign colluded with Russia. (Never mind that the troll-farm attacks are just one of several spheres Mueller is investigating, and that far more evidence to suggest collusion has turned up in others.)
But by the mid-weekend, the president’s mood had soured, as it became clear to him that the prevailed narrative from the indictment was the “incontrovertible” proof—to use National-Security Adviser H.R. McMaster’s word—of Russian interference in the 2016 election. Nothing sets Trump off quite as consistently as any suggestion of anything that might undermine the legitimacy of his victory.
Marvel has hit big with a movie that leads with the artistry of its storytelling and the diversity of its characterization. Hollywood would do well to take notice.
Over the last few years, a lot of pernicious Hollywood myths about what movies are “marketable” have been shattered. Old excuses about how blockbusters featuring actors of color don’t appeal to worldwide audiences have been swept away by the success of franchises like the Fast & Furious series and the Star Wars sequels. Time and again, American audiences have responded to films with black leads like Hidden Figures, Get Out, and Girls Trip, all of them turning huge profits on smaller budgets. Even within this context, though, the box-office success of Black Panther this past weekend was basically unprecedented, and it’s one that could dictate where studios direct their energies in the future.
Trump’s gravest responsibility is to defend the United States from foreign attack—and he’s done nothing to fulfill it.
As the rest of America mourns the victims of the Parkland, Florida, massacre, President Trump took to Twitter.
Not for him the rituals of grief. He is too consumed by rage and resentment. He interrupted his holidaying schedule at Mar-a-Lago only briefly, for a visit to a hospital where some of the shooting victims were treated. He posed afterward for a grinning thumbs-up photo op. Pain for another’s heartbreak—that emotion is for losers, apparently.
Having failed at one presidential duty, to speak for the nation at times of national tragedy, Trump resumed shirking an even more supreme task: defending the nation against foreign attack.
Last week, Special Counsel Robert Mueller indicted 13 Russian persons and three entities that conspired to violate federal election law, to the benefit of Trump and Republican congressional candidates. This is not the whole of the story by any means. This Mueller indictment references only Russian operations on Facebook. It does not deal with the weaponization of hacked information via WikiLeaks. Or the reports that the Russians funneled millions of dollars of election spending through the NRA’s political action committees. But this indictment does show enough to answer some questions about the scale and methods of the Russian intervention—and pose a new question, the most important of them all.
Tech analysts are prone to predicting utopia or dystopia. They’re worse at imagining the side effects of a firm's success.
The U.S economy is in the midst of a wrenching technological transformation that is fundamentally changing the way people sleep, work, eat, shop, love, read, and interact.
At least, that’s one interpretation.
A second story of this age of technological transformation says that it’s mostly a facade—that the last 30 years have been a productivity bust and little has changed in everyday life, aside from the way everyone reads and watches videos. People wanted flying cars and got Netflix binges instead.
Let’s call these the Disrupt Story and the Dud Story of technology. When a new company, app, or platform emerges, it’s common for analysts to divide into camps—Disrupt vs. Dud—with some yelping that the new thing will change everything and others yawning with the expectation that traditionalism will win out.
Another sequel so awful that it needs to be described in detail to be believed
For reasons that are now obscure to me—and were by definition ill-conceived—I read Fifty Shades of Grey at that terrible moment in American history when it seemed that everyone else was reading it too. I don’t believe that I read either of the book’s sequels, though I can’t attest to that with much confidence. Suffice to say that I made either the wise decision to skip them or the only marginally less-wise decision to repress all memory of them.
But writing about movies is something I’m paid to do, and occasionally that entails a degree of professional self-sacrifice. This week, the name of that sacrifice is Fifty Shades Freed.
The third and final—let’s pause and savor that word for a moment—adaptation of the “erotic romance” novel series by Erika Mitchell (pen name: E.L. James), Fifty Shades Freed is precisely as atrocious as one might imagine. Which is to say, it is far worse than the first movie—which, though awful, in hindsight looks like Citizen Kane, only with more discussion of dildos. I’d place the new film more or less on a par with the second one, Fifty Shades Darker, which makes sense given that both were filmed concurrently, were directed by James Foley (whose principal recommendation is that he directed Glengarry Glen Ross many, many years ago), and were adapted by Niall Leonard (whose principal recommendation is that he is married to Erika Mitchell).