Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
The class divide is already toxic, and is fast becoming unbridgeable. You’re probably part of the problem.
1. The Aristocracy Is Dead …
For about a week every year in my childhood, I was a member of one of America’s fading aristocracies. Sometimes around Christmas, more often on the Fourth of July, my family would take up residence at one of my grandparents’ country clubs in Chicago, Palm Beach, or Asheville, North Carolina. The breakfast buffets were magnificent, and Grandfather was a jovial host, always ready with a familiar story, rarely missing an opportunity for gentle instruction on proper club etiquette. At the age of 11 or 12, I gathered from him, between his puffs of cigar smoke, that we owed our weeks of plenty to Great-Grandfather, Colonel Robert W. Stewart, a Rough Rider with Teddy Roosevelt who made his fortune as the chairman of Standard Oil of Indiana in the 1920s. I was also given to understand that, for reasons traceable to some ancient and incomprehensible dispute, the Rockefellers were the mortal enemies of our clan.
The president’s alarming Sunday tweet could genuinely produce a crisis between the White House, on the one hand, and the Justice Department and the FBI, on the other.
Sunday afternoon, President Trump tweeted an extraordinary threat—extraordinary even by the standards of Donald Trump’s norm-busting use of Twitter and abusive conduct toward the Justice Department and federal investigations: “I hereby demand, and will do so officially tomorrow, that the Department of Justice look into whether or not the FBI/DOJ infiltrated or surveilled the Trump Campaign for Political Purposes—and if any such demands or requests were made by people within the Obama Administration!”
I normally try to ignore presidential tweets. This one, however, those concerned about the integrity of law enforcement can’t ignore. It requires attention, because it could genuinely produce a crisis between the White House, on the one hand, and the Justice Department and the FBI, on the other.
As long as there is easy access to guns, there’s no way parents, teachers, and other specialists can thwart every violent teenager.
The 17-year-old who killed 10 people at Santa Fe High School, in Texas, allegedly used his father’s shotgun and .38 revolver. After a firefight with police, he surrendered, saying he did not have the courage to kill himself, as he had planned, Governor Greg Abbott told reporters.
In the hours after the May 18 attack, some students were shocked that Dimitrios Pagourtzis felled his classmates and two substitute teachers with buckshot. He played defensive tackle on the football team. He made honor roll. He is not known to have a criminal record, according to Abbott. Just the day before, he had been joking around with friends on a field trip to a waterpark. Others found him disturbing, often wearing a trench coat, said his classmates, and, on that day, a black T-shirt with the haunting message BORN TO KILL.
“Bird hunting” has become a pastime and a side hustle for teens and young professionals, but for some it’s a cutthroat business.
Every afternoon around 4 p.m., when school lets out, Brandon, an 18-year-old high-school senior in Los Angeles who asked to be referred to only by his first name, goes “Bird hunting.” He heads for his minivan and, on the drive home, he’ll swing through convenient neighborhoods, picking up about 13 Bird electric scooters along the way, tossing them into the back of his car.
“I have a whole system,” he says. “I’ll go home, put the 13 I initially caught on the chargers. They’ll charge for about three hours until around 7 or 8 p.m.”—when Bird makes more scooters available for charger pickup. “Then I’ll go back out.”
Over the course of the next few hours, Brandon loops around his Santa Monica, California, neighborhood collecting as many scooters as possible. He brings back his bounty and, as his parents sleep, neatly sets them up to charge in batches overnight.
Will the president’s claims of political spying on his campaign prove true? The long string of failed vindications he’s rolled out in the past counsels skepticism.
Stop me if you’ve heard this one. The president of the United States is seizing on vague news reports to allege a vast political conspiracy against him, demanding an investigation, and searching for vindication.
Of course you’ve heard this—it’s a trope nearly as old as the Trump administration. The latest recurrence concerns a reported informant who fed information to the FBI about possible Russian interference in the presidential campaign. Wall Street Journal columnist Kim Strassel led the way on the story two weeks ago, and over the weekend The New York Timesand The Washington Postadded a great deal more detail.
The informant is a retired academic who reportedly spoke to three Trump advisers—Carter Page, George Papadopoulos, and Sam Clovis—because of federal government concerns about contacts between Russians and Trump advisers. All three have proven to have had curious links to Russia. Papadopoulos met with Joseph Mifsud, a mysterious Russia-linked professor, and also told the Australian ambassador to Britain that the Russians had dirt on the Clinton campaign, launching the FBI’s probe into Russian interference; Papadopoulos has since pleaded guilty to lying to federal agents and is cooperating with special counsel Robert Mueller. Page delivered testimony to the House Intelligence Committee revealing a web of contacts in Russia, including apparent mischaracterizations and strange gaps in his memory. Clovis was Papadopoulos’s superviser.
Electronic mail as we know it is drowning in spam, forged phishing mails, and other scams and hacks. It’s going to get worse before it gets better.
One week ago, a group of European security researchers warned that two obscure encryption schemes for email were deeply broken. Those schemes, called OpenPGP and S/MIME, are not the kinds of technologies you’re using but don’t know it. They are not part of the invisible and vital internet infrastructure we all rely on.
This isn’t that kind of story.
The exploit, called Efail by the researchers who released it, showed that encrypted (and therefore private and secure) email is not only hard to do, but might be impossible in any practical way, because of what email is at its core. But contained in the story of why these standards failed is the story of why email itself is the main way we get hacked, robbed, and violated online. The story of email is also the story of how we lost so much of our privacy, and how we might regain it.
Philosophically, intellectually—in every way—human society is unprepared for the rise of artificial intelligence.
Three years ago, at a conference on transatlantic issues, the subject of artificial intelligence appeared on the agenda. I was on the verge of skipping that session—it lay outside my usual concerns—but the beginning of the presentation held me in my seat.
The speaker described the workings of a computer program that would soon challenge international champions in the game Go. I was amazed that a computer could master Go, which is more complex than chess. In it, each player deploys 180 or 181 pieces (depending on which color he or she chooses), placed alternately on an initially empty board; victory goes to the side that, by making better strategic decisions, immobilizes his or her opponent by more effectively controlling territory.
As the world watched, an American actress challenged stereotypes, tradition, and the history of the monarchy itself.
Hilary Mantel has compared them to pandas in a zoo, “expensive to conserve and ill-adapted to any modern environment.” Martin Amis once described them as “philistines.” When it comes to the Windsors, Christopher Hitchens wrote in 2000, the stubborn appeal of the British Royal Family comes down to a national “conditioning of mild hysteria and personality cult.”
Why are we so fascinated by them? “As in all matters royal,” Amis concluded before the Queen’s Golden Jubilee, “we are dealing here not with pros and cons, with arguments and counterarguments; we are dealing with signs and symbols, with fever and magic.” How else to explain the 18 million television viewers in the U.K.—29 million in the U.S.—who tuned in on Saturday morning to watch an actress and a former army captain get married?
Despite news reports that Republicans are generally skeptical of college, there are certain programs and institutions that they are quite warm to.
Two surveys last year painted an unambiguous picture: Republicans had soured on higher education. They thought colleges had a negative impact on the country and the “way things were going.” But a new survey shows that Americans’ attitudes towards higher education—regardless of political affiliation—are a little more complex. Not only do Americans value education after high school, many of them—of both political stripes—are willing to have more of their tax dollars support it.
New America, a left-leaning think tank, released on Monday its second annual “Varying Degrees” survey, which examines Americans’ perceptions of higher education. Whether or not people thought higher education was worth the cost varied depending on the type of institution: For-profit colleges fared worse than four-year private colleges among all Americans, which fared worse than four-year public colleges.