Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
The most comprehensive review of evidence on health consequences of caffeine use has just been published.
A Los Angeles news anchor said earlier this month, in response to the announcement that “the world’s strongest coffee” is now available in the United States. The product is called Black Insomnia, a playful nod to apotentially debilitating medical condition that can be caused by the product.
The anchor’s tone took a dramatic decrescendo as she read from the teleprompter: “The site Caffeine Informer says Black Insomnia is one of the ‘most dangerous caffeinated products.’” Her smile faded. “Oh. I’ll have to have this one sparingly.”
Black Insomnia is actually in competition for the title of “world’s strongest coffee.” Another, similar purveyor sells coffee grounds called Death Wish. They come in a black sack with a skull and cross bones. On its Amazon page, Death Wish claims to be “the world’s strongest coffee” and promises its “perfect dark roast will make you the hero of the house or office.”
Democracies across the West are vulnerable to foreign influence—and some are under attack.
Mike Conaway, the Republican who replaced Devin Nunes as head of the House Intelligence Committee’s investigation into Russian meddling in the U.S. election, has described his mission simply: “I just want to find out what happened,” he’s said. The more urgent question elsewhere in the world, however, isn’t confined to the past. It concerns what is happening—not just in the United States but in European democracies as well.
In the Netherlands, Dutch authorities counted paper ballots in a recent election by hand to prevent foreign governments—and Russia in particular—from manipulating the results through cyberattacks. In Denmark, the defense minister has accused the Russian government of carrying out a two-year campaign to infiltrate email accounts at his ministry. In the United Kingdom, a parliamentary committee reports that it cannot “rule out” the possibility that “foreign interference” caused a voter-registration site to crash ahead of Britain’s referendum on EU membership. And in France, a cybersecurity firm has just discovered that suspected Russian hackers are targeting the leading presidential candidate. “We are increasingly concerned about cyber-enabled interference in democratic political processes,” representatives from the Group of Seven—Canada, France, Germany, Italy, Japan, the U.K., and the U.S.—declared after meeting in Italy earlier this month. Russia, a member of the group until it was kicked out for annexing Crimea, wasn’t mentioned in the statement. It didn’t need to be. The subtext was clear.
The president has softened some of his tough talk toward China and Mexico, transferring it to Canada and disputes over softwood lumber and dairy products.
Donald Trump is not the first U.S. president to tangle with Canada over lumber. In fact, the first U.S. president to do so was the first U.S. president. George Washington’s administration saw a dispute over ownership of valuable forests on the border between New Brunswick and present-day Maine.
So despite Trump’s recent tough talk about the trade relationship with America’s neighbor to the north, his announcement Tuesday morning of new tariffs on Canadian lumber is actually consistent with what U.S. policy has been for decades. Where Trump differs from previous presidents, though, is in very publicly sounding off about a longstanding disagreement. In so doing he has also, apparently, found a new target for his trade-related ire, even as he softens his stances toward previous targets like China and Mexico.
It’s a shame that the standard way of learning how to cook is by following recipes. To be sure, they are a wonderfully effective way to approximate a dish as it appeared in a test kitchen, at a star chef’s restaurant, or on TV. And they can be an excellent inspiration for even the least ambitious home cooks to liven up a weeknight dinner. But recipes, for all their precision and completeness, are poor teachers. They tell you what to do, but they rarely tell you why to do it.
This means that for most novice cooks, kitchen wisdom—a unified understanding of how cooking works, as distinct from the notes grandma lovingly scrawled on index-card recipes passed down through the generations—comes piecemeal. Take, for instance, the basic skill of thickening a sauce. Maybe one recipe for marinara advises reserving some of the starchy pasta water, for adding later in case the sauce is looking a little thin. Another might recommend rescuing a too-watery sauce with some flour, and still another might suggest a handful of parmesan. Any one of these recipes offers a fix under specific conditions, but after cooking through enough of them, those isolated recommendations can congeal into a realization: There are many clever ways to thicken a sauce, and picking an appropriate one depends on whether there’s some leeway for the flavor to change and how much time there is until dinner needs to be on the table.
When astronomers talk about the search for life elsewhere in our solar system, they usually talk about microbes, simple and resilient forms of life known to exist in the most extreme temperatures and conditions. Space probes have mapped enough of the sun’s planets and moons to show there are no civilizations lurking in this star system, save for the one on Earth. But what if we’re not done looking yet? What if there are indeed signs of an ancient intelligent species right here, on the worlds in our own backyard, waiting to be found?
That’s the question posed by Jason Wright, an astronomer at Penn State University, in a new paper published Monday night. Wright posits the idea that an advanced civilization—an indigenous technological species, he calls it—could have arisen in the solar system before life as we know it did. (“Indigenous,” because it originates in the solar system, and not from extraterrestrial life that may exist elsewhere in the universe.) If it left behind traces of its technology—called technosignatures—some of those technosignatures may have survived, provided they were made of material not easily degraded by erosion or time. Perhaps, Wright writes, they remain hidden under the surface of Venus and Mars.
Will you pay more for those shoes before 7 p.m.? Would the price tag be different if you lived in the suburbs? Standard prices and simple discounts are giving way to far more exotic strategies, designed to extract every last dollar from the consumer.
As Christmas approached in 2015, the price of pumpkin-pie spice went wild. It didn’t soar, as an economics textbook might suggest. Nor did it crash. It just started vibrating between two quantum states. Amazon’s price for a one-ounce jar was either $4.49 or $8.99, depending on when you looked. Nearly a year later, as Thanksgiving 2016 approached, the price again began whipsawing between two different points, this time $3.36 and $4.69.
We live in the age of the variable airfare, the surge-priced ride, the pay-what-you-want Radiohead album, and other novel price developments. But what was this? Some weird computer glitch? More like a deliberate glitch, it seems. “It’s most likely a strategy to get more data and test the right price,” Guru Hariharan explained, after I had sketched the pattern on a whiteboard.
“Somewhere at Google there is a database containing 25 million books and nobody is allowed to read them.”
You were going to get one-click access to the full text of nearly every book that’s ever been published. Books still in print you’d have to pay for, but everything else—a collection slated to grow larger than the holdings at the Library of Congress, Harvard, the University of Michigan, at any of the great national libraries of Europe—would have been available for free at terminals that were going to be placed in every local library that wanted one.
At the terminal you were going to be able to search tens of millions of books and read every page of any book you found. You’d be able to highlight passages and make annotations and share them; for the first time, you’d be able to pinpoint an idea somewhere inside the vastness of the printed record, and send somebody straight to it with a link. Books would become as instantly available, searchable, copy-pasteable—as alive in the digital world—as web pages.
Film, television, and literature all tell them better. So why are games still obsessed with narrative?
A longstanding dream: Video games will evolve into interactive stories, like the ones that play out fictionally on the Star Trek Holodeck. In this hypothetical future, players could interact with computerized characters as round as those in novels or films, making choices that would influence an ever-evolving plot. It would be like living in a novel, where the player’s actions would have as much of an influence on the story as they might in the real world.
It’s an almost impossible bar to reach, for cultural reasons as much as technical ones. One shortcut is an approach called environmental storytelling. Environmental stories invite players to discover and reconstruct a fixed story from the environment itself. Think of it as the novel wresting the real-time, first-person, 3-D graphics engine from the hands of the shooter game. In Disneyland’s Peter Pan’s Flight, for example, dioramas summarize the plot and setting of the film. In the 2007 game BioShock, recorded messages in an elaborate, Art Deco environment provide context for a story of a utopia’s fall. And in What Remains of Edith Finch, a new game about a girl piecing together a family curse, narration is accomplished through artifacts discovered in an old house.
The Dems are trying to take advantage of the president’s tendency to make maximalist claims then retreat from them.
Mock Donald Trump’s legislative ignorance if you will, but for a brief, shining stretch during the past week, he managed to bring about a rare Washington phenomenon: House and Senate Democrats saying nice things about their GOP
counterparts. Publicly. With straight faces. That the president accomplished this entirely by accident makes the feat no less remarkable.
It has been like a scene straight out of a No Labels kumbaya, centrist fantasy: As Congress hammers out a deal to fund the government for the rest of this fiscal year, Democrats have been lauding Republicans for handling negotiations in a thoughtful, productive, bipartisan manner.
“Appropriators are all about getting something done,” a senior Democratic House aide noted approvingly of the process. And with the April 28 deadline looming, he told me, members of both teams “had been chugging along, making progress, doing a really good job of getting past some riders.”
The cuts-only plan President Trump is expected to unveil Wednesday follows a pattern: The risk associated with higher deficits takes a back seat when it comes with political pain.
“I am the king of debt,” Donald Trump famously boasted during last year’s campaign. On Wednesday, the president is going to set about proving it—but perhaps not in the way he originallymeant.
All indications are that the tax plan the White House is slated to unveil will include what Trump has described as a “massive” cut in the rate that corporations and many small businesses pay to the government. But it will omit the more politically painful choices that Republicans would need to make to offset the correspondingloss of revenue, such as HouseSpeaker Paul Ryan’s proposed tax on imports or the elimination of popular deductions for charitable giving and homeowners. The result is a tax plan that, like the ones Trump offered as a candidate, could add trillions of dollars to the national debt. You can call them tax cuts, but they aren’t tax reform.