Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
In 2004, people in the U.K. consumed more alcohol than ever before. How did they get there?
I first met alcohol in the late 1980s. It was the morning after one of my parents’ parties. My sister and I, aged 9 or 10, were up alone. We trawled the lounge for abandoned cans. I remember being methodical: Pick one up, give it a shake to see if there’s anything inside, and if there is, drink! I can still taste the stale, warm metallic tang of Heineken (lager; 5 percent alcohol by volume) on my tongue. Just mind the ones with cigarette butts in them.
Other times we’d sneak a sip of Dad’s Rémy Martin VSOP (cognac; 40 percent) when he wasn’t looking, even though we didn’t like the taste. It came in a heavy glass bottle that he kept in the sideboard. He’d pour himself a glass at night, the ice cubes clinking as he walked to his small office to make phone calls. On special occasions—family birthdays, Christmas lunch—we even got to drink legitimately: usually half a glass of Asti Spumanti (sparkling wine; around 7.5 percent), served in the best glasses.
A new study found that people who identify as Slytherins may be measurably different from the Hufflepuffs of the world.
I’m not particularly proud of this fact, but here it is: Pottermore, the Harry Potter-themed website unveiled by J.K. Rowling in 2012, has peered deep into my soul, evaluated its findings, and pronounced me a Hufflepuff.
Fans of the series will know why this is upsetting. For all the non-Harry Potter buffs reading this, though, here are three quick points of explanation. One: At Hogwarts, the wizards’ academy that serves as the backdrop for most of the series, students are sorted into one of four houses, each with its own distinctive character. Two: On Pottermore, fans can take a personality quiz to do the same. Three: Hufflepuff’s defining trait is “nice.” Its mascot is a badger. Its members, if Hogwarts were an American high-school cafeteria, would be the ones in the corner, frantically combing the trash for their retainers.
An analysis to see if the algorithms behind the new emoji contribute to political bubbles in America
James Berri traveled three hours to Sacramento earlier this month for his first Pride parade, one of hundreds of annual LGBTQ celebrations across America. Berri also talked about the experience on Facebook, reading and reacting to other people’s posts with thumbs-up likes and Facebook’s new rainbow “Pride” emoji. Throughout June, the platform is offering a rainbow flag alongside likes, hearts, and angry faces that people can click on to react to others’ posts and comments. Yet Berri, a 21-year-old transgender artist, is conflicted over the fact that not everyone can use this new rainbow button.
Back in Fresno, Berri wondered how Facebook decides who’s eligible. “Why don’t they have it, too?” he asked, referring to friends sitting with him in a salon in the larger, less-prominent California city. “It makes me confused for my friends.”
Recep Tayyip Erdogan is thinking about his legacy—and his own mortality. He desires power, but not necessarily for its own sake.
Politicians—especially ideological ones—have to eventually deal with the “then what?” question. With Turkish President Recep Tayyip Erdogan’s narrow victory in a tense April referendum granting him sweeping new powers (amid opposition allegations of voter fraud), he could very well dominate the country’s politics through 2029. He would have more than a decade to reshape Turkey, altering the very meaning of what it means to be Turkish.
In the first decade of its rule, beginning in 2002, Erdogan’s Islamist-rooted Justice and Development Party (AKP) presided over a rapidly growing economy, pushed through liberal reforms, and sidelined a military that had undermined Turkish democracy in a series of coups over the course of six decades. Could that, though, really have been all the AKP and its fiery, erratic leader hoped to accomplish?
She lived with us for 56 years. She raised me and my siblings without pay. I was 11, a typical American kid, before I realized who she was.
The ashes filled a black plastic box about the size of a toaster. It weighed three and a half pounds. I put it in a canvas tote bag and packed it in my suitcase this past July for the transpacific flight to Manila. From there I would travel by car to a rural village. When I arrived, I would hand over all that was left of the woman who had spent 56 years as a slave in my family’s household.
The ways some “healthy voice hearers” cope might be able to help people with psychotic disorders.
Jessica Dorner was lying in bed at her cousin’s house when her grandmother, a “pushy lady” in an apron who had been dead for several years, appeared in front of her. “I know you can see me,” Jessica heard her say, “and you need to do something about it.”
It was a lonely time in Jessica’s life. She was living away from home for the first time, and she thinks her grandmother was drawn by some sense of that. She eventually told her parents what happened, and according to her they were concerned, but not overly panicked. “My parents are probably the least judgmental people I know,” she said.
As Jessica tells it, over the next two years, spirits visited her every now and again. Her brother-in-law’s deceased father began forming before her, ghostlike, just as her grandmother did. And while the experiences were intense and at times made her feel “crazy,” she said, they were infrequent, and insists that they were never a real source of suffering.
The latest winner of RuPaul’s Drag Race is a politically engaged oddball—which is, in its way, pretty traditional.
If you’reever in need of perspective on whether our society is in true upheaval or if we’re only experiencing the same cultural battles that have raged forever, old Geraldo clips on YouTube will always offer some clarity. Recently I found myself binging on the talk show’s coverage of “club kids,” a scene of 1990s New York City partiers who wore fantastical and frequently gender-bending outfits. In various episodes over the years, Rivera invited them on and then scoffed at their floral masks and harlequin makeup, their coy references to drug use, and their queerness. Once, they inspired him to ask, somewhat in earnest, “It’s four in the morning—do you know where your children are?”
Last week, RuPaul retweeted a link to one of those episodes, from 1990, which featured him a few years before he became America’s most famous drag queen. Midway through, Rivera asked whether dressing outlandishly is an art, and RuPaul gave an exuberant yes. “I dropped out of society when Reagan got in office,” he added, then took the opportunity to rally the audience: “Everybody say ‘love!’ Everybody say ‘love!’”
Cecile Richards says the organization will not spin off its abortion services, even though Congress is threatening to remove the group’s funding.
The United States Congress is trying hard to defund Planned Parenthood, once and for all. For a period of one year, the proposed American Health Care Act would prohibit federal funds from going to non-profit organizations that provide family-planning services, including abortions, and get more than $350 million in reimbursements under Medicaid, which provides health insurance to the poor, the elderly, children, pregnant women, and people with disabilities. When the Congressional Budget Office evaluated this clause of the bill, it “identified only one organization that would be affected: Planned Parenthood Federation of America and its affiliates and clinics.”
If this bill goes through, it would represent an existential threat for Planned Parenthood. The organization would be less able to serve poor women who are covered by state Medicaid programs, and it would likely have to close clinics or reduce its services because of the loss of funding. The main motivation behind this provision—and others like it that have come up at the state level—is opposition to abortion. This has lead some, including Ivanka Trump, to wonder why Planned Parenthood doesn’t just spin off its abortion services into a separate organization.
A controversial new study raises questions about the optimal floor for pay.
Seattle’s decision to hike its minimum wage up to $13 an hour—on its way to $15—ended up costing its low-wage workers time on the job, hundreds of dollars of annual income, and a shot at a better livelihood.
That is a reasonable conclusion one could draw from a blockbuster, if not yet peer-reviewed, new study on the city’s famed minimum-wage increases. The research, performed by a group of academics from the University of Washington, looks at detailed data on the earnings and hours of workers affected by the hike of the wage floor from $9.47 an hour to $11 in 2015, and from $11 an hour to $13 an hour in 2016. It concludes that, for low-wage workers, that second wage increase reduced hours worked by nearly 10 percent and earnings by an average of $125 a month. The findings, though preliminary, call into question years of economic research and the decisions of dozens of states and cities to bump their wage floors up.
If the party cares about winning, it needs to learn how to appeal to the white working class.
The strategy was simple. A demographic wave—long-building, still-building—would carry the party to victory, and liberalism to generational advantage. The wave was inevitable, unstoppable. It would not crest for many years, and in the meantime, there would be losses—losses in the midterms and in special elections; in statehouses and in districts and counties and municipalities outside major cities. Losses in places and elections where the white vote was especially strong.
But the presidency could offset these losses. Every four years the wave would swell, receding again thereafter but coming back in the next presidential cycle, higher, higher. The strategy was simple. The presidency was everything.