Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
“Here is what I would like for you to know: In America, it is traditional to destroy the black body—it is heritage.”
Last Sunday the host of a popular news show asked me what it meant to lose my body. The host was broadcasting from Washington, D.C., and I was seated in a remote studio on the Far West Side of Manhattan. A satellite closed the miles between us, but no machinery could close the gap between her world and the world for which I had been summoned to speak. When the host asked me about my body, her face faded from the screen, and was replaced by a scroll of words, written by me earlier that week.
The host read these words for the audience, and when she finished she turned to the subject of my body, although she did not mention it specifically. But by now I am accustomed to intelligent people asking about the condition of my body without realizing the nature of their request. Specifically, the host wished to know why I felt that white America’s progress, or rather the progress of those Americans who believe that they are white, was built on looting and violence. Hearing this, I felt an old and indistinct sadness well up in me. The answer to this question is the record of the believers themselves. The answer is American history.
New data shows that students whose parents make less money pursue more “useful” subjects, such as math or physics.
In 1780, John Adams wrote a letter to his wife, Abigail, in which he laid out his plans for what his children and grandchildren would devote their lives to. Having himself taken the time to master “Politicks and War,” two revolutionary necessities, Adams hoped his children would go into disciplines that promoted nation-building, such as “mathematicks,” “navigation,” and “commerce.” His plan was that in turn, those practical subjects would give his children’s children room “to study painting, poetry, musick, architecture, statuary, tapestry, and porcelaine.”
Two-hundred and thirty-five years later, this progression—“from warriors to dilettantes,” in the words of the literary scholar Geoffrey Galt Harpham—plays out much as Adams hoped it would: Once financial concerns have been covered by their parents, children have more latitude to study less pragmatic things in school. Kim Weeden, a sociologist at Cornell, looked at National Center for Education Statistics data for me after I asked her about this phenomenon, and her analysis revealed that, yes, the amount of money a college student’s parents make does correlate with what that person studies. Kids from lower-income families tend toward “useful” majors, such as computer science, math, and physics. Those whose parents make more money flock to history, English, and performing arts.
Most adults can’t remember much of what happened to them before age 3 or so. What happens to the memories formed in those earliest years?
My first memory is of the day my brother was born: November 14, 1991. I can remember my father driving my grandparents and me over to the hospital in Highland Park, Illinois, that night to see my newborn brother. I can remember being taken to my mother’s hospital room, and going to gaze upon my only sibling in his bedside cot. But mostly, I remember what was on the television. It was the final two minutes of a Thomas the Tank Engine episode. I can even remember the precise story: “Percy Takes the Plunge,” which feels appropriate, given that I too was about to recklessly throw myself into the adventure of being a big brother.
In sentimental moments, I’m tempted to say my brother’s birth is my first memory because it was the first thing in my life worth remembering. There could be a sliver of truth to that: Research into the formation and retention of our earliest memories suggests that people’s memories often begin with significant personal events, and the birth of a sibling is a textbook example. But it was also good timing. Most people’s first memories date to when they were about 3.5 years old, and that was my age, almost to the day, when my brother was born.
In 1992, the neuroscientist Richard Davidson got a challenge from the Dalai Lama. By that point, he’d spent his career asking why people respond to, in his words, “life’s slings and arrows” in different ways. Why are some people more resilient than others in the face of tragedy? And is resilience something you can gain through practice?
The Dalai Lama had a different question for Davidson when he visited the Tibetan Buddhist spiritual leader at his residence in Dharamsala, India. “He said: ‘You’ve been using the tools of modern neuroscience to study depression, and anxiety, and fear. Why can’t you use those same tools to study kindness and compassion?’ … I did not have a very good answer. I said it was hard.”
Defining common cultural literacy for an increasingly diverse nation
Is the culture war over?
That seems an absurd question. This is an age when Confederate monuments still stand; when white-privilege denialism is surging on social media; when legislators and educators in Arizona and Texas propose banning ethnic studies in public schools and assign textbooks euphemizing the slave trade; when fear of Hispanic and Asian immigrants remains strong enough to prevent immigration reform in Congress; when the simple assertion that #BlackLivesMatter cannot be accepted by all but is instead contested petulantly by many non-blacks as divisive, even discriminatory.
And that’s looking only at race. Add gender, guns, gays, and God to the mix and the culture war seems to be raging along quite nicely.
The unwillingness of the former secretary of state to take questions from the press contrasts sharply with Jeb Bush’s marked affinity for public disclosure.
Howard Kurtz reported on Sunday night that the Hillary Clinton campaign has decided to open itself to more press interviews. Kurtz quoted the campaign’s communications director, Jennifer Palmieri: “By not doing national interviews until now, Palmieri concedes, ‘we’re sacrificing the coverage. We’re paying a price for it.’”
Meanwhile Jeb Bush chatted July 2 with the conservative website, the Daily Caller. The Daily Caller interview broke an unusually protracted no-interview period for Bush. It had been more than two weeks since he appeared on the Tonight show with Jimmy Fallon. Bush spoke that same day, June 17, to Sean Hannity’s radio show and ABC News. Five days earlier, he’d spoken to Germany’s Der Spiegel—altogether, five interviews in the month of June. That brought his total, since the beginning of February, to 39, according to the Bush campaign.*
Chicago has seen a double-digit increase in the percentage of kids graduating from high school. Skeptics say educators and kids are manipulating the numbers—but does that even matter?
Desiree Cintron’s name used to come up a lot during “kid talk,” a weekly meeting at Chicago’s North-Grand High School at which teachers mull over a short list of freshmen in trouble.
No shock there, says Desiree now, nearly three years later.
“I was gangbanging and fighting a lot,” she says, describing her first few months of high school. “I didn’t care about school. No one cared, so I didn’t care.”
Had Desiree continued to fail in her freshman year, she would have dropped out. She is sure of that. It was only because of a strong program of academic and social supports put together by her teachers that she stuck it out. Desiree pulled up a failing grade and several Ds. She gave up gangbanging and later started playing softball. She connected with a school determined to connect with her.
Gentrification is pushing long-term residents out of urban neighborhoods. Can collective land ownership keep prices down permanently?
AUSTIN, Tex.—Not long ago, inner cities were riddled with crime and blight and affluent white residents high-tailed it to the suburbs, seeking better schools, safer streets, and, in some cases, fewer minority neighbors.
But today, as affluent white residents return to center cities, people who have lived there for years are finding they can’t afford to stay.
Take the case of the capital city of Texas, where parts of East Austin, right next to downtown, are in the process of becoming whiter, and hip restaurants, coffee shops, and even a barcatering to bicyclists are opening. Much of Austin’s minority population, meanwhile, is priced out, and so they’re moving to far-out suburbs such as Pflugerville and Round Rock, where rents are affordable and commutes are long.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.