Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
Trump’s misogyny is shocking because it’s so brazen, but it’s infuriating because it’s so familiar. Chances are, if you’re a woman in 2016, you’ve heard it all before.
* * *
The first time you meet Donald Trump, he’s an older male relative who smells like cigarettes and asks when you are going to lose that weight. You’re nine years old. Your parents have to go out and buy a bottle of vodka for him before he arrives. His name is Dick. No, really, it is. At dinner one night, he explains to you that black people are dangerous. “If you turn around, they’ll put a knife in your back.” Except Bill Cosby. “He’s one of the good ones.” Turns out he’s wrong about Cosby and everything else, but the statute of limitations on Dick’s existence on Earth will run out before that information is widely available.
“Wanting and not wanting the same thing at the same time is a baseline condition of human consciousness.”
Gary Noesner is a former FBI hostage negotiator. For part of the 51-day standoff outside the Branch Davidian religious compound in Waco, Texas, in 1993, he was the strategic coordinator for negotiations with the compound’s leader, David Koresh. This siege ended in infamous tragedy: The FBI launched a tear-gas attack on the compound, which burned to the ground, killing 76 people inside. But before Noesner was rotated out of his position as the siege’s head negotiator, he and his team secured the release of 35 people.
Jamie Holmes, a Future Tense Fellow at New America, spoke to Noesner for his new book Nonsense: The Power of Not Knowing. “My experience suggests,” Noesner told Holmes, “that in the overwhelming majority of these cases, people are confused and ambivalent. Part of them wants to die, part of them wants to live. Part of them wants to surrender, part of them doesn’t want to surrender.” And good negotiators, Noesner says, are “people who can dwell fairly effectively in the areas of gray, in the uncertainties and ambiguities of life.”
Even in big cities like Tokyo, small children take the subway and run errands by themselves. The reason has a lot to do with group dynamics.
It’s a common sight on Japanese mass transit: Children troop through train cars, singly or in small groups, looking for seats.
They wear knee socks, polished patent-leather shoes, and plaid jumpers, with wide-brimmed hats fastened under the chin and train passes pinned to their backpacks. The kids are as young as 6 or 7, on their way to and from school, and there is nary a guardian in sight.
A popular television show called Hajimete no Otsukai, or My First Errand, features children as young as two or three being sent out to do a task for their family. As they tentatively make their way to the greengrocer or bakery, their progress is secretly filmed by a camera crew. The show has been running for more than 25 years.
Who will win the debates? Trump’s approach was an important part of his strength in the primaries. But will it work when he faces Clinton onstage?
The most famous story about modern presidential campaigning now has a quaint old-world tone. It’s about the showdown between Richard Nixon and John F. Kennedy in the first debate of their 1960 campaign, which was also the very first nationally televised general-election debate in the United States.
The story is that Kennedy looked great, which is true, and Nixon looked terrible, which is also true—and that this visual difference had an unexpected electoral effect. As Theodore H. White described it in his hugely influential book The Making of the President 1960, which has set the model for campaign coverage ever since, “sample surveys” after the debate found that people who had only heard Kennedy and Nixon talking, over the radio, thought that the debate had been a tie. But those who saw the two men on television were much more likely to think that Kennedy—handsome, tanned, non-sweaty, poised—had won.
Early photographs of the architecture and culture of Peking in the 1870s
In May of 1870, Thomas Child was hired by the Imperial Maritime Customs Service to be a gas engineer in Peking (Beijing). The 29-year-old Englishman left behind his wife and three children to become one of roughly 100 foreigners living in the late Qing dynasty's capital, taking his camera along with him. Over the course of the next 20 years, he took some 200 photographs, capturing the earliest comprehensive catalog of the customs, architecture, and people during China's last dynasty. On Thursday, an exhibition of his images will open at the Sidney Mishkin Gallery in New York, curated by Stacey Lambrow. In addition, descendants of the subjects of one of his most famous images, Bride and Bridegroom (1870s), will be in attendance.
In Greenwich, Darien, and New Canaan, Connecticut, bankers are earning astonishing amounts. Does that have anything to do with the poverty in Bridgeport, just a few exits away?
BRIDGEPORT, Conn.—Few places in the country illustrate the divide between the haves and the have-nots more than the county of Fairfield, Connecticut. Drive around the city of Bridgeport and, amid the tracts of middle-class homes, you’ll see burned-out houses, empty factories, and abandoned buildings that line the main street. Nearby, in the wealthier part of the county, there are towns of mansions with leafy grounds, swimming pools, and big iron gates.
Bridgeport, an old manufacturing town all but abandoned by industry, and Greenwich, a headquarters to hedge funds and billionaires, may be in the same county, and a few exits apart from each other on I-95, but their residents live in different worlds. The average income of the top 1 percent of people in the Bridgeport-Stamford-Norwalk metropolitan area, which consists of all of Fairfield County plus a few towns in neighboring New Haven County, is $6 million dollars—73 times the average of the bottom 99 percent—according to a report released by the Economic Policy Institute (EPI) in June. This makes the area one of the most unequal in the country; nationally, the top 1 percent makes 25 times more than the average of the bottom 99 percent.
In a world where Kevin Garnett, Harold Ford, and Halle Berry all check "black" on the census, even the argument that racial labels refer to natural differences in physical traits doesn't hold up.
Andrew Sullivan and Freddie Deboer have two pieces up worth checking out. I disagree with Andrew's (though I detect some movement in his position.) Freddie's piece is entitled "Precisely How Not to Argue About Race and IQ." He writes:
The problem with people who argue for inherent racial inferiority is not that they lie about the results of IQ tests, but that they are credulous about those tests and others like them when they shouldn't be; that they misunderstand the implications of what those tests would indicate even if they were credible; and that they fail to find the moral, analytic, and political response to questions of race and intelligence.
Campus life is too diverse at most schools for dorms to serve as a place of respite from uncomfortable ideas.
Last week, I got an email from Decker O’Donnell, an economics major at Lewis & Clark College in Portland, Oregon. He was troubled by my claim that dorm life at residential colleges cannot be like home. “We live there 30 weeks a year,” he wrote. “I know people with abusive or homophobic families who couch-surf in the summer.”
College, he observed, “is the only home they have.”
There is, of course, a subset of college students whose troubled home lives cause them to feel more comfortable on campus than in the households where they grew up, and the escape that higher education affords them is very much worth celebrating. But those cases are not the core of O’Donnell’s disagreement with me.
By way of background, I wrote about home during last year’s controversy at Yale, when students protested the faculty-in-residence at Silliman College after his wife sent an email that upset them. She argued that Yale undergrads, not administrators, should shape the norms around what Halloween costumes are appropriate. “As master,” a student retorted, “it is your job to create a place of comfort and home for the students who live in Silliman. You have not done that. By sending out that email, that goes against your position as master. Do you understand that?!”
The New England Patriots’ recent success is a reminder of how America’s favorite sport is also the most hierarchical and least collaborative.
The “Deflategate” scandal involving the New England Patriots quarterback Tom Brady—one that dominated NFL headlines for well over a year, involved multiple levels of the American legal system, instilled in every football fan a keen interest in air-pressure physics, and finally ended in a four-game suspension for Brady to start the 2016 season—has ended up having little effect on the field. The Patriots beat the Houston Texans Thursday night in Foxborough, Massachusetts, improving their record to 3-0 with only one game left before they get their starting QB back. The win against the Texans, a team that entered the game undefeated itself, was a 27-0 shellacking, New England besting them in every category of play.
Thousands of indigenous Hawaiians have a claim to publicly trusted lands. But to keep any property, they first have to pass a blood test.
KILAUEA, Hawaii—Natasha Boteilho lives in Oahu’s arid Waianae Valley on a jot of land held in trust for native Hawaiians. Here on Hawaii’s most densely populated island—where the highest per-capita homeless population in the United States continues to swell and the average price of a single-family home is three-quarters of a million dollars—that’s no small thing. The turquoise waters that lap against golden beaches lie next to jammed highways. Even the wildlife is exploding: A cacophonous feral-chicken epidemic provides the background noise to islanders’ daily lives.
Boteilho’s property was originally awarded to her grandfather by virtue of a federal law enacted in 1920 to stabilize a Hawaiian race left withering and landless after a century of colonization. Boteilho’s mother took over the land lease next, and then, in 2011, the homestead was passed on to her. A stay-at-home mother of three girls, the 39-year-old Boteilho resides with her husband and children in the three-bedroom house her grandfather built at the base of an eroded shield volcano.