Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
In her acceptance speech, the Democratic nominee took on her Republican rival by throwing Donald Trump’s own words back at him.
The unicorn of American politics, the “real Hillary Clinton”—the Hillary Clinton I’ve known for nearly 30 years—that Hillary Clinton likes to wear low-heeled shoes to a butt-kicking.
“A man you can bait with a tweet is not a man we can trust with nuclear weapons,” she said of her Republican rival, Donald Trump, while accepting the Democratic presidential nomination, the first woman in U.S. history to head a major-party ticket.
It was a sound bite for the ages, searing and on point.
“Do you really think Donald Trump has the temperament to be commander in chief?” she continued. “Donald Trump can’t even handle the rough and tumble of a presidential campaign. He loses his cool at the slightest provocation. Imagine, if you dare, imagine him in the Oval Office facing a crisis.”
The comparatively less flashy, less spirited former First Kid managed to show her mom’s softer side at the DNC on Thursday.
Yes, yes, yes. Chelsea Clinton is not the most charismatic orator—as the Twittersphere was happy to point out during her brief address on Thursday night. She is like her mother that way. There’s something not quite natural about her self-presentation. She’s not stilted, exactly. But she can come across as too cautious, too reserved, too conscious of other people’s eyes upon her.
But, let’s face it, as the lead-in to Hillary’s big nominating speech, a little bit of boring was called for. Unlike some of this convention’s high-wattage speakers, there was zero chance Chelsea was going to upstage Hillary with a barnburner or tear-jerker. Chelsea wasn’t there to pump up the crowd. Her role was to comfort, to explain, to cajole, with an eye toward giving Americans a glimpse of her mother’s softer side.
The father of a Muslim American who died in Iraq confronts Donald Trump.
Khizr Khan began his speech at the Democratic National Convention on Thursday with words I wish he didn’t have to say: “Tonight we are honored to stand here as parents of Captain Humayun Khan and as patriotic American Muslims—as patriotic American Muslims with undivided loyalty to our country.”
I wish he and his wife didn’t have to stand there as the parents of a 27-year-old Army captain who was killed by suicide bombers while serving in the Iraq War. And I wish Khizr Khan hadn’t felt the need to declare his patriotism and loyalty to the United States of America. Those truths should have been self-evident.
The state of the union is not strong when an American feels compelled to clarify such things. In better times, Khizr Khan, who was born in Pakistan and moved to America from the United Arab Emirates, might have begun his speech with what he said next: “Like many immigrants, we came to this country empty-handed. We believed in American democracy—that with hard work and [the] goodness of this country, we could share in and contribute to its blessings.”
Chris Morris’s brutal satire aired its last and most controversial episode in 2001, but its skewering of the news media feels more relevant than ever.
A sex offender is thrown in the stocks, presented with a small child, and asked if he wants to molest him. A mob of protestors is thrown a “dummy full of guts” that is stomped to pieces within seconds. A radio host insists that pedophiles have “more genes in common with crabs” than the rest of humanity, insisting, “There’s no real evidence for [that], but it is scientific fact.”
It’s hard to pinpoint the most cringe-inducing moment on “Paedogeddon,” a special episode of the British TV satire Brass Eye. But 15 years after the episode aired, it remains a totemic, terrifying satirical vision. Few comedies since have dared to cross the boundaries of taste with such impunity.
“Paedogeddon” aired in the U.K. in the summer of 2001, a year after the murder of a young girl had sparked national hysteria over the country’s sex-offender registry. Britain’s most-read newspaper led a campaign to publish the names and locations of all 110,000 convicted sex offenders, prompting a riot in which an angry mob ransacked the home of an ex-con. Brass Eye, a parody of a 60 Minutes-like newsmagazine show, had been dormant after airing one season in the UK in 1997. But it returned four years later for this surprise broadcast, one that saw its furious (fictional) anchors barking from a dark studio about the plague of seemingly super-powered child molesters stalking the nation, holding a funhouse mirror up to the climate of paranoia and fear that had built up around the country. It was a bold, wildly insensitive piece of comedy, but one that captured the growing madness of the 24-hour news media and foreshadowed some uglier aspects of its future.
The Fox host’s insistence that black laborers building the White House were “well-fed and had decent lodgings” fits in a long history of insisting the “peculiar institution” wasn’t so bad.
In her widely lauded speech at the Democratic National Convention on Monday, Michelle Obama reflected on the remarkable fact of her African American family living in the executive mansion. “I wake up every morning in a house that was built by slaves. And I watch my daughters, two beautiful, intelligent, black young women, playing with their dogs on the White House lawn,” she said.
On Tuesday, Fox News host Bill O’Reilly discussed the moment in his Tip of the Day. In a moment first noticed by the liberal press-tracking group Media Matters, O’Reilly said this:
As we mentioned, Talking Points Memo, Michelle Obama referenced slaves building the White House in referring to the evolution of America in a positive way. It was a positive comment. The history behind her remark is fascinating. George Washington selected the site in 1791, and as president laid the cornerstone in 1792. Washington was then running the country out of Philadelphia.
Slaves did participate in the construction of the White House. Records show about 400 payments made to slave masters between 1795 and 1801. In addition, free blacks, whites, and immigrants also worked on the massive building. There were no illegal immigrants at that time. If you could make it here, you could stay here.
In 1800, President John Adams took up residence in what was then called the Executive Mansion. It was only later on they named it the White House. But Adams was in there with Abigail, and they were still hammering nails, the construction was still going on.
Slaves that worked there were well-fed and had decent lodgings provided by the government, which stopped hiring slave labor in 1802. However, the feds did not forbid subcontractors from using slave labor. So, Michelle Obama is essentially correct in citing slaves as builders of the White House, but there were others working as well. Got it all? There will be a quiz.
Narcissism, disagreeableness, grandiosity—a psychologist investigates how Trump’s extraordinary personality might shape his possible presidency.
In 2006, Donald Trump made plans to purchase the Menie Estate, near Aberdeen, Scotland, aiming to convert the dunes and grassland into a luxury golf resort. He and the estate’s owner, Tom Griffin, sat down to discuss the transaction at the Cock & Bull restaurant. Griffin recalls that Trump was a hard-nosed negotiator, reluctant to give in on even the tiniest details. But, as Michael D’Antonio writes in his recent biography of Trump, Never Enough, Griffin’s most vivid recollection of the evening pertains to the theatrics. It was as if the golden-haired guest sitting across the table were an actor playing a part on the London stage.
“It was Donald Trump playing Donald Trump,” Griffin observed. There was something unreal about it.
The Democrat promised voters she’d do her job intelligently and doggedly—and help them be the heroes of their own lives.
The Democratic convention, which culminated on Thursday night with Hillary Clinton, was inverted. Usually, supporting actors cover policy specifics and flay the opposing candidate. The nominee comes on at the end and offers a vision.
Hillary Clinton doesn’t do vision well. So, wisely, her campaign turned the paradigm on its head. The emotion, the vision, the rhetorical power came from others: from Barack Obama and Joe Biden and ordinary people like disability rights activist Anastasia Somoza; Khizr Kahn, whose son died in Afghanistan; and the families of slain police officers and victims of police violence. Clinton did what she’s good at: She talked about public policy and she proved that she’s not at all intimidated by Donald Trump.
Hillary Clinton accepted the Democratic nomination in Philadelphia, ratifying a promise made there 240 years before—that all are created equal.
PHILADELPHIA—“Daddy,” my daughter recently asked me, “Why are there no girl presidents? Is it because boys are stronger than girls? Because they’re smarter?”
It left me speechless.
On Thursday night, in the city where the Founders declared all men created equal, I found my answer. It’s because no major party has ever tried nominating one before.
“Tonight, we’ve reached a milestone in our nation’s march toward a more perfect union: the first time that a major party has nominated a woman for president,” Clinton said as she accepted the nomination. “Standing here as my mother’s daughter, and my daughter’s mother, I’m so happy this day has come.”
It wasn’t the theme of her speech. But it was the unspoken subtext that ran through it. And Clinton took pains to frame the achievement not as the triumph of some subset of Americans, but as a victory for all Americans. She proclaimed herself both “happy for grandmothers and little girls,” but also “happy for boys and men—because when any barrier falls in America, it clears the way for everyone.”
Psychologists have long debated how flexible someone’s “true” self is.
Almost everyone has something they want to change about their personality. In 2014, a study that traced people’s goals for personality change found that the vast majority of its subjects wanted to be more extraverted, agreeable, emotionally stable, and open to new experiences. A whopping 97 percent said they wished they were more conscientious.
These desires appeared to be rooted in dissatisfaction. People wanted to become more extraverted if they weren’t happy with their sex lives, hobbies, or friendships. They wanted to become more conscientious if they were displeased with their finances or schoolwork. The findings reflect the social psychologist Roy Baumeister’s notion of “crystallization of discontent”: Once people begin to recognize larger patterns of shortcomings in their lives, he contends, they may reshuffle their core values and priorities to justify improving things.
The Democratic nominee for United States president made a play for progressives, moderates, and Independents alike during her address in Philadelphia on Thursday night.
“America's strength doesn't come from lashing out,” Hillary Clinton said Thursday, delivering a harsh rebuke to Donald Trump as she accepted the Democratic nomination for U.S. president.
Clinton’s speech capped the Democratic National Convention in Philadelphia, where she made history as the first female presidential nominee of a major party. While Clinton did not skip over the historic aspect of her nomination, she spent most of her hour-long speech emphasizing two, interlocking themes: the importance of community and togetherness, and the fundamental unfitness of the Republican nominee for office. It was not so dark and ominous a speech as Trump’s own acceptance speech a week ago in Cleveland, but it was a negative speech: a warning against the danger posed to America by a Trump presidency.