Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
The social network learns more about its users than they might realize.
Facebook, you may have noticed, turned into a rainbow-drenched spectacle following the Supreme Court’s decision Friday that same-sex marriage is a Constitutional right.
By overlaying their profile photos with a rainbow filter, Facebook users began celebrating in a way we haven't seen since March 2013, when 3 million peoplechanged their profile images to a red equals sign—the logo of the Human Rights Campaign—as a way to support marriage equality. This time, Facebook provided a simple way to turn profile photos rainbow-colored. More than 1 million people changed their profile in the first few hours, according to the Facebook spokesperson William Nevius, and the number continues to grow.
“This is probably a Facebook experiment!” joked the MIT network scientist Cesar Hidalgo on Facebook yesterday. “This is one Facebook study I want to be included in!” wrote Stacy Blasiola, a communications Ph.D. candidate at the University of Illinois, when she changed her profile.
People labeled “smart” at a young age don’t deal well with being wrong. Life grows stagnant.
ASPEN, Colo.—At whatever agesmart people develop the idea that they are smart, they also tend to develop vulnerability around relinquishing that label. So the difference between telling a kid “You did a great job” and “You are smart” isn’t subtle. That is, at least, according to one growing movement in education and parenting that advocates for retirement of “the S word.”
The idea is that when we praise kids for being smart, those kids think: Oh good, I'm smart. And then later, when those kids mess up, which they will, they think: Oh no, I'm not smart after all. People will think I’m not smart after all. And that’s the worst. That’s a risk to avoid, they learn.“Smart” kids stand to become especially averse to making mistakes, which are critical to learning and succeeding.
The question is at the center of the Greek crisis.
In 1961, the economist Robert Mundell published a paper laying out, per the title, “A Theory of Optimum Currency Areas.” In it, he inquired about the appropriate geographic extent of a shared unit of money. Was it the world? A country? Part of a country? A border-spanning region of, say, the western parts of the United States and Canada, with a separate currency circulating in the eastern parts of the two countries?
“It might seem at first that the question is purely academic,” he wrote, “since it hardly seems within the realm of political feasibility that national currencies would ever be abandoned in favor of any other arrangement.” But it was worth considering anyway, in part because “certain parts of the world are undergoing processes of economic integration and disintegration,” and an idea of what an “optimum currency area” would look like could help “clarify the meaning of these experiments.”
The star has been accused of having a “large blind spot” on issues of race—but testing the boundaries of jokes is part of the process of stand-up.
There’s a fine line in comedy between subversive and offensive, and with every meteoric rise from stand-up to film and television stardom these days, there tends to be controversy over whether or not that line has ever been crossed. Amy Schumer, whose Comedy Central sketch show Inside Amy Schumer has been dominating the Internet on a weekly basis since its third season debuted in April, and who stars in the upcoming Judd Apatow comedy Trainwreck, is the latest figure to experience the pitfalls of being under such sharp scrutiny. A recent profile of Schumer in The Guardian by Monica Heisey, although largely positive, criticizes the comedian for having a “shockingly large blind spot” on race—and cites some clunky jokes she’s made about Latinos as examples.
Over the last two weeks, Republican presidential candidates have repeatedly missed opportunities to demonstrate that they care about communities outside of their traditional base.
After Mitt Romney’s defeat in 2012, the Republican National Committee published an “autopsy.” “When it comes to social issues,” the autopsy declared, “the Party must in fact and deed be inclusive and welcoming. If we are not, we will limit our ability to attract young people.” The autopsy also added that, “we need to go to communities where Republicans do not normally go to listen and make our case. We need to campaign among Hispanic, black, Asian, and gay Americans and demonstrate we care about them, too.”
The last two weeks, more than any since Romney’s defeat, illustrate how miserably the GOP has failed.
Start with June 17, when Dylann Roof, a young white man enamored of the Confederate flag, murdered nine African Americans in church. Within three days, Romney had called for the Confederate flag’s removal from South Carolina’s capitol. Four days later, the state’s Republican governor and senators called for its removal too. But during that entire week—even as it became obvious that the politics of the flag were shifting—not a single GOP presidential candidate forthrightly called for it to be taken down. Instead, they mostly called it a state decision, a transparent dodge politicians deploy when they don’t want to make a difficult call.
For centuries, experts have predicted that machines would make workers obsolete. That moment may finally be arriving. Could that be a good thing?
1. Youngstown, U.S.A.
The end of work is still just a futuristic concept for most of the United States, but it is something like a moment in history for Youngstown, Ohio, one its residents can cite with precision: September 19, 1977.
For much of the 20th century, Youngstown’s steel mills delivered such great prosperity that the city was a model of the American dream, boasting a median income and a homeownership rate that were among the nation’s highest. But as manufacturing shifted abroad after World War II, Youngstown steel suffered, and on that gray September afternoon in 1977, Youngstown Sheet and Tube announced the shuttering of its Campbell Works mill. Within five years, the city lost 50,000 jobs and $1.3 billion in manufacturing wages. The effect was so severe that a term was coined to describe the fallout: regional depression.
The second episode of the new season was a slow burner with a dramatic twist.
Let’s start at the beginning, with Frank in bed with his wife, Jordan, discussing water stains on the ceiling and childhood entombments. I don’t know about you guys, but I found this whole bit slack and familiar. Maybe there was a two-minute scene in there, but five? Maybe a more charismatic actor could have pulled off that lengthy monologue. But Vince Vaughn is no Robert Shaw, and his childhood basement is no U.S.S. Indianapolis.
Tuesday is the official deadline for the Greek government to either make a deal with debtors or face default and its consequences.
Fitch Ratings Agency has downgraded Greece one level to ‘CC’ as the country nears their midnight deadline. In a release, the agency cited ongoing political and economic turmoil that has kept the country from making a deal to avoid defaulting on its debts as the reasoning behind the downgrade.
With less than one hour left until midnight, it is a virtual certainty that the country will not receive an extension on loans owed to its creditors. In an interview, Eurogroup President Jeroen Dijsselbloem said, “The facts are that the program will expire tonight, and Greece will be in default tomorrow. That is something that I don’t think we can stop between now and tomorrow morning.”
New Jersey Governor Chris Christie was once seen as a frontrunner. As he starts off his campaign now, he’s near the back of the pack.
Did Chris Christie already miss his chance to be president? Back in 2012, the New Jersey governor was wildly popular at home, Republicans were clamoring for him to enter the presidential race, and donors were lined up to write checks.
When he jumped into the race Tuesday, he did so as a beleaguered insurgent. He’s among the last entrants to a crowded field, he has much ground to cover in fundraising, and his political fortunes are in tatters. Just three in 10 New Jerseyans approve of his handling of his job, and Christie’s favorability is deeply underwater among Republican primary voters.
Clearly, it’s been a rough three years for Christie. One might peg the start as Christie’s speech at the 2012 Republican National Convention, panned by party insiders as self-serving; or perhaps it was his embrace of President Obama on an airstrip after Hurricane Sandy. Then there was “Bridgegate,” the controversy over lane closures on the George Washington Bridge. While Christie himself has escaped legal trouble so far, two former top aides have been charged with crimes and a third has pled guilty. The scandal is particularly damaging for Christie, who says he was unaware of the apparently politically punitive closures, since his case for office rests on credibility and competence. While it’s gotten less national attention, Christie’s stateside struggles have a lot to do with the Garden State economy. Atlantic City is shutting down. (Maybe everything that dies someday comes back, but not soon enough for Christie’s campaign.) The state’s debt rating has been cut nine times during the Christie governorship. A judge also ruled that a Christie plan to cut pension payments was illegal.
The power in the president’s eulogy for Clementa Pinckney came not from his singing, but from the silence that preceded it.
Coverage of the memorial service held for Reverend Clementa Pinckney in Charleston last week focused largely on the surprising moment when the leader of the free world broke into song. That song, of course, was “Amazing Grace” and the president sang it distinctly in the style of the black church.
For all the attention Obama’s unexpected performance received, though, it’s worth taking another look at the “Amazing Grace” clip, this time watching for the silence. His singing seems to be a release of the collective tension that had been building for a week after the Emanuel A.M.E. shooting. But the preceding pause seems to hold its hearers captive. Though he is frequently interrupted with cheers and amens throughout his eulogy for Reverend Pinckney, the pause he takes 35 minutes into the speech is easily the longest break from the text before him.