Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
Some of Charles Schulz’s fans blame the cartoon dog for ruining Peanuts. Here’s why they’re wrong.
It really was a dark and stormy night. On February 12, 2000, Charles Schulz—who had single-handedly drawn some 18,000 Peanuts comic strips, who refused to use assistants to ink or letter his comics, who vowed that after he quit, no new Peanuts strips would be made—died, taking to the grave, it seemed, any further adventures of the gang.
Hours later, his last Sunday strip came out with a farewell: “Charlie Brown, Snoopy, Linus, Lucy … How can I ever forget them.” By then, Peanuts was carried by more than 2,600 newspapers in 75 countries and read by some 300 million people. It had been going for five decades. Robert Thompson, a scholar of popular culture, called it “arguably the longest story told by a single artist in human history.”
In a new book, the former Middle East peace negotiator Dennis Ross explores just how close Israel came to attacking Iran, and why Susan Rice accused Benjamin Netanyahu of throwing “everything but the n-word” at Barack Obama.
Updated on October 9, 2015 at 12:40 p.m.
When Israeli Prime Minister Benjamin Netanyahu arrives in Washington early next month for a meeting with President Obama, he should at least know that he is more popular in the White House than Vladimir Putin. But not by much.
This meeting will not reset the relationship between the two men in any significant way, and not only because Netanyahu has decided to troll Obama by accepting the Irving Kristol Award from the American Enterprise Institute on this same short trip. The meeting between the two leaders will most likely be businesslike and correct, but the gap between the two is essentially unbridgeable. From Netanyahu’s perspective, the hopelessly naive Obama broke a solemn promise to never allow Iran to cross the nuclear threshold. From Obama’s perspective, Netanyahu violated crucial norms of U.S.-Israel relations by publicly and bitterly criticizing an Iran deal that—from Obama’s perspective—protects Israel, and then by taking the nearly unprecedented step of organizing a partisan (and, by the way, losing and self-destructive) lobbying campaign against the deal on Capitol Hill.
The leaderless GOP begins its search for a speaker anew, starting with a campaign to draft Paul Ryan.
First Eric Cantor. Then John Boehner. Now Kevin McCarthy.
Conservatives in and out of Congress have, within a span of 15 months, tossed aside three of the four men most instrumental in the 2010 victory that gave Republicans their majority in the House. When the leaderless and divided party gathers on Friday to begin anew its search for a speaker, the biggest question will be whether that fourth man, Paul Ryan, will take a job that for the moment, only he can win.
Ryan, the 2012 vice presidential nominee and chairman of the powerful Ways and Means Committee, has for years resisted entreaties to run for speaker, citing the demands of the job on his young family and his desire to run the tax-writing panel, which he has called his “dream job.” And he did so again on Thursday, within minutes of McCarthy’s abrupt decision to abandon a race he had been favored to win. “I will not be a candidate for speaker,” Ryan tweeted. Yet the pressure kept coming. Lawmakers brought up his name throughout the day, and there were reports that Boehner himself had personally implored him to change his mind.
A new report details a black market in nuclear materials.
On Wednesday, the Associated Press published a horrifying report about criminal networks in the former Soviet Union trying to sell “radioactive material to Middle Eastern extremists.” At the center of these cases, of which the AP learned of four in the past five years, was a “thriving black market in nuclear materials” in a “tiny and impoverished Eastern European country”: Moldova.
It’s a new iteration of an old problem with a familiar geography. The breakup of the Soviet Union left a superpower’s worth of nuclear weapons scattered across several countries without a superpower’s capacity to keep track of them. When Harvard’s Graham Allison flagged this problem in 1996, he wrote that the collapse of Russia’s “command-and-control society” left nothing secure. To wit:
Kids who are adopted have richer, more involved parents. They also have more behavior and attention problems. Why?
Being adopted can be one of the best things to happen to a kid. People who adopt tend to be wealthier than other parents, both because of self-selection and because of the adoption screening process. Adoptive parents tend to be better-educated and put more effort into raising their kids, as measured by things like eating family meals together, providing the child with books, and getting involved in their schools.
And yet, as rated by their teachers and tests, adopted children tend to have worse behavioral and academic outcomes in kindergarten and first grade than birth children do, according to a new research brief from the Institute for Family Studies written by psychologist Nicholas Zill.
What’s the balance between preparing students for college and ensuring they aren’t killing themselves in the process?
Kids who go to elite private high schools enjoy lots of advantages. They have access to the most challenging academic classes at reputable institutions, with staffs that are well-equipped to help them prepare for college. Parents pay an average of $10,000 per year to ensure their kids this privilege.
And yet the rigor that these opportunities demand can come with an extra cost for the students themselves. A recent study surveyed and interviewed students at a handful of these high schools and found that about half of them are chronically stressed. The results aren’t surprising—between the homework required for Advanced Placement classes, sports practices, extracurricular activities like music and student government, and SAT prep, the fortunate kids who have access to these opportunities don’t have much downtime these days. These experiences can cause kids to burn out by the time they get to college, or to feel the psychological and physical effects of stress for much of their adult lives, says Marya Gwadz, a senior research scientist at the New York University College of Nursing.
A new tally of the those killed last month makes it the deadliest-ever disaster at the annual pilgrimage.
The death toll in last month’s Hajj stampede in Saudi Arabia is roughly double the number that the country first reported, the Associated Press is reporting.
The Saudi estimate of the disaster was 769, but the new estimate, based on an AP count, suggests that 1,453 people died in the stampede. This new number would make it the deadliest catastrophe in the history of the event.
The Hajj draws roughly 2 million pilgrims to Mecca each year, an observance that lends its host, Saudi Arabia, unrivaled prestige across the Muslim world. It also saddles the kingdom with billions of dollars of costs and logistical considerations. Over the course of the past 40 years, several of the pilgrimages have been marred by deaths caused from stampedes, the collapse of infrastructure, violence, and fires.
Joe Wright’s bombastic CGI-laden epic might be the worst franchise spinoff ever attempted.
Have you ever wondered where the pixie dust in Peter Pan came from, or what its scientific name is? Just how the institutional hierarchies of fairies, native tribes and lost boys break down? How the pirate infrastructure Captain Hook would later commandeer was set up? Where Hook came from, or Smee, for that matter? It’s hard to imagine anyone being so curious, more than 100 years after J. M. Barrie’s play debuted, but nevertheless, Pan is here—a big-budget, garish mess of a blockbuster that answers questions about the Peter Pan universe nobody asked.
In Pan, Peter (Levi Miller) first visits Neverland after he’s spirited away by orphan-snatching pirates during World War II (never mind that Barrie’s original story was set at the turn of the century). In this timeline, the kingdom is governed by the dread pirate Blackbeard (Hugh Jackman), who steals children to mine pixie dust (or “pixum”) to grant him eternal youth, while he fights an endless war with Neverland’s native peoples, led by Tiger Lily (played by the decidedly white Rooney Mara). Peter meets the dashing young James Hook (Garrett Hedlund) and inspires a rebellion, but Pan’s plot quickly disintegrates as the director Joe Wright stages one bombastic set-piece after another with very little grasp on the story he’s trying to tell.
American politicians are now eager to disown a failed criminal-justice system that’s left the U.S. with the largest incarcerated population in the world. But they've failed to reckon with history. Fifty years after Daniel Patrick Moynihan’s report “The Negro Family” tragically helped create this system, it's time to reclaim his original intent.
By his own lights, Daniel Patrick Moynihan, ambassador, senator, sociologist, and itinerant American intellectual, was the product of a broken home and a pathological family. He was born in 1927 in Tulsa, Oklahoma, but raised mostly in New York City. When Moynihan was 10 years old, his father, John, left the family, plunging it into poverty. Moynihan’s mother, Margaret, remarried, had another child, divorced, moved to Indiana to stay with relatives, then returned to New York, where she worked as a nurse. Moynihan’s childhood—a tangle of poverty, remarriage, relocation, and single motherhood—contrasted starkly with the idyllic American family life he would later extol.
A family that falls behind on even a small loan can find itself in a never-ending pit of payments and shame.
On a recent Saturday afternoon, the mayor of Jennings, a St. Louis suburb of about 15,000, settled in before a computer in the empty city-council chambers. Yolonda Fountain Henderson, 50, was elected last spring as the city’s first black mayor.
On the screen was a list of every debt-collection lawsuit against a resident of her city, at least 4,500 in just five years. Henderson asked to see her own street. On her block of 16 modest ranch-style homes, lawsuits had been filed against the occupants of eight. “That’s my neighbor across the street,” she said, pointing to one line on the screen.
And then she saw her own suit. Henderson, a single mother, fell behind on her sewer bill after losing her job a few years ago, and the utility successfully sued her. That judgment was listed, as well as how one day the company seized $382 from her credit-union account—all she had, but not enough to pay off the debt.