Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
A CFPB investigation concluded that Transunion and Equifax deceived Americans about the reports they provided and the fees they charged.
In personal finance, practically everything can turn on one’s credit score. It’s both an indicator of one’s financial past, and the key to accessing necessities—without insane costs—in the future. But on Tuesday, the Consumer Financial Protection Bureau announced that two of the three major credit-reporting agencies responsible for doling out those scores—Equifax and Transunion—have been deceiving and taking advantage of Americans. The Bureau ordered the agencies to pay more than $23 million in fines and restitution.
In their investigation, the Bureau found that the two agencies had been misrepresenting the scores provided to consumers, telling them that the score reports they received were the same reports that lenders and businesses received, when, in fact, they were not. The investigation also found problems with the way the agencies advertised their products, using promotions that suggested that their credit reports were either free or cost only $1. According to the CFPB the agencies did not properly disclose that after a trial of seven to 30 days, individuals would be enrolled in a full-price subscription, which could total $16 or more per month. The Bureau also found Equifax to be in violation of the Fair Credit Reporting Act, which states that the agencies must provide one free report every 12 months made available at a central site. Before viewing their free report, consumers were forced to view advertisements for Equifax, which is prohibited by law.
The MIT economist Peter Temin argues that economic inequality results in two distinct classes. And only one of them has any power.
A lot of factors have contributed to American inequality: slavery, economic policy, technological change, the power of lobbying, globalization, and so on. In their wake, what’s left?
That’s the question at the heart of a new book, The Vanishing Middle Class: Prejudice and Power in a Dual Economy, by Peter Temin, an economist from MIT. Temin argues that, following decades of growing inequality, America is now left with what is more or less a two-class system: One small, predominantly white upper class that wields a disproportionate share of money, power, and political influence and a much larger, minority-heavy (but still mostly white) lower class that is all too frequently subject to the first group’s whims.
In 1985, Neil Postman observed an America imprisoned by its own need for amusement. He was, it turns out, extremely prescient.
Earlier this month, thousands of protesters gathered at Washington’s National Mall to advocate for an assortment of causes: action against global climate change, federal funding for scientific research, a generally empirical approach to the world and its mysteries. The protesters at the March for Science, as scientists are wont to do, followed what has become one of the established formulas for such an event, holding clever signs, wearing cheeky outfits, and attempting, overall, to carnivalize their anger. “Make the Barrier Reef Great Again,” read one sign at the March. “This is my sine,” read another. “I KNEW TO WEAR THIS,” one woman had written on the poncho she wore that soggy Saturday, “BECAUSE SCIENCE PREDICTED THE RAIN.” Three protesters, sporting sensible footwear and matching Tyrannosaurus rex costumes, waved poster boards bearing messages like “Jurassick of this shit.”
American society increasingly mistakes intelligence for human worth.
As recently as the 1950s, possessing only middling intelligence was not likely to severely limit your life’s trajectory. IQ wasn’t a big factor in whom you married, where you lived, or what others thought of you. The qualifications for a good job, whether on an assembly line or behind a desk, mostly revolved around integrity, work ethic, and a knack for getting along—bosses didn’t routinely expect college degrees, much less ask to see SAT scores. As one account of the era put it, hiring decisions were “based on a candidate having a critical skill or two and on soft factors such as eagerness, appearance, family background, and physical characteristics.”
The 2010s, in contrast, are a terrible time to not be brainy. Those who consider themselves bright openly mock others for being less so. Even in this age of rampant concern over microaggressions and victimization, we maintain open season on the nonsmart. People who’d swerve off a cliff rather than use a pejorative for race, religion, physical appearance, or disability are all too happy to drop the s‑bomb: Indeed, degrading others for being “stupid” has become nearly automatic in all forms of disagreement.
WASHINGTON, D.C.—On President Trump’s hundredth day in office, a flood of protesters—fearful of more literal floods to come—deluged the nation’s capital.
Tens of thousands of people filled downtown Washington on Saturday to protest the Trump administration’s environmental agenda and the decades-long history of American inaction on climate change. Over the course of a sweltering 91-degree day, they shut down Pennsylvania Avenue, surrounded the White House in a massive sit-in, and rallied in front of the Washington Monument.
“What do we do when our communities are under attack? Stand up, fight back!” said Dallas Goldtooth, an organizer with the Indigenous Environmental Network and one of the emcees of the rally.
The party appears to be struggling to convince the public it represents a better alternative to President Trump and the GOP.
If Democrats want to regain the power they’ve lost at the state and federal level in recent years, they will have to convince more voters they can offer solutions to their problems.
That may be especially difficult, however, if voters think the party and its representatives in government don’t understand or care about them. And according to a recently released poll, many voters may, in fact, feel that way. The Washington Post-ABC News survey, released this week, found that a majority of the public thinks the Democratic Party is out of touch with the concerns of average Americans in the United States. More Americans think Democrats are out of touch than believe the same of the Republican Party or President Trump.
China has profited immensely from the open global trading system. But whether it remains open depends on the actions of the West’s increasingly reactive democracies.
In January 2017 the global economy changed guard. The venue was Davos, the annual gathering of the world’s wealthiest recyclers of conventional wisdom—and consistently one of the last places to anticipate what is going to happen next. This time was different. The assembled hedge-fund tycoons, Silicon Valley data executives, management gurus, and government officials were treated to a preview of how rapidly the world is about to change. Xi Jinping, the president of China, had come to the Swiss Alpine resort to defend the global trade system against the attacks of the U.S. president-elect, Donald Trump. With minimal fanfare, the leader of the world’s largest developing economy took over the role of defending the global trading system in the teeth of protectionist war cries from the world’s most developed nation. It portended a new era in which China would apparently play the role of the responsible global citizen. The bad guys were swapping places with the good. “Some people blame economic globalization for the chaos in our world,” Xi told Davos. “We should not retreat into the harbor whenever we encounter a storm or we will never reach the other shore. … No one will emerge as a winner from a trade war.”
A child psychologist argues punishment is a waste of time when trying to eliminate problem behavior. Try this instead.
Say you have a problem child. If it’s a toddler, maybe he smacks his siblings. Or she refuses to put on her shoes as the clock ticks down to your morning meeting at work. If it’s a teenager, maybe he peppers you with obscenities during your all-too-frequent arguments. The answer is to punish them, right?
Not so, says Alan Kazdin, director of the Yale Parenting Center. Punishment might make you feel better, but it won’t change the kid’s behavior. Instead, he advocates for a radical technique in which parents positively reinforce the behavior they do want to see until the negative behavior eventually goes away.
As I was reporting my recent series about child abuse, I came to realize that parents fall roughly into three categories. There’s a small number who seem intuitively to do everything perfectly: Moms and dads with chore charts that actually work and snack-sized bags of organic baby carrots at the ready. There’s an even smaller number who are horrifically abusive to their kids. But the biggest chunk by far are parents in the middle. They’re far from abusive, but they aren’t super-parents, either. They’re busy and stressed, so they’re too lenient one day and too harsh the next. They have outdated or no knowledge of child psychology, and they’re scrambling to figure it all out.
Will you pay more for those shoes before 7 p.m.? Would the price tag be different if you lived in the suburbs? Standard prices and simple discounts are giving way to far more exotic strategies, designed to extract every last dollar from the consumer.
As Christmas approached in 2015, the price of pumpkin-pie spice went wild. It didn’t soar, as an economics textbook might suggest. Nor did it crash. It just started vibrating between two quantum states. Amazon’s price for a one-ounce jar was either $4.49 or $8.99, depending on when you looked. Nearly a year later, as Thanksgiving 2016 approached, the price again began whipsawing between two different points, this time $3.36 and $4.69.
We live in the age of the variable airfare, the surge-priced ride, the pay-what-you-want Radiohead album, and other novel price developments. But what was this? Some weird computer glitch? More like a deliberate glitch, it seems. “It’s most likely a strategy to get more data and test the right price,” Guru Hariharan explained, after I had sketched the pattern on a whiteboard.
“Somewhere at Google there is a database containing 25 million books and nobody is allowed to read them.”
You were going to get one-click access to the full text of nearly every book that’s ever been published. Books still in print you’d have to pay for, but everything else—a collection slated to grow larger than the holdings at the Library of Congress, Harvard, the University of Michigan, at any of the great national libraries of Europe—would have been available for free at terminals that were going to be placed in every local library that wanted one.
At the terminal you were going to be able to search tens of millions of books and read every page of any book you found. You’d be able to highlight passages and make annotations and share them; for the first time, you’d be able to pinpoint an idea somewhere inside the vastness of the printed record, and send somebody straight to it with a link. Books would become as instantly available, searchable, copy-pasteable—as alive in the digital world—as web pages.