Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
New research confirms what they say about nice guys.
Smile at the customer. Bake cookies for your colleagues. Sing your subordinates’ praises. Share credit. Listen. Empathize. Don’t drive the last dollar out of a deal. Leave the last doughnut for someone else.
Sneer at the customer. Keep your colleagues on edge. Claim credit. Speak first. Put your feet on the table. Withhold approval. Instill fear. Interrupt. Ask for more. And by all means, take that last doughnut. You deserve it.
Follow one of those paths, the success literature tells us, and you’ll go far. Follow the other, and you’ll die powerless and broke. The only question is, which is which?
Of all the issues that preoccupy the modern mind—Nature or nurture? Is there life in outer space? Why can’t America field a decent soccer team?—it’s hard to think of one that has attracted so much water-cooler philosophizing yet so little scientific inquiry. Does it pay to be nice? Or is there an advantage to being a jerk?
Some fans are complaining that Zack Snyder’s envisioning of the Man of Steel is too grim—but it’s less a departure than a return to the superhero’s roots.
Since the official teaser trailer for Batman v Superman: Dawn of Justice debuted online in April, fans and critics alike have been discussing the kind of Superman Zack Snyder is going to depict in his Man of Steel sequel. The controversy stems from Snyder’s decision to cast Superman as a brooding, Dark Knight-like character, who cares more about beating up bad guys than saving people. The casting split has proved divisive among Superman fans: Some love the new incarnation, citing him as an edgier, more realistic version of the character.
But Snyder’s is a different Superman than the one fans grew up with, and many have no problem expressing their outrage over it. Even Mark Waid, the author of Superman: Birthright (one of the comics the original film is based on), voiced his concern about Man of Steel’s turn toward bleakness when it came out in 2013:
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
19 Kids and Counting built its reputation on preaching family values, but the mass-media platforms that made the family famous might also be their undoing.
On Thursday, news broke that Josh Duggar, the oldest son of the Duggar family's 19 children, had, as a teenager, allegedly molested five underage girls. Four of them, allegedly, were his sisters.
The information came to light because, in 2006—two years before 17 Kids and Counting first aired on TLC, and thus two years before the Duggars became reality-TV celebrities—the family recorded an appearance on TheOprah Winfrey Show. Before the taping, an anonymous source sent an email to Harpo warning the production company Josh’s alleged molestation. Harpo forwarded the email to authorities, triggering a police investigation (the Oprah appearance never aired). The news was reported this week by In Touch Weekly—after the magazine filed a Freedom of Information Act request to see the police report on the case—and then confirmed by the Duggars in a statement posted on Facebook.
In an interview, the U.S. president ties his legacy to a pact with Tehran, argues ISIS is not winning, warns Saudi Arabia not to pursue a nuclear-weapons program, and anguishes about Israel.
On Tuesday afternoon, as President Obama was bringing an occasionally contentious but often illuminating hour-long conversation about the Middle East to an end, I brought up a persistent worry. “A majority of American Jews want to support the Iran deal,” I said, “but a lot of people are anxiety-ridden about this, as am I.” Like many Jews—and also, by the way, many non-Jews—I believe that it is prudent to keep nuclear weapons out of the hands of anti-Semitic regimes. Obama, who earlier in the discussion had explicitly labeled the supreme leader of Iran, Ayatollah Ali Khamenei, an anti-Semite, responded with an argument I had not heard him make before.
“Look, 20 years from now, I’m still going to be around, God willing. If Iran has a nuclear weapon, it’s my name on this,” he said, referring to the apparently almost-finished nuclear agreement between Iran and a group of world powers led by the United States. “I think it’s fair to say that in addition to our profound national-security interests, I have a personal interest in locking this down.”
Why agriculture may someday take place in towers, not fields
A couple of Octobers ago, I found myself standing on a 5,000-acre cotton crop in the outskirts of Lubbock, Texas, shoulder-to-shoulder with a third-generation cotton farmer. He swept his arm across the flat, brown horizon of his field, which was at that moment being plowed by an industrial-sized picker—a toothy machine as tall as a house and operated by one man. The picker’s yields were being dropped into a giant pod to be delivered late that night to the local gin. And far beneath our feet, the Ogallala aquifer dwindled away at its frighteningly swift pace. When asked about this, the farmer spoke of reverse osmosis—the process of desalinating water—which he seemed to put his faith in, and which kept him unafraid of famine and permanent drought.
Advocates say that a guaranteed basic income can lead to more creative, fulfilling work. The question is how to fund it.
Scott Santens has been thinking a lot about fish lately. Specifically, he’s been reflecting on the aphorism, “If you give a man a fish, he eats for a day. If you teach a man to fish, he eats for life.” What Santens wants to know is this: “If you build a robot to fish, do all men starve, or do all men eat?”
Santens is 37 years old, and he’s a leader in the basic income movement—a worldwide network of thousands of advocates (26,000 on Reddit alone) who believe that governments should provide every citizen with a monthly stipend big enough to cover life’s basic necessities. The idea of a basic income has been around for decades, and it once drew support from leaders as different as Martin Luther King Jr. and Richard Nixon. But rather than waiting for governments to act, Santens has started crowdfunding his own basic income of $1,000 per month. He’s nearly halfway to his his goal.
No police officers will serve time for the November 2012 shooting death of two unarmed black civilians.
On November 29, 2012, police officers and witnesses heard what appeared to be gunshots coming from a car driving near a police station in Cleveland. A high-speed car chase ensued, drawing in over 100 officers on duty, before the police managed to corner the car. Thirteen police officers then fired 137 rounds of ammunition at the vehicle, whose occupants Cleveland police suspected were armed. After the other officers stopped firing, 31-year-old Michael Brelo climbed on top of the hood of the suspect’s car and fired 15 more rounds at close range. When the shooting stopped, the car’s occupants, 43-year-old Timothy Russell and 30-year-old Malissa Williams, were dead. Both were unarmed. The “gunshot” witnesses heard turned out to be a backfiring car.
Sean Wilentz discusses his latest book, "Bob Dylan in America," which describes the singer's influence on our nation's culture
Nearly half a century after he released his first album, Bob Dylan continues to release new albums (including, last year, a compilation of Christmas songs) and tour the country playing concerts. Sean Wilentz, an American history professor at Princeton University and "historian-in-residence" at BobDylan.com, traces Dylan's influence on American culture in his new book, Bob Dylan in America. Here, he discusses how Dylan shaped his generation—and whether there's a similar artist in today's music scene.
The book is called Bob Dylan in America. What's Dylan's place in our nation's cultural history history?
He's the most important songwriter of the last 50 years, in a culture in which songwriting has always been a major force, a major component.