Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
She lived with us for 56 years. She raised me and my siblings without pay. I was 11, a typical American kid, before I realized who she was.
The ashes filled a black plastic box about the size of a toaster. It weighed three and a half pounds. I put it in a canvas tote bag and packed it in my suitcase this past July for the transpacific flight to Manila. From there I would travel by car to a rural village. When I arrived, I would hand over all that was left of the woman who had spent 56 years as a slave in my family’s household.
The condition has long been considered untreatable. Experts can spot it in a child as young as 3 or 4. But a new clinical approach offers hope.
This is a good day, Samantha tells me: 10 on a scale of 10. We’re sitting in a conference room at the San Marcos Treatment Center, just south of Austin, Texas, a space that has witnessed countless difficult conversations between troubled children, their worried parents, and clinical therapists. But today promises unalloyed joy. Samantha’s mother is visiting from Idaho, as she does every six weeks, which means lunch off campus and an excursion to Target. The girl needs supplies: new jeans, yoga pants, nail polish.
Listen to the audio version of this article:Download the Audm app for your iPhone to listen to more titles.
At 11, Samantha is just over 5 feet tall and has wavy black hair and a steady gaze. She flashes a smile when I ask about her favorite subject (history), and grimaces when I ask about her least favorite (math). She seems poised and cheerful, a normal preteen. But when we steer into uncomfortable territory—the events that led her to this juvenile-treatment facility nearly 2,000 miles from her family—Samantha hesitates and looks down at her hands. “I wanted the whole world to myself,” she says. “So I made a whole entire book about how to hurt people.”
The office was, until a few decades ago, the last stronghold of fashion formality. Silicon Valley changed that.
Americans began the 20th century in bustles and bowler hats and ended it in velour sweatsuits and flannel shirts—the most radical shift in dress standards in human history. At the center of this sartorial revolution was business casual, a genre of dress that broke the last bastion of formality—office attire—to redefine the American wardrobe.
Born in Silicon Valley in the early 1980s, business casual consists of khaki pants, sensible shoes, and button-down collared shirts. By the time it was mainstream, in the 1990s, it flummoxed HR managers and employees alike. “Welcome to the confusing world of business casual,” declared a fashion writer for the Chicago Tribune in 1995. With time and some coaching, people caught on. Today, though, the term “business casual” is nearly obsolete for describing the clothing of a workforce that includes many who work from home in yoga pants, put on a clean T-shirt for a Skype meeting, and don’t always go into the office.
Isabel Caliva and her husband, Frank, had already “kicked the can down the road.” The can, in their case, was the kid conversation; the road was Caliva’s fertile years. Frank had always said he wanted lots of kids. Caliva, who was in her early 30s, thought maybe one or two would be nice, but she was mostly undecided. They had a nice life, with plenty of free time that allowed for trips to Portugal, Paris, and Hawaii.
“I wasn’t feeling the pull the same way my friends were describing,” she told me recently. “I thought, maybe this isn’t gonna be the thing for me. Maybe it’s just going to be the two of us.”
At times, she wondered if her lack of baby fever should be cause for concern. She took her worries to the Internet, where she came across a post on the Rumpus’ “Dear Sugar” advice column titled, “The Ghost Ship that Didn’t Carry Us.” The letter was from a 41-year-old man who was also on the fence about kids: “Things like quiet, free time, spontaneous travel, pockets of non-obligation,” he wrote. “I really value them.”
U.K. police said at least 19 people are dead and 50 injured following the incident at Manchester Arena.
Here’s what we know:
—Greater Manchester Police said 19 people are dead and 50 injured following reports of an explosion at the Manchester Arena.
—The venue was the scene of an Ariana Grande concert. British Transport Police said there were “reports of an explosion within the foyer area of the stadium” at 10.30 p.m. local time, but Manchester Arena said the incident occurred “outside the venue in a public place.”
—There’s no word yet on what caused the incident, but authorities said they were treating the incident as a terrorist attack “until police know otherwise.”
—This is a developing story and we’ll be following it here. All updates are in Eastern Standard Time (GMT -4).
The president reportedly attempted to enlist the head of the NSA and director of national intelligence to defend against the Russia inquiry.
President Donald Trump reportedly tried, unsuccessfully, to enlist Admiral Michael Rogers, the director of the National Security Agency, and Daniel Coats, the director of national intelligence, to publicly refute the possibility of collusion after former FBI Director James Comey announced in March that the bureau is investigating potential links between Trump campaign associates and the Russian government, according to The Washington Post on Monday.
Citing unnamed government officials, the Post’s Adam Entous and Ellen Nakashima report that Trump asked Coats and Rogers “to publicly deny the existence of any evidence of collusion during the 2016 election.” But, according to the report, the intelligence officials turned down the ask, “which they both deemed to be inappropriate.” The White House told the Post that it would not confirm or deny the allegations.
“Having a slave gave me grave doubts about what kind of people we were, what kind of place we came from,” Alex Tizon wrote in his Atlantic essay “My Family’s Slave.”
A thousand objections can be leveled against that piece, and in the few days since it was published, those objections have materialized from all quarters. It’s a powerful story, and its flaws and omissions have their own eloquence. For me, the most important failure is that Tizon seems to attribute Lola’s abuse entirely to another culture—specifically, to a system of servitude in the Philippines—as though he believes, This doesn’t happen in America. But that system is not only in America, it’s everywhere. It ensnares not only immigrants, but everyone.
An anthropologist discusses some common misconceptions about female genital cutting, including the idea that men force women to undergo the procedure.
I recently had a conversation that challenged what I thought I knew about the controversial ritual known as “female genital cutting,” or, more commonly, "female genital mutilation."
FGC, as it is abbreviated, involves an elder or other community member slicing off all or part of a woman’s clitoris and labia as part of a ceremony that is often conducted around the time that the woman reaches puberty. Many international groups are concerned about FGC, which is practiced extensively in parts of Africa and the Middle East and is linked to infections, infertility, and childbirth complications.
Organizations such as the United Nations have campaigned against the practice, calling for its abolition as a matter of global health and human rights. But despite a decades-old movement against it, FGC rates in some countries haven't budged. While younger women are increasingly going uncut in countries such as Nigeria and the Central African Republic, according to a survey by the Population Reference Bureau, in Egypt more than 80 percent of teenagers still undergo the procedure.
The story of a decades-long lead-poisoning lawsuit in New Orleans illustrates how the toxin destroys black families and communities alike.
Casey Billieson was fighting against the world.
Hers was a charge carried by many mothers: moving mountains to make the best future for her two sons. But the mountains she faced were taller than most. To start, she had to raise her boys in the Lafitte housing projects in Treme, near the epicenter of a crime wave in New Orleans. In the spring of 1994, like mothers in violent cities the world over, Billieson anticipated the bloom in murders the thaw would bring. Fueled by the drug trade and a rising scourge of police corruption and brutality, violence rose to unseen levels that year, and the city’s murder rate surged to the highest in the country.
Listen to the audio version of this article:Download the Audm app for your iPhone to listen to more titles.