Here's a pocket history of the web, according to many people. In the early days, the web was just pages of information linked to each other. Then along came web crawlers that helped you find what you wanted among all that information. Some time around 2003 or maybe 2004, the social web really kicked into gear, and thereafter the web's users began to connect with each other more and more often. Hence Web 2.0, Wikipedia, MySpace, Facebook, Twitter, etc. I'm not strawmanning here. This is the dominant history of the web as seen, for example, in this Wikipedia entry on the 'Social Web.'
1. The sharing you see on sites like Facebook and Twitter is the tip of the 'social' iceberg. We are impressed by its scale because it's easy to measure.
2. But most sharing is done via dark social means like email and IM that are difficult to measure.
3. According to new data on many media sites, 69% of social referrals came from dark social. 20% came from Facebook.
4. Facebook and Twitter do shift the paradigm from private sharing to public publishing. They structure, archive, and monetize your publications.
But it's never felt quite right to me. For one, I spent most of the 90s as a teenager in rural Washington and my web was highly, highly social. We had instant messenger and chat rooms and ICQ and USENET forums and email. My whole Internet life involved sharing links with local and Internet friends. How was I supposed to believe that somehow Friendster and Facebook created a social web out of what was previously a lonely journey in cyberspace when I knew that this has not been my experience? True, my web social life used tools that ran parallel to, not on, the web, but it existed nonetheless.
To be honest, this was a very difficult thing to measure. One dirty secret of web analytics is that the information we get is limited. If you want to see how someone came to your site, it's usually pretty easy. When you follow a link from Facebook to The Atlantic, a little piece of metadata hitches a ride that tells our servers, "Yo, I'm here from Facebook.com." We can then aggregate those numbers and say, "Whoa, a million people came here from Facebook last month," or whatever.
There are circumstances, however, when there is no referrer data. You show up at our doorstep and we have no idea how you got here. The main situations in which this happens are email programs, instant messages, some mobile applications*, and whenever someone is moving from a secure site ("https://mail.google.com/blahblahblah") to a non-secure site (http://www.theatlantic.com).
This means that this vast trove of social traffic is essentially invisible to most analytics programs. I call it DARK SOCIAL. It shows up variously in programs as "direct" or "typed/bookmarked" traffic, which implies to many site owners that you actually have a bookmark or typed in www.theatlantic.com into your browser. But that's not actually what's happening a lot of the time. Most of the time, someone Gchatted someone a link, or it came in on a big email distribution list, or your dad sent it to you.
Nonetheless, the idea that "social networks" and "social media" sites created a social web is pervasive. Everyone behaves as if the traffic your stories receive from the social networks (Facebook, Reddit, Twitter, StumbleUpon) is the same as all of your social traffic. I began to wonder if I was wrong. Or at least that what I had experienced was a niche phenomenon and most people's web time was not filled with Gchatted and emailed links. I began to think that perhaps Facebook and Twitter has dramatically expanded the volume of -- at the very least -- linksharing that takes place.
Everyone else had data to back them up. I had my experience as a teenage nerd in the 1990s. I was not about to shake social media marketing firms with my tales of ICQ friends and the analogy of dark social to dark energy. ("You can't see it, dude, but it's what keeps the universe expanding. No dark social, no Internet universe, man! Just a big crunch.")
And then one day, we had a meeting with the real-time web analytics firm, Chartbeat. Like many media nerds, I love Chartbeat. It lets you know exactly what's happening with your stories, most especially where your readers are coming from. Recently, they made an accounting change that they showed to us. They took visitors who showed up without referrer data and split them into two categories. The first was people who were going to a homepage (theatlantic.com) or a subject landing page (theatlantic.com/politics). The second were people going to any other page, that is to say, all of our articles. These people, they figured, were following some sort of link because no one actually types "http://www.theatlantic.com/technology/archive/2012/10/atlast-the-gargantuan-telescope-designed-to-find-life-on-other-planets/263409/." They started counting these people as what they call direct social.
The second I saw this measure, my heart actually leapt (yes, I am that much of a data nerd). This was it! They'd found a way to quantify dark social, even if they'd given it a lamer name!
On the first day I saw it, this is how big of an impact dark social was having on The Atlantic.
Just look at that graph. On the one hand, you have all the social networks that you know. They're about 43.5 percent of our social traffic. On the other, you have this previously unmeasured darknet that's delivering 56.5 percent of people to individual stories. This is not a niche phenomenon! It's more than 2.5x Facebook's impact on the site.
Day after day, this continues to be true, though the individual numbers vary a lot, say, during a Reddit spike or if one of our stories gets sent out on a very big email list or what have you. Day after day, though, dark social is nearly always our top referral source.
Perhaps, though, it was only The Atlantic for whatever reason. We do really well in the social world, so maybe we were outliers. So, I went back to Chartbeat and asked them to run aggregate numbers across their media sites.
Get this. Dark social is even more important across this broader set of sites. Almost 69 percent of social referrals were dark! Facebook came in second at 20 percent. Twitter was down at 6 percent.
All in all, direct/dark social was 17.5 percent of total referrals; only search at 21.5 percent drove more visitors to this basket of sites. (FWIW, at The Atlantic, social referrers far outstrip search. I'd guess the same is true at all the more magaziney sites.)
There are a couple of really interesting ramifications of this data. First, on the operational side, if you think optimizing your Facebook page and Tweets is "optimizing for social," you're only halfway (or maybe 30 percent) correct. The only real way to optimize for social spread is in the nature of the content itself. There's no way to game email or people's instant messages. There's no power users you can contact. There's no algorithms to understand. This is pure social, uncut.
Second, the social sites that arrived in the 2000s did not create the social web, but they did structure it. This is really, really significant. In large part, they made sharing on the Internet an act of publishing (!), with all the attendant changes that come with that switch. Publishing social interactions makes them more visible, searchable, and adds a lot of metadata to your simple link or photo post. There are some great things about this, but social networks also give a novel, permanent identity to your online persona. Your taste can be monetized, by you or (much more likely) the service itself.
Third, I think there are some philosophical changes that we should consider in light of this new data. While it's true that sharing came to the web's technical infrastructure in the 2000s, the behaviors that we're now all familiar with on the large social networks was present long before they existed, and persists despite Facebook's eight years on the web. The history of the web, as we generally conceive it, needs to consider technologies that were outside the technical envelope of "webness." People layered communication technologies easily and built functioning social networks with most of the capabilities of the web 2.0 sites in semi-private and without the structure of the current sites.
If what I'm saying is true, then the tradeoffs we make on social networks is not the one that we're told we're making. We're not giving our personal data in exchange for the ability to share links with friends. Massive numbers of people -- a larger set than exists on any social network -- already do that outside the social networks. Rather, we're exchanging our personal data in exchange for the ability to publish and archive a record of our sharing. That may be a transaction you want to make, but it might not be the one you've been told you made.
* Chartbeat datawiz Josh Schwartz said it was unlikely that the mobile referral data was throwing off our numbers here. "Only about four percent of total traffic is on mobile at all, so, at least as a percentage of total referrals, app referrals must be a tiny percentage," Schwartz wrote to me in an email. "To put some more context there, only 0.3 percent of total traffic has the Facebook mobile site as a referrer and less than 0.1 percent has the Facebook mobile app."
Walk into the offices of Memac Ogilvy Advize, an advertising firm on the third floor of a car rental building in a business district of West Amman, Jordan, and you’ll be greeted with an immense black-and-white photo of Donald Trump’s face. The red cursive text printed across it reads: “We Trumped the awards.”
The sign sits behind a reception counter boasting a large trophy won at the Dubai Lynx 2017, an annual advertising competition where Memac Ogilvy won the Grand Prix for PR (a first for any Jordanian agency) along with four other silver and gold prizes, for trolling Trump in their ads on behalf of Royal Jordanian Airlines.
Conservatives once warned that Obamacare would produce the Democratic Waterloo. Their inability to accept the principle of universal coverage has, instead, led to their own defeat.
Seven years and three days ago, the House of Representatives grumblingly voted to approve the Senate’s version of the Affordable Care Act. Democrats in the House were displeased by many of the changes introduced by Senate Democrats. But in the interval after Senate passage, the Republicans had gained a 41st seat in the Senate. Any further tinkering with the law could trigger a Republican filibuster. Rather than lose the whole thing, the House swallowed hard and accepted a bill that liberals regarded as a giveaway to insurance companies and other interest groups. The finished law proceeded to President Obama for signature on March 23, 2010.
A few minutes after the House vote, I wrote a short blog post for the website I edited in those days. The site had been founded early in 2009 to argue for a more modern and more moderate form of Republicanism. The timing could not have been worse. At precisely the moment we were urging the GOP to march in one direction, the great mass of conservatives and Republicans had turned on the double in the other, toward an ever more wild and even paranoid extremism. Those were the days of Glenn Beck’s 5 o’clock Fox News conspiracy rants, of Sarah Palin’s “death panels,” of Orly Taitz and her fellow Birthers, of Tea Party rallies at which men openly brandished assault rifles.
Most of management theory is inane, writes our correspondent, the founder of a consulting firm. If you want to succeed in business, don’t get an M.B.A. Study philosophy instead
During the seven years that I worked as a management consultant, I spent a lot of time trying to look older than I was. I became pretty good at furrowing my brow and putting on somber expressions. Those who saw through my disguise assumed I made up for my youth with a fabulous education in management. They were wrong about that. I don’t have an M.B.A. I have a doctoral degree in philosophy—nineteenth-century German philosophy, to be precise. Before I took a job telling managers of large corporations things that they arguably should have known already, my work experience was limited to part-time gigs tutoring surly undergraduates in the ways of Hegel and Nietzsche and to a handful of summer jobs, mostly in the less appetizing ends of the fast-food industry.
The Obama years left Republicans with excellent ratings from the Heritage Foundation, and no idea how to whip a vote.
The Republican Party’s marquee legislative initiative had just imploded in spectacular, and humiliating, fashion Friday afternoon when Paul Ryan stepped up to a podium on Capitol Hill. The beleaguered house speaker wasted no time in diagnosing the failure of his caucus. “Moving from an opposition party to a governing party comes with some growing pains,” he said. “And, well, we’re feeling those growing pains today.”
Ryan wasn’t wrong. The GOP’s inability to maneuver a health-care bill through the House this week—after seven years of promising to repeal and replace Obamacare—is, indeed, emblematic of a deeper dysfunction that grips his party. But that dysfunction may not be as easy to cure as Ryan and other GOP leaders believe.
"Where people are desperate, it is still America they count on, whether they love or scorn it, and America they blame when aid does not come."
After Donald Trump’s victory in the U.S. presidential election in November, a foreign ambassador accosted one of my deputies at the State Department, where from 2014 to early this year I served as theassistant secretary of state for democracy, human rights, and labor. “You must be so sad!” the man, a representative of a Central Asian government, said, grinning widely. “All this talk of elections being important, of democracy being important, and now look at you! Now even your new president says there were 3 million illegal votes in your election! … You must all feel so stupid these days.”
Since then, the global club of autocrats has been crowing about Trump. Sudan’s dictator Omar al Bashir praised him for focusing “on the interests of the American citizen, as opposed to those who talk about democracy, human rights, and transparency.” Iran’s Supreme Leader Ayatollah Khamenei thanked him for showing “America’s true face” by trying to ban Muslim immigration. The Cambodian government justified attacks on journalists by saying Trump, too, recognizes that “news published by [international] media institutions does not reflect the real situation.”
The College Board earns over half of all its revenues from the courses—and, in an uncertain environment, students keep being suckered.
Fraudulent schemes come in all shapes and sizes. To work, they typically wear a patina of respectability. That's the case with Advanced Placement courses, one of the great frauds currently perpetrated on American high-school students.
That's a pretty strong claim, right? You bet. But why not be straightforward when discussing a scam the scale and audacity of which would raise Bernie Madoff's eyebrows?
The miscellany of AP courses offered in U.S. high schools under the imprimatur of the College Board probably started with good intentions. The idea, going back to the 1950s, was to offer college-level courses and exams to high-school students. The courses allegedly provide students the kind of rigorous academic experience they will encounter in college as well as an opportunity to earn college credit for the work.
The House abandoned its legislation to repeal and replace the Affordable Care Act, handing President Trump and Speaker Paul Ryan a major defeat.
Updated on March 24 at 6:28 p.m. ET
To a man and woman, nearly every one of the 237 Republicans elected to the House last November made the same promise to voters: Give us control of Congress and the White House, and we will repeal and replace the Affordable Care Act.
On Friday, those lawmakers abandoned that effort, conceding that the Republican Party’s core campaign pledge of the last seven years will go unfulfilled. “I will not sugarcoat this: This is a disappointing day for us,” House Speaker Paul Ryan said at a press conference after he informed Republicans that he was ditching the American Health Care Act.
“We did not have quite the votes to replace this law,” Ryan said. “And, so yeah, we’re going to be living with Obamacare for the foreseeable future.”
Speaking after the collapse of the Republican health-care bill, the president assigned blame to plenty of parties but cast himself as a mere bystander.
Speaking in the Oval Office Friday afternoon, President Trump surveyed the wreckage of the Obamacare repeal effort and issued a crisp, definitive verdict: I didn’t do it.
The president said he didn’t blame Speaker Paul Ryan, though he had plenty of implied criticism for the speaker. “I like Speaker Ryan. He worked very hard,” Trump said, but he added: “I'm not going to speak badly about anybody within the Republican Party. Certainly there's a big history. I really think Paul worked hard.” He added ruefully that the GOP could have taken up tax-reform first, instead of Obamacare—the reverse of Ryan’s desired sequence. “Now we’re going to go for tax reform, which I’ve always liked,” he said.
The philosophers he influenced set the stage for the technological revolution that remade our world.
THE HISTORY Ofcomputers is often told as a history of objects, from the abacus to the Babbage engine up through the code-breaking machines of World War II. In fact, it is better understood as a history of ideas, mainly ideas that emerged from mathematical logic, an obscure and cult-like discipline that first developed in the 19th century. Mathematical logic was pioneered by philosopher-mathematicians, most notably George Boole and Gottlob Frege, who were themselves inspired by Leibniz’s dream of a universal “concept language,” and the ancient logical system of Aristotle.
Listen to the audio version of this article:Download the Audm app for your iPhone to listen to more titles.
Mathematical logic was initially considered a hopelessly abstract subject with no conceivable applications. As one computer scientist commented: “If, in 1901, a talented and sympathetic outsider had been called upon to survey the sciences and name the branch which would be least fruitful in [the] century ahead, his choice might well have settled upon mathematical logic.” And yet, it would provide the foundation for a field that would have more impact on the modern world than any other.
If the lobbyist’s work did indeed “greatly benefit the Putin Government,” the contract wouldn’t be especially out of the ordinary for an American lobbyist—or for Russia.
MOSCOW—The reports that former Trump campaign manager Paul Manafort had had a contract for tens of millions of dollars to “greatly benefit the Putin Government” were not exactly news here. And, in a certain sense, they didn’t have to be news in Washington, either.
Manafort, who has reportedly just volunteered to testify in the House Intelligence Committee’s investigation of Russian meddling in the U.S. election, had been a lobbyist, a notorious one, for decades. His work for less-than-democratic governments, including various African strongmen and the Marcos family of the Philippines, had been well-known in Washington and reported over the last year. It is also not uncommon for lobbyists and political operatives waiting out an administration of the opposite party to work abroad, helping foreign governments of whatever stripe sharpen their political game. Democratic operatives who had worked on the Obama and Clinton campaigns, for example, have done work advising politicians in Britain, Ukraine, and Georgia. Manafort seemed to have fewer moral qualms and filters than others—the only ticket to access his political skills, it seems, was the right amount of money—but it was all part of the swamp the Donald Trump campaign, with Manafort at the helm for about five months, promised to drain.