While marketers may want to boil down people's sharing behavior to one, easy equation, that's just not how the social networks function.
For many, going viral is the high point of their online life cycle. For media companies, it may soon be their primary source of subsistence.
The end of 2011 suggested as much: social media outpaced search as a top online activity last year, and Google's decision to incorporate Google+ information into search results indicates an increasing emphasis on sharing and social referrals by major Internet companies. For media outlets, this indicates an increasingly disrupted future, where websites lose their appeal as stand-alone content destinations. Felix Salmon articulates this sentiment at the Columbia Journalism Review's Audit desk "HuffPo is built on the idea that when stories are shared on Twitter or Facebook, that will drive traffic back to huffingtonpost.com, where it can then monetize that traffic by selling it to advertisers," writes Salmon. "But in future, the most viral stories are going to have a life of their own, being shared across many different platforms and being read by people who will never visit the original site on which they were published."
But not everyone has the same viral intuition that Ben Huh of I Can Haz Cheezburger or the creators of the now-famous "Old Spice Guy" ads do. So how, if at all, can mere mortals (and media companies) harness the power of virality? In reality, the key ingredient to virality isn't the number of share buttons or Twitter followers you have, but your sensitivity to culture, that body of nuances that go beyond demographic breakdowns. Each sharing ecosystem on the web has its own unique subculture, its own sets of rules of order and norms of behavior. The secret to going viral is seamlessly navigating these worlds.
Until now, media companies have looked at virality as a function of infrastructure: install every share tool imaginable on your website, publish an article and let natural Facebook activity do the rest. At TechCrunch, entrepreneur Uzi Shmilovic examined eight ways Internet giants like Facebook and Linkedin have used virality as a vehicle for success. Shmilovic emphasizes using a "Virality Coefficient" -- "how many new users on average does one user of your product 'infect'" -- to measure to virality of a piece of information. A coefficient greater than 1 indicates exponential growth, the type that describes wildly successful Internet campaigns like the Old Spice Guy:
The virality coefficient is super important, but there's one other critical number that you should pay attention to--the cycle time. The cycle time is the average time it takes from the moment that one of your users performs a viral action to the moment that a new user signs up because of this very action. It makes a huge difference if your cycle time is one day or 60 days.
David Skok of Matrix Ventures gave a presentation about that recently, and actually devised a formula to calculate the amount of users you will get after a period of time based on the Virality Coefficient (K) and the Cycle Time (ct).
Having virality expressed in this way is beneficial as it boils down virality to the optimization of two variables: maximize K and minimize ct.
The problem with Shmilovic's analysis is that it assumes virality is a structural property that can be optimized or reduced to a consistent formula. His recommendations, designed for marketers, are based on creating systems that maximize the space for sharing, differentiated with little marketing buzzwords like "communication virality" ("the product is used to communicate with other people, some of which might be potential users") or "embeddable virality" ("new people who are exposed to the content embed it on their own website, promoting it even further").
The emphasis on structural factors isn't inherently a bad thing: advancements in technology (particularly in communications) have radically transformed the speed and scope of viral products. The Economist's recentexploration of how Martin Luther's Ninety-Five Theses on the Power and Efficacy of Indulgences went viral across the continent through contemporary media -- namely the printing press and multiple translations into the various dialects that permeated 16th-century Europe -- is a perfect (and fascinating) example. In the social space, the prevalence and placement of tools like the Facebook "like" button can certainly be the determining factors of whether a compelling article reaches that tipping point in Shmilovic's Virality Coefficient. The Huffington Post is the ideal model here: the site amplifies its power as a clearinghouse for all things Internet-famous by deeply integrating every conceivable social network and sharing tool into its article pages. When it comes to the promulgation of ideas, infrastructure matters.
But festooning a page with strings of shiny share buttons (Digg! Mixx! Bookmerken! Dipdive!) is a wholly incomplete approach to the spread of information; it assumes that all social behavior and all social networks or online communities are essentially the same. But the human mind isn't a uniform filter, and sharing behavior differs across ubiquitous platforms like Google, Twitter and Facebook. "Nobody can see what you search on Google, so popular search trends tend to reflect the more reptilian brain in people," explained Jonah Peretti, founder of viral hub Buzzfeed, in 2010. ""Celebrity gossip, sex, hair transplants ... nobody tweets about this stuff." A brief glance at the most-shared stories of 2011 on Facebook, Twitter, and Linkedin highlight their differences in focus.
Obviously, the culture of each online ecosystem is shaped by its particular structure, but these have more to do with the how and where of sharing; in reality, it is the why that shapes how ideas take hold. Geert Hofstede, the influential Dutch social psychologist and anthropologist and pioneer in the field of cross-cultural studies, has a succinct take on the role of technology in shaping the spread of ideas and information in his classic work Culture's Consequences. "Electronic communication does not eliminates cultural differences, just as faster and easier travel has not reduced cultural rifts," wrote Hofstede. "The software of the machines may be globalized, but the software of the minds that use the terminals is not":
Electronic communication enormously increases the amount of information accessible for its users, but it does not increase their capacity to absorb this information or change their preexisting value systems. Users have to select what information they recognize; this has always been the case, only the selection ask has become much larger. We select our information according to our values. Like our parents, we read newspapers that we expect to give our preferred points of view, and, confronted with the new bulk of electronic information, we again pick whatever reinforces our preexisting ideas. Our relatively brief experience with the Internet so far has shown that people use to do what they were doing anyway, only maybe more and faster.
People don't engage the unique structure of social networks as blank slates; they enter into each ecosystem with a particular set of values, values that shape the nature of a community and, in turn, the type of ideas and products that take hold. As Alexis Madrigal noted, different networks fill the various social niches in our lives. This is a valuable lesson not just for marketers and media companies, but any person or organization looking to spread a set of ideas or concepts across the vastness of the Web.
Erving Goffman's analogy of social life to the theater from The Presentation of Self in Everyday Life comes to mind. Goffman argued that the social actor has the ability to choose his stage and props, as well as the costume he would wear in front of a specific audience. On the Internet, we function on many different stages, with a wardrobe bursting with meticulously crafted costumes.
Above: The pattern of sound waves, photographed by scientists at Bell Telephone Laboratories, 1950 (Library of Congress)
Forget credit hours—in a quest to cut costs, universities are simply asking students to prove their mastery of a subject.
MANCHESTER, Mich.—Had Daniella Kippnick followed in the footsteps of the hundreds of millions of students who have earned university degrees in the past millennium, she might be slumping in a lecture hall somewhere while a professor droned. But Kippnick has no course lectures. She has no courses to attend at all. No classroom, no college quad, no grades. Her university has no deadlines or tenure-track professors.
Instead, Kippnick makes her way through different subject matters on the way to a bachelor’s in accounting. When she feels she’s mastered a certain subject, she takes a test at home, where a proctor watches her from afar by monitoring her computer and watching her over a video feed. If she proves she’s competent—by getting the equivalent of a B—she passes and moves on to the next subject.
Bernie Sanders and Jeb Bush look abroad for inspiration, heralding the end of American exceptionalism.
This election cycle, two candidates have dared to touch a third rail in American politics.
Not Social Security reform. Not Medicare. Not ethanol subsidies. The shibboleth that politicians are suddenly willing to discuss is the idea that America might have something to learn from other countries.
The most notable example is Bernie Sanders, who renewed his praise for Western Europe in a recent interview with Ezra Klein. “Where is the UK? Where is France? Germany is the economic powerhouse in Europe,” Sanders said. “They provide health care to all of their people, they provide free college education to their kids.”
On ABC’s This Week in May, George Stephanopoulos asked Sanders about this sort of rhetoric. “I can hear the Republican attack ad right now: ‘He wants American to look more like Scandinavia,’” the host said. Sanders didn’t flinch:
Even when a dentist kills an adored lion, and everyone is furious, there’s loftier righteousness to be had.
Now is the point in the story of Cecil the lion—amid non-stop news coverage and passionate social-media advocacy—when people get tired of hearing about Cecil the lion. Even if they hesitate to say it.
But Cecil fatigue is only going to get worse. On Friday morning, Zimbabwe’s environment minister, Oppah Muchinguri, called for the extradition of the man who killed him, the Minnesota dentist Walter Palmer. Muchinguri would like Palmer to be “held accountable for his illegal action”—paying a reported $50,000 to kill Cecil with an arrow after luring him away from protected land. And she’s far from alone in demanding accountability. This week, the Internet has served as a bastion of judgment and vigilante justice—just like usual, except that this was a perfect storm directed at a single person. It might be called an outrage singularity.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
Who can devise the most convoluted way to wipe out the Islamic State?
Everyone with a stake in Middle Eastern geopolitics publicly declares that ISIS must be defeated. Yet opinions range widely on how this should be achieved.
Saudi Arabia, for example, believes ISIS cannot be defeated unless Syrian President Bashar al-Assad is removed from power. Turkey has just convinced NATO nations that the war against ISIS can only be won if Turkey’s traditional Kurdish opponents are neutralized first. Israel sees only one way to defeat ISIS: destroy Iran’s nuclear program and clip its wings regionally.
So what explains these apparently contradictory aims? The cynical view would be that all these parties are less interested in defeating ISIS than in achieving their own regional goals, and that they’re only pretending to be concerned about wiping out the group. Clearly, however, there is no place for cynicism in Middle Eastern politics. Everyone involved in the region is known to be sincere, albeit in radically different ways.
An alpenhorn performance in Switzerland, a portrait of Vladimir Putin made of spent ammunition from Ukraine, Prince Charles surprised by an eagle, wildfire in California, a sunset in Crimea, and much more.
An alpenhorn performance in Switzerland, a portrait of Vladimir Putin made of spent ammunition from Ukraine, fireworks in North Korea, Prince Charles surprised by an eagle, wildfire in California, protests in the Philippines and Turkey, a sunset in Crimea, and much more.
The IOC’s selection of Beijing as the host of its 2022 games is met with a lukewarm response.
When the International Olympic Committee selected Beijing on Friday as the host for the 2022 Winter Olympic Games, the Chinese capital became the first city to have hosted both the Summer and Winter games. This, most likely, isn’t coincidental: Beijing’s hosting of the Summer games in 2008 was generally considered a success, and Almaty, the Kazakh city whose bid placed second, lacks comparable experience.
A closer examination of Beijing’s 2022 bid, though, reveals the selection is far more peculiar than it seems at first glance. One reason: It barely snows in Beijing. China’s northern plain is extremely dry, and what precipitation that falls in the capital tends to occur during the summer. Beijing’s Olympic planners have assured the IOC this won’t be a problem—the country will simply use artificial snow to accommodate events, such as skiing, that require it.
Netflix’s revival of the ensemble cult film does far more than play on nostalgia—it’s an absurd, densely plotted prequel that never forgets to be funny.
At some point, given time, word of mouth, and endless rewatching, a cult classic evolves into a universally beloved media property. Netflix, it seems, has become the arbiter of that transformation—first and most notably by reviving the adored-but-prematurely-canceled Arrested Development for a fourth season. Now the service is continuing this effort by turning the 2001 comedy Wet Hot American Summer, a critical and commercial bomb on its release, into an eight-episode prequel miniseries. Though it all but vanished without a trace on release, Wet Hot’s shaggy, surreal charm and its cast of future stars have helped it endure over the years, and despite its bizarre positioning, the Netflix edition hasn’t missed a beat, even 14 years later.
Most of the big names in futurism are men. What does that mean for the direction we’re all headed?
In the future, everyone’s going to have a robot assistant. That’s the story, at least. And as part of that long-running narrative, Facebook just launched its virtual assistant. They’re calling it Moneypenny—the secretary from the James Bond Films. Which means the symbol of our march forward, once again, ends up being a nod back. In this case, Moneypenny is a send-up to an age when Bond’s womanizing was a symbol of manliness and many women were, no matter what they wanted to be doing, secretaries.
Why can’t people imagine a future without falling into the sexist past? Why does the road ahead keep leading us back to a place that looks like the Tomorrowland of the 1950s? Well, when it comes to Moneypenny, here’s a relevant datapoint: More than two thirds of Facebook employees are men. That’s a ratio reflected among another key group: futurists.
A hawkish senator doesn't apply the lessons of Iraq
Earlier this week, Senator Lindsey Graham, a hawkish Republican from South Carolina, used a Senate Armed Services Committee hearing to stage a theatrical display of his disdain for the Obama administration’s nuclear deal with Iran.
The most telling part of his time in the spotlight came when he pressed Defense Secretary Ashton Carter to declare who would win if the United States and Iran fought a war:
Here’s a transcript of the relevant part:
Graham: Could we win a war with Iran? Who wins the war between us and Iran? Who wins? Do you have any doubt who wins?
Carter: No. The United States.
Graham: We. Win.
Little more than a decade ago, when Senator Graham urged the invasion of Iraq, he may well have asked a general, “Could we win a war against Saddam Hussein? Who wins?” The answer would’ve been the same: “The United States.” And the U.S. did rout Hussein’s army. It drove the dictator into a hole, and he was executed by the government that the United States installed. And yet, the fact that the Iraqi government of 2002 lost the Iraq War didn’t turn out to mean that the U.S. won it. It incurred trillions in costs; thousands of dead Americans; thousands more with missing limbs and post-traumatic stress disorder and years of deployments away from spouses and children; and in the end, a broken Iraq with large swaths of its territory controlled by ISIS, a force the Iraqis cannot seem to defeat. That’s what happened last time a Lindsey Graham-backed war was waged.