While marketers may want to boil down people's sharing behavior to one, easy equation, that's just not how the social networks function.
For many, going viral is the high point of their online life cycle. For media companies, it may soon be their primary source of subsistence.
The end of 2011 suggested as much: social media outpaced search as a top online activity last year, and Google's decision to incorporate Google+ information into search results indicates an increasing emphasis on sharing and social referrals by major Internet companies. For media outlets, this indicates an increasingly disrupted future, where websites lose their appeal as stand-alone content destinations. Felix Salmon articulates this sentiment at the Columbia Journalism Review's Audit desk "HuffPo is built on the idea that when stories are shared on Twitter or Facebook, that will drive traffic back to huffingtonpost.com, where it can then monetize that traffic by selling it to advertisers," writes Salmon. "But in future, the most viral stories are going to have a life of their own, being shared across many different platforms and being read by people who will never visit the original site on which they were published."
But not everyone has the same viral intuition that Ben Huh of I Can Haz Cheezburger or the creators of the now-famous "Old Spice Guy" ads do. So how, if at all, can mere mortals (and media companies) harness the power of virality? In reality, the key ingredient to virality isn't the number of share buttons or Twitter followers you have, but your sensitivity to culture, that body of nuances that go beyond demographic breakdowns. Each sharing ecosystem on the web has its own unique subculture, its own sets of rules of order and norms of behavior. The secret to going viral is seamlessly navigating these worlds.
Until now, media companies have looked at virality as a function of infrastructure: install every share tool imaginable on your website, publish an article and let natural Facebook activity do the rest. At TechCrunch, entrepreneur Uzi Shmilovic examined eight ways Internet giants like Facebook and Linkedin have used virality as a vehicle for success. Shmilovic emphasizes using a "Virality Coefficient" -- "how many new users on average does one user of your product 'infect'" -- to measure to virality of a piece of information. A coefficient greater than 1 indicates exponential growth, the type that describes wildly successful Internet campaigns like the Old Spice Guy:
The virality coefficient is super important, but there's one other critical number that you should pay attention to--the cycle time. The cycle time is the average time it takes from the moment that one of your users performs a viral action to the moment that a new user signs up because of this very action. It makes a huge difference if your cycle time is one day or 60 days.
David Skok of Matrix Ventures gave a presentation about that recently, and actually devised a formula to calculate the amount of users you will get after a period of time based on the Virality Coefficient (K) and the Cycle Time (ct).
Having virality expressed in this way is beneficial as it boils down virality to the optimization of two variables: maximize K and minimize ct.
The problem with Shmilovic's analysis is that it assumes virality is a structural property that can be optimized or reduced to a consistent formula. His recommendations, designed for marketers, are based on creating systems that maximize the space for sharing, differentiated with little marketing buzzwords like "communication virality" ("the product is used to communicate with other people, some of which might be potential users") or "embeddable virality" ("new people who are exposed to the content embed it on their own website, promoting it even further").
The emphasis on structural factors isn't inherently a bad thing: advancements in technology (particularly in communications) have radically transformed the speed and scope of viral products. The Economist's recentexploration of how Martin Luther's Ninety-Five Theses on the Power and Efficacy of Indulgences went viral across the continent through contemporary media -- namely the printing press and multiple translations into the various dialects that permeated 16th-century Europe -- is a perfect (and fascinating) example. In the social space, the prevalence and placement of tools like the Facebook "like" button can certainly be the determining factors of whether a compelling article reaches that tipping point in Shmilovic's Virality Coefficient. The Huffington Post is the ideal model here: the site amplifies its power as a clearinghouse for all things Internet-famous by deeply integrating every conceivable social network and sharing tool into its article pages. When it comes to the promulgation of ideas, infrastructure matters.
But festooning a page with strings of shiny share buttons (Digg! Mixx! Bookmerken! Dipdive!) is a wholly incomplete approach to the spread of information; it assumes that all social behavior and all social networks or online communities are essentially the same. But the human mind isn't a uniform filter, and sharing behavior differs across ubiquitous platforms like Google, Twitter and Facebook. "Nobody can see what you search on Google, so popular search trends tend to reflect the more reptilian brain in people," explained Jonah Peretti, founder of viral hub Buzzfeed, in 2010. ""Celebrity gossip, sex, hair transplants ... nobody tweets about this stuff." A brief glance at the most-shared stories of 2011 on Facebook, Twitter, and Linkedin highlight their differences in focus.
Obviously, the culture of each online ecosystem is shaped by its particular structure, but these have more to do with the how and where of sharing; in reality, it is the why that shapes how ideas take hold. Geert Hofstede, the influential Dutch social psychologist and anthropologist and pioneer in the field of cross-cultural studies, has a succinct take on the role of technology in shaping the spread of ideas and information in his classic work Culture's Consequences. "Electronic communication does not eliminates cultural differences, just as faster and easier travel has not reduced cultural rifts," wrote Hofstede. "The software of the machines may be globalized, but the software of the minds that use the terminals is not":
Electronic communication enormously increases the amount of information accessible for its users, but it does not increase their capacity to absorb this information or change their preexisting value systems. Users have to select what information they recognize; this has always been the case, only the selection ask has become much larger. We select our information according to our values. Like our parents, we read newspapers that we expect to give our preferred points of view, and, confronted with the new bulk of electronic information, we again pick whatever reinforces our preexisting ideas. Our relatively brief experience with the Internet so far has shown that people use to do what they were doing anyway, only maybe more and faster.
People don't engage the unique structure of social networks as blank slates; they enter into each ecosystem with a particular set of values, values that shape the nature of a community and, in turn, the type of ideas and products that take hold. As Alexis Madrigal noted, different networks fill the various social niches in our lives. This is a valuable lesson not just for marketers and media companies, but any person or organization looking to spread a set of ideas or concepts across the vastness of the Web.
Erving Goffman's analogy of social life to the theater from The Presentation of Self in Everyday Life comes to mind. Goffman argued that the social actor has the ability to choose his stage and props, as well as the costume he would wear in front of a specific audience. On the Internet, we function on many different stages, with a wardrobe bursting with meticulously crafted costumes.
Above: The pattern of sound waves, photographed by scientists at Bell Telephone Laboratories, 1950 (Library of Congress)
I traveled to every country on earth. In some cases, the adventure started before I could get there.
Last summer, my Royal Air Maroc flight from Casablanca landed at Malabo International Airport in Equatorial Guinea, and I completed a 50-year mission: I had officially, and legally, visited every recognized country on earth.
This means 196 countries: the 193 members of the United Nations, plus Taiwan, Vatican City, and Kosovo, which are not members but are, to varying degrees, recognized as independent countries by other international actors.
In five decades of traveling, I’ve crossed countries by rickshaw, pedicab, bus, car, minivan, and bush taxi; a handful by train (Italy, Switzerland, Moldova, Belarus, Ukraine, Romania, and Greece); two by riverboat (Gabon and Germany); Norway by coastal steamer; Gambia and the Amazonian parts of Peru and Ecuador by motorized canoe; and half of Burma by motor scooter. I rode completely around Jamaica on a motorcycle and Nauru on a bicycle. I’ve also crossed three small countries on foot (Vatican City, San Marino, and Liechtenstein), and parts of others by horse, camel, elephant, llama, and donkey. I confess that I have not visited every one of the 7,107 islands in the Philippine archipelago or most of the more than 17,000 islands constituting Indonesia, but I’ve made my share of risky voyages on the rickety inter-island rustbuckets you read about in the back pages of the Times under headlines like “Ship Sinks in Sulu Sea, 400 Presumed Lost.”
A Brooklyn-based group is arguing that the displacement of longtime residents meets a definition conceived by the United Nations in the aftermath of World War II.
No one will be surprised to learn that the campaign to build a national movement against gentrification is being waged out of an office in Brooklyn, New York.
For years, the borough’s name has been virtually synonymous with gentrification, and on no street in Brooklyn are its effects more evident than on Atlantic Avenue, where, earlier this summer, a local bodega protesting its impending departure in the face of a rent hike, put up sarcastic window signs advertising “Bushwick baked vegan cat food” and “artisanal roach bombs.”
Just down the block from that bodega are the headquarters of Right to the City, a national alliance of community-based organizations that since 2007 has made it its mission to fight “gentrification and the displacement of low-income people of color.” For too long, organizers with the alliance say, people who otherwise profess concern for the poor have tended to view gentrification as a mere annoyance, as though its harmful effects extended no further than the hassles of putting up with pretentious baristas and overpriced lattes. Changing this perception is the first order of business for Right to the City: Gentrification, as these organizers see it, is a human-rights violation.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
But letting customers buy their own would force cable companies to improve their equipment.
One of the least glamorous realities of the American cable industry is a relic invented in 1948: the cable box. The box has become a fixture in the American household, not least because it is surprisingly profitable. Earlier this year, a U.S. Senate study found that American households pay $231 a year on average renting cable boxes. Further, the report estimated that 99 percent of cable customers rented their equipment, and, across the country, that added up to a $19.5 billion industry just renting cable boxes.
The senators who commissioned the study, Ed Markey of Massachusetts and Richard Blumenthal of Connecticut, noted that this dependable rental revenue gave the industry little incentive to innovate and make better cable boxes. Which begs a really good question: Why aren’t more people purchasing their cable boxes?
Why haven’t more challengers entered the race to defeat the Iraq War hawk, Patriot Act supporter, and close friend of big finance?
As Hillary Clinton loses ground to Bernie Sanders in Iowa, where her lead shrinks by the day, it’s worth noticing that she has never made particular sense as the Democratic Party’s nominee. She may be more electable than her social-democratic rival from Vermont, but plenty of Democrats are better positioned to represent the center-left coalition. Why have they let the former secretary of state keep them out of the race? If Clinton makes it to the general election, I understand why most Democrats will support her. She shares their views on issues as varied as preserving Obamacare, abortion rights, extending legal status to undocumented workers, strengthening labor unions, and imposing a carbon tax to slow climate change.
Learning to program involves a lot of Googling, logic, and trial-and-error—but almost nothing beyond fourth-grade arithmetic.
I’m not in favor of anyone learning to code unless she really wants to. I believe you should follow your bliss, career-wise, because most of the things you’d buy with all the money you’d make as a programmer won’t make you happy. Also, if your only reason for learning to code is because you want to be a journalist and you think that’s the only way to break into the field, that’s false.
I’m all for people not becoming coders, in other words—as long they make that decision for the right reasons. “I’m bad at math” is not the right reason.
Math has very little to do with coding, especially at the early stages. In fact, I’m not even sure why people conflate the two. (Maybe it has to do with the fact that both fields are male-dominated.)
Massive hurricanes striking Miami or Houston. Earthquakes leveling Los Angeles or Seattle. Deadly epidemics. Meet the “maximums of maximums” that keep emergency planners up at night.
For years before Hurricane Katrina, storm experts warned that a big hurricane would inundate the Big Easy. Reporters noted that the levees were unstable and could fail. Yet hardly anyone paid attention to these Cassandras until after the levees had broken, the Gulf Coast had been blown to pieces, and New Orleans sat beneath feet of water.
The wall-to-wall coverage afforded to the anniversary of Hurricane Katrina reveals the sway that a deadly act of God or man can hold on people, even 10 years later. But it also raises uncomfortable questions about how effectively the nation is prepared for the next catastrophe, whether that be a hurricane or something else. There are plenty of people warning about the dangers that lie ahead, but that doesn’t mean that the average citizen or most levels of the government are anywhere near ready for them.
Actually, a good amount: Belittling their plight by comparing it to blue-collar workers’ ignores the trickle-down harms of an exhausting work culture.
Over the past few decades, workers without college degrees have not only seen jobs disappear and wages stagnate—the jobs that remain have, all too often, gotten worse. Constant surveillance is common; schedules are erratic; escalating performance quotas exact faster work. But these trends, often thought to be confined to front-line workers, have creeped up corporate hierarchies, affecting managers and executives. That’s prompted a new controversy: Are white-collar workers victims of exploitation, or merely whining?
A devastating report on the work culture at Amazon’s headquarters recently reignited the debate. The New York Times’s August exposé, based on dozens of interviews, portrayed a firm with all the regimentation and rigidity of military boot camp, minus the esprit de corps. Workers routinely cried at their desks. Rather than being comforted or accommodated, sick employees were dumped into Orwellianly named “Performance Improvement Plans” that simply hastened their eventual departures. Faced with a comprehensive employee-ranking system, cabals of managers agreed to praise one another while talking down the performance of others. Amazon’s “collaborative feedback tool” encouraged a Panopticon of vicious feedback—and similar software may be coming to many more firms.
Conservatives want to defund the group, even if it means shutting down the government. And they’re holding the GOP leadership accountable.
It has become an annual harbinger of autumn in this era of divided government: The calendar swings from August to September, Congress returns from its long summer break, and Republican leaders try to figure out how to keep the federal lights on past the end of the month.
In 2013, John Boehner gave in to Senator Ted Cruz and his conservative allies in the House, and the government shut down for two weeks in a failed fight over Obamacare. A year ago, Boehner and Mitch McConnell succeeded in twice putting off a losing battle over immigration until after they could wrest control of the Senate from the Democrats.
With federal funding set to expire on September 30, conservatives are once again demanding a standoff that Boehner and McConnell are hell-bent on avoiding. This time around, the issue that might prevent an orderly—if temporary—extension of funding is Planned Parenthood. Along with Cruz, House conservatives insist that any spending bill sent to President Obama’s desk explicitly prohibit taxpayer dollars from going to the women’s health organization, which has come under fire over undercover videos that purportedly show its officials discussing the sale of fetal tissue. Democrats have rallied around Planned Parenthood, and an effort to ax its approximately $500 million in annual funding is likely to fall short, either by running into a filibuster in the Senate or a presidential veto.
The NBC show isn’t casting its net wide enough when it comes to finding new players.
Since the departure of many of its biggest stars two years ago, Saturday Night Live has mostly avoided major cast changes. Yesterday, NBC announced the show would add only one new cast member for its 41st season—the near-unknown stand-up comic Jon Rudnitsky. SNL is, of course, a sketch-comedy show, but it keeps hiring mostly white stand-ups who have a markedly different skill set, with limited results. As critics and viewers keep calling out for greater diversity on the show, it’s hard to imagine the series’s reasoning in sticking to old habits.
As is unfortunately typical today, controversy has already arisen over some tasteless old jokes from Rudnitsky’s Twitter and Vine feeds, similar to the furore that greeted Trevor Noah’s hiring at The Daily Show this summer. But Rudnitsky was apparently hired on the back of his stand-up performances, not his Internet presence, similar to the other young stand-ups the show has hired in recent years: Pete Davidson, Brooks Wheelan (since fired), and Michael Che. It’s a peculiar route to the show, because SNL is 90 percent sketch acting, and unless you’re hosting Weekend Update (like Che), you’re not going to do a lot of stand-up material. So why hire Rudnitsky?