A famous Oatmeal cartoon showed the cartoonist making a good faith effort to buy Game of Thrones. He finds that the show is not available on iTunes, Netflix, Amazon, or Hulu. He tries to buy HBO Go, but it's only available as an add-on to a cable package. Finally, the cartoonist gives up trying to pay for the show and pirates it through Bit Torrent. This cartoon is probably the best ever expression of the "piracy is a customer service issue" thesis.
In a way, this doesn't make any sense for HBO, which makes its money off subscriptions and would ostensibly welcome an opportunity to sell subscriptions to another market segment. HBO claims that (a) people aren't interested in a la carte HBO Go and (b) the transaction costs are too high to do their own billing, etc. The technical term for these explanations is "bullshit." Cord cutters are a relatively small market segment but a fast growing one and I think it unlikely that cable subscriptions will fully rebound when the recession ends since the issue isn't just price but convenience. Moreover, I see no reason why HBO can't handle billing and other logistical issues when the Metropolitan Opera and the NFL, not to mention Netflix, don't seem to have any trouble running their own separately billed streaming video services. Of course there are transaction costs associated with billing, but it can't possibly be anywhere close to the cost of a basic cable package.
And here we get to the real issue. It's not that HBO would like to cut out the middleman and sell to us directly, rather requiring you to buy basic cable is the whole point. Cable is a total cash cow and a more flexible business model means lower revenues. The reason is that the incumbent business model of cable combines the features of bundling (basic cable) and a two-part tariff (premium cable channels) for a perfect storm of price discrimination. For much the same reason as Disneyland could only lose money if it sold a la carte tickets to Splash Mountain for $20 without requiring $80 park admission (which includes access to Main Street, Jungle Cruise, etc), cable companies would lose money if you could buy HBO Go for $20 without first buying basic cable (which includes access to ESPN, Mtv, etc). Basically, economic theory (and some reasonable assumptions about the structure of demand) suggests that an a la carte video market could not make as much money as a bundled video market.
So, that's why the cable companies don't want you to buy a la carte HBO Go, but why is that HBO's problem? Let's contrast it with the NFL. The NFL offers standalone access because the credible threat of a streaming business model gives them more leverage to negotiate with the MSOs. In contrast, HBO doesn't want leverage because most of its sister companies are part of the basic cable ecosystem. (They used to have an actual MSO as a sister company but they spun off Time Warner Cable in 2009). Time Warner makes a lot of money from HBO subscriptions, but it makes even more money from carriage fees on CNN, Cartoon Network, and most of the cable networks starting with the letter "T." Unlike HBO (which would do well under an a la carte model) most of these other channels rely more on channel-surfing audiences than cult followings and so couldn't sell subscriptions on their own and would have to settle for something like a Hulu Plus or Netflix business model, probably with less money per subscriber and far fewer subscribers than they currently get through basic cable. Basically, cord-cutting would help HBO but devastate the rest of the company. For what is a media conglomerate profited if it gain a few hundred thousand a la carte HBO Go subscriptions, and lose its carriage fees and ad revenue? What can a media conglomerate give in exchange for its Turner and WBTVG divisions?
Time Warner more or less acknowledges in their investor report that disruptive innovation could screw them: "Furthermore, advances in technology or changes in competitors' product and service offerings may require the Company to make additional research and development expenditures or offer products or services in a digital format without charge or at a lower price than offered in other formats." This is on the first page of the "risk factors" section of the report, whereas piracy doesn't come up until the third. This order is consistent with my own reading of the industry and with the history of the recorded music industry, the proximate problem of which is not piracy but digital singles.
So basically, we can call this the "HBO has to take one for the team" model. We can get a similar result with a slightly weaker model which doesn't require long-term corporate cross-subsidization but treats HBO as autonomous from the rest of Time Warner. In the short-term, HBO itself is highly dependent on cable companies. The target market for a la carte HBO Go would be households with broadband but no cable, or about 5% of all US households. This is dwarfed by the 20% of households that have cable but no broadband. Moreover, although 70% of households have both cable and broadband, most of them aren't familiar with streaming video through set-top devices. So as a rough ballpark, let's say that half of US households have cable but either lack broadband and or wouldn't know how to use it with a set-top device (even if they already own a Blu-Ray player or game console with built-in streaming support). This means that the number of households HBO could appeal to with a la carte HBO Go are one tenth as numerous as the households they rely on cable companies to reach. And HBO does rely on the cable companies to reach these households through marketing promotions and the like. If HBO figures that angering the cable companies could cost them even a small fraction of these households then they're better off alienating Matthew Inman and myself rather than angering Comcast. The same logic explains why Netflix is interested in creating a cable channel and recent rumors that Hulu will switch to the HBO Go business model.
Of course for the cable companies to punish HBO would require them to forgo their half of HBO subscription revenue. This sounds like cutting off your nose to spite your face but that's not unheard of, especially if doing so deters your face from pissing you off again by flirting with a disruptive business model. We see a similar dynamic with how theatrical exhibitors react whenever movie studios suggest closing the video release window from its current 17 weeks. (Ironically in this scenario it's the cable companies who are the innovators trying to disrupt the stodgy incumbents). For instance last year, Universal floated the idea of experimenting with tightening up the pay-per-view window for Tower Heist. The theaters were livid and threatened to boycott the test film. This despite the fact that the experiment was on ridiculously unappealing terms to the consumer: $60 to watch a mediocre film three weeks after theatrical premiere and that's only if you live in Atlanta or Portland. Ultimately Universal backed down, deciding it was better to keep their old trading partners happy than try to develop new ones.
(By the way, I'm sure you'll agree it's a total coincidence that Universal was bought by a cable company shortly before the Tower Heist incident. Similarly, a total coincidence that this same cable company has a history of playing hardball with internet companies that offer infrastructure for streaming video services that compete with cable TV).
All that is to say I can understand why HBO Go isn't available yet to cord cutters. Still, let's say that tomorrow HBO starts offering standalone HBO Go subscriptions (as I sincerely hope it does), how would I explain that? I could see this happening if HBO decides that the transition will happen eventually and it is better to do it while they can still do so favorably. We saw a similar dynamic ten years ago with the recorded music industry, which acceded to a low price point digital singles market as it saw its market share eroded by piracy, but only moderately so. In 2003, when the record labels agreed to participate in iTunes, unit sales were down about 15% from the pre-Napster peak, which wasn't fun but also wasn't catastrophic. Most people were still buying CDs when the record labels agreed to a legal digital singles market that would eventually destroy the CD market. They did so in order to transition consumers to a new model before most of us had fully committed to piracy. It's a lot easier to get someone to buy singles for $1 if they're used to buying CDs for $15 than if they're used to pirating singles for nothing. Similarly, as the number of cord-cutters increases this will be an increasingly attractive market for HBO, and not just because it can get these people as customers but because it can keep them from developing the habit of pirating content that isn't promptly made available through legitimate streaming markets. We may not be at that point yet, but I wouldn't be surprised if we reach it before HBO runs out of Fire and Ice novels to adapt.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
When Kenneth Jarecke photographed an Iraqi man burned alive, he thought it would change the way Americans saw the Gulf War. But the media wouldn’t run the picture.
The Iraqi soldier died attempting to pull himself up over the dashboard of his truck. The flames engulfed his vehicle and incinerated his body, turning him to dusty ash and blackened bone. In a photograph taken soon afterward, the soldier’s hand reaches out of the shattered windshield, which frames his face and chest. The colors and textures of his hand and shoulders look like those of the scorched and rusted metal around him. Fire has destroyed most of his features, leaving behind a skeletal face, fixed in a final rictus. He stares without eyes.
On February 28, 1991, Kenneth Jarecke stood in front of the charred man, parked amid the carbonized bodies of his fellow soldiers, and photographed him. At one point, before he died this dramatic mid-retreat death, the soldier had had a name. He’d fought in Saddam Hussein’s army and had a rank and an assignment and a unit. He might have been devoted to the dictator who sent him to occupy Kuwait and fight the Americans. Or he might have been an unlucky young man with no prospects, recruited off the streets of Baghdad.
Heather Armstrong’s Dooce once drew millions of readers. Her blog’s semi-retirement speaks to the challenges of earning money as an individual blogger today.
The success story of Dooce.com was once blogger lore, told and re-told in playgroups and Meetups—anywhere hyper-verbal people with Wordpress accounts gathered. “It happened for that Dooce lady,” they would say. “It could happen for your blog, too.”
Dooce has its origin in the late 1990s, when a young lapsed Mormon named Heather Armstrong taught herself HTML code and moved to Los Angeles. She got a job in web design and began blogging about her life on her personal site, Dooce.com.
The site’s name evolved out of her friends’ AOL Instant-Messenger slang for dude, or its more incredulous cousin, "doooood!” About a year later, Armstrong was fired for writing about her co-workers on the site—an experience that, for a good portion of the ‘aughts, came known as “getting dooced.” She eloped with her now ex-husband, Jon, moved to Salt Lake City, and eventually started blogging full time again.
It’s not just Trump: With Ben Carson and Carly Fiorina on the rise, Republicans are loving outsiders and shunning politicians.
For the first time in a long time, Donald Trump isn’t the most interesting story in the 2016 presidential race. That's partly because his dominance in the Republican polls, while still surprising, is no longer novel and increasingly well explored and explained, but it’s also partly because what’s going on with the rest of the GOP field is far more interesting.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
ISIS did not merely blast apart old stones—it attacked the very foundations of pluralistic society.
If the ruined ruins of Palmyra could speak, they would marvel at our shock. After all, they have been sacked before. In their mute and shattered eloquence, they spoke for centuries not only about the cultures that built them but also about the cultures that destroyed them—about the fragility of civilization itself, even when it is incarnated in stone. No designation of sanctity, by God or by UNESCO, suffices to protect the past. The past is helpless. Instead these ruins, all ruins, have had the effect of lifting the past out of history and into time. They carry the spectator away from facts and toward reveries.
In the 18th century, after the publication in London of The Ruins of Palmyra, a pioneering volume of etchings by Robert Wood, who had traveled to the Syrian desert with the rather colorful James Dawkins, a fellow antiquarian and politician, the desolation of Palmyra became a recurring symbol for ephemerality and the vanity of all human endeavors. “It is the natural and common fate of cities,” Wood dryly remarked in one of the essays in his book, “to have their memory longer preserved than their ruins.” Wood’s beautiful and meticulous prints served as inspirations for paintings, and it was in response to one of those paintings that Diderot wrote some famous pages in his great Salons of 1767: “The ideas ruins evoke in me are grand. Everything comes to nothing, everything perishes, everything passes, only the world remains, only time endures. ... Wherever I cast my glance, the objects surrounding me announce death and compel my resignation to what awaits me. What is my ephemeral existence in comparison with that of a rock being worn down, of a valley being formed, of a forest that’s dying, of these deteriorating masses suspended above my head? I see the marble of tombs crumble into powder and I don’t want to die!”
In continuing to tinker with the universe she built eight years after it ended, J.K. Rowling might be falling into the same trap as Star Wars’s George Lucas.
September 1st, 2015 marked a curious footnote in Harry Potter marginalia: According to the series’s elaborate timeline, rarely referenced in the books themselves, it was the day James S. Potter, Harry’s eldest son, started school at Hogwarts. It’s not an event directly written about in the books, nor one of particular importance, but their creator, J.K. Rowling, dutifully took to Twitter to announce what amounts to footnote details: that James was sorted into House Gryffindor, just like his father, to the disappointment of Teddy Lupin, Harry’s godson, apparently a Hufflepuff.
It’s not earth-shattering information that Harry’s kid would end up in the same house his father was in, and the Harry Potter series’s insistence on sorting all of its characters into four broad personality quadrants largely based on their family names has always struggled to stand up to scrutiny. Still, Rowling’s tweet prompted much garment-rending among the books’ devoted fans. Can a tweet really amount to a piece of canonical information for a book? There isn’t much harm in Rowling providing these little embellishments years after her books were published, but even idle tinkering can be a dangerous path to take, with the obvious example being the insistent tweaks wrought by George Lucas on his Star Wars series.
Though it wasn’t pretty, Minaj was really teaching a lesson in civility.
Nicki Minaj didn’t, in the end, say much to Miley Cyrus at all. If you only read the comments that lit up the Internet at last night’s MTV Video Music Awards, you might think she was kidding, or got cut off, when she “called out” the former Disney star who was hosting: “And now, back to this bitch that had a lot to say about me the other day in the press. Miley, what’s good?”
To summarize: When Minaj’s “Anaconda” won the award for Best Hip-Hop Video, she took to the stage in a slow shuffle, shook her booty with presenter Rebel Wilson, and then gave an acceptance speech in which she switched vocal personas as amusingly as she does in her best raps—street-preacher-like when telling women “don’t you be out here depending on these little snotty-nosed boys”; sweetness and light when thanking her fans and pastor. Then a wave of nausea seemed to come over her, and she turned her gaze toward Cyrus. To me, the look on her face, not the words that she said, was the news of the night:
The past is beautiful until you’re reminded it’s ugly.
Taylor Swift’s music video for “Wildest Dreams” isn’t about the world as it exists; it’s about the world as seen through the filter of nostalgia and the magic of entertainment. In the song, Swift sings that she wants to live on in an ex’s memory as an idealized image of glamour—“standing in a nice dress, staring at the sunset.” In the video, her character, an actress, falls in love with her already-coupled costar, for whom she’ll live on as an idealized image of glamour—standing in a nice dress, staring at a giant fan that’s making the fabric swirl in the wind.
The setting for the most part is Africa, but, again, the video isn’t about Africa as it exists, but as it’s seen through the filter of nostalgia and the magic of entertainment—a very particular nostalgia and kind of entertainment. Though set in 1950, the video is in the literary and cinematic tradition of white savannah romances, the most important recent incarnation of which might be the 1985 Meryl Streep film Out of Africa, whose story begins in 1913. Its familiarity is part of its appeal, and also part of why it’s now drawing flack for being insensitive. As James Kassaga Arinaitwe and Viviane Rutabingwa write at NPR:
Demonizing processed food may be dooming many to obesity and disease. Could embracing the drive-thru make us all healthier?
Late last year, in a small health-food eatery called Cafe Sprouts in Oberlin, Ohio, I had what may well have been the most wholesome beverage of my life. The friendly server patiently guided me to an apple-blueberry-kale-carrot smoothie-juice combination, which she spent the next several minutes preparing, mostly by shepherding farm-fresh produce into machinery. The result was tasty, but at 300 calories (by my rough calculation) in a 16-ounce cup, it was more than my diet could regularly absorb without consequences, nor was I about to make a habit of $9 shakes, healthy or not.
Inspired by the experience nonetheless, I tried again two months later at L.A.’s Real Food Daily, a popular vegan restaurant near Hollywood. I was initially wary of a low-calorie juice made almost entirely from green vegetables, but the server assured me it was a popular treat. I like to brag that I can eat anything, and I scarf down all sorts of raw vegetables like candy, but I could stomach only about a third of this oddly foamy, bitter concoction. It smelled like lawn clippings and tasted like liquid celery. It goes for $7.95, and I waited 10 minutes for it.