Steve Jobs didn't change the world by playing nice
When filmmaker Stanley Kubrick died, the steely perfectionist who ground actors into submission died with him. Kubrick was a good man -- Matthew Modine once described him as "probably the most heartfelt person I ever met" -- but by all accounts, his shoots were crucibles for which the faint of heart need not apply. When he walked onto a set, Stanley Kubrick would get exactly what he wanted, and he would exact this vision without mercy. Upon his death, however, only a mythical Saint Stanley remained, a slightly taller Yoda with a slightly better complexion.
Part of this can be explained by decorum. No one wants to speak ill of the dead, and it's hard to casually reconcile the loving father and husband with the man who verbally flayed Shelley Duvall until her frail character in The Shining seemed Byronic in comparison. Still, revising the methods of such a genius is to diminish exactly what made his genius work. A Clockwork Orange didn't happen by accident. Stanley Kubrick made it happen, and though anyone could direct a Kubrick script, only the man himself could make a Kubrick film.
Last year a former Apple employee related his favorite Steve Jobs story to me. I have no way of knowing if it is true, so take it for what it's worth. I think it nicely captures the man who changed the worldfourtimesover. When engineers working on the very first iPod completed the prototype, they presented their work to Steve Jobs for his approval. Jobs played with the device, scrutinized it, weighed it in his hands, and promptly rejected it. It was too big.
The engineers explained that they had to reinvent inventing to create the iPod, and that it was simply impossible to make it any smaller. Jobs was quiet for a moment. Finally he stood, walked over to an aquarium, and dropped the iPod in the tank. After it touched bottom, bubbles floated to the top.
"Those are air bubbles," he snapped. "That means there's space in there. Make it smaller."
Steve Jobs was a genius, and one of the most important businessmen and inventors of our time. But he was not a kindly, soft-spoken sage who might otherwise live atop a mountain in India, dispatching wisdom to pilgrims. He was a taskmaster who knew how to get things done. "Real artists ship" was an Apple battle cry from the earliest days. Everyone, by now, knows about the Steve Jobs "reality distortion field" -- the charismatic Care Bear Stare that compels otherwise reasonable people to spend weeks in line for a slightly faster telephone. In his biography of Jobs, journalist Alan Deutschman described the Apple co-founder's lesser-known hero-shithead roller coaster. "He could be Good Steve or he could be Bad Steve. When he was Bad Steve, he didn't seem to care about the severe damage he caused to egos or emotions so long as he pushed for greatness." When confronted with the full "terrifying" wrath of Bad Steve (even over the slightest of details), the brains at Apple would push themselves beyond all personal limits to find a way to meet Jobs's exacting demands, and somehow return to his good graces. And the process would repeat itself. "Steve was willing to be loved or feared, whatever worked." As Bud Tribble, Vice President of Software Technology at Apple explained. "It let the engineers know that it wasn't OK to be sloppy in anything they did, even the 99 percent that Steve would never look at."
That attention to detail makes Apple products unique and desired. Does any other company produce ubiquitous, mass-market devices that still feel so rare, and deeply personal? Steve Jobs did that.
His life was too short, but never wasted, and his impact reaches even those who've never touched an Apple product. He ushered in the personal computing era, and rallied from pancreatic cancer to show us a glimpse of the post-PC world. That didn't just happen; it was made to happen.
When Apple announced his resignation in August, the canonization began. Barrels of ink recounted all of the carrot and none of the stick. With the announcement of his death, coverage and conversations continue along those lines. That's to be expected, and like Kubrick, is set to become conventional wisdom. Steve Jobs was a good man who loved and was loved, and earned every accolade he's garnered. But he doesn't deserve a hagiography, and I doubt he would have wanted one. Apple wasn't built by a saint. It was built by an iron-fisted visionary. There are a lot of geniuses in the world, and a lot of aesthetes. But that's not enough. Sometimes it takes Bad Steve to bring products to market. Real artists ship.
Hillary Clinton’s realistic attitude is the only thing that can effect change in today’s political climate.
Bernie Sanders and Ted Cruz have something in common. Both have an electoral strategy predicated on the ability of a purist candidate to revolutionize the electorate—bringing droves of chronic non-voters to the polls because at last they have a choice, not an echo—and along the way transforming the political system. Sanders can point to his large crowds and impressive, even astonishing, success at tapping into a small-donor base that exceeds, in breadth and depth, the remarkable one built in 2008 by Barack Obama. Cruz points to his extraordinarily sophisticated voter-identification operation, one that certainly seemed to do the trick in Iowa.
But is there any real evidence that there is a hidden “sleeper cell” of potential voters who are waiting for the signal to emerge and transform the electorate? No. Small-donor contributions are meaningful and a sign of underlying enthusiasm among a slice of the electorate, but they represent a tiny sliver even of that slice; Ron Paul’s success at fundraising (and his big crowds at rallies) misled many analysts into believing that he would make a strong showing in Republican primaries when he ran for president. He flopped.
Thenew Daily Show host, Trevor Noah, is smooth and charming, but he hasn’t found his edge.
It’s a psychic law of the American workplace: By the time you give your notice, you’ve already left. You’ve checked out, and for the days or weeks that remain, a kind of placeholder-you, a you-cipher, will be doing your job. It’s a law that applies equally to dog walkers, accountants, and spoof TV anchormen. Jon Stewart announced that he was quitting The Daily Show in February 2015, but he stuck around until early August, and those last months had a restless, frazzled, long-lingering feel. A smell of ashes was in the air. The host himself suddenly looked quite old: beaky, pique-y, hollow-cheeky. For 16 years he had shaken his bells, jumped and jangled in his little host’s chair, the only man on TV who could caper while sitting behind a desk. Flash back to his first episode as the Daily Show host, succeeding Craig Kilborn: January 11, 1999, Stewart with floppy, luscious black hair, twitching in a new suit (“I feel like this is my bar mitzvah … I have a rash like you wouldn’t believe.”) while he interviews Michael J. Fox.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
The championship game descends on a city failing to deal with questions of affordability and inclusion.
SAN FRANCISCO—The protest kicked off just a few feet from Super Bowl City, the commercial playground behind security fences on the Embarcadero, where football fans were milling about drinking beer, noshing on $18 bacon cheeseburgers, and lining up for a ride on a zip line down Market Street.
The protesters held up big green camping tents painted with slogans such as “End the Class War” and “Stop Stealing Our Homes,” and chanted phrases blaming San Francisco Mayor Ed Lee for a whole range of problems, including the catchy “Hey Hey, Mayor Lee, No Penalty for Poverty.” They blocked the sidewalk, battling with tourists, joggers, and city workers, some of whom were trying to wheel their bikes through the crowd to get to the ferries that would take them home.
The country has experienced nursing shortages for decades, but an aging population means the problem is about to get much worse.
Five years ago, my mother was rushed to the hospital for an aneurysm. For the next two weeks, my family and I sat huddled around her bed in the intensive-care unit, oscillating between panic, fear, uncertainty, and exhaustion.
It was nurses that got us through that time with our sanity intact. Nurses checked on my mother—and us—multiple times an hour. They ran tests, updated charts, and changed IVs; they made us laugh, allayed our concerns, and thought about our comfort. The doctors came in every now and then, but the calm dedication of the nurses was what kept us together. Without them, we would have fallen apart.
Which is just one reason why the prospect of a national nursing shortage is so alarming. The U.S. has been dealing with a nursing deficit of varying degrees for decades, but today—due to an aging population, the rising incidence of chronic disease, an aging nursing workforce, and the limited capacity of nursing schools—this shortage is on the cusp of becoming a crisis, one with worrying implications for patients and health-care providers alike.
What happened when 11 exiles armed themselves for a violent night in the Gambia
In the dark hours of the morning on December 30, 2014, eight men gathered in a graveyard a mile down the road from the official residence of Yahya Jammeh, the president of the Gambia. The State House overlooks the Atlantic Ocean from the capital city of Banjul, on an island at the mouth of the Gambia River. It was built in the 1820s and served as the governor’s mansion through the end of British colonialism, in 1965. Trees and high walls separate the house from the road, obscuring any light inside.
The men were dressed in boots and dark pants, and as two of them stood guard, the rest donned Kevlar helmets and leather gloves, strapped on body armor and CamelBaks, and loaded their guns. Their plan was to storm the presidential compound, win over the military, and install their own civilian leader. They hoped to gain control of the country by New Year’s Day.
U.S. presidential candidates are steering the country toward a terror trap.
For close to a decade, the trauma of the Iraq War left Americans wary of launching new wars in the Middle East. That caution is largely gone. Most of the leading presidential candidates demand that the United States escalate its air war in Iraq and Syria, send additional Special Forces, or enforce a buffer zone, which the head of Central Command, General Lloyd Austin, has said would require deploying U.S. ground troops. Most Americans now favor doing just that.
The primary justification for this new hawkishness is stopping the Islamic State, or isis, from striking the United States. Which is ironic, because at least in the short term, America’s intervention will likely spark more terrorism against the United States, thus fueling demands for yet greater military action. After a period of relative restraint, the United States is heading back into the terror trap.
Overly persistent pursuit is a staple of movie love stories, but a new study shows that it could normalize some troubling behaviors.
Romantic comedies are supposed to be escapist—a jaunt into a better, more colorful world where journalists can afford giant New York apartments and no obstacle to love is too great to overcome.
Except that when you think about it, some of the behavior portrayed as romantic in these movies is, objectively, creepy. The Love Actually sign guy was totally out of line, and honestly, Lloyd Dobler from Say Anything was pushing it with his famous jukebox. Even the supposedly “pure” love of cute baby-faced Joseph Gordon Levitt as Cameron in 10 Things I Hate About You involves teaching himself just enough French that he can pose as a tutor and hang out with his beloved. Oh, and hiring a guy to go out with her sister.
I coined the term—now I’ve come back to fix what I started.
O reader, hear my plea: I am the victim of semantic drift.
Four months ago, I coined the term “Berniebro” to describe a phenomenon I saw on Facebook: Men, mostly my age, mostly of my background, mostly with my political beliefs, were hectoring their friends about how great Bernie was even when their friends wanted to do something else, like talk about the NBA.
In the post, I tried to gently suggest that maybe there were other ways to advance Sanders’s beliefs, many of which I share. I hinted, too, that I was not talking about every Sanders supporter. I did this subtly, by writing: “The Berniebro is not every Sanders supporter.”
Then, 28,000 people shared the story on Facebook. The Berniebro was alive! Immediately, I started getting emails: Why did I hate progressivism? Why did I joke about politics? And how dare I generalize about every Bernie Sanders supporter?
Bernie Sanders doggedly pursued his one big idea about reforming American politics, while Hillary Clinton detailed her many proposals for change.
With the New Hampshire primaries just days away, Democrats Hillary Clinton and Bernie Sanders met on a debate stage in Durham on Thursday. In their first one-on-one matchup, the duo seemed determined to illustrate Archilochus’s classic binary between the fox, who knows many things, and the hedgehog, who knows one important thing. Sanders knows that what the country needs—the only thing it needs—is a political and economic revolution. Clinton knows the country needs progressive policies on a range of matters and a pragmatic, realistic strategy to implement them.
That divide was clear from their opening statements, with Sanders immediately jumping to his familiar mantra about a rigged economy and a corrupt campaign-finance scheme. Clinton’s answer was not so laser focused, discussing a general need for the nation to “live up to our values in the 21st century,” and checking off not just the economy, but racism, sexism, and more. This split is not new, of course, but with Martin O’Malley off the stage and out of the race, and the Democratic contest tighter than ever, the division has never been so clear. It led to an unusually interesting debate, with the two candidates frequently addressing each other directly and delving into detail.