Steve Jobs didn't change the world by playing nice
When filmmaker Stanley Kubrick died, the steely perfectionist who ground actors into submission died with him. Kubrick was a good man -- Matthew Modine once described him as "probably the most heartfelt person I ever met" -- but by all accounts, his shoots were crucibles for which the faint of heart need not apply. When he walked onto a set, Stanley Kubrick would get exactly what he wanted, and he would exact this vision without mercy. Upon his death, however, only a mythical Saint Stanley remained, a slightly taller Yoda with a slightly better complexion.
Part of this can be explained by decorum. No one wants to speak ill of the dead, and it's hard to casually reconcile the loving father and husband with the man who verbally flayed Shelley Duvall until her frail character in The Shining seemed Byronic in comparison. Still, revising the methods of such a genius is to diminish exactly what made his genius work. A Clockwork Orange didn't happen by accident. Stanley Kubrick made it happen, and though anyone could direct a Kubrick script, only the man himself could make a Kubrick film.
Last year a former Apple employee related his favorite Steve Jobs story to me. I have no way of knowing if it is true, so take it for what it's worth. I think it nicely captures the man who changed the worldfourtimesover. When engineers working on the very first iPod completed the prototype, they presented their work to Steve Jobs for his approval. Jobs played with the device, scrutinized it, weighed it in his hands, and promptly rejected it. It was too big.
The engineers explained that they had to reinvent inventing to create the iPod, and that it was simply impossible to make it any smaller. Jobs was quiet for a moment. Finally he stood, walked over to an aquarium, and dropped the iPod in the tank. After it touched bottom, bubbles floated to the top.
"Those are air bubbles," he snapped. "That means there's space in there. Make it smaller."
Steve Jobs was a genius, and one of the most important businessmen and inventors of our time. But he was not a kindly, soft-spoken sage who might otherwise live atop a mountain in India, dispatching wisdom to pilgrims. He was a taskmaster who knew how to get things done. "Real artists ship" was an Apple battle cry from the earliest days. Everyone, by now, knows about the Steve Jobs "reality distortion field" -- the charismatic Care Bear Stare that compels otherwise reasonable people to spend weeks in line for a slightly faster telephone. In his biography of Jobs, journalist Alan Deutschman described the Apple co-founder's lesser-known hero-shithead roller coaster. "He could be Good Steve or he could be Bad Steve. When he was Bad Steve, he didn't seem to care about the severe damage he caused to egos or emotions so long as he pushed for greatness." When confronted with the full "terrifying" wrath of Bad Steve (even over the slightest of details), the brains at Apple would push themselves beyond all personal limits to find a way to meet Jobs's exacting demands, and somehow return to his good graces. And the process would repeat itself. "Steve was willing to be loved or feared, whatever worked." As Bud Tribble, Vice President of Software Technology at Apple explained. "It let the engineers know that it wasn't OK to be sloppy in anything they did, even the 99 percent that Steve would never look at."
That attention to detail makes Apple products unique and desired. Does any other company produce ubiquitous, mass-market devices that still feel so rare, and deeply personal? Steve Jobs did that.
His life was too short, but never wasted, and his impact reaches even those who've never touched an Apple product. He ushered in the personal computing era, and rallied from pancreatic cancer to show us a glimpse of the post-PC world. That didn't just happen; it was made to happen.
When Apple announced his resignation in August, the canonization began. Barrels of ink recounted all of the carrot and none of the stick. With the announcement of his death, coverage and conversations continue along those lines. That's to be expected, and like Kubrick, is set to become conventional wisdom. Steve Jobs was a good man who loved and was loved, and earned every accolade he's garnered. But he doesn't deserve a hagiography, and I doubt he would have wanted one. Apple wasn't built by a saint. It was built by an iron-fisted visionary. There are a lot of geniuses in the world, and a lot of aesthetes. But that's not enough. Sometimes it takes Bad Steve to bring products to market. Real artists ship.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
Places like St. Louis and New York City were once similarly prosperous. Then, 30 years ago, the United States turned its back on the policies that had been encouraging parity.
Despite all the attention focused these days on the fortunes of the “1 percent,” debates over inequality still tend to ignore one of its most politically destabilizing and economically destructive forms. This is the growing, and historically unprecedented, economic divide that has emerged in recent decades among the different regions of the United States.
Until the early 1980s, a long-running feature of American history was the gradual convergence of income across regions. The trend goes back to at least the 1840s, but grew particularly strong during the middle decades of the 20th century. This was, in part, a result of the South catching up with the North in its economic development. As late as 1940, per-capita income in Mississippi, for example, was still less than one-quarter that of Connecticut. Over the next 40 years, Mississippians saw their incomes rise much faster than did residents of Connecticut, until by 1980 the gap in income had shrunk to 58 percent.
Live in anticipation, gathering stories and memories. New research builds on the vogue mantra of behavioral economics.
Forty-seven percent of the time, the average mind is wandering. It wanders about a third of the time while a person is reading, talking with other people, or taking care of children. It wanders 10 percent of the time, even, during sex. And that wandering, according to psychologist Matthew Killingsworth, is not good for well-being. A mind belongs in one place. During his training at Harvard, Killingsworth compiled those numbers and built a scientific case for every cliché about living in the moment. In a 2010 Science paper co-authored with psychology professor Daniel Gilbert, the two wrote that "a wandering mind is an unhappy mind."
For Killingsworth, happiness is in the content of moment-to-moment experiences. Nothing material is intrinsically valuable, except in whatever promise of happiness it carries. Satisfaction in owning a thing does not have to come during the moment it's acquired, of course. It can come as anticipation or nostalgic longing. Overall, though, the achievement of the human brain to contemplate events past and future at great, tedious length has, these psychologists believe, come at the expense of happiness. Minds tend to wander to dark, not whimsical, places. Unless that mind has something exciting to anticipate or sweet to remember.
Why are so many kids with bright prospects killing themselves in Palo Alto?
The air shrieks, and life stops. First, from far away, comes a high whine like angry insects swarming, and then a trampling, like a herd moving through. The kids on their bikes who pass by the Caltrain crossing are eager to get home from school, but they know the drill. Brake. Wait for the train to pass. Five cars, double-decker, tearing past at 50 miles an hour. Too fast to see the faces of the Silicon Valley commuters on board, only a long silver thing with black teeth. A Caltrain coming into a station slows, invites you in. But a Caltrain at a crossing registers more like an ambulance, warning you fiercely out of its way.
The kids wait until the passing train forces a gust you can feel on your skin. The alarms ring and the red lights flash for a few seconds more, just in case. Then the gate lifts up, signaling that it’s safe to cross. All at once life revives: a rush of bikes, skateboards, helmets, backpacks, basketball shorts, boisterous conversation. “Ew, how old is that gum?” “The quiz is next week, dipshit.” On the road, a minivan makes a left a little too fast—nothing ominous, just a mom late for pickup. The air is again still, like it usually is in spring in Palo Alto. A woodpecker does its work nearby. A bee goes in search of jasmine, stinging no one.
As the public’s fear and loathing surge, the frontrunner’s durable candidacy has taken a dark turn.
MYRTLE BEACH, South Carolina—All politicians, if they are any good at their craft, know the truth about human nature.
Donald Trump is very good, and he knows it better than most.
Trump stands alone on a long platform, surrounded by a rapturous throng. Below and behind him—sitting on bleachers and standing on the floor—they fill this city’s cavernous, yellow-beige convention center by the thousands. As Trump will shortly point out, there are a lot of other Republican presidential candidates, but none of them get crowds anything like this.
Trump raises an orange-pink hand like a waiter holding a tray. “They are not coming in from Syria,” he says. “We’re sending them back!” The crowd surges, whistles, cheers. “So many bad things are happening—they have sections of Paris where the police are afraid to go,” he continues. “Look at Belgium, the whole place is closed down! We can’t let it happen here, folks.”
A Chicago cop now faces murder charges—but will anyone hold his colleagues, his superiors, and elected officials accountable for their failures?
Thanks to clear video evidence, Chicago police officer Jason Van Dyke was charged this week with first-degree murder for shooting 17-year-old Laquan McDonald. Nevertheless, thousands of people took to the city’s streets on Friday in protest. And that is as it should be.
The needlessness of the killing is clear and unambiguous:
Yet that dash-cam footage was suppressed for more than a year by authorities citing an investigation. “There was no mystery, no dead-end leads to pursue, no ambiguity about who fired the shots,” Eric Zorn wrote in The Chicago Tribune. “Who was pursuing justice and the truth? What were they doing? Who were they talking to? With whom were they meeting? What were they trying to figure out for 400 days?”
The disturbing implications of a long-standing expectation
NPR reporter Shereen Marisol Meraji recently dropped in on a professional-etiquette class for teens to see what they made of traditional chivalry. “I can open my own door. I don’t see the point,” 18-year-old Chiamaka Njokutold her. “Most of these doors are automatic anyway.”
But the young woman took a less progressive stance on the topic of money: “If a man wants to pay for the whole meal, I would not stop him,” she said. Why, as other sexist institutions gradually dissolve, does this one stubbornly hang on?
A survey released yesterday morning found that about 77 percent of people in straight relationships believe men should pay the bill on a first date. The survey, put together by the financial website NerdWallet, polled roughly 1,000 people who had been dating their partners for six months or more.
Better-informed consumers are ditching the bowls of sugar that were once a triumph of 20th-century marketing.
Last year, General Mills launched a new product aimed at health-conscious customers: Cheerios Protein, a version of its popular cereal made with whole-grain oats and lentils. Early reviews were favorable. The cereal, Huffington Post reported, tasted mostly like regular Cheerios, although “it seemed like they were sweetened and flavored a little more aggressively.” Meanwhile, ads boasted that the cereal would offer “long-lasting energy” as opposed to a sugar crash.
But earlier this month, the Center for Science in the Public Interest sued General Mills, saying that there’s very little extra protein in Cheerios Protein compared to the original brand and an awful lot more sugar—17 times as much, in fact. So why would General Mills try to market a product as containing protein when it’s really a box fill of carbs and refined sugar?
American education is largely limited to lessons about the West.
When I turned 15, my parents sent me alone on a one-month trip to Ecuador, the country where my father was born. This was tradition in our family—for my parents to send their first-generation American kids to the country of their heritage, where we would meet our extended family, immerse ourselves in a different culture, and learn some lessons on gratefulness.
My family’s plan worked. That month in Ecuador did more for my character, education, and sense of identity than any other experience in my early life. And five years later, my experience in Ecuador inspired me to spend more time abroad, studying in South Africa at the University of Cape Town. These two trips not only made me a lifelong traveler, but also a person who believes traveling to developing countries should be a necessary rite of passage for every young American who has the means.
Bill Gates has committed his fortune to moving the world beyond fossil fuels and mitigating climate change.
In his offices overlooking Lake Washington, just east of Seattle, Bill Gates grabbed a legal pad recently and began covering it in his left-handed scrawl. He scribbled arrows by each margin of the pad, both pointing inward. The arrow near the left margin, he said, represented how governments worldwide could stimulate ingenuity to combat climate change by dramatically increasing spending on research and development. “The push is the R&D,” he said, before indicating the arrow on the right. “The pull is the carbon tax.” Between the arrows he sketched boxes to represent areas, such as deployment of new technology, where, he argued, private investors should foot the bill. He has pledged to commit $2 billion himself.