Steve Jobs didn't change the world by playing nice
When filmmaker Stanley Kubrick died, the steely perfectionist who ground actors into submission died with him. Kubrick was a good man -- Matthew Modine once described him as "probably the most heartfelt person I ever met" -- but by all accounts, his shoots were crucibles for which the faint of heart need not apply. When he walked onto a set, Stanley Kubrick would get exactly what he wanted, and he would exact this vision without mercy. Upon his death, however, only a mythical Saint Stanley remained, a slightly taller Yoda with a slightly better complexion.
Part of this can be explained by decorum. No one wants to speak ill of the dead, and it's hard to casually reconcile the loving father and husband with the man who verbally flayed Shelley Duvall until her frail character in The Shining seemed Byronic in comparison. Still, revising the methods of such a genius is to diminish exactly what made his genius work. A Clockwork Orange didn't happen by accident. Stanley Kubrick made it happen, and though anyone could direct a Kubrick script, only the man himself could make a Kubrick film.
Last year a former Apple employee related his favorite Steve Jobs story to me. I have no way of knowing if it is true, so take it for what it's worth. I think it nicely captures the man who changed the worldfourtimesover. When engineers working on the very first iPod completed the prototype, they presented their work to Steve Jobs for his approval. Jobs played with the device, scrutinized it, weighed it in his hands, and promptly rejected it. It was too big.
The engineers explained that they had to reinvent inventing to create the iPod, and that it was simply impossible to make it any smaller. Jobs was quiet for a moment. Finally he stood, walked over to an aquarium, and dropped the iPod in the tank. After it touched bottom, bubbles floated to the top.
"Those are air bubbles," he snapped. "That means there's space in there. Make it smaller."
Steve Jobs was a genius, and one of the most important businessmen and inventors of our time. But he was not a kindly, soft-spoken sage who might otherwise live atop a mountain in India, dispatching wisdom to pilgrims. He was a taskmaster who knew how to get things done. "Real artists ship" was an Apple battle cry from the earliest days. Everyone, by now, knows about the Steve Jobs "reality distortion field" -- the charismatic Care Bear Stare that compels otherwise reasonable people to spend weeks in line for a slightly faster telephone. In his biography of Jobs, journalist Alan Deutschman described the Apple co-founder's lesser-known hero-shithead roller coaster. "He could be Good Steve or he could be Bad Steve. When he was Bad Steve, he didn't seem to care about the severe damage he caused to egos or emotions so long as he pushed for greatness." When confronted with the full "terrifying" wrath of Bad Steve (even over the slightest of details), the brains at Apple would push themselves beyond all personal limits to find a way to meet Jobs's exacting demands, and somehow return to his good graces. And the process would repeat itself. "Steve was willing to be loved or feared, whatever worked." As Bud Tribble, Vice President of Software Technology at Apple explained. "It let the engineers know that it wasn't OK to be sloppy in anything they did, even the 99 percent that Steve would never look at."
That attention to detail makes Apple products unique and desired. Does any other company produce ubiquitous, mass-market devices that still feel so rare, and deeply personal? Steve Jobs did that.
His life was too short, but never wasted, and his impact reaches even those who've never touched an Apple product. He ushered in the personal computing era, and rallied from pancreatic cancer to show us a glimpse of the post-PC world. That didn't just happen; it was made to happen.
When Apple announced his resignation in August, the canonization began. Barrels of ink recounted all of the carrot and none of the stick. With the announcement of his death, coverage and conversations continue along those lines. That's to be expected, and like Kubrick, is set to become conventional wisdom. Steve Jobs was a good man who loved and was loved, and earned every accolade he's garnered. But he doesn't deserve a hagiography, and I doubt he would have wanted one. Apple wasn't built by a saint. It was built by an iron-fisted visionary. There are a lot of geniuses in the world, and a lot of aesthetes. But that's not enough. Sometimes it takes Bad Steve to bring products to market. Real artists ship.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
The drug modafinil was recently found to enhance cognition in healthy people. Should you take it to get a raise?
If you could take a pill that will make you better at your job, with few or no negative consequences, would you do it?
In a meta-analysis recently published in European Neuropsychopharmacology, researchers from the University of Oxford and Harvard Medical School concluded that a drug called modafinil, which is typically used to treat sleep disorders, is a cognitive enhancer. Essentially, it can help normal people think better.
Out of all cognitive processes, modafinil was found to improve decision-making and planning the most in the 24 studies the authors reviewed. Some of the studies also showed gains in flexible thinking, combining information, or coping with novelty. The drug didn’t seem to influence creativity either way.
But no tale of posthumous success is quite as spectacular as that of Howard Phillips Lovecraft, the “cosmic horror” writer who died in Providence, Rhode Island, in 1937 at the age of 46. The circumstances of Lovecraft’s final years were as bleak as anyone’s. He ate expired canned food and wrote to a friend, “I was never closer to the bread-line.” He never saw his stories collectively published in book form, and, before succumbing to intestinal cancer, he wrote, “I have no illusions concerning the precarious status of my tales, and do not expect to become a serious competitor of my favorite weird authors.” Among the last words the author uttered were, “Sometimes the pain is unbearable.” His obituary in the Providence Evening Bulletin was “full of errors large and small,” according to his biographer.
As the vice president edges toward a presidential run, is he banking on further public disclosures to discredit the frontrunner?
As Joe Biden edges closer to a presidential run, there’s no shortage of theories as to what he’s up to. Former secretary of state Hillary Clinton has built a commanding lead in the national polls, giving Biden little apparent space to gain traction. Perhaps he’s counting on the early-primary state of South Carolina to provide a critical boost. He might be banking on appearing as a stronger general-election candidate than any of his potential rivals in the primary race. Maybe after spending the past 42 years of his life running for elective office, he just can’t stop.
But there’s one intriguing theory that has so far garnered little attention: What if Biden knows something about Democratic frontrunner Hillary Clinton that the rest of us don’t?
It is not too late to strengthen the Iran deal, a prominent critic says.
It appears likely, as of this writing, that Barack Obama will be victorious in his fight to implement the Iran nuclear deal negotiated by his secretary of state, John Kerry. Republicans in Congress don’t appear to have the votes necessary to void the agreement, and Benjamin Netanyahu’s campaign to subvert Obama may be remembered as one of the more counterproductive and shortsighted acts of an Israeli prime minister since the rebirth of the Jewish state 67 years ago.
Things could change, of course, and the Iranian regime, which is populated in good part by extremists, fundamentalist theocrats, and supporters of terrorism, could do something monumentally stupid in the coming weeks that could force on-the-fence Democrats to side with their Republican adversaries (remember the Café Milano fiasco, anyone?). But, generally speaking, the Obama administration, and its European allies, seem to have a clearer path to implementation than they had at the beginning of the month.
A new study shows that the field suffers from a reproducibility problem, but the extent of the issue is still hard to nail down.
No one is entirely clear on how Brian Nosek pulled it off, including Nosek himself. Over the last three years, the psychologist from the University of Virginia persuaded some 270 of his peers to channel their free time into repeating 100 published psychological experiments to see if they could get the same results a second time around. There would be no glory, no empirical eurekas, no breaking of fresh ground. Instead, this initiative—the Reproducibility Project—would be the first big systematic attempt to answer questions that have been vexing psychologists for years, if not decades. What proportion of results in their field are reliable?
In 1998, Toni Morrison wrote a comment for The New Yorker arguing that “white skin notwithstanding, this is our first black President. Blacker than any actual black person who could ever be elected in our children’s lifetime.” Last week the New York Times, implicitly cited Morrison’s piece, and claimed the author was giving Clinton “a compliment.” This interpretation of Morrison’s claim is as common as it is erroneous.
The popular interpretation of Morrison’s point (exhibited here) holds that, summoning all of her powers, the writer gazed into the very essence of Clinton, and found him sufficiently soulful. In fact, Morrison’s point had little to do with soul of any kind. She was not much concerned with Clinton’s knowledge of Ebonics, his style of handshake, nor whether he pledged Alpha or Q. Morrison was concerned with power.
A new study finds an algorithmic word analysis is flawless at determining whether a person will have a psychotic episode.
Although the language of thinking is deliberate—let me think, I have to do some thinking—the actual experience of having thoughts is often passive. Ideas pop up like dandelions; thoughts occur suddenly and escape without warning. People swim in and out of pools of thought in a way that can feel, paradoxically, mindless.
Most of the time, people don’t actively track the way one thought flows into the next. But in psychiatry, much attention is paid to such intricacies of thinking. For instance, disorganized thought, evidenced by disjointed patterns in speech, is considered a hallmark characteristic of schizophrenia. Several studies of at-risk youths have found that doctors are able to guess with impressive accuracy—the best predictive models hover around 79 percent—whether a person will develop psychosis based on tracking that person’s speech patterns in interviews.