Steve Jobs didn't change the world by playing nice
When filmmaker Stanley Kubrick died, the steely perfectionist who ground actors into submission died with him. Kubrick was a good man -- Matthew Modine once described him as "probably the most heartfelt person I ever met" -- but by all accounts, his shoots were crucibles for which the faint of heart need not apply. When he walked onto a set, Stanley Kubrick would get exactly what he wanted, and he would exact this vision without mercy. Upon his death, however, only a mythical Saint Stanley remained, a slightly taller Yoda with a slightly better complexion.
Part of this can be explained by decorum. No one wants to speak ill of the dead, and it's hard to casually reconcile the loving father and husband with the man who verbally flayed Shelley Duvall until her frail character in The Shining seemed Byronic in comparison. Still, revising the methods of such a genius is to diminish exactly what made his genius work. A Clockwork Orange didn't happen by accident. Stanley Kubrick made it happen, and though anyone could direct a Kubrick script, only the man himself could make a Kubrick film.
Last year a former Apple employee related his favorite Steve Jobs story to me. I have no way of knowing if it is true, so take it for what it's worth. I think it nicely captures the man who changed the worldfourtimesover. When engineers working on the very first iPod completed the prototype, they presented their work to Steve Jobs for his approval. Jobs played with the device, scrutinized it, weighed it in his hands, and promptly rejected it. It was too big.
The engineers explained that they had to reinvent inventing to create the iPod, and that it was simply impossible to make it any smaller. Jobs was quiet for a moment. Finally he stood, walked over to an aquarium, and dropped the iPod in the tank. After it touched bottom, bubbles floated to the top.
"Those are air bubbles," he snapped. "That means there's space in there. Make it smaller."
Steve Jobs was a genius, and one of the most important businessmen and inventors of our time. But he was not a kindly, soft-spoken sage who might otherwise live atop a mountain in India, dispatching wisdom to pilgrims. He was a taskmaster who knew how to get things done. "Real artists ship" was an Apple battle cry from the earliest days. Everyone, by now, knows about the Steve Jobs "reality distortion field" -- the charismatic Care Bear Stare that compels otherwise reasonable people to spend weeks in line for a slightly faster telephone. In his biography of Jobs, journalist Alan Deutschman described the Apple co-founder's lesser-known hero-shithead roller coaster. "He could be Good Steve or he could be Bad Steve. When he was Bad Steve, he didn't seem to care about the severe damage he caused to egos or emotions so long as he pushed for greatness." When confronted with the full "terrifying" wrath of Bad Steve (even over the slightest of details), the brains at Apple would push themselves beyond all personal limits to find a way to meet Jobs's exacting demands, and somehow return to his good graces. And the process would repeat itself. "Steve was willing to be loved or feared, whatever worked." As Bud Tribble, Vice President of Software Technology at Apple explained. "It let the engineers know that it wasn't OK to be sloppy in anything they did, even the 99 percent that Steve would never look at."
That attention to detail makes Apple products unique and desired. Does any other company produce ubiquitous, mass-market devices that still feel so rare, and deeply personal? Steve Jobs did that.
His life was too short, but never wasted, and his impact reaches even those who've never touched an Apple product. He ushered in the personal computing era, and rallied from pancreatic cancer to show us a glimpse of the post-PC world. That didn't just happen; it was made to happen.
When Apple announced his resignation in August, the canonization began. Barrels of ink recounted all of the carrot and none of the stick. With the announcement of his death, coverage and conversations continue along those lines. That's to be expected, and like Kubrick, is set to become conventional wisdom. Steve Jobs was a good man who loved and was loved, and earned every accolade he's garnered. But he doesn't deserve a hagiography, and I doubt he would have wanted one. Apple wasn't built by a saint. It was built by an iron-fisted visionary. There are a lot of geniuses in the world, and a lot of aesthetes. But that's not enough. Sometimes it takes Bad Steve to bring products to market. Real artists ship.
The man who made computers personal was a genius and a jerk. A new documentary wonders whether his legacy can accommodate both realities.
An iPhone is a machine much like any other: motherboard, modem, microphone, microchip, battery, wires of gold and silver and copper twisting and snaking, the whole assembly arranged under a piece of glass whose surface—coated with an oxide of indium and tin to make it electrically conductive—sparks to life at the touch of a warm-blooded finger. But an iPhone, too, is much more than a machine. The neat ecosystem that hums under its heat-activated glass holds grocery lists and photos and games and jokes and news and books and music and secrets and the voices of loved ones and, quite possibly, every text you’ve ever exchanged with your best friend. Thought, memory, empathy, the stuff we sometimes shorthand as “the soul”: There it all is, zapping through metal whose curves and coils were designed to be held in a human hand.
Fractured by internal conflict and foreign intervention for centuries, Afghanistan made several tentative steps toward modernization in the mid-20th century. In the 1950s and 1960s, some of the biggest strides were made toward a more liberal and westernized lifestyle, while trying to maintain a respect for more conservative factions. Though officially a neutral nation, Afghanistan was courted and influenced by the U.S. and Soviet Union during the Cold War, accepting Soviet machinery and weapons, and U.S. financial aid. This time was a brief, relatively peaceful era, when modern buildings were constructed in Kabul alongside older traditional mud structures, when burqas became optional for a time, and the country appeared to be on a path toward a more open, prosperous society. Progress was halted in the 1970s, as a series of bloody coups, invasions, and civil wars began, continuing to this day, reversing almost all of the steps toward modernization taken in the 50s and 60s. Keep in mind, when looking at these images, that the average life expectancy for Afghans born in 1960 was 31, so the vast majority of those pictured have likely passed on since.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
In continuing to tinker with the universe she built eight years after it ended, J.K. Rowling might be falling into the same trap as Star Wars’s George Lucas.
September 1st, 2015 marked a curious footnote in Harry Potter marginalia: According to the series’s elaborate timeline, rarely referenced in the books themselves, it was the day James S. Potter, Harry’s eldest son, started school at Hogwarts. It’s not an event directly written about in the books, nor one of particular importance, but their creator, J.K. Rowling, dutifully took to Twitter to announce what amounts to footnote details: that James was sorted into House Gryffindor, just like his father, to the disappointment of Teddy Lupin, Harry’s godson, apparently a Hufflepuff.
It’s not earth-shattering information that Harry’s kid would end up in the same house his father was in, and the Harry Potter series’s insistence on sorting all of its characters into four broad personality quadrants largely based on their family names has always struggled to stand up to scrutiny. Still, Rowling’s tweet prompted much garment-rending among the books’ devoted fans. Can a tweet really amount to a piece of canonical information for a book? There isn’t much harm in Rowling providing these little embellishments years after her books were published, but even idle tinkering can be a dangerous path to take, with the obvious example being the insistent tweaks wrought by George Lucas on his Star Wars series.
I traveled to every country on earth. In some cases, the adventure started before I could get there.
Last summer, my Royal Air Maroc flight from Casablanca landed at Malabo International Airport in Equatorial Guinea, and I completed a 50-year mission: I had officially, and legally, visited every recognized country on earth.
This means 196 countries: the 193 members of the United Nations, plus Taiwan, Vatican City, and Kosovo, which are not members but are, to varying degrees, recognized as independent countries by other international actors.
In five decades of traveling, I’ve crossed countries by rickshaw, pedicab, bus, car, minivan, and bush taxi; a handful by train (Italy, Switzerland, Moldova, Belarus, Ukraine, Romania, and Greece); two by riverboat (Gabon and Germany); Norway by coastal steamer; Gambia and the Amazonian parts of Peru and Ecuador by motorized canoe; and half of Burma by motor scooter. I rode completely around Jamaica on a motorcycle and Nauru on a bicycle. I’ve also crossed three small countries on foot (Vatican City, San Marino, and Liechtenstein), and parts of others by horse, camel, elephant, llama, and donkey. I confess that I have not visited every one of the 7,107 islands in the Philippine archipelago or most of the more than 17,000 islands constituting Indonesia, but I’ve made my share of risky voyages on the rickety inter-island rustbuckets you read about in the back pages of the Times under headlines like “Ship Sinks in Sulu Sea, 400 Presumed Lost.”
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
In Beijing, China marked the 70th anniversary of the end of World War II, and its role in defeating Japan, by holding an enormous military parade and declaring a new national holiday. The spectacle involved more than 12,000 troops, 500 pieces of military hardware, and 200 aircraft.
In Beijing, China marked the 70th anniversary of the end of World War II, and its role in defeating Japan, by holding an enormous military parade and declaring a new national holiday. The spectacle involved more than 12,000 troops, 500 pieces of military hardware, and 200 aircraft of various types, representing what military officials said were the Chinese military's most cutting-edge technology. While the entire event was a show of strength, Chinese officials insisted the message was about peace, with the logo displayed on posters featuring an image of a dove.
How the Islamic State uses economic persecution as a recruitment tactic
Before Islamic State militants overran her hometown of Mosul in June 2014, Fahima Omar ran a hairdressing salon. But ISIS gunmen made Omar close her business—and lose her only source of income. Salons like hers encouraged “debauchery,” the militants said.
Omar is one of many business owners—male and female—who say ISIS has forced them to shut up shop and lose their livelihoods in the process. The extremist group has also prevented those who refuse to join it from finding jobs, and has imposed heavy taxes on civilians.
“ISIS controls every detail of the economy,” says Abu Mujahed, who fled with his family from ISIS-controlled Deir al-Zor in eastern Syria. “Only their people or those who swear allegiance to them have a good life.” When they took over Deir al-Zor, ISIS gunmen systematically took control of the local economy, looting factories and confiscating properties, says Mujahed. Then they moved in, taking over local business networks.
What do Google's trippy neural network-generated images tell us about the human mind?
When a collection of artificial brains at Google began generating psychedelic images from otherwise ordinary photos, engineers compared what they saw to dreamscapes. They named their image-generation technique Inceptionism and called the code used to power it Deep Dream.
But many of the people who saw the images reacted the same way: These things didn’t come from a dream world. They came from an acid trip.
The computer-made images feature scrolls of color, swirling lines, stretched faces, floating eyeballs, and uneasy waves of shadow and light. The machines seemed to be hallucinating, and in a way that appeared uncannily human.
The idea behind the project was to test the extent to which a neural network had learned to recognize various animals and landscapes by asking the computer to describe what it saw. So, instead of just showing a computer a picture of a tree and saying, "tell me what this is," engineers would show the computer an image and say, "enhance whatever it is you see."
Some people see threats even when none are present. Strangely, it can make them more creative.
For much of his life, Isaac Newton seemed like he was on the verge of a nervous breakdown. In 1693, the collapse finally arrived: After not sleeping for five days straight, Newton sent letters accusing his friends of conspiring against him. He was refraining from publishing books, he said at one point that year, “for fear that disputes and controversies may be raised against me by ignoramuses.”
Newton was, by many accounts, highly neurotic. Brilliant, but neurotic nonetheless. He was prone to depressive jags, mistrust, and angry outbursts.
Unfortunately, his genius might have been rooted in his maladjustments. His mental state led him to brood over past mistakes, and eventually, a breakthrough would dawn. “I keep the subject constantly before me,” he once said, “and wait till the first dawnings open slowly, by little and little, into a full and clear light.”