Steve Jobs didn't change the world by playing nice
When filmmaker Stanley Kubrick died, the steely perfectionist who ground actors into submission died with him. Kubrick was a good man -- Matthew Modine once described him as "probably the most heartfelt person I ever met" -- but by all accounts, his shoots were crucibles for which the faint of heart need not apply. When he walked onto a set, Stanley Kubrick would get exactly what he wanted, and he would exact this vision without mercy. Upon his death, however, only a mythical Saint Stanley remained, a slightly taller Yoda with a slightly better complexion.
Part of this can be explained by decorum. No one wants to speak ill of the dead, and it's hard to casually reconcile the loving father and husband with the man who verbally flayed Shelley Duvall until her frail character in The Shining seemed Byronic in comparison. Still, revising the methods of such a genius is to diminish exactly what made his genius work. A Clockwork Orange didn't happen by accident. Stanley Kubrick made it happen, and though anyone could direct a Kubrick script, only the man himself could make a Kubrick film.
Last year a former Apple employee related his favorite Steve Jobs story to me. I have no way of knowing if it is true, so take it for what it's worth. I think it nicely captures the man who changed the worldfourtimesover. When engineers working on the very first iPod completed the prototype, they presented their work to Steve Jobs for his approval. Jobs played with the device, scrutinized it, weighed it in his hands, and promptly rejected it. It was too big.
The engineers explained that they had to reinvent inventing to create the iPod, and that it was simply impossible to make it any smaller. Jobs was quiet for a moment. Finally he stood, walked over to an aquarium, and dropped the iPod in the tank. After it touched bottom, bubbles floated to the top.
"Those are air bubbles," he snapped. "That means there's space in there. Make it smaller."
Steve Jobs was a genius, and one of the most important businessmen and inventors of our time. But he was not a kindly, soft-spoken sage who might otherwise live atop a mountain in India, dispatching wisdom to pilgrims. He was a taskmaster who knew how to get things done. "Real artists ship" was an Apple battle cry from the earliest days. Everyone, by now, knows about the Steve Jobs "reality distortion field" -- the charismatic Care Bear Stare that compels otherwise reasonable people to spend weeks in line for a slightly faster telephone. In his biography of Jobs, journalist Alan Deutschman described the Apple co-founder's lesser-known hero-shithead roller coaster. "He could be Good Steve or he could be Bad Steve. When he was Bad Steve, he didn't seem to care about the severe damage he caused to egos or emotions so long as he pushed for greatness." When confronted with the full "terrifying" wrath of Bad Steve (even over the slightest of details), the brains at Apple would push themselves beyond all personal limits to find a way to meet Jobs's exacting demands, and somehow return to his good graces. And the process would repeat itself. "Steve was willing to be loved or feared, whatever worked." As Bud Tribble, Vice President of Software Technology at Apple explained. "It let the engineers know that it wasn't OK to be sloppy in anything they did, even the 99 percent that Steve would never look at."
That attention to detail makes Apple products unique and desired. Does any other company produce ubiquitous, mass-market devices that still feel so rare, and deeply personal? Steve Jobs did that.
His life was too short, but never wasted, and his impact reaches even those who've never touched an Apple product. He ushered in the personal computing era, and rallied from pancreatic cancer to show us a glimpse of the post-PC world. That didn't just happen; it was made to happen.
When Apple announced his resignation in August, the canonization began. Barrels of ink recounted all of the carrot and none of the stick. With the announcement of his death, coverage and conversations continue along those lines. That's to be expected, and like Kubrick, is set to become conventional wisdom. Steve Jobs was a good man who loved and was loved, and earned every accolade he's garnered. But he doesn't deserve a hagiography, and I doubt he would have wanted one. Apple wasn't built by a saint. It was built by an iron-fisted visionary. There are a lot of geniuses in the world, and a lot of aesthetes. But that's not enough. Sometimes it takes Bad Steve to bring products to market. Real artists ship.
The number of American teens who excel at advanced math has surged. Why?
On a sultry evening last July, a tall, soft-spoken 17-year-old named David Stoner and nearly 600 other math whizzes from all over the world sat huddled in small groups around wicker bistro tables, talking in low voices and obsessively refreshing the browsers on their laptops. The air in the cavernous lobby of the Lotus Hotel Pang Suan Kaew in Chiang Mai, Thailand, was humid, recalls Stoner, whose light South Carolina accent warms his carefully chosen words. The tension in the room made it seem especially heavy, like the atmosphere at a high-stakes poker tournament.
Stoner and five teammates were representing the United States in the 56th International Mathematical Olympiad. They figured they’d done pretty well over the two days of competition. God knows, they’d trained hard. Stoner, like his teammates, had endured a grueling regime for more than a year—practicing tricky problems over breakfast before school and taking on more problems late into the evening after he completed the homework for his college-level math classes. Sometimes, he sketched out proofs on the large dry-erase board his dad had installed in his bedroom. Most nights, he put himself to sleep reading books like New Problems in Euclidean Geometry and An Introduction to Diophantine Equations.
For decades, some psychologists have claimed that bilinguals have better mental control. Their work is now being called into question.
In one of his sketches, comedian Eddie Izzard talks about how English speakers see bilingualism: “Two languages in one head? No one can live at that speed! Good lord, man. You’re asking the impossible,” he says. This satirical view used to be a serious one. People believed that if children grew up with two languages rattling around their heads, they would become so confused that their “intellectual and spiritual growth would not thereby be doubled, but halved,” wrote one professor in 1890. “The use of a foreign language in the home is one of the chief factors in producing mental retardation,” said another in 1926.
A century on, things are very different. Since the 1960s, several studies have shown that bilingualism leads to many advantages, beyond the obvious social benefits of being able to speak to more people. It also supposedly improves executive function—a catch-all term for advanced mental abilities that allow us to control our thoughts and behavior, such as focusing on a goal, ignoring distractions, switching attention, and planning for the future.
Attorney General Loretta Lynch announced the Justice Department is suing the Missouri municipality after an agreement on reform broke down.
The Justice Department filed a wide-ranging lawsuit against Ferguson, Missouri, in federal court Wednesday, accusing the municipality of “a pattern or practice of law enforcement conduct that violates the Constitution and federal civil rights laws,” Attorney General Loretta Lynch announced.
“Residents of Ferguson have suffered the deprivation of their constitutional rights—the rights guaranteed to all Americans—for decades,” Lynch said. “They have waited decades for justice. They should not be forced to wait any longer.”
The lawsuit’s allegations mirror those in the Justice Department’s landmark Ferguson Report, which was released last March on the same day as a separate report clearing Officer Darren Wilson of civil-rights violations for the shooting death of Michael Brown in August 2014. Brown’s death, alongside the high-profile shootings of unarmed black men and women in other cities, led to violent protests in Ferguson and ignited a national debate over race and policing in the U.S.
After a pair of poor showings in New Hampshire, Chris Christie and Carly Fiorina drop out of the race.
The Republican race is headed to South Carolina with two fewer candidates. The day after finishing sixth and seventh in the New Hampshire primaries, New Jersey Governor Chris Christie and former Hewlett-Packard CEO Carly Fiorina announced on Wednesday that they were suspending their campaigns.
Fiorina was always a long shot—she was practically a political newcomer, having only run one unsuccessful Senate campaign. And while her record at HP was vulnerable to attack, Republican figures saw in her both private-sector experience and a woman who could counter Hillary Clinton’s monopoly on a “historic” woman’s candidacy. While many political professionals sniffed at Fiorina’s candidacy, remembering that 2010 Senate race, she broke out after a commanding performance in the undercard to the first Republican debate. That earned her a promotion to the main stage at the next debate, where she scored another victory. But it was all downhill from there. Dogged by questions of honesty and unable to earn media attention, her campaign faded quickly.
Issued last summer, the rules are the centerpiece of the White House’s climate-change-fighting agenda, and they play a big part in the recent, tepid optimism about global warming. Without the proposal of the plan, the United States couldn’t have secured the Paris Agreement, the first international treaty to mitigate greenhouse-gas emissions, last December. And without the adoption of the plan, the United States almost certainly won’t be able to comply with that document. If the world were to lose the Paris Agreement—which was not a total solution to the climate crisis, but meant to be a first, provisional step—years could be lost in the diplomatic fight to reduce climate-change’s dangers.
Most people in the U.S. believe their country is going to hell. But they’re wrong. What a three-year journey by single-engine plane reveals about reinvention and renewal.
When news broke late last year of a mass shooting in San Bernardino, California, most people in the rest of the country, and even the state, probably had to search a map to figure out where the city was. I knew exactly, having grown up in the next-door town of Redlands (where the two killers lived) and having, by chance, spent a long period earlier in the year meeting and interviewing people in the unglamorous “Inland Empire” of Southern California as part of an ongoing project of reporting across America.
Some of what my wife, Deb, and I heard in San Bernardino before the shootings closely matched the picture that the nonstop news coverage presented afterward: San Bernardino as a poor, troubled town that sadly managed to combine nearly every destructive economic, political, and social trend of the country as a whole. San Bernardino went into bankruptcy in 2012 and was only beginning to emerge at the time of the shootings. Crime is high, household income is low, the downtown is nearly abandoned in the daytime and dangerous at night, and unemployment and welfare rates are persistently the worst in the state.
The ancient civilization may have tracked Jupiter using sophisticated methods, but their reasons for stargazing were very different than ours.
We’ve never escaped the influence of the Babylonians. That there are 60 seconds in a minute, 60 minutes in an hour, and 360 degrees in a full circle, are all echoes of the Babylonian preference for counting in base 60. An affinity for base 12 (inches in a foot, pence in an old British shilling) is also an offshoot, 12 being a factor of 60.
All this suggests that the Babylonians had a mathematics worth copying, which was why the Greeks did copy it and thereby rooted these number systems in Western tradition. The latest indication of Babylonian mathematical sophistication is the discovery that their astronomers knew that, in effect, the distance traveled by a moving object is equal to the area under the graph of velocity plotted against time. Previously it had been thought that this relationship wasn’t recognized until the fourteenth century in Europe. But since historian Mathieu Ossendrijver of the Humboldt University in Berlin found the calculation described in a series of clay tablets inscribed with cuneiform writing in Babylonia during the fourth to the first centuries B.C.E., where it was used to figure out the distance traveled across the sky by the planet Jupiter.
After getting shut down late last year, a website that allows free access to paywalled academic papers has sprung back up in a shadowy corner of the Internet.
There’s a battle raging over whether academic research should be free, and it’s overflowing into the dark web.
Most modern scholarly work remains locked behind paywalls, and unless your computer is on the network of a university with an expensive subscription, you have to pay a fee, often around 30 dollars, to access each paper.
Many scholars say this system makes publishers rich—Elsevier, a company that controls access to more than 2,000 journals, has a market capitalization about equal to that of Delta Airlines—but does not benefit the academics that conducted the research, or the public at large. Others worry that free academic journals would have a hard time upholding the rigorous standards and peer reviews that the most prestigious paid journals are famous for.
Why Donald Trump's anti-immigration rhetoric was enough for movement conservatives to forgive his history of liberalism.
Last summer, Donald Trump described Mexican immigrants as “bringing drugs, they’re bringing crime. They’re rapists.” In December, he called for “a total and complete shutdown of Muslims entering the United States.” Many commentators claim that this wild rhetoric helps Trump suck up media oxygen or appear like a straight-talking political outsider. But the most important benefit of the anti-immigrant language is that it inoculates Trump against the charge of being a closet liberal.
Trump has a seemingly fatal vulnerability in the Republican primary: His past support for a host of moderate and liberal positions. In recent years, Trump said he would “press for universal health care,” claimed that he was “pro-choice in every respect,” remarked that “I hate the concept of guns,” stated that Hillary Clinton would “do a good job” in negotiating with Iran, asserted that the GOP was “just too crazy right,” and even said, “In many cases, I probably identify more as a Democrat.”
This morning I went on Democracy Now to discuss my critique of “class-first” policy as a way of ameliorating the effects of racism. In the midst of that discussion I made the point that one can maintain a critique of a candidate—in this case Bernie Sanders—and still feel that that candidate is deserving of your vote. Amy Goodman, being an excellent journalist, did exactly what she should have done—she asked if I were going to vote for Senator Sanders.
I, with some trepidation, answered in the affirmative. I did so because I’ve spent my career trying to get people to answer uncomfortable questions. Indeed, the entire reason I was on the show was to try to push liberals into directly addressing an uncomfortable issue that threatens their coalition. It seemed wrong, somehow, to ask others to step into their uncomfortable space and not do so myself. So I answered.