A lot conventional wisdom about software is mistaken. It's probably a mistake to try to tackle these misconceptions in too much detail in a blog post, but my time here is limited and perhaps a short catalog of common mistakes might help some of you think more critically about the programs you use every day.
Results are what matter We all know that small computers have transformed the workplace. The world of The Apartment and Mad Men has vanished. Companies know that they wouldn't be more profitable if they discarded their PCs and hired lots of secretaries and typists. Yet the productivity gains from using computers have been remarkably hard to identify.
It turns out that lot of the work we do with business computers involves dressing up our ideas to impress managers and clients. Where a typed page was once sufficient, we now dispatch an elegantly typeset document and a deck of presentation slides. This might not help the company serve customers, but it helps individuals impress their managers.
Much of the real contribution that software makes to your thinking happens in the course of the work. What may matter most in the long run are the ideas you discover while preparing a management report or a client presentation. Process matters.
Software should be polished We spend too much time perfecting the way our programs look, just as in the previous century we spent far too much time perfecting our books. We are accustomed to a very high standard of editing and typesetting in publishing, a standard that originally was possible only because a vast number of educated women were for the first time entering the work force and were, for a time, willing to accept very low wages. Today, we look for the same sort of surface polish in our software.
All this polish comes with substantial costs. Some costs are evident because they appear in the price. Others are hidden. How do you measure the cost of terrific software that never gets written, or that remains locked in a laboratory?
Software developers have long struggled to reduce the riskiness of development, its delays and failures, by working to build a software factory that would make software construction more systematic. This hasn't worked well. "We software creators woke up one day," I wrote in 2007, "to find ourselves living in the software factory. The floor is hard, from time to time it gets very cold at night, and they say the factory is going to close and move somewhere else. We are unhappy with our modern computing and alienated from our work, we experience constant, inexorable guilt."
We've been here before. In 1853, John Ruskin inserted a long aside in The Stones Of Venice to advise to the Victorian consumer and art buyer. What sort of things should one buy? Ruskin suggests the following:
1. Never encourage the manufacture of any article not absolutely necessary, in the production of which Invention has no share.
2. Never demand an exact finish for its own sake, but only for some practical or noble end.
3. Never encourage imitation or copying of any kind, except for the sake of preserving record of great works.
Brush marks are not signs of sloth, and pixel misalignments are not an indicator of moral laxity. The software creator should make intention clear, but excessive polish is slave's work unredeem'd.
Software should be friendly
The program is not your friend. It does not understand you, or care about you.
Computers should be intuitive
We are often told that computers should be information appliances, that you don't need to know about anything under the hood. Many things we want to do, however, are far from simple; the real work of real people is surprisingly complex. Learning to use tools well sometimes takes times, but you are only a beginner once and you may use your tools every day.
Programs should never crash, hang, or do surprising things
Homer nods, and most of us aren't Homer. Human collaborators sometimes make mistakes, lose things, or drop them on the floor. With computers as well as people, take sensible precautions and hope for the best.
On her first trip to New Mexico, Linda was astonished to find that National Park trails frequently ran close beside spectacular cliffs, with no guard rails in sight. Back east, you'd put up a guard rail and spoil the view -- or you might close the trail because it might be dangerous. If we do not trust users, we deprive people of abilities they need.
No one wants to read on screens
People still say this, even though we spend our days reading and writing on the screen. It is now clear that the future of serious reading and writing lies on screens and on the displays that will replace them.
Hypertext is distracting; the Internet is ruining kids today
Life is distracting. Ideas are complicated and densely interconnected. There is too much to do and we have too little time. Kids know this, too, and make choices accordingly.
Computers don't wear out
Computers you depend on last three years, laptops a bit less. A three-year-old computer, even if in pristine condition, is sufficiently obsolete that replacing it is nearly mandatory. If you don't use your computer much, or you want to use an old computer for an occasional chore, you can keep it for a few years more.
Web pages should (or can) say one thing, and should mean what they say
Dreams of the semantic Web often rest on the assumption that we can (and will) express the meaning of a Web page in a simple and concise format. Everything we know about writing, everything we know about meaning, suggests this is a fantasy.
In despair over their perception of the intellectual dishonesty of the Bush administration and the epistemic closure of the American Right, Jed Buchwald and Diane Greco Josefowicz wrote The Zodiac of Paris. It's describes once-famous controversies in the early 19th century over some Egyptian inscription that suggested the world was older than Genesis allows.
The book is, in a very real sense, about the lies Curveball told Colin Powell, but that meaning is not on the page.
Steve Jobs matters
The American business press is obsessed with CEOs. If a stock increases, the firm's leaders are brilliant fellows. If shares plummet, the CEO must be a buffoon. Steve Jobs, once regarded as a fool, is now hailed as the one true software visionary, the indispensible force.
Jobs is, in fact, a good software critic and an executive who is willing to trust his judgment and endure the consequences.
The customer, the usability lab, or the marketplace will tell you what is good; crowds are wise
From the user-generated content of Wikipedia to mass recommendation systems and user-written product reviews, my colleagues assume that crowds are wise and that, on average, sensible opinions prevail. That this is often true is fortunate, but crowds can be wildly wrong..
Almost all software designers believe that customers, clinical studies, or the marketplace will reveal what works and what doesn't, but everything we know about art (and software is an art form) argues this cannot be right. Best-seller lists sometimes contain good books, but they list bad books aplenty. Popular movies are not always great.
We know that an intelligent critic can sometimes recognize a great work when she sees it. No individual's taste or judgment is infallible, but the marketplace is often wrong, too.
James Fallows is a national correspondent for The Atlantic and has written for the magazine since the late 1970s. He has reported extensively from outside the United States and once worked as President Carter's chief speechwriter. His latest book is China Airborne.
Writing used to be a solitary profession. How did it become so interminably social?
Whether we’re behind the podium or awaiting our turn, numbing our bottoms on the chill of metal foldout chairs or trying to work some life into our terror-stricken tongues, we introverts feel the pain of the public performance. This is because there are requirements to being a writer. Other than being a writer, I mean. Firstly, there’s the need to become part of the writing “community”, which compels every writer who craves self respect and success to attend community events, help to organize them, buzz over them, and—despite blitzed nerves and staggering bowels—present and perform at them. We get through it. We bully ourselves into it. We dose ourselves with beta blockers. We drink. We become our own worst enemies for a night of validation and participation.
Even when a dentist kills an adored lion, and everyone is furious, there’s loftier righteousness to be had.
Now is the point in the story of Cecil the lion—amid non-stop news coverage and passionate social-media advocacy—when people get tired of hearing about Cecil the lion. Even if they hesitate to say it.
But Cecil fatigue is only going to get worse. On Friday morning, Zimbabwe’s environment minister, Oppah Muchinguri, called for the extradition of the man who killed him, the Minnesota dentist Walter Palmer. Muchinguri would like Palmer to be “held accountable for his illegal action”—paying a reported $50,000 to kill Cecil with an arrow after luring him away from protected land. And she’s far from alone in demanding accountability. This week, the Internet has served as a bastion of judgment and vigilante justice—just like usual, except that this was a perfect storm directed at a single person. It might be called an outrage singularity.
Forget credit hours—in a quest to cut costs, universities are simply asking students to prove their mastery of a subject.
MANCHESTER, Mich.—Had Daniella Kippnick followed in the footsteps of the hundreds of millions of students who have earned university degrees in the past millennium, she might be slumping in a lecture hall somewhere while a professor droned. But Kippnick has no course lectures. She has no courses to attend at all. No classroom, no college quad, no grades. Her university has no deadlines or tenure-track professors.
Instead, Kippnick makes her way through different subject matters on the way to a bachelor’s in accounting. When she feels she’s mastered a certain subject, she takes a test at home, where a proctor watches her from afar by monitoring her computer and watching her over a video feed. If she proves she’s competent—by getting the equivalent of a B—she passes and moves on to the next subject.
The Vermont senator’s revolutionary zeal has met its moment.
There’s no way this man could be president, right? Just look at him: rumpled and scowling, bald pate topped by an entropic nimbus of white hair. Just listen to him: ranting, in his gravelly Brooklyn accent, about socialism. Socialism!
And yet here we are: In the biggest surprise of the race for the Democratic presidential nomination, this thoroughly implausible man, Bernie Sanders, is a sensation.
He is drawing enormous crowds—11,000 in Phoenix, 8,000 in Dallas, 2,500 in Council Bluffs, Iowa—the largest turnout of any candidate from any party in the first-to-vote primary state. He has raised $15 million in mostly small donations, to Hillary Clinton’s $45 million—and unlike her, he did it without holding a single fundraiser. Shocking the political establishment, it is Sanders—not Martin O’Malley, the fresh-faced former two-term governor of Maryland; not Joe Biden, the sitting vice president—to whom discontented Democratic voters looking for an alternative to Clinton have turned.
The new version of Apple’s signature media software is a mess. What are people with large MP3 libraries to do?
When the developer Erik Kemp designed the first metadata system for MP3s in 1996, he provided only three options for attaching text to the music. Every audio file could be labeled with only an artist, song name, and album title.
Kemp’s system has since been augmented and improved upon, but never replaced. Which makes sense: Like the web itself, his schema was shipped, good enough,and an improvement on the vacuum which preceded it. Those three big tags, as they’re called, work well with pop and rock written between 1960 and 1995. This didn’t prevent rampant mislabeling in the early days of the web, though, as anyone who remembers Napster can tell you. His system stumbles even more, though, when it needs to capture hip hop’s tradition of guest MCs or jazz’s vibrant culture of studio musicianship.
Some say the so-called sharing economy has gotten away from its central premise—sharing.
This past March, in an up-and-coming neighborhood of Portland, Maine, a group of residents rented a warehouse and opened a tool-lending library. The idea was to give locals access to everyday but expensive garage, kitchen, and landscaping tools—such as chainsaws, lawnmowers, wheelbarrows, a giant cider press, and soap molds—to save unnecessary expense as well as clutter in closets and tool sheds.
The residents had been inspired by similar tool-lending libraries across the country—in Columbus, Ohio; in Seattle, Washington; in Portland, Oregon. The ethos made sense to the Mainers. “We all have day jobs working to make a more sustainable world,” says Hazel Onsrud, one of the Maine Tool Library’s founders, who works in renewable energy. “I do not want to buy all of that stuff.”
During the multi-country press tour for Mission Impossible: Rogue Nation, not even Jon Stewart has dared ask Tom Cruise about Scientology.
During the media blitz for Mission Impossible: Rogue Nation over the past two weeks, Tom Cruise has seemingly been everywhere. In London, he participated in a live interview at the British Film Institute with the presenter Alex Zane, the movie’s director, Christopher McQuarrie, and a handful of his fellow cast members. In New York, he faced off with Jimmy Fallon in a lip-sync battle on The Tonight Show and attended the Monday night premiere in Times Square. And, on Tuesday afternoon, the actor recorded an appearance on The Daily Show With Jon Stewart, where he discussed his exercise regimen, the importance of a healthy diet, and how he still has all his own hair at 53.
Stewart, who during his career has won two Peabody Awards for public service and the Orwell Award for “distinguished contribution to honesty and clarity in public language,” represented the most challenging interviewer Cruise has faced on the tour, during a challenging year for the actor. In April, HBO broadcast Alex Gibney’s documentary Going Clear, a film based on the book of the same title by Lawrence Wright exploring the Church of Scientology, of which Cruise is a high-profile member. The movie alleges, among other things, that the actor personally profited from slave labor (church members who were paid 40 cents an hour to outfit the star’s airplane hangar and motorcycle), and that his former girlfriend, the actress Nazanin Boniadi, was punished by the Church by being forced to do menial work after telling a friend about her relationship troubles with Cruise. For Cruise “not to address the allegations of abuse,” Gibney said in January, “seems to me palpably irresponsible.” But in The Daily Show interview, as with all of Cruise’s other appearances, Scientology wasn’t mentioned.
A leading neuroscientist who has spent decades studying creativity shares her research on where genius comes from, whether it is dependent on high IQ—and why it is so often accompanied by mental illness.
As a psychiatrist and neuroscientist who studies creativity, I’ve had the pleasure of working with many gifted and high-profile subjects over the years, but Kurt Vonnegut—dear, funny, eccentric, lovable, tormented Kurt Vonnegut—will always be one of my favorites. Kurt was a faculty member at the Iowa Writers’ Workshop in the 1960s, and participated in the first big study I did as a member of the university’s psychiatry department. I was examining the anecdotal link between creativity and mental illness, and Kurt was an excellent case study.
He was intermittently depressed, but that was only the beginning. His mother had suffered from depression and committed suicide on Mother’s Day, when Kurt was 21 and home on military leave during World War II. His son, Mark, was originally diagnosed with schizophrenia but may actually have bipolar disorder. (Mark, who is a practicing physician, recounts his experiences in two books, The Eden Express and Just Like Someone Without Mental Illness Only More So, in which he reveals that many family members struggled with psychiatric problems. “My mother, my cousins, and my sisters weren’t doing so great,” he writes. “We had eating disorders, co-dependency, outstanding warrants, drug and alcohol problems, dating and employment problems, and other ‘issues.’ ”)
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
An attack on an American-funded military group epitomizes the Obama Administration’s logistical and strategic failures in the war-torn country.
Last week, the U.S. finally received some good news in Syria:.After months of prevarication, Turkey announced that the American military could launch airstrikes against Islamic State positions in Syria from its base in Incirlik. The development signaled that Turkey, a regional power, had at last agreed to join the fight against ISIS.
The announcement provided a dose of optimism in a conflict that has, in the last four years, killed over 200,000 and displaced millions more. Days later, however, the positive momentum screeched to a halt. Earlier this week, fighters from the al-Nusra Front, an Islamist group aligned with al-Qaeda, reportedly captured the commander of Division 30, a Syrian militia that receives U.S. funding and logistical support, in the countryside north of Aleppo. On Friday, the offensive escalated: Al-Nusra fighters attacked Division 30 headquarters, killing five and capturing others. According to Agence France Presse, the purpose of the attack was to obtain sophisticated weapons provided by the Americans.