A lot conventional wisdom about software is mistaken. It's probably a mistake to try to tackle these misconceptions in too much detail in a blog post, but my time here is limited and perhaps a short catalog of common mistakes might help some of you think more critically about the programs you use every day.
Results are what matter We all know that small computers have transformed the workplace. The world of The Apartment and Mad Men has vanished. Companies know that they wouldn't be more profitable if they discarded their PCs and hired lots of secretaries and typists. Yet the productivity gains from using computers have been remarkably hard to identify.
It turns out that lot of the work we do with business computers involves dressing up our ideas to impress managers and clients. Where a typed page was once sufficient, we now dispatch an elegantly typeset document and a deck of presentation slides. This might not help the company serve customers, but it helps individuals impress their managers.
Much of the real contribution that software makes to your thinking happens in the course of the work. What may matter most in the long run are the ideas you discover while preparing a management report or a client presentation. Process matters.
Software should be polished We spend too much time perfecting the way our programs look, just as in the previous century we spent far too much time perfecting our books. We are accustomed to a very high standard of editing and typesetting in publishing, a standard that originally was possible only because a vast number of educated women were for the first time entering the work force and were, for a time, willing to accept very low wages. Today, we look for the same sort of surface polish in our software.
All this polish comes with substantial costs. Some costs are evident because they appear in the price. Others are hidden. How do you measure the cost of terrific software that never gets written, or that remains locked in a laboratory?
Software developers have long struggled to reduce the riskiness of development, its delays and failures, by working to build a software factory that would make software construction more systematic. This hasn't worked well. "We software creators woke up one day," I wrote in 2007, "to find ourselves living in the software factory. The floor is hard, from time to time it gets very cold at night, and they say the factory is going to close and move somewhere else. We are unhappy with our modern computing and alienated from our work, we experience constant, inexorable guilt."
We've been here before. In 1853, John Ruskin inserted a long aside in The Stones Of Venice to advise to the Victorian consumer and art buyer. What sort of things should one buy? Ruskin suggests the following:
1. Never encourage the manufacture of any article not absolutely necessary, in the production of which Invention has no share.
2. Never demand an exact finish for its own sake, but only for some practical or noble end.
3. Never encourage imitation or copying of any kind, except for the sake of preserving record of great works.
Brush marks are not signs of sloth, and pixel misalignments are not an indicator of moral laxity. The software creator should make intention clear, but excessive polish is slave's work unredeem'd.
Software should be friendly
The program is not your friend. It does not understand you, or care about you.
Computers should be intuitive
We are often told that computers should be information appliances, that you don't need to know about anything under the hood. Many things we want to do, however, are far from simple; the real work of real people is surprisingly complex. Learning to use tools well sometimes takes times, but you are only a beginner once and you may use your tools every day.
Programs should never crash, hang, or do surprising things
Homer nods, and most of us aren't Homer. Human collaborators sometimes make mistakes, lose things, or drop them on the floor. With computers as well as people, take sensible precautions and hope for the best.
On her first trip to New Mexico, Linda was astonished to find that National Park trails frequently ran close beside spectacular cliffs, with no guard rails in sight. Back east, you'd put up a guard rail and spoil the view -- or you might close the trail because it might be dangerous. If we do not trust users, we deprive people of abilities they need.
No one wants to read on screens
People still say this, even though we spend our days reading and writing on the screen. It is now clear that the future of serious reading and writing lies on screens and on the displays that will replace them.
Hypertext is distracting; the Internet is ruining kids today
Life is distracting. Ideas are complicated and densely interconnected. There is too much to do and we have too little time. Kids know this, too, and make choices accordingly.
Computers don't wear out
Computers you depend on last three years, laptops a bit less. A three-year-old computer, even if in pristine condition, is sufficiently obsolete that replacing it is nearly mandatory. If you don't use your computer much, or you want to use an old computer for an occasional chore, you can keep it for a few years more.
Web pages should (or can) say one thing, and should mean what they say
Dreams of the semantic Web often rest on the assumption that we can (and will) express the meaning of a Web page in a simple and concise format. Everything we know about writing, everything we know about meaning, suggests this is a fantasy.
In despair over their perception of the intellectual dishonesty of the Bush administration and the epistemic closure of the American Right, Jed Buchwald and Diane Greco Josefowicz wrote The Zodiac of Paris. It's describes once-famous controversies in the early 19th century over some Egyptian inscription that suggested the world was older than Genesis allows.
The book is, in a very real sense, about the lies Curveball told Colin Powell, but that meaning is not on the page.
Steve Jobs matters
The American business press is obsessed with CEOs. If a stock increases, the firm's leaders are brilliant fellows. If shares plummet, the CEO must be a buffoon. Steve Jobs, once regarded as a fool, is now hailed as the one true software visionary, the indispensible force.
Jobs is, in fact, a good software critic and an executive who is willing to trust his judgment and endure the consequences.
The customer, the usability lab, or the marketplace will tell you what is good; crowds are wise
From the user-generated content of Wikipedia to mass recommendation systems and user-written product reviews, my colleagues assume that crowds are wise and that, on average, sensible opinions prevail. That this is often true is fortunate, but crowds can be wildly wrong..
Almost all software designers believe that customers, clinical studies, or the marketplace will reveal what works and what doesn't, but everything we know about art (and software is an art form) argues this cannot be right. Best-seller lists sometimes contain good books, but they list bad books aplenty. Popular movies are not always great.
We know that an intelligent critic can sometimes recognize a great work when she sees it. No individual's taste or judgment is infallible, but the marketplace is often wrong, too.
James Fallows is a national correspondent for The Atlantic and has written for the magazine since the late 1970s. He has reported extensively from outside the United States and once worked as President Carter's chief speechwriter. His latest book is China Airborne.
Though it wasn’t pretty, Minaj was really teaching a lesson in civility.
Nicki Minaj didn’t, in the end, say much to Miley Cyrus at all. If you only read the comments that lit up the Internet at last night’s MTV Video Music Awards, you might think she was kidding, or got cut off, when she “called out” the former Disney star who was hosting: “And now, back to this bitch that had a lot to say about me the other day in the press. Miley, what’s good?”
To summarize: When Minaj’s “Anaconda” won the award for Best Hip-Hop Video, she took to the stage in a slow shuffle, shook her booty with presenter Rebel Wilson, and then gave an acceptance speech in which she switched vocal personas as amusingly as she does in her best raps—street-preacher-like when telling women “don’t you be out here depending on these little snotty-nosed boys”; sweetness and light when thanking her fans and pastor. Then a wave of nausea seemed to come over her, and she turned her gaze toward Cyrus. To me, the look on her face, not the words that she said, was the news of the night:
After calling his intellectual opponents treasonous, and allegedly exaggerating his credentials, a controversial law professor resigns from the United States Military Academy.
On Monday, West Point law professor William C. Bradford resigned after The Guardianreported that he had allegedly inflated his academic credentials. Bradford made headlines last week, when the editors of the National Security Law Journaldenounced a controversial article by him in their own summer issue:
As the incoming Editorial Board, we want to address concerns regarding Mr. Bradford’s contention that some scholars in legal academia could be considered as constituting a fifth column in the war against terror; his interpretation is that those scholars could be targeted as unlawful combatants. The substance of Mr. Bradford’s article cannot fairly be considered apart from the egregious breach of professional decorum that it exhibits. We cannot “unpublish” it, of course, but we can and do acknowledge that the article was not presentable for publication when we published it, and that we therefore repudiate it with sincere apologies to our readers.
In the name of emotional well-being, college students are increasingly demanding protection from words and ideas they don’t like. Here’s why that’s disastrous for education—and mental health.
Something strange is happening at America’s colleges and universities. A movement is arising, undirected and driven largely by students, to scrub campuses clean of words, ideas, and subjects that might cause discomfort or give offense. Last December, Jeannie Suk wrote in an online article for The New Yorker about law students asking her fellow professors at Harvard not to teach rape law—or, in one case, even use the word violate (as in “that violates the law”) lest it cause students distress. In February, Laura Kipnis, a professor at Northwestern University, wrote an essay in The Chronicle of Higher Education describing a new campus politics of sexual paranoia—and was then subjected to a long investigation after students who were offended by the article and by a tweet she’d sent filed Title IX complaints against her. In June, a professor protecting himself with a pseudonym wrote an essay for Vox describing how gingerly he now has to teach. “I’m a Liberal Professor, and My Liberal Students Terrify Me,” the headline said. A number of popular comedians, including Chris Rock, have stopped performing on college campuses (see Caitlin Flanagan’s article in this month’s issue). Jerry Seinfeld and Bill Maher have publicly condemned the oversensitivity of college students, saying too many of them can’t take a joke.
The neurologist leaves behind a body of work that reveals a lifetime of asking difficult questions with empathy.
Oliver Sacks always seemed propelled by joyful curiosity. The neurologist’s writing is infused with this quality—equal parts buoyancy and diligence, the exuberant asking of difficult questions.
More specifically, Sacks had a fascination with ways of seeing and hearing and thinking. Which is another way of exploring experiences of living. He focused on modes of perception that are delightful not only because they are subjective, but precisely because they are very often faulty.
To say Sacks had a gift for this method of exploration is an understatement. He was a master at connecting curiosity to observation, and observation to emotion. Sacks died on Sunday after receiving a terminal cancer diagnosis earlier this year. He was 82.
Thicker ink, fewer smudges, and more strained hands: an Object Lesson
Recently, Bic launched acampaign to “save handwriting.” Named “Fight for Your Write,” it includes a pledge to “encourage the act of handwriting” in the pledge-taker’s home and community, and emphasizes putting more of the company’s ballpoints into classrooms.
As a teacher, I couldn’t help but wonder how anyone could think there’s a shortage. I find ballpoint pens all over the place: on classroom floors, behind desks. Dozens of castaways collect in cups on every teacher’s desk. They’re so ubiquitous that the word “ballpoint” is rarely used; they’re just “pens.” But despite its popularity, the ballpoint pen is relatively new in the history of handwriting, and its influence on popular handwriting is more complicated than the Bic campaign would imply.
The tennis player is arguably the era’s greatest athlete, but she has fewer endorsements than other less-successful players.
The U.S. Open begins today (August 31), and Serena Williams has a chance to make tennis history. A win would put her at 22 career Grand Slam titles, tying Steffi Graf for second most, behind only Margaret Court. Her astonishing ability prompts arguments that she’s the sport’s greatest female player of all time, and currently the most dominant U.S. athlete in any sex or sport. Katrina Adams, the president of the U.S. Tennis Association, recently posited that Williams is the greatest athlete ever—period.
Can the sleek F-35 match the rugged dependability of the aging A-10? The Pentagon plans to find out.
If you’re the Pentagon, how do you choose between an aging, but dependable, fighter jet and a brand new aircraft that you’re not quite sure is up to the job? You have them fight it out, naturally.
That’s essentially what the Air Force said it would do when it announced that starting in 2018, it would pit the A-10 “Warthog” against the F-35 Joint Strike Fighter in a series of tests to see if the new F-35s can adequately replace the A-10s, which the military wants to retire. A 40-year-old platform, the A-10 has been described by Martin Dempsey, the joint chiefs chairman, as “the ugliest, most beautiful aircraft on the planet.” It may be old, but as a certain Irish actor would say, it has a very particular set of skills: The A-10 excels at providing what’s known as “close-air support,” flying low and slow to provide ideal cover protection for U.S. troops fighting in ground combat. That capability is prized not only by the military, but also by a pair of key Republican lawmakers who oversee its budget, Senators John McCain and Kelly Ayotte.
Many educators are introducing meditation into the classroom as a means of improving kids’ attention and emotional regulation.
A five-minute walk from the rickety, raised track that carries the 5 train through the Bronx, the English teacher Argos Gonzalez balanced a rounded metal bowl on an outstretched palm. His class—a mix of black and Hispanic students in their late teens, most of whom live in one of the poorest districts in New York City—by now were used to the sight of this unusual object: a Tibetan meditation bell.
“Today we’re going to talk about mindfulness of emotion,” Gonzalez said with a hint of a Venezuelan accent. “You guys remember what mindfulness is?” Met with quiet stares, Gonzalez gestured to one of the posters pasted at the back of the classroom, where the students a few weeks earlier had brainstormed terms describing the meaning of “mindfulness.” There were some tentative mumblings: “being focused,” “being aware of our surroundings.”
Residents of Newtok, Alaska, voted to relocate as erosion destroyed their land. That was the easy part.
NEWTOK, Alaska—Two decades ago, the people of this tiny village came to terms with what had become increasingly obvious: They could no longer fight back the rising waters.
Their homes perched on a low-lying, treeless tuft of land between two rivers on Alaska’s west coast, residents saw the water creeping closer every year, gobbling up fields where they used to pick berries and hunt moose. Paul and Teresa Charles watched from their blue home on stilts on Newtok’s southern side as the Ninglick River inched closer and closer, bringing with it the salt waters of the Bering Sea.
“Sometimes, we lose 100 feet a year,” Paul Charles told me, over a bowl of moose soup.
Many communities across the world are trying to stay put as the climate changes, installing expensive levees and dikes and pumps, but not Newtok, a settlement of about 350 members of the Yupik people. In 1996, the village decided that fighting Mother Nature was fruitless, and they voted to move to a new piece of land nine miles away, elevated on bedrock.
In renaming a peak that honored a Republican hero, President Obama stepped into the center of a fray over political correctness, American culture, and partisanship.
There are many disorienting things about traveling to Alaska in the summer; the long daylight hours are only the most obvious. But during a vacation to the land of the midnight sun, I also found myself perplexed: Why did people keep pointing at Mount McKinley and calling it “Denali”? Wasn’t that just the name of the national park where it was located?
As of today, the name of the mountain and of the park will be the same. For all the ruckus aroused by President Obama’s decision to rename the nation’s tallest peak, the name change may mean the least for Alaskans, the people who most frequently discuss it. The greatest outcry against the name change, as my colleague Krishandev Calamur notes, is coming from two groups: Ohioans and Republicans, William McKinley’s two leading constituencies. Ohio Republicans, members of both groups, are particularly apoplectic. Here’s Speaker John Boehner: