A lot conventional wisdom about software is mistaken. It's probably a mistake to try to tackle these misconceptions in too much detail in a blog post, but my time here is limited and perhaps a short catalog of common mistakes might help some of you think more critically about the programs you use every day.
Results are what matter We all know that small computers have transformed the workplace. The world of The Apartment and Mad Men has vanished. Companies know that they wouldn't be more profitable if they discarded their PCs and hired lots of secretaries and typists. Yet the productivity gains from using computers have been remarkably hard to identify.
It turns out that lot of the work we do with business computers involves dressing up our ideas to impress managers and clients. Where a typed page was once sufficient, we now dispatch an elegantly typeset document and a deck of presentation slides. This might not help the company serve customers, but it helps individuals impress their managers.
Much of the real contribution that software makes to your thinking happens in the course of the work. What may matter most in the long run are the ideas you discover while preparing a management report or a client presentation. Process matters.
Software should be polished We spend too much time perfecting the way our programs look, just as in the previous century we spent far too much time perfecting our books. We are accustomed to a very high standard of editing and typesetting in publishing, a standard that originally was possible only because a vast number of educated women were for the first time entering the work force and were, for a time, willing to accept very low wages. Today, we look for the same sort of surface polish in our software.
All this polish comes with substantial costs. Some costs are evident because they appear in the price. Others are hidden. How do you measure the cost of terrific software that never gets written, or that remains locked in a laboratory?
Software developers have long struggled to reduce the riskiness of development, its delays and failures, by working to build a software factory that would make software construction more systematic. This hasn't worked well. "We software creators woke up one day," I wrote in 2007, "to find ourselves living in the software factory. The floor is hard, from time to time it gets very cold at night, and they say the factory is going to close and move somewhere else. We are unhappy with our modern computing and alienated from our work, we experience constant, inexorable guilt."
We've been here before. In 1853, John Ruskin inserted a long aside in The Stones Of Venice to advise to the Victorian consumer and art buyer. What sort of things should one buy? Ruskin suggests the following:
1. Never encourage the manufacture of any article not absolutely necessary, in the production of which Invention has no share.
2. Never demand an exact finish for its own sake, but only for some practical or noble end.
3. Never encourage imitation or copying of any kind, except for the sake of preserving record of great works.
Brush marks are not signs of sloth, and pixel misalignments are not an indicator of moral laxity. The software creator should make intention clear, but excessive polish is slave's work unredeem'd.
Software should be friendly
The program is not your friend. It does not understand you, or care about you.
Computers should be intuitive
We are often told that computers should be information appliances, that you don't need to know about anything under the hood. Many things we want to do, however, are far from simple; the real work of real people is surprisingly complex. Learning to use tools well sometimes takes times, but you are only a beginner once and you may use your tools every day.
Programs should never crash, hang, or do surprising things
Homer nods, and most of us aren't Homer. Human collaborators sometimes make mistakes, lose things, or drop them on the floor. With computers as well as people, take sensible precautions and hope for the best.
On her first trip to New Mexico, Linda was astonished to find that National Park trails frequently ran close beside spectacular cliffs, with no guard rails in sight. Back east, you'd put up a guard rail and spoil the view -- or you might close the trail because it might be dangerous. If we do not trust users, we deprive people of abilities they need.
No one wants to read on screens
People still say this, even though we spend our days reading and writing on the screen. It is now clear that the future of serious reading and writing lies on screens and on the displays that will replace them.
Hypertext is distracting; the Internet is ruining kids today
Life is distracting. Ideas are complicated and densely interconnected. There is too much to do and we have too little time. Kids know this, too, and make choices accordingly.
Computers don't wear out
Computers you depend on last three years, laptops a bit less. A three-year-old computer, even if in pristine condition, is sufficiently obsolete that replacing it is nearly mandatory. If you don't use your computer much, or you want to use an old computer for an occasional chore, you can keep it for a few years more.
Web pages should (or can) say one thing, and should mean what they say
Dreams of the semantic Web often rest on the assumption that we can (and will) express the meaning of a Web page in a simple and concise format. Everything we know about writing, everything we know about meaning, suggests this is a fantasy.
In despair over their perception of the intellectual dishonesty of the Bush administration and the epistemic closure of the American Right, Jed Buchwald and Diane Greco Josefowicz wrote The Zodiac of Paris. It's describes once-famous controversies in the early 19th century over some Egyptian inscription that suggested the world was older than Genesis allows.
The book is, in a very real sense, about the lies Curveball told Colin Powell, but that meaning is not on the page.
Steve Jobs matters
The American business press is obsessed with CEOs. If a stock increases, the firm's leaders are brilliant fellows. If shares plummet, the CEO must be a buffoon. Steve Jobs, once regarded as a fool, is now hailed as the one true software visionary, the indispensible force.
Jobs is, in fact, a good software critic and an executive who is willing to trust his judgment and endure the consequences.
The customer, the usability lab, or the marketplace will tell you what is good; crowds are wise
From the user-generated content of Wikipedia to mass recommendation systems and user-written product reviews, my colleagues assume that crowds are wise and that, on average, sensible opinions prevail. That this is often true is fortunate, but crowds can be wildly wrong..
Almost all software designers believe that customers, clinical studies, or the marketplace will reveal what works and what doesn't, but everything we know about art (and software is an art form) argues this cannot be right. Best-seller lists sometimes contain good books, but they list bad books aplenty. Popular movies are not always great.
We know that an intelligent critic can sometimes recognize a great work when she sees it. No individual's taste or judgment is infallible, but the marketplace is often wrong, too.
James Fallows is a national correspondent for The Atlantic and has written for the magazine since the late 1970s. He has reported extensively from outside the United States and once worked as President Carter's chief speechwriter. His latest book is China Airborne.
A new book by the evolutionary biologist Jerry Coyne tackles arguments that the two institutions are compatible.
In May 1988, a 13-year-old girl named Ashley King was admitted to Phoenix Children’s Hospital by court order. She had a tumor on her leg—an osteogenic sarcoma—that, writes Jerry Coyne in his book Faith Versus Fact, was “larger than a basketball,” and was causing her leg to decay while her body started to shut down. Ashley’s Christian Scientist parents, however, refused to allow doctors permission to amputate, and instead moved their daughter to a Christian Science sanatorium, where, in accordance with the tenets of their faith, “there was no medical care, not even pain medication.” Ashley’s mother and father arranged a collective pray-in to help her recover—to no avail. Three weeks later, she died.
Defining common cultural literacy for an increasingly diverse nation.
Is the culture war over?
That seems an absurd question. This is an age when Confederate monuments still stand; when white-privilege denialism is surging on social media; when legislators and educators in Arizona and Texas propose banning ethnic studies in public schools and assign textbooks euphemizing the slave trade; when fear of Hispanic and Asian immigrants remains strong enough to prevent immigration reform in Congress; when the simple assertion that #BlackLivesMatter cannot be accepted by all but is instead contested petulantly by many non-blacks as divisive, even discriminatory.
And that’s looking only at race. Add gender, guns, gays, and God to the mix and the culture war seems to be raging along quite nicely.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
25 years ago, Roseanne Barr sparked national fury when she delivered an off-key rendition in San Diego. But the reasons behind outrage and praise for various interpretations have as much to do with politics as musical talent.
On July 25, 1990, the comedian Roseanne Barr stood in San Diego’s Jack Murphy Stadium before a baseball game, grabbed a microphone behind home plate, and, with her shirttails hanging out and sleeves rolled up, barked out what many consider to be the most unpatriotic performance of “The Star-Spangled Banner” in history. In a screech not unlike a fork being scratched across slate, Barr garbled her way through the lyrics, missed notes intentionally, and capped off the whole affair by grabbing her crotch and spitting on the ground. In her defense, those final gestures were meant as a parody of ballplayers’ behavior, but many of the 27,285 paying fans didn’t see it that way. What they saw was utter disrespect for the national anthem, and thus, the country. She had exercised her freedom of speech, so they exercised their right to boo her off the field.
People labeled “smart” at a young age don’t deal well with being wrong. Life grows stagnant.
At whatever agesmart people develop the idea that they are smart, they also tend to develop vulnerability around relinquishing that label. So the difference between telling a kid “You did a great job” and “You are smart” isn’t subtle. That is, at least, according to one growing movement in education and parenting that advocates for retirement of “the S word.”
The idea is that when we praise kids for being smart, those kids think: Oh good, I'm smart. And then later, when those kids mess up, which they will, they think: Oh no, I'm not smart after all. People will think I’m not smart after all. And that’s the worst. That’s a risk to avoid, they learn.“Smart” kids stand to become especially averse to making mistakes, which are critical to learning and succeeding.
Former Senator Jim Webb is the fifth Democrat to enter the race—and by far the most conservative one.
In a different era’s Democratic Party, Jim Webb might be a serious contender for the presidential nomination. He’s a war hero and former Navy secretary, but he has been an outspoken opponent of recent military interventions. He’s a former senator from Virginia, a purple state. He has a strong populist streak, could appeal to working-class white voters, and might even have crossover appeal from his days as a member of the Reagan administration.
In today’s leftward drifting Democratic Party, however, it’s hard to see Webb—who declared his candidacy Thursday—getting very far. As surprising as Bernie Sanders’s rise in the polls has been, he looks more like the Democratic base than Webb does. The Virginian is progressive on a few major issues, including the military and campaign spending, but he’s far to the center or even right on others: He's against affirmative action, supports gun rights, and is a defender of coal. During the George W. Bush administration, Democrats loved to have him as a foil to the White House. It’s hard to imagine the national electorate will cotton to him in the same way. Webb’s statement essentially saying he had no problem with the Confederate battle flag flying in places like the grounds of the South Carolina capitol may have been the final straw. (At 69, he’s also older than Hillary Clinton, whose age has been a topic of debate, though still younger than Bernie Sanders or Joe Biden.)
For centuries, experts have predicted that machines would make workers obsolete. That moment may finally be arriving. Could that be a good thing?
1. Youngstown, U.S.A.
The end of work is still just a futuristic concept for most of the United States, but it is something like a moment in history for Youngstown, Ohio, one its residents can cite with precision: September 19, 1977.
For much of the 20th century, Youngstown’s steel mills delivered such great prosperity that the city was a model of the American dream, boasting a median income and a homeownership rate that were among the nation’s highest. But as manufacturing shifted abroad after World War II, Youngstown steel suffered, and on that gray September afternoon in 1977, Youngstown Sheet and Tube announced the shuttering of its Campbell Works mill. Within five years, the city lost 50,000 jobs and $1.3 billion in manufacturing wages. The effect was so severe that a term was coined to describe the fallout: regional depression.
How a re-creation of its most famous battle helped erase the meaning of the Civil War.
"No person should die without seeing this cyclorama," declared a Boston man in 1885. "It's a duty they owe to their country." Paul Philippoteaux's lifelike depiction of the Battle of Gettysburg was much more than a painting. It re-created the battlefield with such painstaking fidelity, and created an illusion so enveloping, that many visitors felt as if they were actually there.
For all its verisimilitude, though, the painting failed to capture the deeper truths of the Civil War. It showed the two armies in lavish detail, but not the clash of ideals that impelled them onto the battlefield. Its stunning rendition of a battle utterly divorced from context appealed to a nation as eager to remember the valor of those who fought as it was to forget the purpose of their fight. Its version of the conflict proved so alluring, in fact, that it changed the way America remembered the Civil War.
Highlights from seven days of reading about entertainment
British Cinemas Need to Do Better for Black Audiences
Simran Hans | Buzzfeed
“The myth that black people don’t go to the cinema becomes a self-fulfilling prophecy, predicated on the assumption that cinemagoers are only interested in seeing themselves represented on screen. This seems to be at the heart of the problem.”
Hump Day: The Utterly OMG Magic Mike XXL
Wesley Morris | Grantland
“Not since the days of peak Travolta and Dirty Dancing has a film so perfectly nailed something essential about movie lust: Male vulnerability is hot, particularly when the man is dancing with and therefore for a woman. It aligns the entire audience with the complex prerogatives of female desire.”
The retired general and former CIA director holds forth on the Middle East.
ASPEN, Colo.—Retired U.S. Army General David Petraeus pioneered America’s approach to counterinsurgency, led the surge in Iraq, served as director of the CIA for a year, and was sentenced to two years probation for leaking classified information to his mistress. On Wednesday at the Aspen Ideas Festival, he was interviewed by my colleague, Jeffrey Goldberg, about subjects including efforts to stop Iran’s nuclear program; the civil war in Syria; ISIS and the threat it poses to the United States; and the Iraq War.
Here are several noteworthy moments from their conversation, slightly condensed:
The Risks of Attacking Iran
Jeffrey Goldberg: So you believe that, under certain circumstances, President Obama would still use military force against Iran?
David Petraeus: I think he would, actually. I know we’ve had red lines that didn’t turn out to be red lines. ... I think this is a different issue, and I clearly recognize how the administration has sought to show that this is very, very different from other sort of off-the-cuff remarks.
Goldberg: How did the Obama administration stop Israel from attacking Iran? And do you think that if this deal does go south, that Israel would be back in the picture?
Petraeus: I don’t, actually. I think Israel is very cognizant of its limitations. ... The Israelis do not have anything that can crack this deeply buried enrichment site ... and if you cannot do that, you’re not going to set the program back very much. So is it truly worth it, then?
So that’s a huge limitation. It’s also publicly known that we have a 30,000-pound projectile that no one else has, that no one else can even carry. The Massive Ordinance Penetrator was under design for almost six years. ... If necessary, we can take out all these facilities and set them back a few years, depending on your assumptions.
But that’s another roll of the iron dice, as Bismarck used to say, and you never know when those dice are rolled what the outcome is going to be. You don’t know what risks could materialize for those who are in harm’s way.
You don’t know what the response could be by Iran.
There’s always the chance that there will be salvos at Israel, but what if they decide to go at the Gulf states, where we have facilities in every single one.
This is not something to be taken lightly, clearly.