A lot conventional wisdom about software is mistaken. It's probably a mistake to try to tackle these misconceptions in too much detail in a blog post, but my time here is limited and perhaps a short catalog of common mistakes might help some of you think more critically about the programs you use every day.
Results are what matter We all know that small computers have transformed the workplace. The world of The Apartment and Mad Men has vanished. Companies know that they wouldn't be more profitable if they discarded their PCs and hired lots of secretaries and typists. Yet the productivity gains from using computers have been remarkably hard to identify.
It turns out that lot of the work we do with business computers involves dressing up our ideas to impress managers and clients. Where a typed page was once sufficient, we now dispatch an elegantly typeset document and a deck of presentation slides. This might not help the company serve customers, but it helps individuals impress their managers.
Much of the real contribution that software makes to your thinking happens in the course of the work. What may matter most in the long run are the ideas you discover while preparing a management report or a client presentation. Process matters.
Software should be polished We spend too much time perfecting the way our programs look, just as in the previous century we spent far too much time perfecting our books. We are accustomed to a very high standard of editing and typesetting in publishing, a standard that originally was possible only because a vast number of educated women were for the first time entering the work force and were, for a time, willing to accept very low wages. Today, we look for the same sort of surface polish in our software.
All this polish comes with substantial costs. Some costs are evident because they appear in the price. Others are hidden. How do you measure the cost of terrific software that never gets written, or that remains locked in a laboratory?
Software developers have long struggled to reduce the riskiness of development, its delays and failures, by working to build a software factory that would make software construction more systematic. This hasn't worked well. "We software creators woke up one day," I wrote in 2007, "to find ourselves living in the software factory. The floor is hard, from time to time it gets very cold at night, and they say the factory is going to close and move somewhere else. We are unhappy with our modern computing and alienated from our work, we experience constant, inexorable guilt."
We've been here before. In 1853, John Ruskin inserted a long aside in The Stones Of Venice to advise to the Victorian consumer and art buyer. What sort of things should one buy? Ruskin suggests the following:
1. Never encourage the manufacture of any article not absolutely necessary, in the production of which Invention has no share.
2. Never demand an exact finish for its own sake, but only for some practical or noble end.
3. Never encourage imitation or copying of any kind, except for the sake of preserving record of great works.
Brush marks are not signs of sloth, and pixel misalignments are not an indicator of moral laxity. The software creator should make intention clear, but excessive polish is slave's work unredeem'd.
Software should be friendly
The program is not your friend. It does not understand you, or care about you.
Computers should be intuitive
We are often told that computers should be information appliances, that you don't need to know about anything under the hood. Many things we want to do, however, are far from simple; the real work of real people is surprisingly complex. Learning to use tools well sometimes takes times, but you are only a beginner once and you may use your tools every day.
Programs should never crash, hang, or do surprising things
Homer nods, and most of us aren't Homer. Human collaborators sometimes make mistakes, lose things, or drop them on the floor. With computers as well as people, take sensible precautions and hope for the best.
On her first trip to New Mexico, Linda was astonished to find that National Park trails frequently ran close beside spectacular cliffs, with no guard rails in sight. Back east, you'd put up a guard rail and spoil the view -- or you might close the trail because it might be dangerous. If we do not trust users, we deprive people of abilities they need.
No one wants to read on screens
People still say this, even though we spend our days reading and writing on the screen. It is now clear that the future of serious reading and writing lies on screens and on the displays that will replace them.
Hypertext is distracting; the Internet is ruining kids today
Life is distracting. Ideas are complicated and densely interconnected. There is too much to do and we have too little time. Kids know this, too, and make choices accordingly.
Computers don't wear out
Computers you depend on last three years, laptops a bit less. A three-year-old computer, even if in pristine condition, is sufficiently obsolete that replacing it is nearly mandatory. If you don't use your computer much, or you want to use an old computer for an occasional chore, you can keep it for a few years more.
Web pages should (or can) say one thing, and should mean what they say
Dreams of the semantic Web often rest on the assumption that we can (and will) express the meaning of a Web page in a simple and concise format. Everything we know about writing, everything we know about meaning, suggests this is a fantasy.
In despair over their perception of the intellectual dishonesty of the Bush administration and the epistemic closure of the American Right, Jed Buchwald and Diane Greco Josefowicz wrote The Zodiac of Paris. It's describes once-famous controversies in the early 19th century over some Egyptian inscription that suggested the world was older than Genesis allows.
The book is, in a very real sense, about the lies Curveball told Colin Powell, but that meaning is not on the page.
Steve Jobs matters
The American business press is obsessed with CEOs. If a stock increases, the firm's leaders are brilliant fellows. If shares plummet, the CEO must be a buffoon. Steve Jobs, once regarded as a fool, is now hailed as the one true software visionary, the indispensible force.
Jobs is, in fact, a good software critic and an executive who is willing to trust his judgment and endure the consequences.
The customer, the usability lab, or the marketplace will tell you what is good; crowds are wise
From the user-generated content of Wikipedia to mass recommendation systems and user-written product reviews, my colleagues assume that crowds are wise and that, on average, sensible opinions prevail. That this is often true is fortunate, but crowds can be wildly wrong..
Almost all software designers believe that customers, clinical studies, or the marketplace will reveal what works and what doesn't, but everything we know about art (and software is an art form) argues this cannot be right. Best-seller lists sometimes contain good books, but they list bad books aplenty. Popular movies are not always great.
We know that an intelligent critic can sometimes recognize a great work when she sees it. No individual's taste or judgment is infallible, but the marketplace is often wrong, too.
James Fallows is a national correspondent for The Atlantic and has written for the magazine since the late 1970s. He has reported extensively from outside the United States and once worked as President Carter's chief speechwriter. His latest book is China Airborne.
The Islamic State is no mere collection of psychopaths. It is a religious group with carefully considered beliefs, among them that it is a key agent of the coming apocalypse. Here’s what that means for its strategy—and for how to stop it.
What is the Islamic State?
Where did it come from, and what are its intentions? The simplicity of these questions can be deceiving, and few Western leaders seem to know the answers. In December, The New York Times published confidential comments by Major General Michael K. Nagata, the Special Operations commander for the United States in the Middle East, admitting that he had hardly begun figuring out the Islamic State’s appeal. “We have not defeated the idea,” he said. “We do not even understand the idea.” In the past year, President Obama has referred to the Islamic State, variously, as “not Islamic” and as al-Qaeda’s “jayvee team,” statements that reflected confusion about the group, and may have contributed to significant strategic errors.
Places like St. Louis and New York City were once similarly prosperous. Then, 30 years ago, the United States turned its back on the policies that had been encouraging parity.
Despite all the attention focused these days on the fortunes of the “1 percent,” debates over inequality still tend to ignore one of its most politically destabilizing and economically destructive forms. This is the growing, and historically unprecedented, economic divide that has emerged in recent decades among the different regions of the United States.
Until the early 1980s, a long-running feature of American history was the gradual convergence of income across regions. The trend goes back to at least the 1840s, but grew particularly strong during the middle decades of the 20th century. This was, in part, a result of the South catching up with the North in its economic development. As late as 1940, per-capita income in Mississippi, for example, was still less than one-quarter that of Connecticut. Over the next 40 years, Mississippians saw their incomes rise much faster than did residents of Connecticut, until by 1980 the gap in income had shrunk to 58 percent.
One hundred years ago, a crisis in urban masculinity created the lumberjack aesthetic. Now it's making a comeback.
The first one I met was at an inauguration party in 2009. I was in a cocktail dress. He was in jeans, work boots, and a flannel shirt. He had John Henry tattooed on his bicep. He was white. Somehow, at a fairly elegant affair, he had found a can of PBR. Since then they’ve multiplied. You can see them in coffee shops and bars and artisanal butchers. They don't exactly cut down trees, but they might try their hand at agriculture and woodworking, even if only in the form of window-box herb gardens.
In the last month, these bearded, manly men even earned themselves a pithy nickname: the lumbersexuals. GearJunkiecoined the term only a few weeks ago, and since then Jezebel, Gawker, The Guardian and Time have jumped in to analyze their style. BuzzFeed even has a holiday gift guide for the lumbersexual in your life. (He would, apparently, like bourbon-flavored syrup and beard oil.)
A Chicago cop now faces murder charges—but will anyone hold his colleagues, his superiors, and elected officials accountable for their failures?
Thanks to clear video evidence, Chicago police officer Jason Van Dyke was charged this week with first-degree murder for shooting 17-year-old Laquan McDonald. Nevertheless, thousands of people took to the city’s streets on Friday in protest. And that is as it should be.
The needlessness of the killing is clear and unambiguous:
Yet that dash-cam footage was suppressed for more than a year by authorities citing an investigation. “There was no mystery, no dead-end leads to pursue, no ambiguity about who fired the shots,” Eric Zorn wrote in The Chicago Tribune. “Who was pursuing justice and the truth? What were they doing? Who were they talking to? With whom were they meeting? What were they trying to figure out for 400 days?”
Highly-poisonous botulinum toxin (the stuff in Botox), played a formidable role in the history of food and warfare. It is still a factor in prison-brewed alcohol and some canned foods, and can quickly kill a person.
After tanking up on “pruno,” a bootleg prison wine, eight maximum-security inmates at the Utah State prison in Salt Lake County tried to shake off more than just the average hangover. Their buzz faded into double vision, weakness, trouble swallowing, and vomiting. Tests confirmed that the detainees came down with botulism from their cellblock science experiment. In secret, a prison moonshiner mixed grapefruit, oranges, powdered drink mix, canned fruit, and water in a plastic bag. For the pièce de résistance, he added a baked potato filched from a meal tray weeks earlier and peeled with his fingernails. After days of fermentation and anticipation, the brewer filtered the mash through a sock, and then doled out the hooch to his fellow yardbirds.
As the public’s fear and loathing surge, the frontrunner’s durable candidacy has taken a dark turn.
MYRTLE BEACH, South Carolina—All politicians, if they are any good at their craft, know the truth about human nature.
Donald Trump is very good, and he knows it better than most.
Trump stands alone on a long platform, surrounded by a rapturous throng. Below and behind him—sitting on bleachers and standing on the floor—they fill this city’s cavernous, yellow-beige convention center by the thousands. As Trump will shortly point out, there are a lot of other Republican presidential candidates, but none of them get crowds anything like this.
Trump raises an orange-pink hand like a waiter holding a tray. “They are not coming in from Syria,” he says. “We’re sending them back!” The crowd surges, whistles, cheers. “So many bad things are happening—they have sections of Paris where the police are afraid to go,” he continues. “Look at Belgium, the whole place is closed down! We can’t let it happen here, folks.”
Students at Princeton University are protesting the ways it honors the former president, who once threw a civil-rights leader out of the White House.
The Black Justice League, in protests on Princeton University’s campus, has drawn wider attention to an inconvenient truth about the university’s ultimate star: Woodrow Wilson. The Virginia native was racist, a trait largely overshadowed by his works as Princeton’s president, as New Jersey’s governor, and, most notably, as the 28th president of the United States.
As president, Wilson oversaw unprecedented segregation in federal offices. It’s a shameful side to his legacy that came to a head one fall afternoon in 1914 when he threw the civil-rights leader William Monroe Trotter out of the Oval Office.
Trotter led a delegation of blacks to meet with the president on November 12, 1914 to discuss the surge of segregation in the country. Trotter, today largely forgotten, was a nationally prominent civil-rights leader and newspaper editor. In the early 1900s, he was often mentioned in the same breath as W.E.B. Du Bois and Booker T. Washington. But unlike Washington, Trotter, an 1895 graduate of Harvard, believed in direct protest actions. In fact, Trotter founded his Boston newspaper, The Guardian, as a vehicle to challenge Washington’s more conciliatory approach to civil rights.
It was widely seen as a counter-argument to claims that poor people are "to blame" for bad decisions and a rebuke to policies that withhold money from the poorest families unless they behave in a certain way. After all, if being poor leads to bad decision-making (as opposed to the other way around), then giving cash should alleviate the cognitive burdens of poverty, all on its own.
Sometimes, science doesn't stick without a proper anecdote, and "Why I Make Terrible Decisions," a comment published on Gawker's Kinja platform by a person in poverty, is a devastating illustration of the Science study. I've bolded what I found the most moving, insightful portions, but it's a moving and insightful testimony all the way through.
Meet the bald Norwegians and other unknowns who actually create the songs that top the charts.
The biggest pop star in America today is a man named Karl Martin Sandberg. The lead singer of an obscure ’80s glam-metal band, Sandberg grew up in a remote suburb of Stockholm and is now 44. Sandberg is the George Lucas, the LeBron James, the Serena Williams of American pop. He is responsible for more hits than Phil Spector, Michael Jackson, or the Beatles.
After Sandberg come the bald Norwegians, Mikkel Eriksen and Tor Hermansen, 43 and 44; Lukasz Gottwald, 42, a Sandberg protégé and collaborator who spent a decade languishing in Saturday Night Live’s house band; and another Sandberg collaborator named Esther Dean, 33, a former nurse’s aide from Oklahoma who was discovered in the audience of a Gap Band concert, singing along to “Oops Upside Your Head.” They use pseudonyms professionally, but most Americans wouldn’t recognize those, either: Max Martin, Stargate, Dr. Luke, and Ester Dean.
Twitter stock fell more than 10 percent after the announcement.
Since it went public two years ago, investors have rarely considered Twitter’s prospects rosy. The sliver of Twitter’s users who are interested in how it fares as a corporation have gotten used to this, I think: There’s an idea you see floating around that, beyond avoiding bankruptcy, Twitter’s financial success has little bearing on its social utility. Maybe there are only 320 million humans interested in seeing 140-character updates from their friends every day after all. If you make a website that 4 percent of the world’s population finds interesting enough to peek at every month, you shouldn’t exactly feel embarrassed.