A lot conventional wisdom about software is mistaken. It's probably a mistake to try to tackle these misconceptions in too much detail in a blog post, but my time here is limited and perhaps a short catalog of common mistakes might help some of you think more critically about the programs you use every day.
Results are what matter We all know that small computers have transformed the workplace. The world of The Apartment and Mad Men has vanished. Companies know that they wouldn't be more profitable if they discarded their PCs and hired lots of secretaries and typists. Yet the productivity gains from using computers have been remarkably hard to identify.
It turns out that lot of the work we do with business computers involves dressing up our ideas to impress managers and clients. Where a typed page was once sufficient, we now dispatch an elegantly typeset document and a deck of presentation slides. This might not help the company serve customers, but it helps individuals impress their managers.
Much of the real contribution that software makes to your thinking happens in the course of the work. What may matter most in the long run are the ideas you discover while preparing a management report or a client presentation. Process matters.
Software should be polished We spend too much time perfecting the way our programs look, just as in the previous century we spent far too much time perfecting our books. We are accustomed to a very high standard of editing and typesetting in publishing, a standard that originally was possible only because a vast number of educated women were for the first time entering the work force and were, for a time, willing to accept very low wages. Today, we look for the same sort of surface polish in our software.
All this polish comes with substantial costs. Some costs are evident because they appear in the price. Others are hidden. How do you measure the cost of terrific software that never gets written, or that remains locked in a laboratory?
Software developers have long struggled to reduce the riskiness of development, its delays and failures, by working to build a software factory that would make software construction more systematic. This hasn't worked well. "We software creators woke up one day," I wrote in 2007, "to find ourselves living in the software factory. The floor is hard, from time to time it gets very cold at night, and they say the factory is going to close and move somewhere else. We are unhappy with our modern computing and alienated from our work, we experience constant, inexorable guilt."
We've been here before. In 1853, John Ruskin inserted a long aside in The Stones Of Venice to advise to the Victorian consumer and art buyer. What sort of things should one buy? Ruskin suggests the following:
1. Never encourage the manufacture of any article not absolutely necessary, in the production of which Invention has no share.
2. Never demand an exact finish for its own sake, but only for some practical or noble end.
3. Never encourage imitation or copying of any kind, except for the sake of preserving record of great works.
Brush marks are not signs of sloth, and pixel misalignments are not an indicator of moral laxity. The software creator should make intention clear, but excessive polish is slave's work unredeem'd.
Software should be friendly
The program is not your friend. It does not understand you, or care about you.
Computers should be intuitive
We are often told that computers should be information appliances, that you don't need to know about anything under the hood. Many things we want to do, however, are far from simple; the real work of real people is surprisingly complex. Learning to use tools well sometimes takes times, but you are only a beginner once and you may use your tools every day.
Programs should never crash, hang, or do surprising things
Homer nods, and most of us aren't Homer. Human collaborators sometimes make mistakes, lose things, or drop them on the floor. With computers as well as people, take sensible precautions and hope for the best.
On her first trip to New Mexico, Linda was astonished to find that National Park trails frequently ran close beside spectacular cliffs, with no guard rails in sight. Back east, you'd put up a guard rail and spoil the view -- or you might close the trail because it might be dangerous. If we do not trust users, we deprive people of abilities they need.
No one wants to read on screens
People still say this, even though we spend our days reading and writing on the screen. It is now clear that the future of serious reading and writing lies on screens and on the displays that will replace them.
Hypertext is distracting; the Internet is ruining kids today
Life is distracting. Ideas are complicated and densely interconnected. There is too much to do and we have too little time. Kids know this, too, and make choices accordingly.
Computers don't wear out
Computers you depend on last three years, laptops a bit less. A three-year-old computer, even if in pristine condition, is sufficiently obsolete that replacing it is nearly mandatory. If you don't use your computer much, or you want to use an old computer for an occasional chore, you can keep it for a few years more.
Web pages should (or can) say one thing, and should mean what they say
Dreams of the semantic Web often rest on the assumption that we can (and will) express the meaning of a Web page in a simple and concise format. Everything we know about writing, everything we know about meaning, suggests this is a fantasy.
In despair over their perception of the intellectual dishonesty of the Bush administration and the epistemic closure of the American Right, Jed Buchwald and Diane Greco Josefowicz wrote The Zodiac of Paris. It's describes once-famous controversies in the early 19th century over some Egyptian inscription that suggested the world was older than Genesis allows.
The book is, in a very real sense, about the lies Curveball told Colin Powell, but that meaning is not on the page.
Steve Jobs matters
The American business press is obsessed with CEOs. If a stock increases, the firm's leaders are brilliant fellows. If shares plummet, the CEO must be a buffoon. Steve Jobs, once regarded as a fool, is now hailed as the one true software visionary, the indispensible force.
Jobs is, in fact, a good software critic and an executive who is willing to trust his judgment and endure the consequences.
The customer, the usability lab, or the marketplace will tell you what is good; crowds are wise
From the user-generated content of Wikipedia to mass recommendation systems and user-written product reviews, my colleagues assume that crowds are wise and that, on average, sensible opinions prevail. That this is often true is fortunate, but crowds can be wildly wrong..
Almost all software designers believe that customers, clinical studies, or the marketplace will reveal what works and what doesn't, but everything we know about art (and software is an art form) argues this cannot be right. Best-seller lists sometimes contain good books, but they list bad books aplenty. Popular movies are not always great.
We know that an intelligent critic can sometimes recognize a great work when she sees it. No individual's taste or judgment is infallible, but the marketplace is often wrong, too.
James Fallows is a national correspondent for The Atlantic and has written for the magazine since the late 1970s. He has reported extensively from outside the United States and once worked as President Carter's chief speechwriter. His latest book is China Airborne.
What would the American culture wars look like if they were less about “values” and more about Jesus?
Evangelical Christianity has long had a stranglehold on how Americans imagine public faith. Vague invocations of “religion”—whether it’s “religion vs. science” or “religious freedom”—usually really mean “conservative, Protestant, evangelical Christianity,” and this assumption inevitably frames debates about American belief. For the other three-quarters of the population—Catholics, Jews, other Protestants, Muslims, Hindus, secular Americans, Buddhists, Wiccans, etc.—this can be infuriating. For some evangelicals, it’s a sign of success, a linguistic triumph of the culture wars.
But not for Russell Moore. In 2013, the 43-year-old theologian became the head of the Ethics and Religious Liberty Commission, the political nerve center of the Southern Baptist Convention. His predecessor, Richard Land, prayed with George W. Bush, played hardball with Democrats, and helped make evangelicals a quintessentially Republican voting bloc.
The winners of the 27th annual National Geographic Traveler Photo Contest have just been announced.
The winners of the 27th annual National Geographic Traveler Photo Contest have just been announced. Winning first prize, Anuar Patjane Floriuk of Tehuacán, Mexico, will receive an eight-day photo expedition for two to Costa Rica and the Panama Canal for a photograph of divers swimming near a humpback whale off the western coast of Mexico. Here, National Geographic has shared all of this year’s winners, gathered from four categories: Travel Portraits, Outdoor Scenes, Sense of Place, and Spontaneous Moments. Captions by the photographers.
Many psychiatrists believe that a new approach to diagnosing and treating depression—linking individual symptoms to their underlying mechanisms—is needed for research to move forward.
In his Aphorisms, Hippocrates defined melancholia, an early understanding of depression, as a state of “fears and despondencies, if they last a long time.” It was caused, he believed, by an excess of bile in the body (the word “melancholia” is ancient Greek for “black bile”).
Ever since then, doctors have struggled to create a more precise and accurate definition of the illness that still isn’t well understood. In the 1920s, the German psychiatrist Kurt Schneider argued that depression could be divided into two separate conditions, each requiring a different form of treatment: depression that resulted from changes in mood, which he called “endogenous depression,” and depression resulting from reactions to outside events, or “reactive depression.” His theory was challenged in 1926, when the British psychologist Edward Mapother argued in the British Medical Journal that there was no evidence for two distinct types of depression, and that the apparent differences between depression patients were just differences in the severity of the condition.
Paul faced danger, Ani and Ray faced each other, and Frank faced some career decisions.
This is what happens when you devote two-thirds of a season to scene after scene after scene of Frank and Jordan’s Baby Problems, and Frank Shaking Guys Down, and Look How Fucked Up Ray and Ani Are, and Melancholy Singer in the Dive Bar Yet Again—and then you suddenly realize that with only a couple episodes left you haven’t offered even a rudimentary outline of the central plot.
What if Joe Biden is going to run for the Democratic nomination after all?
Most Democrats seem ready for Hillary Clinton—or at least appear content with her candidacy. But what about the ones who who were bidin’ for Biden? There are new signs the vice president might consider running for president after all.
Biden has given little indication he was exploring a run: There’s no super PAC, no cultivation of a network of fundraisers or grassroots organizers, few visits to early-primary states. While his boss hasn’t endorsed Clinton—and says he won’t endorse in the primary—many members of the Obama administration have gone to work for Clinton, including some close to Biden.
But Biden also hasn’t given any clear indication that he isn’t running, and a column by Maureen Dowd in Saturday’s New York Times has set off new speculation. One reason Biden didn’t get into the race was that his son Beau was dying of cancer, and the vice president was focused on being with his son. But before he died in May, Dowd reported, Beau Biden tried to get his father to promise to run. Now Joe Biden is considering the idea.
An activist group is trying to discredit Planned Parenthood with covertly recorded videos even as contraception advocates are touting a method that sharply reduces unwanted pregnancies.
Abortion is back at the fore of U.S. politics due to an activist group’s attempt to discredit Planned Parenthood, one of the most polarizing organizations in the country. Supporters laud its substantial efforts to provide healthcare for women and children. For critics, nothing that the organization does excuses its role in performing millions of abortions––a procedure that they regard as literal murder––and its monstrous character is only confirmed, in their view, by covertly recorded video footage of staffers cavalierly discussing what to do with fetal body parts.
If nothing else, that recently released footage has galvanized Americans who oppose abortion, media outlets that share their views, and politicians who seek their votes. “Defunding Planned Parenthood is now a centerpiece of the Republican agenda going into the summer congressional recess,” TheWashington Postreports, “and some hard-liners have said they are willing to force a government shutdown in October if federal support to the group is not curtailed.”
It’s impossible to “solve” the Iranian nuclear threat. This agreement is the next best thing.
Having carefully reviewed the lengthy and complex agreement negotiated by the United States and its international partners with Iran, I have reached the following conclusion: If I were a member of Congress, I would vote yes on the deal. Here are nine reasons why.
1) No one has identified a better feasible alternative. Before negotiations halted its nuclear advance, Iran had marched relentlessly down the field from 10 years away from a bomb to two months from that goal line. In response, the United States and its partners imposed a series of sanctions that have had a significant impact on Iran’s economy, driving it to negotiate. That strategy worked, and resulted in a deal. In the absence of this agreement, the most likely outcome would be that the parties resume doing what they were doing before the freeze began: Iran installing more centrifuges, accumulating a larger stockpile of bomb-usable material, shrinking the time required to build a bomb; the U.S. resuming an effort to impose more severe sanctions on Iran. Alternatively, Israel or the United States could conduct military strikes on Iran’s nuclear facilities, setting back the Iranian program by two years, or perhaps even three. But that option risks wider war in the Middle East, an Iran even more determined to acquire a bomb, and the collapse of consensus among American allies.
The jobs that are least vulnerable to automation tend to be held by women.
Many economists and technologists believe the world is on the brink of a new industrial revolution, in which advances in the field of artificial intelligence will obsolete human labor at an unforgiving pace. Two Oxford researchers recently analyzed the skills required for more than 700 different occupations to determine how many of them would be susceptible to automation in the near future, and the news was not good: They concluded that machines are likely to take over 47 percent of today’s jobs within a few decades.
This is a dire prediction, but one whose consequences will not fall upon society evenly. A close look at the data reveals a surprising pattern: The jobs performed primarily by women are relatively safe, while those typically performed by men are at risk.
Blame Prohibitionists, German immigrants, and factory workers who just wanted to drink during their lunch break.
Today’s discerning beer drinkers might be convinced that America’s watery, bland lagers are a recent corporate invention. But the existence of American beers that are, as one industry executive once put it, “less challenging,” has a much longer history. In fact, Thomas Jefferson, himself an accomplished homebrewer, complained that some of his country’s beers were “meagre and often vapid” nearly 200 years ago.
Jefferson never lived to see the worst of it. Starting in about the mid-1800s, American beer has been defined by its dullness. Why? The answer lies in a combination of religious objections to alcohol, hordes of German immigrants, and a bunch of miners who just wanted to drink during their lunch break, says Ranjit Dighe, a professor of economics at the State University of New York at Oswego.
In the footage, secretly recorded by an anti-abortion-rights group, an official from the organization discusses the procurement and cost of intact fetuses.
Updated on August 4, 2015, at 5:54 p.m. ET
Planned Parenthood’s handling of fetal tissue for research is the subject of a fresh video released Tuesday by an anti-abortion group.
In the latest video, the fifth released by Irvine, California-based Center for Medical Progress, an official from Planned Parenthood discusses the procurement and cost of intact fetuses. The video, we should warn you, is graphic.
Planned Parenthood calls the videos a “smear campaign.” It says the footage is highly edited, misleading, and takes discussions out of context.
The Center for Medical Progress has faced two court orders that block the release of future videos, but those orders are limited to footage recorded at meetings of the National Abortion Federation and those dealing with a tissue procurement company. Fox News adds: “Tuesday’s release, purely reliant on video taken inside a Planned Parenthood clinic, would not seem to violate either order.”