Previously in Digital Culture:
"Alternate Realities," by Harvey Blume (January 13, 2000) Choose your technorealism. William Mitchell's e-topia and Douglas Rushkoff's Coercion take starkly differing views of the Information Age. Plus, interviews with both authors.
"The Unacknowledged Legislators of the Digital World," by Charles C. Mann (December 15, 1999)
In his new book, Code and Other Laws of Cyberspace, Lawrence Lessig offers a disconcerting vision of the Net's future. Too disconcerting, objects our reviewer. Plus, an e-mail exchange between Lessig and Atlantic Unbound editor Wen Stephenson.
"Exquisite Source," by Harvey Blume (August 12, 1999)
Heads turned in June when the Linux operating system was awarded first prize by the judges of an international art festival. How far, one wonders, can the open source model go?
"With Liberty and Justice for Me," by Mark Dery (July 22, 1999)
Is the Internet giving ordinary people more control over their lives? An e-mail exchange with Andrew L. Shapiro, the author of The Control Revolution.
"Bits of Beauty," by Harvey Blume (June 3, 1999)
Yes, it's art. Now what is there to say about it? An assessment of the first-ever Cyberarts Festival in Boston, where art criticism is forced to play catch-up with technology.
"The MP3 Revolution," by Charles C. Mann (April 8, 1999)
The recording industry may indeed have something to worry about. If the much-talked-about digital format (or something like it) catches on, we could be carried back into our musical and cultural past.
See the complete Digital Culture index.
More on Technology and Digital Culture in Atlantic Unbound and The Atlantic Monthly.
Join the conversation in the Technology & Digital Culture conference of Post & Riposte.
We survived the Millennium Bug. But just because we're no longer threatened by bad software from the 1960s, it doesn't mean we're safe from all the bad software of the 1990s. A look at why the world's fastest-growing industry routinely gets away with selling poorly made products
On February 15, two days before the launch of Windows 2000, the Internet was briefly roiled by the revelation that Microsoft's newest operating system was shipping with more than 63,000 "potential issues" -- bugs, in other words. Of these potential issues, as many as 28,000 would become "real" problems, according to a Microsoft in-house memo obtained by the online news outlet ZDNet. "Our customers do not want us to sell them products with over 63,000 potential known defects," Windows development leader Marc Lucovsky exhorted company programmers in the memo. "How many of you," he asked, "would spend $500 on a piece of software with over 63,000 potential known defects?"
Microsoft dismissed the ZDNet story -- though not for the reasons one might suppose. Although the company pointed out that many of the "defects" were not actual bugs so much as failures to achieve hoped-for improvements, its main defense was that software problems are routine, and Windows 2000 has no more than its share. "Bugs are inherent in computer science," a Microsoft representative explained to ZDNet. "All software ships with issues."
Oddly, Microsoft is right. Commercial software companies send their products to customers knowing that they are full of defects -- that's the reason people install programs and then immediately check the Internet for bug-fixing updates. And that is also the reason, or part of the reason, that software offers a rare example of a technology-based industry whose standard products are routinely derided by experts. Indeed, academic computer-science researchers quite commonly charge that on the whole commercial software is getting worse, not better.
Aeronautics engineers don't dismiss Boeing, pharmacologists don't knock Merck, but computer programmers regularly scoff at Microsoft -- and, in fact, at almost every other big software company. Commercial programs, software engineers say, are badly designed, overly complex, poorly functional, and marketed with amazing disregard for the consumer.
At first glance it seems implausible that most commercial software could be dreadful stuff. The notion that people will keep buying junky products appears to violate the principles of Economics 101. If software is bad, some companies should try to increase profits by producing better programs, and customers should be smart enough to buy them. The result should be continuous improvement -- the way that contemporary car paint is vastly better than the paint on cars twenty years ago. Instead, in the view of David E. Weekly, a programming instructor at Stanford, "software quality standards have been plummeting over the last decade or so."
Well-known principles of software design are increasingly being ignored, according to Steve McConnell, editor of IEEE Software and author of After the Gold Rush: Creating a True Profession of Software Engineering. In the crazy hurry typical of today's software firms, workers increasingly begin to write code without having worked out a final design. The result, McConnell argues, is that programmers spend most of their time fixing their own errors. When they get the number of bugs down to a (barely) tolerable level, they heave the software onto the market.
Why aren't dissatisfied consumers forcing companies to write better programs? Part of the reason, critics say, is that the continuing decline in software quality is masked by the continuing increase in hardware quality. Ever-faster microprocessors conceal the presence of ever-slower software. More important, the critics believe, many consumers simply don't know what good code is like -- they're akin to the middle-class folks in the late 1970s who bought bloated, inefficient Buicks and Thunderbirds because they had never driven superior Japanese or German cars. "Potential [software] buyers tend to believe that publishers ship buggy products, then refuse to provide grudging assistance over perpetually busy phone lines," Jeffrey Tarter, executive director of the Association of Support Professionals, admitted in a recent paper. Nonetheless, consumers keep buying, because they believe there's no alternative.
Microsoft is far from being the only offender. The installation files of WordPerfect 5.1, issued in 1992, occupy about 8 megabytes; the equivalent files on version 8.0, which Corel released two years ago, are ten times bigger, but the program is no faster or more reliable. But because Windows is central to the computing world, its foibles have a disproportionate impact -- and attract special ire from programming experts. Windows 2000, which supposedly has more than 30 million lines of code, is regularly derided as "bloatware." Bloatware, as defined by the online "jargon file," is "software that provides minimal functionality while requiring a disproportionate amount of disk space and memory.... The term is very common in the Windows/NT world. So is its cause."
Programs like Windows keep expanding partly because each new version does more than its predecessor. And, of course, few object to increased functionality in and of itself. GNU Emacs -- an open-source program which the writer Neal Stephenson has described as a "thermonuclear word processor ... that outshines all other editing software in approximately the same way that the noonday sun does the stars" -- is even bigger than WordPerfect 8.0. (Emacs is widely used by programmers, as is a rival version called Xemacs and a leaner program called Vi; all three are available on the Net for free.) Few users object to Emacs's size because the bulk represents muscle, not flab. As Stephenson put it, "The engineer-hours that, in the case of Microsoft Word, were devoted to features like mail merge, and the ability to embed feature-length motion pictures in corporate memoranda, were, in the case of Emacs, focused with maniacal intensity on the deceptively simple-seeming problem of editing text."
The current version of GNU Emacs, version 20.5, comes with many, many additional features -- you can more or less run your computer with it. But -- this is a key point -- you can remove the bells and whistles without breaking the program. By contrast, Microsoft's purpose in expanding Windows is to clamp the new pieces and the old into a seamless whole. Indeed, an important issue in the Microsoft antitrust trial is whether Internet Explorer can be extracted from Windows without crippling the rest of the operating system. Microsoft has sound business reasons for binding them together with hoops of steel. But to coders, the practice almost guarantees bad software -- an operating system that is needlessly slow and prone to crash.
When I tell my word processor to print or save a file, the program doesn't perform these tasks itself. Instead it asks the operating system to do the work. The interface between an application and the operating system is known as a "system call." Because system calls by their nature provide ways for applications to interact with the operating system, programmers believe that their number should be limited. The larger the number of system calls, the larger the number of potential unwanted interactions, and the larger the number of potential security problems, many of which are created by manipulating system calls. Version 2.0 of the Linux operating system appeared in 1998, at about the time Microsoft released Windows98. Linux had 229 system calls, Diomidis Spinellis, a computer scientist at the University of the Aegean, calculated last year. By contrast, Windows had more than three thousand -- the result of constantly stuffing new things into the operating system. Computer scientists generally believe that Windows is less stable than Linux; the difference in the number of system calls is one reason why.
An old principle of industrial design is that well-designed objects do not need instruction booklets. One glance at the computer-book section of a bookstore, replete with 800-page paperbacks, should suggest that computer software does not follow this principle. How many first-time Windows users have been baffled to learn that they turn off their computers by clicking a button marked "Start"? How many Mac novices have objected, when told to eject a floppy disk by dragging it to the "trash" icon, "But I don't want to erase the files, I just want to take my disk out"? (This bit of bad design has remained for sixteen years.) How many Netscape users have been startled to learn that "Go" on the menu bar refers to a list of recently visited Web sites? And why, for heaven's sake, is the first site on the Go list numbered 0, not 1?
According to Isys Information Architects, a software-design company in Greensboro, North Carolina, the "find" function in Windows95/98 deserves a special place in the "interface hall of shame." When the Find window comes onscreen, it contains a space to type in the name of the file to look for. Unfortunately, the window also contains a menu bar -- File, Edit, View, Options, Help -- that is an almost complete waste of zeroes and ones. The Edit menu lets people Undo Copy and Select All, which would be more useful if there was anything to Undo or Select. The View menu permits changing the size of the icons -- or would if any existed. Annoyingly, the results window, which displays a list of the files discovered by Find, is just big enough to show four files. If Find turns up more files, the window must be resized. Unlike the rest of the operating system, the results window always returns to its original, four-file dimensions after it is closed, forcing the user to resize every time. "Interface designers refer to all this management as excise," Isys notes in a critique on its Web site. Excise is "all the piddly stuff you have to do just to get to the real purpose" of an application. Windows Find has so much excise, the critique says, that it may be the premiere example "of how applications should NOT be designed."
Sherlock 2, the current Macintosh version of the "find" function, is not much better, as the complaints in Mac fora around the Internet make clear. It can do many things but it is difficult to learn how -- the interface is unlike the standard Apple interface, and the relevant help files are of Microsoftian uselessness. (The main Linux graphical interfaces, Gnome and the K Desktop Environment, are not developed enough to be worth critiquing.) Worse than Sherlock was Apple's 1999 version of QuickTime, its multimedia software. So loud were the criticisms of QuickTime 4.0 that one had to wonder: Why did Apple do this? For that matter, why did Microsoft bollix up Find?
In theory, consumers don't buy software. Instead, they purchase a license to use it. The license is the long legal document inside the shrink-wrapped box or the boilerplate that the user clicks through to download a program. Some of the terms in the licenses are ridiculous; the copy of Netscape 4.7 on my machine, for instance, is licensed to me on the understanding that I will not test the software and publish the results. But the most important term, present in almost all software licenses, explicitly disclaims any responsibility for the program working as advertised.
It seems fair to say that most people ignore the license. They're entirely correct to do so, because it has little legal import. The reasoning is complicated, but it can be summed up by saying that licenses are a kind of contract, and contracts whose terms cannot be negotiated -- contracts of adhesion, in the jargon -- are not acceptable.
Since 1995 the software industry has been pushing to alter the relevant law to make its licenses legal. The current version of its proposed legislation, the Uniform Computer Information Transactions Act (UCITA), was passed in Virginia in February and will soon come before the other forty-nine state legislatures. If a majority of the states pass the law in its current form, argues the lawyer and consumer advocate Cem Kaner, software companies may not even be liable for distributing programs with viruses.
UCITA also has implications for free speech. An anti-censorship group called Peacefire has been examining the lists of Web sites blocked by filtering programs, with the intent of demonstrating that they block access to many harmless sites. One such program is I-Gear, sold by Symantec, whose best known product is Norton Utilities. When Peacefire revealed this month that the program erroneously blocked three-quarters of the .edu sites in its "Sex/Acts" category, Symantec threatened to sue. UCITA would give it a potent legal weapon.
UCITA should be stopped, Kaner believes. But the more important issue is reforming the industry that's pushing the law. Ultimately the only way to do that is for consumers to stop buying its products -- the industry will catch on. Today users often believe they have no choice but to accept what they're given, a posture reinforced by many corporate information-services departments. But it's not true. In fact, the information-services guys themselves frequently don't use the applications they inflict on others. A small step toward improving software might be to go into the back offices -- or search the Net, for that matter -- to see which software the experts use.
Join the conversation in Post & Riposte.
More on Technology and Digital Culture in Atlantic Unbound and The Atlantic Monthly.
Charles C. Mann is a correspondent for The Atlantic Monthly and a frequent contributor to Atlantic Unbound.
Copyright © 2000 by The Atlantic Monthly Company. All rights reserved.