A FEW years ago I worked on a small publishing project in northern California. As often happens now, the project was entirely edited and laid out on a computer network -- four Apple Macintoshes in this case. Because the project had many pictures, which computers treat as very large files, we added a powerful new-model Mac to the network and a fancy laser printer that could produce photo-quality images. To our dismay, the improvements broke the system horribly. We tried to fix it by every method we could think of: fiddling with configuration screens, deleting and reinstalling software, combing through the Internet for advice, calling technical-support numbers, cursing at the machines while turning them on and off repeatedly. A hired computer consultant swaggered into our offices. Hours later he staggered out in defeat. Although we were not far from Silicon Valley, we couldn't find anyone who could fix our boxes.
By luck we eventually resuscitated our machines. But we still had no idea why they had so many problems, other than our incautious decision to revamp the network on the eve of a major deadline. Recently I have come to believe that the fundamental cause was one that hardly anyone would have named, even those few years ago: our software was proprietary. In other words, the manufacturers -- Apple, in the main, but also Xanté, which made the printer -- controlled the underlying instructions that made their products work, and kept them secret. Much as no technician could repair a computer if the case were sealed shut, neither we nor our consultant could lift the lid of the software and peer inside to see why our network was down.
This is rapidly changing. Pushed by the growth of the Internet and a new operating system known as Linux, many software companies are considering whether to allow unrestricted access to the underlying instructions -- the "source code," in the jargon -- of their programs. For the computer industry this is a turnabout. Until as recently as last year software companies almost invariably viewed source code as their single most valuable asset. Yet throughout Silicon Valley executives are discussing whether they will be forced, for the sake of corporate survival, to give away something they have always thought worth millions of dollars.
The notion is especially startling for operating systems. An operating system is the software that runs the hardware on a computer. When users of, say, Microsoft Word click on the PRINT button or type sentences, the program does not print the document or put letters on the monitor. Instead it passes the instructions to the operating system, which shuttles data from the hard drive to the printer or translates the keystrokes into onscreen characters. The two most familiar operating systems are Microsoft Windows and Mac OS, the Macintosh operating system. Microsoft and Apple are famed for their differences, but in one respect they are exactly the same: they have both used control of the source code to their operating systems as a competitive weapon. Indeed, Microsoft's competitors widely attribute its dominance to its lock on Windows. Nonetheless, Apple has already given programmers a small glimpse of its source code, and Microsoft has announced that it might do the same.
Opening source code is more than the latest news release in the software industry. It is the center of a small but increasingly influential movement -- one that aspires to transform the world. Enthusiasts dismiss the grudging moves by Apple and Microsoft as too little too late, and confidently anticipate that Linux will topple Bill Gates's empire. Many believe that its widespread adoption will greatly increase human freedom. Still others hope that Linux will reduce the gap between rich and poor nations.
No one can say whether any of this will come to pass, although the information age is sweeping society so rapidly and unpredictably that not even the unlikeliest outcomes can be dismissed. But it is clear that living with Linux, which I have been doing for a while now, is not at all like spending time with Windows or Mac OS. Whereas Windows and Mac OS are intended in part to shield users from their machines, Linux forces people to grapple with their relationship to technology -- an experience that was for me both salutary and disquieting.
I'M writing this article with software called XEmacs. The program (its name derives, circuitously, from an abbreviation for "Editor Macros") is unlike any other word processor I've ever encountered. In addition to cutting and pasting text, XEmacs can run other programs; send electronic mail; browse the World Wide Web; retrieve, edit, and send files across the Internet; and keep track of appointments. It's like a digital Swiss army knife. Many hackers (a term used by computer cognoscenti to refer not to teenage vandals but to expert programmers and network administrators) open XEmacs when they come to work, use it through the day for every kind of task, and close it when they go home. Macintosh and Windows adaptations of XEmacs exist, but the program is used mainly on other operating systems, especially Linux. XEmacs is a descendant of Emacs, most of which was written in the mid-1970s by Richard Stallman, a gifted hacker who has spent most of the past two decades working as a volunteer in the halls of the Laboratory for Computer Science at the Massachusetts Institute of Technology. When Stallman began coding, proprietary software hardly existed. Computer makers like IBM and Digital threw in the programs needed to operate their expensive machines -- the profits to be made were on hardware. In those long-ago days programmers freely passed their code around for colleagues to use, critique, and improve. Stallman thought of this as eminently sensible.
To many hackers, the rise of the commercial software industry came as an unpleasant surprise. Companies sell software in the form of zeros and ones on computer disks. Like a translated poem that
cannot be retranslated accurately into its original language, these zeros and ones cannot reliably be translated back into the original source code. Programmers were thus shut out when software firms began treating source code as a trade secret. To Stallman, this was a moral outrage -- corporations were preventing people from learning about and improving their software. He decided to do something about it.
In 1984, Stallman founded the GNU Project, which worked out of donated offices at the MIT Laboratory for Computer Science. Its intent was to create a powerful and sophisticated operating system that was wholly free of proprietary restrictions -- the source code could be copied and changed at will. To ensure that the software could never be kept secret, Stallman invented the GNU General Public License, which copyrighted the source code and released it to the world on the sole condition that any modifications would be covered by the same license -- that is, could be freely copied and changed. This special form of copyright, Stallman said, should be called "copyleft."
GNU intended to duplicate the look and feel of Unix, which was then the most common operating system on big computer networks and the Internet. Invented in 1969 at Bell Labs (the precursor of Lucent Technologies), Unix has mutated into a dozen different proprietary versions, which today are available from IBM, Compaq, Sun Microsystems, and other companies. Stallman called his version GNU, for "GNU's Not Unix," an acronym that includes itself -- the sort of trick hackers love.
The goal was almost ludicrously ambitious. The heart of an operating system is its "kernel," the switchboardlike software that shuttles zeros and ones among the keyboard, the mouse, the monitor, the hard drive, and the microprocessors. But the kernel does not run the computer by itself: it works in concert with hundreds of supplementary programs, including drivers, utilities, programming tools, and window managers. To create a complete operating system the GNU Project would have to write not only the kernel but also the surrounding software -- hundreds of megabytes of code. It was like a group of friends deciding to build their own space shuttle in a basement.
Amazingly, the GNU Project did it -- or, rather, has almost done it. By the early 1990s hackers around the world were using GNU software, including "gawk," a programming language, and "groff," a document-formatting and printing system. But the project had not created a working kernel, partly because Stallman had decided not to pattern the GNU operating system on the Unix kernel but to build upon a stripped-down, ultraflexible kernel developed at Carnegie Mellon University and the University of Utah. Unfortunately, producing a kernel that was not only small but also adaptable enough to do many things turned out to be exceedingly difficult. The GNU kernel is still under development; the latest release, version 0.2, appeared in June of 1997.
In 1991 Linus Torvalds decided to write his own kernel. Torvalds, then an undergraduate at the University of Helsinki, had just bought a personal computer: a 386 with four megabytes of memory. Like many hackers, he disliked MS-DOS -- "MS-DOG," as GNU Web pages sometimes call it. But his computer was not powerful enough to run Unix. Scrambling together code from his instructors and merging it with his own work, he fashioned something like a Unix kernel. Because the GNU Project had put together the necessary subsidiary programs, he calibrated his kernel to work with them. The preliminary results appeared on the Internet in the fall of 1991. "Just a hobby," Torvalds explained in an accompanying note. "Won't be big and professional like GNU."
A few hackers downloaded his kernel, fiddled with it, and passed on the improvements, which Torvalds incorporated into the next version, which was downloaded by more hackers, who added still more improvements. The informal online collaboration snowballed Andy Hardy-style, until it included hundreds of computer jocks on five continents, all volunteering their time, all chipping in bug fixes and test results and new chunks of code. Torvalds had backed into creating a full open-source operating system similar to the one that Stallman had envisioned, incorporating all the available GNU software except the kernel. He called the system Freax, but a friend thought the name was silly and to Torvalds's embarrassment rechristened it Linux (Linn-uks, after "Linus"). Version 1.0 was officially released in March of 1994. It was copylefted under the GNU General Public License.
Today Linux has millions of users, the kernel has reached version 2.2, and Torvalds has become the most famous computer programmer on earth. When I went to hear him speak in Anaheim, in 1997, the audience of hackers reacted to his appearance on the dais with mystical fervor: "This is better than the Beatles coming to Shea Stadium," one told me. According to a study issued last March by the International Data Corporation, a market-research firm in Framingham, Massachusetts, that specializes in information technology, the use of Linux will grow by at least 25 percent a year for at least the next five years; Torvalds's not "big and professional" project, the study says, will be breathing down the neck of Microsoft Windows by 2003.
A simple cost comparison suggests the plausibility of this scenario. The basic version of Windows NT, Microsoft's operating system for computer networks, has a list price of $319 per user and requires a Pentium chip. A small start-up company -- not to mention libraries, schools, and other nonprofits -- must pay thousands of dollars to equip its workers. Linux, which is equally powerful, may be downloaded gratis, no matter how many people use it, and can run on older, 486-chip personal computers -- boxes that used-computer stores sell for a hundred dollars.
Linux may have its greatest impact on poor countries, which can use free software on cheap machines to create computer systems just as good as or better than the systems in rich countries. The executive director of Linux International, a nonprofit group devoted to promoting the operating system, is Jon Hall, a manager at Compaq, which has provided Linux on powerful high-end machines since 1995 and which recently announced a new line of Linux-loaded machines for small businesses. In his view, Linux has already broken the rich nations' monopoly on access to computing -- an economic mainstay of the information age.
Because open-source software doesn't have to be registered and can be shot across the Net at will, precise information on its use in poor nations is impossible to obtain. Nonetheless, glimpses of its spread suggest that Hall may be right. Third World hackers have made major contributions to Linux -- the first important computer-industry development in which they have been able to participate. Though Chad, Niger, Liberia, Equatorial Guinea, and the Central African Republic have few Internet hubs, every one of those runs on Linux, according to a survey conducted in April using data from the European Internet Protocol Network (one of the three regional registries of Internet addresses). Last October the Mexican government announced plans to put more than a million Linux computers in its schools. And so on. "The next Albert Einstein of computer science might come from Korea, or China, or Venezuela, or even Helsinki, Finland," Jon Hall has written, "and the world cannot afford to miss the opportunity of finding them."
MICROSOFT Windows in its limited number of variants is sold by one company. Linux, in contrast, can be obtained from a dozen or more vendors. Because each vendor has adapted the operating system in its own way, and because the software itself is so labile, each Linux user will have a different experience. That's part of the point.
I began my own experimentation by downloading Linux free from a Web site: Debian.org. Debian, a loose collective of programmers, is positioned near one end of the arc of open-source software -- the end with the most geek credibility, because it is militantly devoted to the hacker ethos. Through no fault of Debian's, I didn't download all 800 megabytes of Linux from its Web site: I stopped after a few hours, because my noisy phone line kept losing the connection.
Debian's version of Linux is offered by online retailers; CheapBytes.com sells the current release, Debian 2.1, on CD-ROM for $2.49 plus shipping and handling. But after several download failures I was too grumpy
AUTHOR'S NOTE (Web-only)
Actually, they weren't exactly Red Hat Linux products. The software that I bought was labeled as "The Complete Red Hat Linux Operating System 5.2 Deluxe." It turned out, however, to be produced not by Red Hat itself but by MacMillan Digital Publishing, a branch of Viacom. Red Hat is known for providing excellent technical support to novices -- one reason they charge more than, say, Debian. Unfortunately, the MacMillan package did not entitle me to support from Red Hat; customer hand-holding had been outsourced to RomNet, a Boston customer-service firm. RomNet's "corporate background" Web page touts its expertise in Microsoft products but makes no direct mention of Linux -- an omission that after two fruitless attempts to seek help I came to regard as an instance of corporate honesty.
to wait for it to arrive in the mail. I drove to CompUSA, the computer superstore nearest my house, where to my surprise I found at the end of one aisle a six-foot rack of Red Hat Linux products. Red Hat is on the opposite end of the arc from Debian. Although it follows the open-source rule book, the company aggressively markets itself, and Compaq, IBM, Intel, and Novell are investors. Red Hat Linux on CD-ROM sells for $39.95 (a deluxe version costs $79.95): less than Windows, but an order of magnitude more than Debian.
Most people who buy a Linux CD-ROM plan to install the software on a machine that is already running Windows or Mac OS, and want to keep working with their old programs; Red Hat Linux includes a program called "LILO" -- for "Linux Loader" -- that lets such users switch between operating systems. I planned to install Linux on an old 486-50 that my family uses to store aged but potentially useful material.
Because I had souped up the computer with new components over the years, I expected trouble. It arrived within seconds of inserting the start-up floppy disk. The installation program looked for the Linux CD-ROM and couldn't find it, because it didn't recognize the Trantor adapter in my NEC CD-ROM drive. A search on the Internet informed me that Trantor had been bought by Adaptec, which no longer supported my adapter. I also found several futile pleas by Linux users for help with their Trantor adapters. So I went to a used-computer store and bought a ...
There's little point in continuing. Installing Linux in an old Windows or Macintosh box is like putting a new clutch in a car. Some people like this sort of thing and some don't. I like it more than many people I know, though I'm not particularly good at it. People who don't like to tinker should either pay someone else to install Linux or, better, obtain a brand-new computer with Linux already on its hard drive, which is what I did next.
I borrowed a computer from VA Linux Systems, of Sunnyvale, California, the biggest Linux-only computer maker. As a graduate student in engineering at Stanford, Larry Augustin, the company president, discovered that by loading Linux on a 486 PC he could make a powerful workstation for about a quarter of the prevailing price. One thing led to another, and in 1993 VA Linux Systems was born (the name comes from the initials of Augustin and James Vera, a former partner). Augustin helped his fellow students David Filo and Jerry Yang write the initial business plan for Yahoo! -- an Internet-search company with a current stock valuation of more than $30 billion. Augustin is not yet a billionaire, but he assured me that Linux will grow so much that VA "will be bigger [than Yahoo!] in the end." Indeed, he said, its monthly sales tripled from November to March.
When Donna Sokolsky, the public-relations person for VA, asked why I wanted the computer, I explained that I hoped to see what Linux would be like if bought in the same way that most people buy Windows or Mac OS -- pre-loaded on a machine. Sighing, Sokolsky said something like "Well, you have to understand that Linux isn't really ready for the casual user yet." This turned out to be not completely true. In certain situations Linux is better suited to non-experts than are other operating systems -- something about which, upon reflection, I could not feel entirely happy.
LINUX, Torvalds warned early on, "is a program for hackers by a hacker." Indeed, Unix, the inspiration for GNU and Linux, is exactly the kind of software that makes the uninitiated feel stupid. Militantly unintuitive, it requires the mastering of dozens of telegraphic commands such as "egrep," "ping," and "gzip," all of which can be modified to act in any number of ways familiar to anyone who remembers MS-DOS. Novices can easily view this approach as daunting. One of the help files distributed with Linux unblushingly gives, as a "realistic example," the following command to decompress all the compressed files in the current directory without erasing any of them:
find . -name '*.gz' -print | sed
How could anything that requires people to learn this gibberish be gaining in popularity?
Hackers prefer typed commands to point-and-click menus, because these usually permit better control of the machine; they like open-source software because they can add options as needed. The purpose of Windows and Mac OS is to create a seamless experience for the user, which is usually accomplished by gluing together separate pieces of software. Examples of this agglomerating tendency abound, including Microsoft's attempt to meld Internet Explorer, its Web browser, into Windows -- which led to the antitrust battle that began last fall. Conceptually more important was the decision, first by Apple and then, imitating it, by Microsoft, to fuse the operating system with the graphical user interface -- the software that lets people run the operating system by clicking the mouse on windows on a computer "desktop." With Macintoshes there are no typed commands. Grudgingly, almost covertly, Windows does permit typed-in commands, but Microsoft supposedly has been trying to eliminate the DOS command line in the next versions.
This is abhorrent to Linux partisans. From their point of view, the best practice is to separate the operating system from the user interface and applications, so that if one application freezes up, the system won't crash. (IBM's challenger to Windows, the lamented but for-hackers-only OS/2, rigorously maintained this separation, and its safety made it the object of a cult.) I have terrible Netscape karma, for instance -- Netscape crashes when I'm anywhere near it. With Windows, Netscape usually takes down the whole computer; when I try to exit the frozen program, I end up having to reboot. With Linux the program is just as badly behaved, but I can kill it without bringing everything else down with it. Indeed, I have not yet managed to lock up Linux, despite giving it, in my incompetence, plenty of excuses to go belly-up. My Windows box dies every day or so; Mac OS, when I use it, rarely lasts a day without needing a reboot.
For all its obvious advantages, Linux strikes most people as hard to use, because they don't want to learn typed commands. To bring Linux to the masses, programmers around the world have spent years developing a point-and-click interface -- an effort that is now paying off. The attempt to make Linux (and Unix) easier to use began with a project called, variously, X Window, X11, and simply X. Launched in the 1980s as a joint project between MIT and Digital, X is free software that provides a way for Linux computers to draw graphics on the monitor -- but, in the spirit of the operating system, without requiring any particular method of putting up windows. That is left up to the "window manager" -- software that lays out how the onscreen windows look and act.
More than half a dozen Linux window managers are available, all of them with features that are so obviously good ideas that the failure of Microsoft and
Apple to imitate them is baffling. In every one I know, users can have multiple desktops, so that instead of piling windows higgledy-piggledy atop one another, users can distribute them among desktops. With a single click of the mouse, users can instantly switch from one desktop to another.
The latest step in the campaign to domesticate Linux has been to create "desktop environments," software that adds features to the window managers to let users control and configure their computers. Again, modular construction: the environment rides on top of the window manager, which rides on top of the graphical user interface, which rides on top of the operating system. As with Windows and Mac OS, Linux desktops have icons that users click on to run programs, "drag-and-drop" features that permit people to move text and files with the mouse, and the equivalent of taskbars, called "panels" or "pagers," which list running programs and desktops. Unlike the Windows desktop, Linux desktops do not force people who want to turn off their computers to click a button labeled START. Unlike the Mac OS desktop, Linux desktops do not ask users to eject floppy disks by dragging them to the trash -- the same action used to erase files.
In the past year alone two competing groups of programmers have released desktop environments (KDE and GNOME, both available in Red Hat 6.0) that are graphically sophisticated and far easier to customize than their proprietary cousins. Although neither eliminates the need to type in commands and hand-configure Linux, each makes Linux ideal for schools and small businesses -- places that are likely to have both computer-administration staff and tight budgets. Companies that need feature-rich software should beware: the two Linux "office suites" I have tried, though adequate for most home users, are still poor relations to those put out by Microsoft and Corel. But soon -- in months, not years -- computer makers will be selling "turnkey" Linux systems with fully configured desktops and pre-installed office applications.
When turnkey Linux systems become widely available, Stallman will have achieved much of his preposterous goal. No single person, company, or institution will own the software through which people interact with computers. The accomplishment will be remarkable. And yet turnkey systems -- machines that require no choices or understanding from the user -- fit uncomfortably with the ideals of freedom that began the GNU Project and the free-software movement.
WHEN Linux boots up, it sprays dozens of cryptic messages on the monitor, informing the user precisely what is happening inside the computer. The messages exemplify the transparency hackers like about Linux: it permits complete control of the machine. In our technological era control of computers is an increasingly valuable kind of freedom. Sensing this, people readily distrust the corporations that make technological choices for them -- one reason that Microsoft will never be loved, despite the many economic benefits it has conferred on the nation. At the same time, the line of boot-up messages stuttering down the monitor makes clear that this freedom comes at a price -- a learning curve that can be long and frustrating. VA Linux Systems lent me a computer that far surpassed my modest needs, not that I complained. Intended for use as a central computer in a network, it came with nothing so primitive as a modem. Wanting to connect to the Internet, I once again drove to CompUSA and bought a modem. When I got home, I was confronted with the task of figuring out how to hook it up -- the maker provided no instructions.
I spent a day or two puttering through the help files, newsgroup archives, and books. But just when I was ready to get down to work connecting the modem, I lost all interest. I was suffering from what the writer Neal Stephenson calls "geek fatigue" -- the desire to stop learning about, and fiddling with, a computer. (Last spring, in conjunction with the publication of his novel, Cryptonomicon, Stephenson made available a long, shrewd, funny online essay about operating systems, "In the Beginning was the Command Line.")
I wanted somebody else to solve my problem. I called the nearby university and asked the computer center if I could pay someone to assist me. Within ten minutes I received an E-mail offering help. After his calculus test Dan came over. It was a humiliating cliché: an adult getting computer help from an eighteen-year-old. I loved it. Dan quickly found the dumb mistake I had made, fixed it, and amiably showed me a few other things. I paid him fifty dollars.
My ideas about liberty were unexpectedly challenged when Dan asked whether I wanted him to make an account for himself on my machine; if I had further problems, he could dial in and resolve them from his home. VA technical support made a similar offer: my machine came with a "back door" -- a pre-installed route by which company technicians could take over my computer and inspect the contents. Even if I asked Dan or VA to maintain my computer, it would be effectively impossible for anyone to enter it without my permission. But I would still be surrendering control.
Every contemporary operating system allows outside supervisors to control a user's machine, but with Linux the control is especially complete. Supervisors aren't limited by the choices left to them by Microsoft or Apple: they can do exactly what they want. If bosses don't want employees to download pictures from the Internet, that can be arranged. If they want alarms to sound when employees show signs of inefficiency, that, too, is possible. Every keystroke can be recorded and analyzed.
As is always the case with technology, the only way to control its impact is to understand it. That can't be done for every innovation -- look at the "12:00" blinking on videocassette recorders across the nation. Computers have become too important to be unquestioningly turned over to others to run. But they are a tremendous burden to maintain without help. Living with Linux has increased my awareness of the control exerted by computers even as it has stripped away my easy certainty about how much value I attach to liberty. Richard Stallman and Linus Torvalds may have launched a successful campaign to unseal every computer, but it is a pending question how many people will want to look inside.
Charles C. Mann (firstname.lastname@example.org) is a correspondent for The Atlantic. His most recent book is (1997), written with David Freedman. Mann's article in this issue is the first in a series of technology pieces that he will write for The
Atlantic in the coming months.
The Atlantic Monthly; August 1999; Living With Linux - 99.08 (Part Two); Volume 284, No. 2; page 80-86.
We want to hear what you think about this article. Submit a letter to the editor or write to email@example.com.