The Computers of Tomorrow
In the past two decades, thousands of computers hare been applied successfully in various industries. How much more widespread will their use become? MARTIN GREENBERGER,who is associate professor at the School of Industrial Management of M.I.T., has been working with computers for fourteen years.
NINETEEN years ago, in the July, 1945, issue of the Atlantic, Vannevar Bush predicted that the “advanced arithmetical machines of the future” would be (a) electrical in nature, (b) far more versatile than accounting machines, (c) readily adapted for a wide variety of operations, (d) controlled by instructions, (e) exceedingly fast in complex computation, and (f) capable of recording results in reusable form.
Tens of thousands of computers have been perfected and successfully applied in the past two decades, and each one attests to the remarkable clarity of Dr. Bush’s vision. Few of his readers in 1945 could have imagined the major strides that were about to be made in computer technology. Dr. Bush himself was only extrapolating from the technology of the time in these particular predictions. He did not assume the concept of internally stored programming, described by John von Neumann the following year: nor did he bank on the perfection of electronic logic, magnetic cores, and transistors. Yet, in a functional sense, his predictions scored a virtual bull’s-eye.
Only a decade ago, in 1954, a UNI VAC was delivered to the General Electric Company in Louisville for business use. Up to that point, computers had been applied almost exclusively to scientific calculation. Quickly, payroll, inventory, and customer accounting became fair game. Today there are probably more than twenty thousand computers in use within the United States, and correspondingly large numbers are installed in many other countries around the world. Computers run at speeds of up to millions of operations per second, and do so with negligible rates of error. Their linguistic abilities have been broadened impressively through development of elaborate programming systems, and their memories can be virtually unlimited in size over a range of times of recall.
By achieving reliability along with capability, computers have won broad commercial acceptance. But what of the future? What can we expect as computers enter their third decade? Some conservatives have been predicting a deceleration of computer growth for at least five years now. Is there a plateau just over the horizon?
Not if a recent turn in computer research is as significant as many of us believe it to be. General economic and political conditions permitting, this work will nourish a new wave of computer expansion. Computing services and establishments will begin to spread throughout every sector of American life, reaching into homes, offices, classrooms, laboratories, factories, and businesses of all kinds.
ANALOGY WITH ELECTRICITY
The computing machine is fundamentally an extremely useful device. The service it provides has a kind of universality and generality not unlike that afforded by electric power. Electricity can be harnessed for any of a wide variety of jobs: running machinery, exercising control, transmitting information, producing sound, heat, and light. Symbolic computation can be applied to an equally broad range of tasks: routine numerical calculations, manipulation of textual data, automatic control of instrumentation, simulation of dynamic processes, statistical analyses, problem solving, game playing, information storage, retrieval, and display.
Within reasonable limits the user is assured that electrical energy will always be available to the extent required. Power failures and overloading are relatively infrequent. Ten years ago an analogous statement for computation would have been a misrepresentation. Error rates in the computer were precariously high, and service was uncertain by any standards. Today, however, improved components have all but eliminated reliability as a consideration in the use of computers. Overloading is still a problem, but this is mostly a consequence of burgeoning demand.
Where, then, does the analogy with electrical energy break down? Why has automatic compulation not pervaded industry as electricity has done? Is it simply a matter of time, or do the differences between the two, by their nature, enforce a permanent disparity?
The first difference that comes to mind is cost. Three pennies keep a large electric light bulb burning all night, and they buy about thirty thousand additions or subtractions or other elementary computations at current large-computer rates (omitting overhead, communication, and programming expense). This is enough computation to balance a large number of monthly bank statements, and at face value seems to compare very favorably with the equivalent amount of electricity. Furthermore, the cost of computation has been decreasing steadily, whereas electric rates have been stable for over twenty years now.
But a complication arises when we try to distribute small chunks of computation widely on a regular basis. The electric utility finds it easy to accommodate numerous customers consuming as little as 1 kilowatt-hour or I watt-hour at a time. It does not even have to charge a premium for the privilege of using small chunks if the total monthly consumption of a customer is large enough.
Not so for computation, as indicated by present experiments with computer systems that share their time among a number of concurrent demands. These experiments, while demonstrating the feasibility of making a conventional computer accessible to many small remote users simultaneously, also demonstrate the sizable hidden cost of such service. Overhead in supervising user programs, as well as in shuffling them around memory, can increase actual costs to several times the figure implied by a naive analysis based on more conventional computer techniques. But today’s computers were not built to be time-shared. With a new generation of computers, overhead of the kind mentioned may shrink to relative insignificance.
Electrical power is immediately available as soon as it is requested, no matter how much power (up to predefined limits) is being drawn. In the timesharing experiments, on the other hand, some of the longer requests for computation are delayed excessively during periods of heavy demand. Certain classes of use can tolerate delay more than others, so it is not mandatory to eliminate it completely. Since the delay is caused largely by the heavy (free) loading on present time-shared systems, it is reasonable to expect alleviation of the problem, at least in the business world, not only from better computer systems but also from the institution of price schedules based on amount and type of use.
The analogy of automatic computation with electrical power is subject to three major qualifications. First, to get electricity, we simply reach over and flip on a switch or insert a plug into an outlet; computers, by contrast, seem complex, forbidding, and at a distance from most potential users, both in space and time. This condition has been improving, but much work remains to be done.
Second, a wide variety of appliances, bulbs, machinery, and miscellaneous electrical equipment has been invented and perfected to harness electrical power for its various uses; each piece of equipment has its function built right into it, and each couples to its power supply in more or less the same way. But the general-purpose computer performs almost its entire repertoire all by itself, once it has been programmed appropriately, and employs its terminal equipment primarily for the entrance, exit, or temporary storage of information. and for little else. The difference will diminish as more special-purpose terminals arc designed lor use in conjunction with large memories and fast processors. Whether it will ever disappear entirely is doubtful, but it is worth noting that the development of most electrical appliances came well after the realization of electrical distribution equipment.
Third, electricity is a relatively homogeneous product, produced centrally and transmitted without interruption and without intelligent guidance by the consumer. Computation, on the other hand, is dynamic in form, and its course is typically guided by action of the user. The two-way dialogue and information feedback characteristic of on-line computation is totally absent from the electrical side of the analogy.
These three qualifications by no means kill the dream of large utilities built around the service of computing systems, but they do raise interesting uncertainty about how this dream will materialize.
THE INFORMATION UTILITY
The concept of an information-processing utility poses many questions. Will the role of information utilities be sufficiently extensive and cohesive to create a whole new industry? If so, will this industry consist of a single integrated utility, like American Telephone and Telegraph, or will there be numerous individual utilities, like Consolidated Edison and the Boston Gas Company? Will the design and manufacture of computing components, terminal equipment, and programming systems be accomplished by subsidiaries of the information utility, as in the telephone industry, or will there be a separate industry of independent private manufacturers, like General Electric and Westinghouse in today’s electrical equipment industry?
Perhaps the most important question of all concerns the legal matter of government regulation. Will the information utility be a public utility, or will it be privately owned and operated? Will some large companies have their own information utilities, just as some companies today have their own generating plants?
Central to all these questions is the matter of cost. Computation, like electricity and unlike oil, is not stored. Since its production is concurrent with its consumption, production capacity must provide for peak loads, and the cost of equipment per dollar of revenue can soar.
The high cost of capital equipment is a major reason why producers of electricity are public utilities instead of unregulated companies. A second reason is the extensive distribution network they require to make their product generally available. This network, once established, is geographically fixed and immovable. Wasteful duplication and proliferation of lines could easily result if there were no public regulation.
Given the advanced state of development of present communications lines, it is unlikely that information utilities will wish to invest in their own communication networks. This may be taken as an argument against the necessity for stiffing free competition and placing information utilities under public regulation; yet, there is another massive investment that the information utilities will not be able to sidestep as easily, if at all — namely, investment in the large programming systems required to supervise the operation of the information utility and provide its services. The information utility should be able to shift part of this burden to the shoulders of its customers, but it will have to bear responsibility itself for the design, maintenance, and modification of the core of the programming system. The vast potential magnitude of this system, plus the fact that its usefulness may not extend beyond the physical machinery for which it was constructed, plus the possibility of programming waste from having too many entries in the field, may tip the balance in favor of a regulated monopoly.
In summary, a very substantial amount of capital is needed in the development of information utilities, capital to furnish both equipment and programming. Thus, even if no new communication lines of a proprietary nature are required, the public-utility format may still prove to be the best answer. On the other hand, one very persuasive reason for the private-company format is the stimulating effect of free enterprise and competition on imagination and hard work — vital prerequisites for realization of the information utility.
Whichever way the balance tips, it is clear that information utilities will be enterprises of considerable size. If they form an industry of private companies, then the industry probably will be dominated by one or two firms of giant proportions. Logical candidates among existing companies include not only the large communication and computer enterprises, but also the big computer users.
BETTER THAN MONEY
The organizational impact of the information utility will extend well beyond the one or two industries directly concerned. User industries, such as banking and retailing, may also be greatly affected. Suppose, for example, that businesses of all sizes have simple terminals linking them electronically to a central information exchange. Then each business can make instantaneous credit checks and offer its customers the convenience of universal credit cards. These cards, referred to by some as “money keys.” together with the simple terminals and information exchange, can all but eliminate the need for currency, checks, cash registers, sales slips, and making change. When the card is inserted in the terminal and the amount of the purchase keyed in, a record of the transaction is produced centrally and the customer’s balance is updated. A signal is transmitted to the terminal from the central exchange if the customer’s balance is not adequate for the sale. Positive credits to the customer’s account, such as payroll payments, benefits, dividends, and gifts are entered in a similar way. Periodic account statements are figured automatically and delivered to customers, perhaps directly to a private terminal for some, or by postal service for others.
Any number of variations on this theme are conceivable, up to and including the virtual disappearance of our traditional media for commerce. The savings resulting from eliminating the physical handling and flow of money, as well as the clearing and transfer of checks, would justify a considerable expenditure for electronic equipment.
Secondary benefits might include the semiautomatic preparation of income tax returns and the automation of most bill collection. Incidentally, we can look forward in the process to displacing another class of manual labor: miscellaneous thieves who prey on money. The increased possibilities for embezzlement through fraudulent accounting may attract Some of the resulting unemployed, but there are ways that tire computer can be deputized to police its own operation, quietly and without danger of corruption.
Insurance is another staid industry whose way of doing business could change more than some may realize. Insurance policies are sold by agents at present from a relatively fixed, relatively small number of plans formulated by the actuarial department of the insurance company. Suppose all the actuarial figures on which these plans are based, together with other relevant statistics, are brought together in the store of a central computing system, and on-line terminals are placed at the company’s field offices. Then there is no reason why policies cannot be custom-tailored to each prospect’s needs and characteristics as a regular service. Personalized insurance would have considerable marketing appeal, and offers several subtle advantages. At least one of the very large insurance companies is already taking steps in this direction. Equitable Life is reputed to be planning a telephone link of 114 typewriter terminals, located at field offices and operating departments, with a central computing system at the home office. The magnitude of the project is estimated at $12 million and 5 years’ duration.
With personalized insurance, the rates of premiums can be made to vary with the company’s changing inventory of policies and insureds. Thus, a continual control over aggregate risk can be maintained. Since premiums are based on a much more complete description of a prospect than at present, there is less need for grouping of essentially different risk categories into the same premium class. Approximately 50 percent of the insureds (the less risky half) would receive better rates from personalized insurance than from insurance offered by competing companies that operate with fixed plans. As a result, there would be a gradual drift of more profitable (less risky) customers over to personalized insurance, fihus, the rates could be made still more favorable, and the competitive margin would grow.
A final advantage of personalized insurance is the ease with which a customer can trade up or down. As the customer’s family expands, as his children approach college age, as they become selfsupporting, as he approaches retirement, and so on, his insurance requirements change. At any time he can go to the nearest personalized terminal and key in information on his current insurance portfolio and on the adjustments he wishes to make. Within minutes he receives an indication of the differential premium due or saved, and this permits him to decide whether to trade. An agent can act as intermediary if self-service turned out to be unprofitable; or the computer may be able to sell its own insurance policies via persuasive discourse with the customer.
Certain people who are intimately familiar with the workings of the New York Stock Exchange see no reason why its entire operation cannot be automated. Their thoughts go well beyond the mechanization of quotations and reporting procedures that is currently in progress. These persons find no real need for the floor specialists, for example. They believe that the computer could be programmed to maintain at least as stable and fluid a market as the specialists maintain, and serve at least as well in the public interest. Readers of the recent SEC stalf study on the security markets will appreciate immediately some of the potential benefits of eliminating specialists, over and above the tangible savings in commissions and paper flow.
Every investor has a “seat” on the computerized exchange, and even brokers become dispensable (although they, like insurance agents, may remain as the most deep-rooted of present institutions). Transactions arc handled lay an information utility which feeds customer orders directly to the computer system, keeps book, makes a market, ancl collects commissions on each transaction. Similar arrangements are possible for the other security and commodity markets, regardless of size, as well as for bond trading, mutual-fund sales, and so on.
A St. Louis broker has suggested the formation of a National Trading Corporation to automate the quoting and trading of securities in the over-thecounter market. His proposal could provide a first step. Operation of the computerized security exchange ties in naturally with operation ol the central credit exchange. Translations on the security exchange can be preceded by checks on the appropriate accounts of the credit exchange and result in adjustments to these accounts. Margin allowances made as part of the normal operation of the credit exchange permit a tighter watch over excessive borrowing and other violations than is now possible.
Computer-managed markets working together with computer-regulated credit may sound more than a bit Orwellian, but the potential for good from this merger is enormous. Unregulated credit in the purchase of securities was one of the chief factors that contributed to the severe decline in stock prices ol May, 1962, just as heavy margin positions in the twenties scaled the lid on the 1929 debacle. With the information utility keeping a vastly expanded and mechanized Federal Reserve type of scrutiny and control over the flow of credit and the operation of markets, the United States could be within an arm’s length of stabilizing the behavior of its economy, an elusive goal that is almost as old as the economy itself.
The range of application of the information utility extends well beyond the few possibilities that have been sketched. It includes medical-information systems for hospitals and clinics, centralized traffic control for cities and highways, catalogue shopping from a convenience terminal at home, automatic libraries linked to home and office, integrated management-control systems for companies and factories, teaching consoles in the classroom, research consoles in the laboratory, design consoles in the engineering firm, editing consoles in the publishing office, computerized communities. Different subscribers to the same information utility will be able to use one another’s programs and facilities through intersubscriber arrangements worked out with the utility on a fee basis.
As more and more of these services are perfected, an increasing percentage of the day-to-day functioning of man, the economy, and society will become documented and mechanically recorded in easily accessible form. It will no longer be necessary to conduct costly surveys and door-to-door interviews to acquire data on consumer tastes or investment behavior, at times only to find that the data are inappropriate or anachronistic for the needs of research. Research investigators will specify their precise data requirements and will requisition custom studies from the files of the information utility. I he studies will be timely and current, and a great boon to analysts and simulators. As their use develops, these data studies will be invaluable for corporate decision-making and government planning, to the point where they may be woven into the very fabric of these processes. It is not a mere flight of fancy to anticipate the day when information automatically acquired during the operation of the information utility feeds directly into decision mechanisms that regulate the economy and the activity of companies.
The information service may be conducted by the information utility itself, by a subsidiary, or by one or more of the subscribers. The information service represents a profitable and natural fulfillment of the utility’s role and function. Revenue is created by the utility on both ends of the data line — for example, in the production of sales data, when the utility can charge for making a money transaction unnecessary; and again in the marketing of this same data, when the utility can charge for providing detailed information that would be costly and difficult to obtain any other way.
Among the chief potential users of custom information are persons engaged in simulation studies and dynamic modeling. Simulation is about the most promising approach known for the general analysis of complex systems and stochastic processes. On the operating level, it affords the user a way of asking the question, what if. The use of simulation by staff specialists, systems analysts, decision makers, social scientists, and others will markedly expand as the information utility makes powerfid computers and programming systems easily accessible.
Most users of simulation will not have the knowledge or desire to build their own models, especially as simulation starts being applied by line managers and operating personnel. Assistance in the formulation, adjustment, and validation of models will be provided by an on-line simulation center, joined by the information utility to both the users and the relevant information sources. Simulation service, like information, will be obtained by a procedure as simple as dialing a telephone number.
A simulation service could be of great value as a proving ground for development of an early form of information utility, and could provide a bootstrap for further refinement of the utility. Each contemplated service could be designed by successive approximations, simulated, and revised before it is instituted. This is especially important for a service such as the automated stock exchange, where design errors can cost millions of dollars and experiments on the real system are impractical. In addition, a working prototype of the exchange, displayed by the simulation service, could persuade the doubtful and the wary.
Barring unforeseen obstacles, an on-line interactive computer service, provided commercially by an information utility, may be as commonplace by 2000 a.d. as telephone service is today. By 2000 A.D. man should have a much better comprehension of himself and his system, not because he will be innately any smarter than he is today, but because he will have learned to use imaginatively the most powerful amplifier of intelligence yet devised.