It's Man v. Machine on Jeopardy this week as IBM super-robot Watson
takes on former champions Ken Jennings and Brad Rutter. At The
Atlantic, we're using Watson as an occasion to think about what smart
robots mean for the American worker. This is Part Three of a three-part
series on the exciting and sometimes scary capabilities of artificial
intelligence. Read Part One --Anything You Can Do, Robots Can Do Better -- and Part Two -- Can a Computer Do a Lawyer's Job?
Since the beginnings of the personal computer industry, computer hardware sales have often been driven by a particular software application so compelling that it has motivated customers to purchase the machine required to run it. When the Apple II was introduced in 1977, it was initially a success within a relatively small group of computer hobbyists. It wasn't until the first electronic spreadsheet, VisiCalc, was developed that the Apple II began to generate wider interest. VisiCalc was the catalyst that helped transform the Apple II from an interesting toy into a true business machine. Likewise, when the IBM PC was introduced, Lotus 1-2-3 fulfilled the "killer app" role. Later, it was graphic design and desktop publishing software that drove the Apple MacIntosh to success.
In recent years, the highest sales growth for the computer industry has not been in high-end desktop computers but instead in laptops and, lately, the newer netbook machines that provide a simple and inexpensive way to browse the web. At least in part, this probably results from the fact that the acceleration of computer hardware capability has largely outpaced what is required to run most of the software applications of interest to the average user. If you are primarily interested in word processing, spreadsheets and web browsing, it may be difficult to justify the cost of a high-end computer when a lower cost or more portable machine offers more than enough power to run the software. Likewise, it seems to be increasingly difficult for Microsoft and other software vendors to continually add new features to desktop productivity applications and operating systems that are compelling enough to justify expensive upgrades.
Yet the business models of both Intel and Microsoft depend on continuing to sell ever more powerful processors and new or updated software applications to take advantage of that power. If customers were to permanently turn away from the idea of faster processors, the business would quickly become commoditized, and Intel would lose its competitive advantage. For that reason, we can be sure that Intel, Microsoft and hundreds of other software companies are actively seeking the next killer app--something that will fully leverage the vastly increased computer power that will be available in the coming years and decades.
I think that there are good reasons to believe that this next killer app is going to turn out to be artificial intelligence (AI). AI applications are highly compute intensive and will take full advantage of all the computational power that new processors can offer. New standalone AI applications will appear, but more importantly, artificial intelligence is likely to be built directly into existing productivity applications and operating systems, as well as the enterprise software and database systems used by large businesses.
The market for AI software is likely to extend far beyond the computer industry. Increasingly sophisticated robots will demand the most advanced hardware and software available. High-end microprocessors and AI software will also surely be used to build intelligence into appliances, consumer devices and industrial equipment of all kinds. Ultimately, robots and other non-computer applications may well eclipse the personal computer market as the primary growth engine for leading-edge hardware and software.
Products that give some insight into what the future may hold are already on display. Microsoft recently demonstrated a "virtual personal assistant" which appears as a computer generated person on the screen. The assistant is capable of tasks such as making airline reservations or scheduling meetings and requires the most advanced hardware available. According to the New York Times, Microsoft's virtual assistant can "make sophisticated decisions about the people in front of her, judging things like their attire, whether they seem impatient, their importance and their preferred times for appointments." The Times article also quotes a Microsoft executive who speculates that future applications might include a "medical doctor in a box" that could help with basic medical issues.
An artificial intelligence application that could dispense basic medical advice is certainly a compelling idea, especially in light of the continuing problem with accelerating health care costs. However, it raises an important point. What education and training would we require of a person who dispensed such information? Would this person need to be a doctor? Perhaps not, but clearly this would not be one of the low skill, low wage jobs that we often associate with vulnerability to automation. The reality is that there is simply little or no relationship between the level of education and training required for a person to do a job and whether or not that job can be automated. While doctors are probably not in danger of losing their jobs in the foreseeable future, the same cannot be said for many thousands of knowledge workers and middle managers in the private sector.
It's important to note that, while humanoid interfaces like Microsoft's virtual assistant make for great demonstrations, the AI applications that will likely displace knowledge workers will not need such elaborate interfaces. They will simply be workhorse programs that make the routine decisions and perform the tasks and analysis that are currently the responsibility of highly paid workers sitting in cubicles all over the world. AI capability may start out by being built into the productivity applications used by workers, but over time, it will evolve to the point that these applications can perform much of the work autonomously: AI will become a tool for managers rather than workers. The result is likely to be substantial job losses for knowledge workers and a flattening of organizational charts that will eliminate large numbers of middle managers. (The impact of automation will, of course, be in addition to that of offshoring.) Many of these people will be highly educated professionals who had previously assumed that they were, because of their skills and advanced educations, beneficiaries of the trend toward an increasingly technological and globalized world.
This article available online at: