In early August, The New York Times ran a front page story that statisticians--rather than "dronish number nerds"--are increasingly in demand, "even cool." With reams of data generated in the computer age and new realms to explore for purposes as broad as protecting national security or creating financial products, statisticians, says the Times , are only a small part of an army of "data sleuths...from backgrounds like economics, computer science and mathematics."
An important question raised by the story--and, of course, the broader, deeper trend of using mathematics and systems analysis to "understand" complex human behavior--is whether, before they have deleterious effects, emerging theories and products and ideas can be advanced with a strong measure of humility and put in the context of complex human society, where some key factors exist that cannot be quantified. Will the already potent but ever-emerging "numbers" class have the broad education and training to understand the "benefits" but also the "limits" of all this numbers crunching?
I raise this question of the potential effects of rigid application of mathematical and systems techniques because two of the most serious problems to beset this country--the Vietnam War and the financial meltdown--stemmed, in important ways, from overconfidence, indeed even cult-like behavior. These two problems are at the front of my mind due to two books that received attention in June and July that dealt with how the false idolatry of numbers and systems can lead people, institutions and nations far astray--with catastrophic results.
The first is former Defense Secretary Robert S. McNamara's, In Retrospect: The Tragedy and Lessons of Vietnam. It was published in 1995, nearly 30 years after he left the Defense Department in 1968, but received much attention when McNamara died, at 93, in early July. As is well known, McNamara was a part of a World War II "systems analysis" team at Defense, led by Tex Thornton, a "whiz kid" who rose to the top of Ford Motor and a civilian technocrat who brought a powerful systems orientation to the Pentagon in the early 60s. While this approach certainly had relevant application to an attempt to rationalize the Pentagon's corpulent competition between the Army, Navy and Air Force, it became famous in the Vietnam War when numbers like body counts, targets hit, enemy forces captured, weapons seized, tons of bombs dropped, and hamlets protected were used to argue that the war was being prosecuted successfully. The origins of the war were not in systems theories (rather, to take one strand, a belief that monolithic Russian-Chinese Communism would overrun Southeast Asia). But those theories played an important part in convincing McNamara and President Johnson that the war could be won and, therefore, in deepening our involvement, resulting in tragedy both for the U.S. and for Vietnam
Thirty years later, McNamara declared in his book (positions repeated later in the award winning documentary, "Fog of War") that, although his aspirations were idealistic, "we were wrong. We were terribly wrong." Among his "lessons" about the causes of the Vietnam mistake: "our profound ignorance of the history, culture and politics of the people...[the failure] to recognize the limitations of modern high technology military equipment." After McNamara's death, the historian Michael Beschloss wrote: "McNamara knew little about Southeast Asia, and made no conspicuous effort to learn more. His penchant for numbers left him ill-equipped to understand that some human motivations cannot be quantified or predicted. As a result, he drastically underestimated the Viet Cong's determination."
The second book, Gillian Tett's Fool's Gold, is a detailed account of how a small group of bankers at J.P. Morgan in the mid-90s (many with degrees in math and computer science) developed product models--credit derivatives--and perfected methods of breaking those derivative products into small pieces, bundling the pieces and selling them--securitization. Tett is a well-regarded reporter at The Financial Times, covering global markets. And, while there have already been many important books written on the financial crash (one of the first being Charles S. Morris' The Trillion Dollar Meltdown), this is a very fine-grained account of what actually happened inside major institutions.
As she explains in detail, the purpose behind credit derivatives was for Bank A to sell some its loan risk to Bank B. Then, when the underlying loans and the credit derivatives were bundled together and sold off to many other investors, the creators, relying on mathematical models, felt that they had eliminated financial risk. A cult followed, as other financial institutions like Bear Stearn, Lehman Brothers, Merrill Lynch and AIG used the Morgan technique and added the combustible element of subprime loans to the derivative and securitization approach.
But, of course, when the endless rise in real estate prices headed south in 2006 and 2007, the computer models were confounded by this change in basic assumption. More importantly so were markets, as derivatives and securities sold through securitization lost value and the supposed safety of risk-free paper turned into the horror show of panic and the seizing up of the credit markets. Members of the cult ran--or were chased--as pillars of the temple collapsed.
These two dramatic and far-reaching examples, however different in context, demonstrate the power of mathematical and systems analysis--and how hard it is to force their proponents to probe for weak-points, to lay bare the critical--often hidden--assumptions that underlie all models. It is also important to take into account the elements of "history, culture and politics" which so often determine outcomes (like the history, culture and politics of a fiercely independent North Vietnam or the history and culture of an overheated housing market). In both examples, there were critics at the time (George Ball inside the Johnson Administration and a number of economists and analysts mentioned by Tett).
It is fine to have McNamara's way-after-the-fact analysis added to the perspectives on what went wrong in Vietnam. It is fine to have Tett analyze, after the fact, the seeds of financial destruction with precision. But the profound question is what processes can make Cassandras heard before the catastrophe, especially because mathematical models and system analysis will---and should--continue to be important analytic tools (but not holistic answers) for all manner of problems. (Recall that no one listened to her warning about Greeks bearing gifts, turned by Warren Buffett into his annual letter aphorism "beware of geeks bearing formulas".)
Whether in the public or private sector, leaders must at a minimum have the intellectual courage and strength to form red teams and blue teams which fight over the fundamentals of the analysis, and isolate and challenge the assumptions which, when eroded, erode in turn the apparent precision of mathematical or systems models. So, too, we need tribunes of the people--either in the Congress, the media, or independent critics--who have strong enough voices to be heard, not just about how unwise a policy is, but why that policy is built on models and systems that are unrealistic, and can communicate that in a powerful, understandable way to the public, or at least to opinion leaders.
Models are just that. It was the complexity of human endeavor and the messy reality of Vietnam and the financial markets that blew them up.
We want to hear what you think about this article. Submit a letter to the editor or write to firstname.lastname@example.org.