By Sanjay Saigal
On Monday, I had pointed to visibility dilemma bedeviling the profession of operations research (OR) thus:
... mentions of OR in the popular press are few and far between. Worse yet, popular media articles covering OR manage to avoid mentioning OR entirely!
On Tuesday, Steve Lohr's New York Times blog post, titled "Software Progress Beats Moore's Law," appeared. It mentioned "Martin Grotschel, a German scientist and mathematician." Over the 20-odd years that I've known Martin Grotschel, we have not discussed his primary professional affiliation. Nevertheless, even a cursory survey of his research interests will suggest that Martin is at least as much an operations researcher as he is a mathematician or an omnibus "scientist." Martin's OR expertise is entirely missing in Lohr's story. (The bias that "pure" mathematics is better, or harder, or more meaningful, than applied mathematics isn't universal, but it is pervasive. OR is application-focused. Lest I be slotted as a wannabe boiled-frog crusader, this will be my final mention of OR's branding dilemma.)
Lohr argues that improvement in the performance of software such as optimization algorithms has outpaced highly celebrated hardware speedups:
The rapid improvement in chips [...] has its own "law" -- Moore's Law, named after the Intel co-founder Gordon Moore, who in 1965 predicted that the density of transistors on integrated circuits would double every 18 months or so...
... a study of progress over a 15-year span on a benchmark production-planning task. Over that time, the speed of completing the calculations improved by a factor of 43 million. Of the total, a factor of roughly 1,000 was attributable to faster processor speeds, according to the research by Martin Grotschel, a German scientist and mathematician. Yet a factor of 43,000 was due to improvements in the efficiency of software algorithms...
But the ingenuity that computer scientists have put into algorithms have yielded performance improvements that make even the exponential gains of Moore's Law look trivial," said Edward Lazowska, a professor at the University of Washington.
My first reaction was: These don't look like Martin's results.
I have since verified my hunch via e-mail: Martin was reporting the work of his longtime associate Bob Bixby, who has advanced a more calibrated version of Lohr's argument for more than 10 years. Indeed, the performance improvement for linear optimization software has been nothing short of spectacular. Lohr's is the first recognition of this revolution that I have read in the popular press.
(To put my views in context, I did my graduate work with Bob. Years later, we also worked for the same company. However, I don't write numerical software, and I played no direct role in the results being discussed.)
Lohr mentions production planning. Such problems, technically known as mixed-integer programs, are complex variants of the steak grilling example I previously described. To get a flavor of this complexity, instead of a backyard grill, imagine an airline food-service operation that needs to turn out thousands of steak and vegetable entrees. Naturally, for such a large-scale commercial operation, lots of different grills with different capabilities are used. One grill may cook up to five steaks or 20 servings of vegetables, or any smaller combination that fits, at a time. Another may allow 12 steaks or 42 servings of vegetables, or any smaller combination. And so on. Multiple cooks staff the kitchen. Our challenge is to find a best possible sequence of grilling operations.
Recall that for our example we sequenced six operations -- three steaks with two sides per steak. For instance, our best possible sequence was [Steak1.side1, Steak2.side1, Steak1.side2, Steak3.side1, Steak2.side2, Steak3.side]. That yields -- you can work it out manually -- 11 meaningful cooking sequences from which to pick the fastest.