The country's financial regulators have delivered their final report on the mysterious May 6 "Flash Crash," in which the Dow plunged 10 percent in just minutes -- and it turns out that a dumb algorithm is partly to blame.
A single large sell order executed by a rather crude software program sent the already-stressed market into a downward spiral.
It was obvious that computerized trading systems clearly played some important role in the Flash Crash, but the Securities and Exchange Commission and Commodity Futures Trading Commission had not gone beyond assigning general blame.
The new report details precisely what happened. A large firm, reportedly Waddell & Reed, sold more than $4 billion of S&P500 futures known as E-Minis. Their sell algorithm only took into account market volume when making trades, not price or time. The algo could only "know" what it was told to know, and because of the unusual market conditions, it sold all of the contracts in 20 minutes.
This large fundamental trader chose to execute this sell program via an automated execution algorithm ("Sell Algorithm") that was programmed to feed orders into the June 2010 E-Mini market to target an execution rate set to 9% of the trading volume calculated over the previous minute, but without regard to price or time.
As the Sell Algorithm went to work, traders bought up the contracts normally, but other algorithms started to pile on, increasing the volume of trades. Because the Sell Algorithm's sell rate was pegged to the volume of the market, it started going faster and faster.
The Sell Algorithm used by the large trader responded to the increased volume by increasing the rate at which it was feeding the orders into the market, even though orders that it already sent to the market were arguably not yet fully absorbed by fundamental buyers or cross-market arbitrageurs. In fact, especially in times of significant volatility, high trading volume is not necessarily a reliable indicator of market liquidity.
The net effect was that the entire order was complete within 20 minutes, and that sent the markets into a panic.
"The interaction between automated execution programs and algorithmic trading strategies can quickly erode liquidity and result in disorderly markets," the regulators wrote. "As the events of May 6 demonstrate, especially in times of significant volatility, high trading volume is not necessarily a reliable indicator of market liquidity."
As John Bates, a member of the technical advisory committee for the CFTC and the CTO of Progress Software, explained, "Liquidity was choked off as traders tried to make sense of the situation, which was nearly impossible."
The Flash Crash drew attention to the speed and automation of the markets. The speed of the decline (and the subsequent) recovery shocked just about everyone but the high-frequency specialists.Many wondered whether the crash resulted from bad actors, prompting calls for increased surveillance of high-frequency traders.
That's still a good idea, but in this case, the problem looks more like simple incompetence, which is a harder thing to regulate against. One key strategy could be implementing pre-trade risk monitoring of algorithmic strategies, Bates said.
"The report underscores a powerful need for better pre-trade back-testing of a strategy to analyze potential impacts. Testing under realistic negative conditions might have stopped the firm from using the algorithmic approach used. It also highlights the need for better real-time monitoring of the trading process, using technology that is available but underused for pre-trade risk monitoring, market monitoring and market surveillance."
MIT's Andrew Lo told us this summer that we need to "resynchronize the regulatory infrastructure with the technology of our time." Because the traders aren't waiting to implement new technologies, and their speed really matters.
"We're seeing innovations that dramatically increase the speed and throughput of the market, and that works great until it doesn't," Lo concluded.