Explaining Bizarre Robot Stock Trader Behavior

BKF_080310.png


Odd patterns in the stock market, like those we reported on this week, are open to a wide variety of interpretations. They are clearly generated by robot traders, but it's not clear what those algorithms are doing.

Nanex, the data services company that discovered and visualized the very high-speed bursts of curious orders, has theorized the bots could provide some millisecond advantage to their operators by confusing their competitors. High-frequency trading experts Michael Kearns of the University of Pennsylvania and MIT's Andrew Lo disagreed with that assessment.

Kearns offered two reasons why Nanex's "quote stuffing" thesis seemed unlikely to him. One, it's not technically easy to gain that advantage and two, the data suggests that there aren't actually competitors to beat in the specific circumstances under which the bots are running.

"The quote stuffing theory is that this behavior is kind of like a denial of service attack. You flood some exchange with these orders. Your competitors have to process those orders in their data feeds, but since you placed them, you could ignore them," Kearns explained. "The reason this is unlikely is that we cannot think of any easy way for somebody to ignore the own orders that they've placed without having some risk."


Technically speaking, there just isn't a simple "ignore my own phony orders" button that a trading firm could press.

"What a firm has is nine real time data feeds from the exchanges [e.g. NASDAQ] that are telling them what the quotes are from those exchanges in real time. Let's say I'm flooding some exchange, how do I know which orders to ignore?" Kearns asked. "I at least have to have my code pick up each incoming order to inspect it just enough to know it's my order, but then I haven't ignored it at all. These orders are very simple. You can look up the raw data. And each one is like a line of text. The expensive thing is not doing something complicated to that line of text, it's inspecting it in the first place."

The second reason that quote-stuffing is unlikely is slightly more difficult to understand. The basic idea is that we can only see the algorithms working in stocks on exchanges that are illiquid. There aren't a lot of buyers and sellers around. In fact, there aren't any. If there were, we wouldn't be able to see the patterns with such clarity because other people's bids and asks would mess them up. "That creates a problem with the argument that it's being done to slow down competitors," Kearns concluded. Essentially, on these specific stocks on these specific exchanges at these specific times, there aren't competitors to slow down.

So, if it's not quote-stuffing, why would a firm engage in this behavior? Lo and Kearns offered a few theories of their own about what could be happening.

MNTA_080510.png

"To be honest, we can't come up with a good reason," Kearns said. What's particularly difficult to explain is how diverse and prevalent the patterns are. If algorithmic traders are simply testing new bots out -- which isn't a bad explanation -- it doesn't seem plausible that they'd do it so often. Alternatively, one could imagine the patterns are generated by some set of systemic information processing mistakes, but then it might be difficult to explain the variety of the patterns.

Kearns does have a leading explanation, though, which he emailed to me after we spoke.

"It's possible that the observed patterns are not malicious, in error, or for testing, but for information-gathering," Kearns observed. "One could easily imagine a HFT shop wanting to regularly examine (e.g.) the latency they experienced from the different exchanges under different conditions, including conditions involving high order volume, rapid changes in prices and volumes, etc. And one might want such information not just when getting started, but on a regular basis, since latency and other exchange properties might well be expected to change over time, exhibit seasonality of various kind, etc. The super-HFT groups might even make co-location decisions based on such benchmarks."

MIT's Andrew Lo, who is the director of the school's Financial Engineering laboratory, offered a variation on that thesis. He contends that the algorithms are being used not to test latency but to probe the actual market conditions.

"What I think is going on is that there are algorithms that have been designed to monitor the markets and essentially create a kind of trolling function to try to identify orders that might be executed and to do that in a regular and relatively systematic way," he said.

He likened the algorithms to "financial radar."

"I think this is not random nor is it hard to understand what the motive is," Lo contended. "If you think about the way modern radar works, if you didn't know anything about radar, what you would see is pattern of electromagnetic radiation shot out at regular interviews and then you'd see patterns of the reflections of the objects out there.This is financial radar that we're seeing."

Traders want to put out tens of thousands of orders in a really short period of time precisely because they are probing for the split second when a buyer or seller shows up.

"Suppose that you would like to identify to the nearest millisecond when an order is placed and at what price. If you want to detect the trade to the nearest millisecond, you are going to have to submit orders faster than that," Lo said. "The pattern gives you a sense of the fineness of the mesh that's being constructed to try to capture the first trade that occurs."

That first trade acts as a forecast for where the price of the stock is going. "If you see an order that's being placed at $1.05 at time T, and $1.06 at time T+1 then you start betting on that for the next few milliseconds," Lo explained. "The faster you can detect the trend, the more likely you are to make money."

AON2_072910.png

Lo even offered a way of testing the algorithms to see if he's right about what they're doing. He figures that if you could squeeze into the patterned trading and take them up on an offer, they might switch into a different phase of operation.

"What would be interesting but potentially expensive to do is when you would detect patterns like this would be to trigger an order to hit the bid on the offer on one of these regular sweeps and see what happens to the pattern," Lo said.

Kearns argued, though, that the kinds of wild ordering strategies that we see aren't necessary to probe the market. "What's weird about these patterns, the sawtooth patterns, say," he said, "where you're alternating [prices up and down] with no hope of a trade is that If I were going to explore the idea of a large number of patterns to see what works, there's just no need to place orders that far from the market, especially given how quickly they are being removed."

The only people who know for sure what's going on in the market are the traders themselves and the exchanges on which they work.

At the highest level, though, the robot traders provide a unique lens on exactly how fast and complex our financial system has become. Upon the discovery of this new and apparently pervasive behavior, it is not immediately clear how to explain it, even for the brightest minds in the field.

Our regulators have tools built for assessing a market measured in seconds, but technology has pushed the markets down to the millisecond level.

"The observation to make is that this isn't as innocuous as it might seem, not so much because there is anything wrong with high-frequency trolling but rather because the regulatory infrastructure that monitors these markets are not designed to deal with this kind of latency and high-frequency," Lo pointed out. "That can create significant problems, not the least of which is the Flash Crash. There are fairness issues. There are transparency issues. There are stability issues. We need to resynchronize the regulatory infrastructure with the technology of our time."

"We're seeing innovations that dramatically increase the speed and throughput of the market, and that works great until it doesn't," Lo concluded. "And when you have some problem, like the flash crash, then you'll have version 2.0 and people will fix it. We're still in version one-dot-something and there are certainly improvements that have to be made in the regulatory infrastructure."


Images: All images courtesy of Nanex. Full explanations of the patterns at their site.

Presented by

How to Cook Spaghetti Squash (and Why)

Cooking for yourself is one of the surest ways to eat well. Bestselling author Mark Bittman teaches James Hamblin the recipe that everyone is Googling.

Join the Discussion

After you comment, click Post. If you’re not already logged in you will be asked to log in or register.

blog comments powered by Disqus

Video

How to Cook Spaghetti Squash (and Why)

Cooking for yourself is one of the surest ways to eat well.

Video

Before Tinder, a Tree

Looking for your soulmate? Write a letter to the "Bridegroom's Oak" in Germany.

Video

The Health Benefits of Going Outside

People spend too much time indoors. One solution: ecotherapy.

Video

Where High Tech Meets the 1950s

Why did Green Bank, West Virginia, ban wireless signals? For science.

Video

Yes, Quidditch Is Real

How J.K. Rowling's magical sport spread from Hogwarts to college campuses

Video

Would You Live in a Treehouse?

A treehouse can be an ideal office space, vacation rental, and way of reconnecting with your youth.

More in Technology

Just In