Ultra-high-speed trading at saturation point: Study

NEW YORK & LONDON, June 9, 2011 – Now that ultra-high speed trading in the US has reached a saturation point, the financial markets are entering a new era of quantitative research, strategy development and other targets of trade workflow automation. In a new research report published today, TABB Group says this will usher in a periodof trading with less focus on speed andgreater focus on hunting for patterns in a much larger, diverse sea of financial instruments, market-related data and use cases.

“We’ve reached the ‘tip of the spear’ on matters related to latency,” says E. Paul Rowady, Jr., senior analyst at TABB and author of “Quantitative Research: The World after High-Speed Saturation.” “We’ve reached just about as much speed as we’re ever likely to know, at least within individual matching engines and between New York and Chicago. Everything else at the ultrafast spectrum is essentially a ‘me-too’ clone of the original opportunities.” In the Far East, this tipping point will be reached at a faster pace “shortly after the last remaining cables are lit and matching engines are upgraded.”

While speed is and will continue to be essential, the biggest challenges facing quantitative researchers are data management and the need for a single, unified, storage solution capable of meeting future requirements. According to nearly 50% of the quantitative research directors TABB interviewed for this report, simply dealing with the sheer scale of market, fundamental, reference, internal, broker and client data, not to mention research commentary, is the leading challenge they face in managing a quant research platform.

As old boundaries are tested and more quant-focused firms turn research efforts toward new types of strategies that include but also go beyond speed, the industry will see a greater emphasis on cross-asset, cross-regional, multi-temporal, asymmetric versus symmetric trades, even enhanced front-to-back automation, what TABB calls “multi-dimensional arbitrage.”

In addition to algorithmic trade execution, Rowady learned that strategy development is where quant research is focused today and going forward. “Up to 70% of quant projects involve trade execution. At nearly 50%, strategy development will catch up if not surpass trade execution in the spectrum of quant projects.”

Rowady writes that firms lacking infrastructure, expertise, budget or patience to “get down and dirty in the data” are missing the most vital component of the entire quantitative research process.“The only way to tame the data beast and ultimately have a shot at enjoying success from quantitative R&D,”he says, “is through the quant toolbox – the combination of the immediacy, concurrency and multiplicity of data needs that cause so much complexity in trading infrastructures and impede improved value extraction from data.”

“There’s no single storage solution today able to handle what’s coming,” says Rowady. “Until then, quant teams must rely on multiple data stores that specialize in various datasets and crunch through exceedingly large datasets, in multiple physical locations to access the data they need to perform.”

Drawing on interviews with 25 buy-side and sell-side directors of quantitative research, the research note explains why an era focused on ultra-low latency trading is coming to a close with the emergence of a new post-high-speed era. It examines how crowding out in the high-frequency space, coupled with contemplated regulatory maneuvers to slow trading traffic, are forcing quantitative research teams to search for slower forms of alpha. The report also addresses how the search for slower turnover trading strategies, along with quant applications to other areas of trading workflow, will result in more complex data management challenges, and therefore, a period of solution innovation to combat these challenges.

Comments
comments powered by Disqus