High-Frequency Trading in Limit Order Markets: Equilibrium Impact and Regulation

Author(s):  
Jakub Rojcek ◽  
Alexandre Ziegler

This article examines every NASDAQ ITCH feed message for S&P 500 Index stocks for 2012 and identifies clusters of extremely high and extremely low limit-order cancellation activity. The authors find results consistent with the idea that cancel clusters are the result of high-frequency traders jockeying for queue position and reacting to information to establish a new price level. Furthermore, few trades seem to be executed during cancel clusters or even immediately after them. Low cancellation activity seems to be markedly different, with many level changes all caused by executions. The results are consistent with high-frequency trading firms behaving as agents who bring efficiency to the market without the need to have executions at intermediate prices. The authors also discuss the misconception that investors and low-frequency traders are synonymous and its implications for policy given these results.


2015 ◽  
Vol 130 (4) ◽  
pp. 1547-1621 ◽  
Author(s):  
Eric Budish ◽  
Peter Cramton ◽  
John Shim

Abstract The high-frequency trading arms race is a symptom of flawed market design. Instead of the continuous limit order book market design that is currently predominant, we argue that financial exchanges should use frequent batch auctions: uniform price double auctions conducted, for example, every tenth of a second. That is, time should be treated as discrete instead of continuous, and orders should be processed in a batch auction instead of serially. Our argument has three parts. First, we use millisecond-level direct-feed data from exchanges to document a series of stylized facts about how the continuous market works at high-frequency time horizons: (i) correlations completely break down; which (ii) leads to obvious mechanical arbitrage opportunities; and (iii) competition has not affected the size or frequency of the arbitrage opportunities, it has only raised the bar for how fast one has to be to capture them. Second, we introduce a simple theory model which is motivated by and helps explain the empirical facts. The key insight is that obvious mechanical arbitrage opportunities, like those observed in the data, are built into the market design—continuous-time serial-processing implies that even symmetrically observed public information creates arbitrage rents. These rents harm liquidity provision and induce a never-ending socially wasteful arms race for speed. Last, we show that frequent batch auctions directly address the flaws of the continuous limit order book. Discrete time reduces the value of tiny speed advantages, and the auction transforms competition on speed into competition on price. Consequently, frequent batch auctions eliminate the mechanical arbitrage rents, enhance liquidity for investors, and stop the high-frequency trading arms race.


2008 ◽  
Vol 8 (3) ◽  
pp. 217-224 ◽  
Author(s):  
Marco Avellaneda ◽  
Sasha Stoikov

In this paper we take a retrospective look at our paper “Phantom Liquidity and High-Frequency Quoting” and discuss the context of the research in light of our broader inquiry into the nature of the high-frequency trading industry. The data presented in this paper appear to show that limit order cancellations of high-frequency traders are associated with price discovery and liquidity provision, rather than some manner of systematic taking advantage of other market participants. These firms are acting as rational, profit-seeking businesses, and we believe time has shown this view to be correct. In the years since publication, HFT has matured, and consolidated into fewer, lower-cost providers of efficiency and liquidity services, much like we would expect in any other industry.


Author(s):  
Matteo Aquilina ◽  
Eric Budish ◽  
Peter O’Neill

Abstract We use stock exchange message data to quantify the negative aspect of high-frequency trading, known as “latency arbitrage.” The key difference between message data and widely familiar limit order book data is that message data contain attempts to trade or cancel that fail. This allows the researcher to observe both winners and losers in a race, whereas in limit order book data you cannot see the losers, so you cannot directly see the races. We find that latency arbitrage races are very frequent (about one per minute per symbol for FTSE 100 stocks), extremely fast (the modal race lasts 5–10 millionths of a second), and account for a remarkably large portion of overall trading volume (about 20%). Race participation is concentrated, with the top six firms accounting for over 80% of all race wins and losses. The average race is worth just a small amount (about half a price tick), but because of the large volumes the stakes add up. Our main estimates suggest that races constitute roughly one-third of price impact and the effective spread (key microstructure measures of the cost of liquidity), that latency arbitrage imposes a roughly 0.5 basis point tax on trading, that market designs that eliminate latency arbitrage would reduce the market’s cost of liquidity by 17%, and that the total sums at stake are on the order of $5 billion per year in global equity markets alone.


Author(s):  
Yacine Aït-Sahalia ◽  
Jean Jacod

High-frequency trading is an algorithm-based computerized trading practice that allows firms to trade stocks in milliseconds. Over the last fifteen years, the use of statistical and econometric methods for analyzing high-frequency financial data has grown exponentially. This growth has been driven by the increasing availability of such data, the technological advancements that make high-frequency trading strategies possible, and the need of practitioners to analyze these data. This comprehensive book introduces readers to these emerging methods and tools of analysis. The book covers the mathematical foundations of stochastic processes, describes the primary characteristics of high-frequency financial data, and presents the asymptotic concepts that their analysis relies on. It also deals with estimation of the volatility portion of the model, including methods that are robust to market microstructure noise, and address estimation and testing questions involving the jump part of the model. As the book demonstrates, the practical importance and relevance of jumps in financial data are universally recognized, but only recently have econometric methods become available to rigorously analyze jump processes. The book approaches high-frequency econometrics with a distinct focus on the financial side of matters while maintaining technical rigor, which makes this book invaluable to researchers and practitioners alike.


Sign in / Sign up

Export Citation Format

Share Document