High-Frequency Trading in Nanoseconds: Analysis, Modeling, and Policy Implications
Mao Ye, University of Illinois at Urbana-Champaign
Usage Details
Mao YeBefore trading became automated, markets could only operate as fast as humans could walk, talk, and gesture. In those days, it seemed that financial economists could always rely on intuition to identify the key mechanisms underlying financial markets. However, markets no longer operate on a human time scale. Modern markets are big, fast, and complex, and mechanisms at work in them are not necessarily accessible to unaided intuition.
Computing resources have become a serious constraint for financial economists. Computerized algorithmic and high- frequency trading generates enormous amounts of data, at ever-growing rates; for example, the historical market data from just the CME Group’s exchanges exceed 450 terabytes. Furthermore, industry practitioners routinely use techniques from machine learning and high- dimensional statistics to guide their trading. The advent of big data has reshaped not only the methodological challenges and opportunities facing financial economics, but also the very phenomena that the field studies.
In this proposal, we will construct the limit order book for each stock at each point in time from the NASDAQ ITCH data. This will give us snapshots of the financial market with nanosecond precision, so we can recover the state of the order book when trades and crashes happen. The data will be the foundation for analysis of high frequency trading and related policy.