High Frequency Trading
High Frequency Trading (HFT) is a sophisticated trading practice that utilizes advanced computer algorithms to execute transactions at extremely high speeds, often measured in millionths of a second. This method allows traders to analyze vast amounts of financial data instantaneously, capturing small price discrepancies to generate profits. Historically, trading relied on human brokers who provided insights based on market analysis; however, HFT has largely automated this process, reducing the human element and increasing transaction volumes dramatically.
The rise of HFT has transformed the structure of financial markets, leading to concerns about market stability and fairness. Critics argue that the lack of human judgment in decision-making can lead to chaotic trading scenarios, especially during market volatility, as evidenced by events like the Flash Crash of 2010. Furthermore, there are worries that HFT primarily benefits large firms with advanced technology, potentially sidelining smaller investors.
Despite these challenges, HFT has become an integral part of modern trading, often credited with improving market liquidity and lowering trading costs. As the industry matures, it faces ongoing scrutiny and adaptation, with regulatory discussions continuing around the implications of increasingly automated trading practices.
High Frequency Trading.
Abstract
High frequency trading refers to the popular practice of buying and selling stock through the use of sophisticated computer programs and increasingly larger volume trading platforms. In a volatile global financial market, high frequency trading introduces the possibility of computer programs analyzing gigabytes of up-to-the-second financial data, including critical market information at very high speeds. However, given that the process virtually eliminates the human factor in stock purchasing, and given the vulnerabilities of even the most sophisticated computer systems, high frequency trading has also sparked concerns over the risk of catastrophic financial panic as computerized trades become faster and faster.
Overview
For more than two centuries, the operating principle behind financial markets has been that the faster the sales and purchase of stock shares, the more liquid the financial marketplace, the better for both investors and companies. Information networks that link the principal stock trading centers—New York to London; New York to Tokyo; New York to Chicago; London to Hong Kong, for instance—have long sought to obtain accurate information quickly in order to maintain a healthy trading environment. Thus information has always been crucial—information about a company's health, earnings reports, all financial data gathered by government agencies, movement among both small investors and large investment firms, even information about personnel movements within individual companies, and even broader information about resource scarcity, weather interruptions of market activity, unexpected political and military events—all factor into the measuring of a market's health and in turn the viability of stock trading. Information, then, is at the heart of the trading transaction.
Securing such information quickly has always been the basis for shrewd market maneuvers that have routinely secured investors sizeable returns. The general premise, of course, has been for investors to buy below the trend prices and then sell when the market price for the share of stock pitches above trends. It is determining that moment that defines expert and successful trading.
For centuries, new communication networks have impacted the trading process. In a business that relied originally on a network of spies on horseback carrying secret information about new mineral reserves or potential trading routes opening on land or sea, the invention of the wireless telegraph and the subsequent introduction of telephone communication and then transatlantic communication through underwater cables created a frenzy for the increasingly rapid dissemination of crucial financial information and redefined trading as an industry onto itself. Speed and volume became the defining characteristics of the trading market.
The introduction in the mid-1980s of automation into the trading markets and the rapid evolution into digital information platforms (accomplished in less than twenty years) revolutionized the traditional conservative wisdom about the relationship between an investor and a broker and, in turn, the relationship between the trading market and the larger economic environment. The stock brokers, traditionally middlemen who gathered financial information and then advised their investors, seemed suddenly to embody antiquated, time-consuming, and even backward thinking. Traditionally, investors paid brokers for their insights, their market savvy and gut feelings, their ability to gauge and predict market activity, and to advise them when to buy and when to sell often huge volumes of stocks.
What if computers could do the thinking? "[High-frequency traders] are a natural outcome in a world in which trading is automated, and in which there is competition between lots of different exchanges and a need for someone speedily to knit together the prices they offer" ("Fast" 2012). By feeding massive amounts of data, raw financial information, into a software program and by devising algorithms that direct the computer to make the appropriate logical market moves, would it not be possible for an investor to sidestep the expense of the stockbroker and engage in market activity directly? Response time could be minimized to the point at which it is measured in millionths of a second, a far cry from the often time-consuming efforts of traditional stockbrokers to pore over financial information before deciding on purchase actions. By automating the middleman, of course, what would be lost would be the human factor, that uncanny insight and perceptions that stockbrokers can bring to their clients. But what would be gained would be higher sales volumes, stock shares bought and sold with such rapidity that by the time a stockbroker would receive the information, the information would already be obsolete and virtually useless. Machine talking to machine—that is the central vision of high frequency trading.
Applications
The evolution of high frequency trading redefined the stock market—if the conventional model centered on big-name firms with hundreds of high-priced financial analysts headquartered in multi-floor office buildings in the financial heart of major international cities, the new model stressed smaller facilities in more remote areas, a smaller corps of younger, hipper financial advisers with computer science and mathematical backgrounds (sometimes known as quants), most digital natives born after 1980. Their job is to facilitate the movement of information, designing not only the information network systems whose effectiveness is measured by how many millionths of second the routing system can take off information transfer but also to design cutting edge applications and code new algorithms that, in turn, create computer systems more tuned to the meanings of fluctuations in the global market.
Of course, the rapid rise of these new stockbroker firms that manage high frequency trading raised security concerns. Unlike the established firms long in the financial market, these firms were not required to register with any government oversight or regulatory board, not even the Securities and Exchange Commission. In short, the volume of financial information and sales monitored by these high frequency trading outfits is never scrutinized, books are never made public, and financial maneuvers are not subject to larger scrutiny to maintain a healthy larger economic environment. More problematic, there is no way to override the computer activities should the sales (and the volume is measured in millions of transactions per second) start to spin into some catastrophic meltdown.
Undoubtedly high frequency trading is cheaper and quicker to the point that investors, particularly those born entirely within the age of the Internet and computer accessibility, began to rely on the system (market analysts, many skeptical about the emergence of an entirely automated market, dismissed that reliance as an addiction to the easy way, a way to avoid the hard work of market analysis and the human factor of assessing the market volatility and viability). Even conservative high-priced brokerage houses invested not merely in financial personnel to render experienced trading intermediation but in trading infrastructure itself, investing billions in across-the-board revamping of computer systems, upgrading to high-speed servers, investing in the coding of increasingly more sophisticated programs to assess and even manipulate the market, and most importantly securing processing routes that provide efficient global links. In 2014, Matthew O'Brien, contributing editor for The Atlantic, termed this massive intra-business escalation of technology as a kind of "intellectual arms race."
Clearly, in a short period of time high frequency trading became a defining and central reality—estimates agree on that. In 2013, for instance, about half of the billions of stock transactions were facilitated and executed using automation. That number was expected to increase as the efficiency and security of machine-to-machine trading increases and as the speed of relayed information, sent along thousands of miles of fiber optic wiring, approaches the speed of light itself (see Adler, 2012).
It is perhaps difficult without a background in finance trading to appreciate the magnitude of the impact of high frequency trading. Without the appropriate scale, the process and its potential advantage can seem more like picking up pennies in some infinitely large parking lot. If a stock in a shoe company trades at $2.00 a share and an investor purchases a volume of the stock through an automated purchase vendor and then, within seconds of the purchase, the stock price ticks up to $2.01, the same automated stock vendor will sell it, as it is programmed to sell at a margin of profit. For a stockbroker to peruse data and to horse-trade and quibble to that extent would be tedious, unrewarding, and time consuming—but a computer can do that evaluating effortlessly all day and all night. For a computer, pursuing a fraction of one thousandth of a cent is a standard operating program.
Simultaneously, the program can process data regarding what to do with that stock at that price, whether to move it or hold it. Those decisions will be uncomplicated by the necessary caution of the human factor—the program will be fed a constant stream of international financial information, including market moves, international stock trading data, government data on world market health, even measures of cell phone upticks that might indicate hot spots in trading. It is more information than a stock broker could sift through in weeks. In milliseconds, the automated stock vendor measures patterns, assesses trends, tests prices, and logically makes the appropriate buy or sell move. To return to the example of that shoe company stock, if that apparently miniscule percentage of stock value increases across hundreds of shares traded thousands of time an hour, the impact of high frequency trading starts to become evident. If one then multiplies that times the thousands of small and large companies that trade stock on the public market, the animation volume of trading becomes clearer. These automated investors are certainly not like old-school stockbrokers who were primarily committed to making their clients rich, hired to ensure their clients banked a profit at the end of the day through savvy observation of market movements. Rather, in high frequency trading the goal for the program is to act logically, methodically, cleanly, and efficiently.
Given the industry-wide interest in developing cutting edge software applications that will facilitate and even manipulate stock market trading, and given the deep secrecy under which many investment firms cloak their software development divisions, cataloging the variety of software programs designed to work with trading is a challenge. As the industry moves toward being driven entirely by algorithms, perhaps the two best-known are the momentum algorithm and the mean-revision algorithm. The momentum algorithm buys stock expected to rise; the mean-revision algorithm sells stock expected to drop. Those expectations are shaped entirely by the constant stream of financial data. There is also an algorithm designed to weigh stock options in pairs, comparing activity in industries that are historically linked, for instance the fuel industry and the airline industry or the commodities market and the restaurant industry. These algorithms are designed to make financial moves by first weighing and comparing conditions in different industries. Another market maker algorithm looks for quick profit, the old-school conventional wisdom of buying low and then selling high, but this can be done increasingly rapidly and efficiently. "Stock exchanges can now execute trades in less than a half a millionth of a second more than a million times faster than the human mind can make a decision" (Baumann, 2013).
Viewpoints
The larger question, of course, is whether high frequency trading will render the rest of the stock market obsolete. In high frequency trading, computers operate without perceptions greater than the logic of the current negotiation and without regard for the health of the companies involved, much less the well-being of the greater economy. The traditionally accepted value of centralized financial institutions was to guarantee a smoothly operating economy, reassuring investors that stock transactions are conducted in stable and transparent ways. Traders "want the benefits of high-frequency traders' technology, but [they] need to make sure the technology is being deployed safely and responsibly because everyone has a stake in the outcome" (Chilton, 2014).
As the volume of high frequency trading swells, as more stock brokerages commit resources to the strategies of automated trading, the risk grows of machine-to-machine error. Opponents of the trend toward high frequency trading point out that given the volume of trading done in milliseconds, any financial perturbation or glitch in the information systems could render the market chaotic and vulnerable to precipitous collapse, and given the vastness of the automated systems, any kill button might be difficult to locate, much like running behind tipping dominoes. Proponents of high frequency trading have suggested batching trading orders in sequential stock folders, thus preventing continuous trading and minimizing the possibility of snowballing (Budish et al., 2012).
Indeed the so-called Flash Crash on May 6, 2010, in which the Dow Jones Industrial average plummeted more than 600 points in just under five minutes, triggering a massive global panic, was initially blamed by conservative financial experts on the untrackable high frequency trading. The initial explanation centered on computer systems acting on false information, most likely leaked into the global network at some portal point, which led to programs beginning to sell and buy their own unwanted stocks, bouncing the volume back and forth in a kind of vacuum loop. However, much later, financial forensics indicated that the programs actually prevented the sales loop from escalating into a major global meltdown, that the system actually corrected itself, patched across its own misinformation and halted the accelerating sales spiral. The algorithm shut itself off from compounding its own error (Goldfarb, 2010).
Another major concern about high frequency trading from its earliest incarnations was that it provided an advantage only to large, technologically enabled firms, with smaller traders and average investors losing out on the potential benefits. Michael Lewis's 2014 book Flash Boys did much to draw public attention to the phenomenon, and also stoked these concerns of stock exchanges becoming an unfair playing field. Congress even held hearings on the implications of high frequency trading. While these concerns remain to some degree for many analysts, the rapid adoption of high frequency trading across the industry allayed the worst fears. By the late 2010s, even average investors were typically interacting with complex algorithms when buying or selling stocks, bonds, futures, or exchange traded funds through various interfaces (Meyer, Bullock, & Rennison, 2018).
At any rate, old-school investment advisers remained reluctant to concede the implications of the computer-driven market. "High-frequency traders tend to narrow the bid-ask spread by protecting the market makers from bad news while they help their positions. Thus . . . trading costs get lower" (Conerly, 2014). The potential profits from high-volume, high-speed transactions position high frequency trading as a kind of inevitability for the stock market and for the global economy.
By the late 2010s, high frequency trading was firmly established and regarded as an industry staple rather than a new, disruptive technology. Indeed, many financial observers began to consider it a mature industry, no longer capable of generating the kind of eye-opening profits that had won the method so much attention over the previous decade and a half or so due to cost increases (such as fees for exchange data) and decreased volatility. The speed of information transmission had reached such levels that most competitors operated at equal levels. In one sign of the resulting slowdown, the Tabb Group consultancy estimated 2017 aggregated revenues of companies trading at high frequency in US stocks fell under $1 billion for the first time in almost a decade, after hitting $7.2 billion in 2009 (Meyer et al., 2018). Firms practicing high frequency trading underwent a wave of consolidation, with analysts keeping an eye on potential effects on liquidity. Several of the major high frequency trading firms cooperated in establishing a consortium to build an ultra-high-speed communications link, known as Go West, between Chicago and Tokyo (Meyer et al., 2018).
Terms & Concepts
Algorithm: A procedure or process designed to direct a computer system to execute a specific problem-solving protocol in a finite series of steps.
Latency: The measureable delay or gap in a computer system between the time of data input and desired result going effective.
Liquidity: The degree at which a stock can be bought or sold in the market without negatively impacting the stock's value, a measure of the environment of trading and the confidence of investors in the viability of the market.
Market volatility: The fluctuations in market activity caused by internal and external forces.
Quants: Nickname for computer software designers with mathematical expertise able to perform quantitative analysis on data.
Spread: The difference in a stock's value between an actual bid and the asking price at any specific time.
Bibliography
Adler, J. (2012, August 3). Raging bulls: How Wall Street got addicted to light-speed trading. Retrieved April 1, 2015 from http://www.wired.com/2012/08/ff%5Fwallstreet%5Ftrading/
Alexis, S., & Masayuki, S. (2018). High-frequency trading, liquidity withdrawal, and the breakdown of conventions in foreign exchange markets. Journal of Economic Issues, 52(2), 387–395. https://doi.org/10.1080/00213624.2018.1469883
Bauman, N. (2013, January/February). Too fast to fail: Is high-speed trading the next Wall Street disaster? Retrieved April 1, 2015 from http://www.motherjones.com/politics/2013/02/high-frequency-trading-danger-risk-wall-street
Budish, E., Cramton. P., & Shim, J. (2012). The high-frequency trading arms race: Frequent batch actions as a market design response. University of Chicago Booth School of Business publications. Retrieved April 1, 2015 from http://faculty.chicagobooth.edu/eric.budish/research/HFT-FrequentBatchAuctions.pdf
Chilton, B. (2014, July 7). No need to demonize high-frequency trading. Retrieved April 1, 2015 from http://dealbook.nytimes.com/2014/07/07/no-need-to-demonize-high-frequency-trading/
Conerly, B. (2014, April 14). High frequency trading explained simply. Retrieved April 1, 2015 from http://www.forbes.com/sites/billconerly/2014/04/14/high-frequency-trading-explained-simply/
The fast and furious. (2012, February 25). Retrieved April 1, 2015 from http://www.economist.com/node/21547988
Goldfarb, Z. (2010, October 1). Report examines May's "flash crash" expresses concern over high-speed trading. Retrieved April 1, 2015 from http://www.washingtonpost.com/wp-dyn/content/article/2010/10/01/AR2010100103969.html
Menkveld, A. J. (2018). High-frequency trading as viewed through an electron microscope. Financial Analysts Journal, 74(2), 24–31. Retrieved from EBSCO Online Database Business Source Ultimate. http://search.ebscohost.com/login.aspx?direct=true&db=bsu&AN=129776624&site=ehost-live&scope=site
Meyer, G., Bullock, N., & Rennison, J. (2018, January 1). How high-frequency trading hit a speed bump. Financial Times. Retrieved from https://www.ft.com/content/d81f96ea-d43c-11e7-a303-9060cb1e5f44
O'Brien, Matthew. (2014, April 11). Everything you need to know about high-frequency trading. Retrieved April 1, 2015 from http://www.theatlantic.com/business/archive/2014/04/everything-you-need-to-know-about-high-frequency-trading/ 360411/
Suggested Reading
Brogaard, J., Hendershott, T., & Riordan, R. (2014). High-frequency trading and price discovery. Review of Financial Studies, 27, 2267-2306. Retrieved March 22, 2015 from EBSCO Online Database Education Research Complete. http://search.ebscohost.com/login.aspx?direct=true&db=bth&AN=97239230&site=ehost-live
Clarke, T. (2014). High-frequency trading and dark pools: sharks never sleep. Law & Financial Markets Review, 8, 342-351. Retrieved March 22, 2015 from EBSCO Online Database Education Research Complete. http://search.ebscohost.com/login.aspx?direct=true&db=a9h&AN=100732472&site=ehost-live
Guo, X. (2017). Quantitative trading : algorithms, analytics, data, models, optimization. Boca Raton, FL: CRC Press.
Lewis, M. (2014). Flash boys. New York, NY: Norton.
Manahov, V., & Hudson, R. (2014). The implications of high-frequency trading on market efficiency and price discovery. Applied Economics Letters, 21, 1148-1151. Retrieved March 22, 2015 from EBSCO Online Database Education Research Complete. http://search.ebscohost.com/login.aspx?direct=true&db=bth&AN=98645767& site=ehost-live
Patterson, S. (2013). Dark pools: The rise of the machine traders and the rigging of the U.S. stock market. New York, NY: Crown.