Who benefits when the computer takes over the stock market?
The question has been debated since at least the first "program trades" of the 1980s, which were blamed for contributing to the stock market crash of 1987, and it has never been fully resolved. Sketched most broadly, defenders claim that the increased liquidity resulting from a proliferation of split-second computer-driven electronic trades helps keep markets moving and lowers transaction costs for everyone involved. But critics argue that automated trading increases volatility, exacerbating wild swings up and down in the market, and gives those entities with access to the most computerized firepower a clear advantage over regular old retail investors.
The arrest in early July of former Goldman Sachs programmer Sergey Aleynikov, accused of stealing proprietary stock-trading code from his employer, has sent this long-simmering argument into full boil. And some rather amazing facts have come to light. Notably, since roughly the beginning of the bear market kicked off by the financial crisis, the amount of stock trading accounted for by non-humans has surged. As reported by the Financial Times, Tabb Group, a consultancy, says that "high-frequency trading" firms are driving "almost three quarters of all U.S. equities trading volume." As recently as five years ago, such trading accounted for less less than one-quarter of all volume.
No humans or value investors involved. As defined by the FT:
[High frequency trading firms] typically employ trading strategies that are based not on company earnings prospects and other fundamentals, but on arbitraging minute differences in share prices and trading speeds -- known as latency -- between exchanges and other trading venues.
In this business, it's all about having the best hardware, smartest software, and maximizing every possible technical advantage -- such as physically locating your computer servers as close as possible to electronic trading systems so as to save every possible millisecond.
Sergey Aleynikov worked on exactly the kind of software employed for high-frequency trading, a fact that has generated some rather overheated claims about Goldman Sachs' ability to manipulate the stock market. But while Goldman is arguably the most ferociously effective competitor in this arena, the company is far from the only player. As the numbers reported by the FT suggest, high frequency trading is the most popular game in town. Everybody's doing it, including a lot of people whom you've never heard of before, with names -- TradeBot?! -- stolen right out of the cyberpunk classics.
Back in April, Rick Bookstaber, author of "A Demon of Our Own Design: Markets, Hedge Funds, and the Perils of Financial Innovation," considered the rising computer tide in a post titled "The Arms Race in High Frequency Trading." He concluded that the trend was without social benefit.
... [A]rms races are negative sum games. The arms in this case are not tanks and jets, but computer chips and throughput. But like any arms race, the result is a cycle of spending which leaves everyone in the same relative position, only poorer. Put another way, like any arms race, what is happening with high frequency trading is a net drain on social welfare...
...Does anyone really get a benefit in having the latency of their trade cut by milliseconds -- except for the fact that their competitor is also spending the money to cut his latency? Should anyone care if a news event hits market prices in twenty-nine milliseconds rather than thirty milliseconds? Does it do anything to make the markets more efficient? Does it add any value to society?
Many of the respondents to Bookstaber's post pushed back against his thesis -- arguing that reliance on the latest technology to eke out profits was no different now than it was when the semaphore or telegraph or telephone collapsed distances and enabled the technologically-enabled to get an informational leg up on their competitors. But as the Wall Street Journal's MarketBeat noted in June:
With the rise of these automated funds, the stock market is more prone than ever to large intraday moves with little or no fundamental catalyst. Computers don't analyze the news (although some strategies use headlines as triggers) or seek to justify their buying and selling. Even in the relative quiet of the last three months, investors have often watched individual stocks or sectors move by 10 percent or more without explanation.
How does an efficient markets theorist justify such a rapid upsurge in near-random noise and unpredictable volatility? Can this really be seen as a useful process of price discovery? More to the point, isn't it the exact opposite of transparency? Markets are supposed to work best, in theory, the more information we all have about what's going on. But we seem to know less and less -- at a rapidly increasing rate.
Shares