Since then, more and more futures and options markets have moved away from floor trading to electronic trading, resulting in greater opportunities for traders and investors to develop and utilize algorithms. Algorithmic options trading is one example where there has been a marked up tick, especially in the U.S.
High frequency trading firms have developed multi asset strategies to trade equities – both derivatives and the underlying cash equities – and FX in order to take advantage of arbitrage opportunities or delta hedge their positions. Multi asset algorithms have also been developed by banks and vendors for their clients. One of the hottest areas of algorithm development has been the energy market, where, for example, technology investments by markets such as ICE have opened it up to HFT in energy futures.
What areas of trading will algorithms move into next?
The fixed income market is primed for development as some debt products become more electronic and exchange traded. Vega-Chi (no relation to Chi-x) was launched at the start of this year as the first convertible bond MTF in Europe to trade such instruments. It’s accessible to buy side algorithms that want to tap into liquidity. In addition, there’s a mechanism to hedge a position in the convertible bond by trading the underlying in the markets it connects to via smart automated order types. Credit Suisse also launched a fixed income algorithmic trading platform in June 2010 to trade pair strategies in fixed income U.S. futures and cash U.S. treasuries.
New opportunities to algorithmically trade other fixed income products will continue to emerge especially on the back of potential regulatory change that may force more and more OTC products such as interest rate swaps and credit default swaps to be standardized and traded on an exchange.
Exchanges are trying to get in on the game as well. But although some offer trading in multiple assets, it’s done across different trading platforms and in the best case there is a connectivity layer linking them.
I am aware of no exchange-based single high speed, high throughput multi-asset class trading system that would allow smart order types to trade, for example, an equity derivative and delta hedge the cash position in the underlying on a single platform with low latency.
We as a firm are exploring this opportunity in the development of a matching engine that could facilitate this kind of a trade, but the latency debate will have to move on now that we have developed a matching engine with 16 microseconds of round trip latency. Of course the speed of light is the ultimate constraint.
Developments of this nature could result in smart, multi-legged order types on a single exchange platform in which a futures trade could automatically result in a delta hedge in the underlying stocks. Today, this sort of strategy is typically handled in systems outside the exchange. But exchanges/ MTFs are beginning to deploy smart order routing systems in Europe like they did in the U.S. Take BATS Europe, for example. This has resulted in some exchanges no longer being just closed systems where buyers and sellers meet but instead a more open ecosystem with a diverse network of users, both inbound and outbound.
Is it a case of maintaining the status quo or continually adapting?
Generally, firms are always re-evaluating their algorithm trading strategies and recalibrating them based on performance and to prevent competitors from getting a handle on their strategies. In addition, reviews are often touched off by major changes in quantitative data based on changes in trading patterns that deviate markedly from the norm.
Trends emerge that are not necessarily reflected by historical data and so trading scenarios begin to emerge that are very different from those that have occurred previously. As a result, when reacting to event-driven trading, such new or non-standard trends make it more difficult for an algorithm to interpret data.
When lots of new or very different use cases start to emerge, many algorithms do not perform as intended because there are too many exception conditions to deal with that buck the statistical trend leading to adverse algorithm performance. As a result, it makes sense to take a step back, analyze the data to try to better understand the behavior and determine these new or modified trends and come back into the market with enhanced algorithms that reflect the new information. Thus, they can perform better.
Can we move from software-based trading techniques to hardware-based algorithms and matching engines on a computer chip?
A field-programmable gate array (FPGA) is effectively a hardware chip that can be programmed and can reduce latency. This is OK for low change environments such as simple simulation and real time market data processing. As a result, some trading firms and vendors have employed such techniques.
Additionally it is useful to get an accurate measurement of latency and so firms like Corvil use FPGA time stamping to measure latency more accurately than can be done at the software level. For high change distributed systems, software-based development is not going away anytime soon. There is no widespread use of FPGA for trading currently as the development time, maintenance effort and diagnostic effort required far outweigh any potential speed advantage.
What’s more, the cost-benefit analysis does not support the idea of having a matching engine on a chip. As an exchange matching engine technology vendor, we are expecting 16-, 32-, and 64-core commodity chips with hundreds of gigabits of external bandwidth in the next one to three years, and these will support all matching needs, even if you were to put all of U.S. options trading volume (>500,000 updates a second) on one such chip.
The problem is scaling out to face hundreds of users and avoiding deployment of dozens of gateway computers to support user connections. Each installation has only a few CPUs (central processing units) performing matching, with many more CPUs on the periphery. The periphery is the problem, and in fact is where much of the innovation lies. There will be increased hardware assistance in the area of power consumption rather than speed. An exchange of the future needs to be green and power efficient.
Beyond this what does the future of algorithms hold?
Some vendors are looking to develop news-driven algorithms rather than typical quantitative event-driven algorithms. There is also talk of neural trading algorithms that learn by doing, although for quantitative high frequency trading, despite a high level of automation, a human being is constantly observing the results and recalibrating the algorithms.
Some may disagree but for me, such things for the most part usually remain at the R&D stage in the realm of academia until a consistent commercial use is proven. Nonetheless, the rapid advancements in technology mean that we may not be as far away from proven commercially deployed neural trading applications as we think.
It reminds me of the film Terminator, where the machines may have really taken over.