Created by Materia for OpenMind Recommended by Materia
Start Trading: an Arms Race
17 February 2014

Trading: an Arms Race

Estimated reading time Time 3 to read

Just like other industries, trading activity is being automated at an increasing rate. With it, the mastery of statistics, mathematics, microeconomics and risk management does not now appear sufficient for a trader to operate efficiently in the market. Now we also need to master computer science and machine learning. The age of robots has arrived in finance at a phase that has been called the Arms Race. In fact, as happend during the worst period of the Cold War, activity has been surrounded by:

  1. secrecy: there is barely any literature published on the subject;
  2. espionage: there have been numerous court judgments against programmers who stole codes in their companies and gave them to their new employers; and
  3. propaganda: now it is no longer important what you know but what it appears you know, so that you can confuse your competitors (this is an obviously more subtle aspect of something that occurs in conversations outside work, interviews, events, etc.).

And no one wants to lose position within this wave of progress because if you arrive late you may find you are at a point of no return and are expelled from the market – it is with good reason that there is talk of a change in the paradigm of trading.

The problem that most financial agents are facing in this field is always the same: a lack of expertise. Very few professionals dominate the fields mentioned above and while human capital is being trained (there are notable cases such British government organizing a doctorate in Computational Finance specifically for this purpose, at the UK PhD Centre for Financial Computing and Analytics), the decisions are taken by people with good intentions but insufficient expertise. Overspending in technology (maximized, at times developed before really being usable, and that will be partially obsolete the day it is used), frictions between teams due to individualistic incentives and undefined responsibilities, imbalances in project management… These are typical consequences of the chaos in which many financial agents find themselves in their progress toward systematic trading. And while, all the competition is becoming increasingly ferocious. Some products have already consistently reduced their margin by more than 50% (Menkveld (2013)), meaning that the new trading paradigm is already here to stay.

What few operators are clear about what is behind these developments are two types of agents: those who focus on maximizing technology and those who focus on maximizing strategies. While the former group are dedicated to market arbitrage (taking advantage of its inefficiencies) in questions of milliseconds, the second group are geared to statistical arbitrage (taking advantage of the inefficiencies that “will probably be revealed” in accordance with a series of indicators). Between the two, the second group appears more interesting for most of the industry, not only because it does not depend on a continuous expense in technology, unlike pure arbitrage, but because it also has a broader universe of possibilities that allows it to compete for certain patterns with fewer or even no agents; unlike arbitrage, where most of the universe is basic and known to all.

The challenge of statistical arbitrage when it is interday (which is normal for market makers) is that it not only requires knowledge of the fields mentioned above, but that these are also combined with one that has boomed in recent years, big data. Data for which interday patterns are sought comply with the paradigm of the 3 Vs: high volume, variety of formats/structures and velocity of processing. This makes things even more difficult in terms of human capital and costs. In fact, there is also an added difficulty: one of the major tools of big data, MapReduce, is lost: the regression and the learning algorithms of machines used in the strategies are in general difficult to parallelize, which does not allow the use of techniques such as MapReduce to achieve the above-mentioned velocity.

Overall, the adaption to the new market involves giving a greater role to technological and algorithmic synergies. This flexibility calls for new hierarchical structures with global roles, which can ensure that they can be perfectly used by e-commerce and trading, both in different regions and in different assets, often with legacies ranging from equity to foreign exchange and fixed-income, in what has been termed the “equitization” of the rest of the assets. This is precisely where the advantage of the smallest agents lies (we could name some, ranging from hedge funds such as Renaissance Technologies, Citadel or D.E. Shaw to liquidity suppliers such as Jane Street and Susquehanna) and the big challenge of the big agents, more reluctant in general to adopt organizational changes.

Talent acquisition, organizational flexibility and idea of a global team will thus determine the future of the current market agents.


  • Menkveld, A.J. (2013). High Frequency Trading and the New-Market Makers. Journal of Financial Markets 16.


Sergio Álvarez-Teleña

Strategies & Data Science, BBVA (Madrid)

More publications about Sergio Álvarez-Teleña

BIG "DATAxi": Invasion or Innovation?

Comments on this publication

Name cannot be empty
Write a comment here…* (500 words maximum)
This field cannot be empty, Please enter your comment.
*Your comment will be reviewed before being published
Captcha must be solved