Originally published on Automated Trader
by Ian Salmon, Market Consultant – Finance, Accedian
Many of the high-profile items in MiFID II have been identified – unbundling, best execution, trade reporting & transaction data capture – and market commentary has already flagged many areas of concern and challenge.
But one key part is being overlooked, one that requires significant effort and investment for the 1st January 2017 deadline and has the potential to derail all other efforts to comply.
Under MiFID II, Article 50 and its corresponding RTS 25 defines the accuracy and granularity of the timestamp required to be captured on various records and reports.
The impact of this will be widespread judging from the cross-references to it from other articles which define transaction and trade reporting, audit trail and even voice and RFQ record requirements. Many asset classed not covered by MiFID I will be caught in this horological net which will ensnare operators of venues, their members, participants and clients.
The expected landing-point of the RTS requires these participants to timestamp every ‘reportable event’, in a transaction’s lifecycle to millisecond accuracy and granularity. High frequency traders must record to an even more stringent 1 microsecond resolution with 100 microsecond accuracy. However there is some pragmatism on display – RFQ and voice activity are only expected to report to 1 second granularity – a nod to the manual nature of these markets.
MiFID II objectives
Before we baulk at the intensity of the breadth of information points and accuracy, we should look to the objective of this part of the regulation. Reliable transaction data enables regulators and participants alike to historically reconstitute and compare transaction activity across venues and against prevailing market prices. This might be for detecting cross-venue market abuse, or execution quality analysis.
The two underlying principles of Article 50 really must be appreciated at this point. In measuring and reporting such minute intervals of time, accuracy and synchronisation are key. As the sophistication of electronic trading technology has grown, reconstituting the number of orders and trades completed every second on each single venue requires micro-second accurate data – do systems have the ability to record time at such high levels of granularity?
Without it, many transactions would be recorded with exactly the same time, making time alone, even on a single venue, a useless means of determining the sequence of events. To compare between venues introduces an additional challenge and requires those timestamps to be synchronised with an equivalent level of accuracy.
So in this ‘Formula One’ world, isn’t the existing technology already capable of capturing this data? Whilst many institutions have deployed timestamping to some degree, the mandated level of granularity and breadth of data is a seismic shift from current levels. Indeed many legacy platforms are simply incapable of synchronised, absolute timestamping within acceptable tolerance levels.
Spectre haunting HFT
For HFT requirements, hardware to software timestamping has inherent inaccuracies that makes application-level timestamping a questionable approach to achieving 100 microsecond synchronisation to Coordinated Universal Time ‘UTC’.
More generally, there are tens of thousands of systems globally, supporting thousands of orders per second. Instrumenting each one to meet the required granularity in such a short space of time is a herculean task. While just upgrading these mission-critical platforms is a costly and high risk project to each and every market participant, the mandatory documentation of the timing methodology and annual audit of its functioning will also add additional ongoing ‘run-the-bank’ cost. In short, the financial impact to the industry will be immense.
For many, the spectre of hardware and application changes to meet this requirement will be the proverbial last straw on top of the inevitable exchange, MTF and associated platform mandatory upgrades. Thinking laterally to a single technology that passively sources and stores transaction records, timestamped to microsecond accuracy directly from the network with no latency or software overhead minimises cost and more importantly risk. Such systems do exist – state of the art FPGA-based technologies, tried and tested in adjacent industries, set to make their debut in Capital Markets.
The financial markets have never faced change on this scale before. The industry must draw inspiration from emerging, cutting edge technologies in other market verticals to find the solution to this problem that are sufficiently low risk, affordable and timely.