Fintech Finance presents: The Fintech Magazine 19

Page 65

COMMENTARY: TIMESTAMPING

A

short historyof financial time Regulators require timestamping of trading data. But how do we know it’s accurate? We asked three experienced clock watchers – Cisco’s James Beeken, Leon Lobo at NPL and Txtsmarter’s Hugh Cumberland Time might be a human construct – a convenient way for our brains to create a framework for our lives – and Einstein might have planted the notion that it’s entirely relative. But when it comes to financial services, being able to track it accurately, down to a quadrillionth of a second, is both a regulatory requirement, and, in parts of the industry, a distinct competitive advantage. James Beeken, product specialist for the ultra-low-latency product range at Cisco, which provides the physical network architecture to facilitate trading through the world’s exchanges, explains: “You have two things going on in the market. One is the regulatory obligation to ensure the market operates fairly, by the rules. It achieves that by obliging all trading entities to have the ability to reference back to a universal clock source and time-stamp activity to a defined degree of accuracy. That data has to be stored for some considerable while. If a market event occurs that requires investigation – a flash crash, for example – the regulator can go to all the parties that were in and around that incident, access their information and recreate the scenario to understand exactly what happened. “But trading organisations – our client base, including banks and high-frequency traders (HFTs) – that are looking to eke more margin out of market opportunity, also need to understand how their server infrastructure, strategy, and therefore their overall business, is performing. To do that, they need to monitor and analyse their www.fintechf.com

network down to a picosecond level of detail. In recent years, the network has progressed from a millisecond to microsecond, nanosecond, and now picosecond realm of analytical requirement. Not only do we need to know exactly how our current live networks are performing, we also need to be able to understand the exact impact of change, of new bits of hardware, firmware and strategy.” Being able to time-stamp a data exchange, with counterparties maybe many thousands of miles apart – indeed, in different time zones – pre-supposes both have accurate clocks with which to record it. That’s where the UK’s National Physical Laboratory (NPL) and similar guardians of the international timescale (also known as the Co-ordinated Universal Time standard, or UTC) , come in. The NPL is the keeper of UK time – arbiter of the country’s definitive second since UTC replaced Greenwich Mean Time (GMT) as the international standard of civil time in 1972. The NPL uses caesium fountain atomic clocks and primary frequency standard apparatus to realise the internationally accepted scientific definition of a second. “Currently, the caesium fountains are accurate and stable at one part in 10 to the 16 level, so the 16th decimal place,” says Leon Lobo, head of the National Timing Centre at the NPL. “But we are developing the next generation of clocks, which are accurate and stable at one part in 18, the 18th decimal place. That’s vital to develop

commercial devices at picosecond or femtosecond [quadrillionth of a second] level.” He's acutely aware how the world’s financial system relies on keeping that UK second ticking in synchronicity with those of other UTC labs. “MiFID II, RTS-25 talks about time-stamp traceability for all reportable events to UTC, which is formulated by data submitted monthly by all UTC labs around the world,” explains Lobo. “UTC NPL is the UK’s national timescale, and UTC USNO is the US Naval Observatory, which feeds the GPS constellation for time and positioning. All of these national labs are delivering the time, whether directly over the internet, over RF broadcast, or via GNSS constellations, like GALILEO, GPS, GLONASS and BEIDOU. It’s not about how an organisation receives it, though – GPS receivables or direct feeds from a national lab – but about being able to demonstrate traceability of the time-stamp for regulatory compliance. It is incredibly important to consider the entire chain, back to source. And without a common source, it’s incredibly difficult for regulators to unpick who did what when. “If you have an infrastructure in one datacentre, and an infrastructure in another datacentre, and you’re time-stamping all this activity, those time-stamps have to be relevant to each other,” he continues. “Organisations go to GPS signals, or direct feeds from organisations like NPL, to look at the whole business performance, across the estate, globally, and understand exactly what that performance is.” Issue 19 | TheFintechMagazine

65


Turn static files into dynamic content formats.

Create a flipbook
Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.