Skip to content

Data quality: The value of low latency PCAP capture

Patrick Flannery
Patrick Flannery
Group Head of Low Latency, LSEG

The need for exchange data of very high quality, either raw or lightly normalised, has never been greater than it is today. Tick History – PCAP meets these robust standards, and Refinitiv, an LSEG business, is investing further to meet tomorrow’s evolving customer requirements.


  1. Data quality standards for historical data are more rigorous for use cases aligned with the electronic trading lifecycle.
  2. Ensuring lossless capture – the hallmark of high data quality – requires superior technology, processes and people.
  3. Refinitiv is investing in Tick-History – PCAP and its capture technology, Real-Time – Ultra Direct to expand coverage and other aspects of quality to meet the evolving needs of the marketplace.

For more data-driven insights in your Inbox, subscribe to the Refinitiv Perspectives weekly newsletter.

As the use of technology in trading grows – with electronification, algorithms, artificial intelligence, machine learning and more – the importance of deploying high-quality data is increasing dramatically.

So, packet-captured (PCAP) historical data that supports use cases aligned with ultra-low latency trading needs to be both extremely robust and precise. With the acquisition of MayStreet in May 2022 by LSEG – Refinitiv’s parent company – there is now significant potential to grow Refinitiv Tick History – PCAP data across several dimensions.

Discover more about Refinitiv Tick History – PCAP, a cloud-based, 20+ petabyte repository of ultra-high-quality global market data, captured directly at the data centre level

What is data quality?

For electronic trading needs, data quality for market data is determined by five key factors:

  • Accuracy – To be accurate, the market data needs to be valid. That is, the data must contain the values disseminated by the exchange and it needs to be of the highest fidelity.
  • Accessibility – Quality data is accessible when and where you want it.
  • Completeness – This means that data sets must contain all of the data elements provided by the exchange and that all data required for a use case is available.
  • Latency – Data is captured as it comes out of the exchange, with minimal network hops and high-performance network infrastructure (switches/routers) to minimise network latency.
  • Usability – The ability of financial services firms to use data in the format that they prefer.

Data quality elements

Electronic trading needs require historical data with the same robust quality standards as the original feed.

Refinitiv Tick History – PCAP provides data captured directly from exchanges using Real-Time – Ultra Direct, Refinitiv’s ultra-low latency feed handler – with high levels of accuracy, accessibility, completeness, and usability, as well as low levels of latency in the capture process. This ensures that financial services firms can trust Tick History – PCAP for even the most sensitive use cases.

How are gaps in data addressed?

While Tick History – PCAP provides robust historical data today, data quality can always be evolved further to support the developing demands of customers’ use cases.

For example, during the T+1 processing, Tick History – PCAP automatically uses primary and backup captures at the primary data centre only where available and when there would be no impact on processing time. Arbitrated data is examined for drops the next morning – if any are found, this is investigated to determine whether including backup sites would be beneficial.

Many drops occur at the exchange level, which TCP replays – utilities for editing and replaying previously captured network traffic – can help remedy.

Over the coming months, the Tick History – PCAP team plans to include TCP replays with all the data captures. If there is a drop, the technology will automatically capture the replays and deliver those with the feed.

Longer term, the technology will automatically perform the drop examination processes before arbitration, allowing us to determine the best mix of sites to include in that night’s arbitration. Ultimately, this will deliver even more accurate data to customers for deployment in their most sensitive use cases.

End-to-end process flow to achieve data quality
End-to-end process flow to achieve data quality

What else does the acquisition of MayStreet by LSEG mean for the data quality of Refinitiv Tick History – PCAP?

Refinitiv is deeply committed to investing in the technology that supports Refinitiv Tick History – PCAP and innovating to enhance the Real-Time – Ultra Direct feed handler – all with a focus on data quality.

In addition to expanding coverage, ongoing improvements include:

  • Switching to independent time sources to add redundancy, reduce latency and improve time accuracy.
  • Undertaking implementation improvements, including enhancements in backhaul timing at all sites worldwide.
  • Upgrading our infrastructure so that data can be captured as near to an exchange’s matching engine as possible using capture switches that are 100 nanoseconds from the wire.

Many other improvements are planned or are underway. For a more detailed exploration of data quality for historical data for electronic trading, and of the future direction of Tick History – PCAP, read our new Expert Talk.

Discover more about Refinitiv Tick History – PCAP, a cloud-based, 20+ petabyte repository of ultra-high-quality global market data, captured directly at the data centre level


Faqs

What is data quality?

For electronic trading needs, data quality for market data is determined by five key factors