Banks should target data in the displacement of Libor

Banks should target data in the displacement of Libor

  • Export:

By Roy Kirby, Head of Core Products at SIX 

“It’s lights out, and away we go!”. This is a familiar phrase for F1 fans and, with the first race of the season last month in Bahrain, they will be looking forward to hearing again. There are many parallels between F1 and financial markets – from moving at high speeds on the track and in trade execution, to in-depth analytics of race performance and trading activity. There is also an important analogy for bank operations and the displacement of Libor.

In anticipation of the new season, the different teams will have been working behind the scenes looking under the bonnet to ensure the cars are optimised and ready to go for the first race. When it comes to risk free rate calculations there is an important problem under the bonnet at financial institutions.

On the surface, the displacement of Libor appears to be going well. Efforts by regulators and industry bodies have seemingly eased the monumental upheaval involved in moving from Libor to new risk-free rates in the sterling and Japanese yen markets. However, with multiple new backward-looking rates, market participants need to take a closer look at how the data feeds into their calculation engines.

The new and varied different rates are throwing up different approaches and processes across different regions. Where consensus existed, we now have divergence. And this could create operational headaches as the displacement of Libor continues, with the existence of synthetic Libor fallbacks for sterling and yen meaning the transition still has much further to go as issuers have the ability to change the rates underpinning their contracts.

The big switch from using a forward-looking rate to multiple backwards looking rates comes with numerous complexities. Primarily, the reluctance of different counterparties and issuers to converge on the different rates for which market participants need to factor in for the multiple calculations required in loans, fixed income and rates markets.

From a data perspective, the move to new rates exposes old problems. Libor’s role as the number one rate used across regions by the majority of issuers has left computer and data systems largely untouched for years. Why update something that has for so long handled the processes associated with one consistent ever-present rate? Up until recently there has been no need to change.

As institutions look under the bonnet at the infrastructure for their rate calculations, it’s clear that the systems in place are outdated and in need of some updates. Going from forward looking rates to backward looking rates, it’s as if banks are a high performance F1 engine going from using petrol to electricity for their engines – the current system just won’t do for the level of functionality needed at the highest level. This is especially critical as banks prepare ahead of the big bang next year with the displacement of USD Libor.

Getting the data quickly to minimise disruption from the pitfalls around issuer contract changes and fallback rates is crucial. Having a partner, especially one with experience administering an important global rate such as SARON, to provide the relevant reference data and help with data strategy, bringing relevant data out of siloes and creating the data linkages, will give banks the cutting edge as they navigate the sharp turns and twists, much like the F1 drivers will be this racing season.

  • Export:

Related Articles