Data volumes have increased due to increased market volatility and many systems in front and back offices have struggled to keep up. KX explores how financial institutions can optimize data management and analytics platforms to cope with the ever-changing investment landscape

In 2019, a common complaint between FX traders were, ‘Where did all the market volatility go?’

In response to this grumbling came the Covid-19 pandemic, which disrupted supply chains and forced many central banks to back off on monetary policy tightening. Volatility is back with a vengeance and shows no sign of abating yet.

For many, this was good news FX traders who were able to profit from these wild currency swings. It was less welcome news for other market participants, such as corporates, which were forced to review their currency hedges to ensure they were adequately protected.

In the midst of this market chaos, many dealers are doing well because they are profiting from or protecting against violent market movements with increased client demand. For example, according to data from Greenwich Associates, the global income from the fund OUR dollar options trading jumped more than 50% between 2019 and 2020 amid market volatility (from $3.4 billion to $5.3 billion). Revenues in 2021 ($4.6 billion) were still at elevated levels.

However, while the return of volatility to FX markets may be welcome news for some, it also means a much higher sensitivity to risk.

Consequences of introducing an FX trading at the wrong time can be more serious in volatile market conditions. In addition, dealers need to be more careful about their hedging strategies. This was highlighted earlier this year when there were sharp movements in the Japanese yen against the Japanese yen OUR dollar left FX options traders nursing their injuries and trying to rehabilitate.

As a result, it is increasingly important for dealers to ensure that the information on which they base trading decisions is as accurate and dynamic as possible.

Harry Darrell-Brown, KX

“Merchants are increasingly aware of the fact that in order to be competitive, they must respond to changing market dynamics,” says Harry Darrell-Brown, product manager of financial solutions at KX, which provides time series database technology to financial institutions worldwide. “If they’re not tech-savvy and out of step with the times, they can be replaced quickly.”

KXThe kdb+ data platform enables companies to quickly store, analyze, process and retrieve large data sets. It has proven particularly popular with high-frequency traders, but has also been deployed in other time-sensitive data applications such as energy markets, telecommunications, monitoring equipment used in manufacturing plants and even Formula 1 racing.

The rule of three

Data collection and data analysis have long been the responsibility of back-office technical departments, but this has led to inefficient understanding of captured information.

Steve Wilcockson, KX

Steve Wilcockson KX

Steve Wilcockson, Product Marketing Manager at the company KXsays that proactively managing volatile market data is about what he calls the rule of three.

“It’s about having the right analytics for the right group at the right time,” he says.

It’s also about being able to effectively adapt data needs to meet the demands of different users.

“You have quants designing the system; the data engineers who build it; and the merchants who use it. Each of them has a strong interest in data analysis, but all from a slightly different perspective. The platform needs to be able to cater to each of these different interests,” says Alex Weinrich, FX and crypto solutions specialist in the company KX.

Collecting data on a single platform and providing a user-friendly interface for performing analytics enables the entire business to gain valuable insight into the day-to-day business operations.

“Capturing the fluidity between these three different roles is absolutely the right thing to do,” says Wilcockson. “Every participant in the group must be able to work together to build, analyze and understand the business life cycle.”

This enterprise-wide buy-in is critical to the successful deployment of a data analytics tool. “Everyone in the business – from salespeople to senior management – ​​needs to realize the competitive advantage this technology gives them,” says Darrell-Brown. “If they don’t adopt this new way of working, but their competitors do, then they will quickly be outclassed and likely replaced. Nobody wants that to happen to them.”

Data optimization

Processing and managing large volumes of such data is slow and time-consuming. This can lead to increased costs, reduced agility and ultimately uncompetitive pricing. While any data set can contain important insights—on which business decisions can be made—the real power is unlocked once that data is brought under one roof.

Unfortunately, in many organizations, data still exists in a number of different formats in different places. This is usually a legacy from the way the data was first captured.

Some companies have tried to work around the problem of scattered data sets by sampling the data or creating cached calculations before performing data analysis. However, this is far from optimal and undermines the ability of businesses to have real-time data at their fingertips.

Alex Weinrich, KX

Alex Weinrich, KX

Analytics need to be done the moment the data fits, Weinrich says.

“In many cases, companies move data to another location before they can perform analysis on it – which introduces latency,” he says.

Seamlessly combining historical data with real-time data can also yield more valuable insights.

“All major banks and financial institutions are using predictive machine learning today. This means incorporating historical client behavior into data algorithms and adjusting models based on how a particular group of clients or liquidity providers is likely to behave after a market shock,” says Weinrich.

Combining different data sets in this way allows companies to use the same code during development as they will use when the system is live.

“If the code running in production or the data is different from the real world, how do you know what you’re testing is valid? So it has to be the same,” says Darrell-Brown.

For the successful integration of any data platform, pre-trade and post-trade analytics with real-time response to market changes is absolutely essential.

“WITH FX markets in such a state of flux, traders must be able to hedge their portfolios and make sure they get them FX the right hedges,” says Wilcockson. “To do this, they need the ability to quickly recalibrate models to account for sudden shocks in the markets.”

This means being able to generate real-time statistics at the end of trades – not just at the end of the day. Key to this is transaction cost analysis, liquidity and venue analysis, and execution analysis.

“It’s not just about having the right tools – it’s about being able to use them effectively,” says Wilcockson.

With no signs of market volatility abating anytime soon, it’s also about the ability to hedge future data.

Richard Kiel, KX

Richard Kiel, KX

“Optimized cloud storage combined with exceptional compression capabilities can provide a highly cost-effective alternative to local data storage,” says Richard Kiel, Global Director FX solution to KX.

An important part of using the cloud is connectivity. Not only do businesses need to be able to access data as quickly as possible, but they also need to be able to do so in a convenient way.

“People need to be able to easily integrate the cloud platform into their workflows using the languages ​​and functions they already know,” says Wilcockson.

Typical tools that data scientists and engineers already use include Python, Jupyter notebooks, and structured query language—and any new cloud platform must be able to seamlessly interact with them.

“This democratizes who can access the platform immediately and continue to use it—not just today or tomorrow, but next year and beyond,” says Wilcockson.

Outside of finance

While KXThe cloud-enabled platform has proven to be an invaluable addition to the trading teams of many major banks and fund managers, its ability to efficiently process very large volumes of data means its application extends beyond the financial world.

“The deep insight you need for e-commerce is the same you need to improve efficiency and automate processes across industries,” says Darrell-Brown. “Again, it’s about having complete, accurate and timely data that can be easily extracted in real time.”

One of the most interesting examples of the use of the platform outside of finance is the world of Formula 1 motor racing.

“If you’re in a racing situation and your competitor comes to a pit stop, you need to be able to assess whether you need to adjust your tactics to compensate,” says Wilcockson. “Early data with predictive algorithms can provide this insight.”

Another successful use case for this technology has been in the supply and delivery of commodities such as natural gas, which combine timely data with predictive analytics to identify current trends in supply and demand.

“The thing is, our real-time analytics platform doesn’t have to be limited to finance,” says Weinrich. “Whether it’s business analytics, predictive health, predictive maintenance or predictive motorsports, the ‘power of three’ still applies: the right analytics for the right group at the right time, tailored to that winning combination of modeller, data engineer and system user.” .”

Source Link