Balancing cost optimisation while delivering clients with value-added solutions is one of the biggest challenges to hit the global financial services industry. In recent years, both buy-side and sell-side firms have endured painstaking, manual and resource-heavy workflows, to deliver clients the data they need at their fingertips.
The reality is, while the mainframe has served the financial services industry for over half a century, the time has now come for newer technologies to finally displace this perennial workhorse. As markets and client needs have evolved, the mainframe infrastructure has become quickly outdated, leaving a gap between what the business needs and what technology teams can offer.
In its wake, it’s left a legacy of duplicated systems and interfaces, fragmented best of breed solutions and intensive workarounds, which are holding firms back as they try to respond to a world in which the integration of ESG and private assets datasets has become the norm. In our recently commissioned industry report, 25% of buy-side and sell-side respondents rated the trust in their data as less than good, with 33% rating the timeliness of data, similarly. With the added threat of broader regulations, net zero targets and continued fee scrutiny, financial services firms are now finding themselves in an unsustainable situation.
Making multi-asset investment, market and reference data and the processes around it low-touch, translatable and scalable is becoming an urgent need. From our global market engagement it is clear that firms must take action to replace sticking plasters, the manual re-mapping of data and spaghetti workflows, with a trusted data foundation that can secure valuable insights, boost performance and add value to clients.
The question is; if the traditional change processes are no longer fit-for purpose, how can firms successfully approach operational change in the present day?
Common barriers to operational change
To date, financial services firms have had limited choices, from best of breed solutions to outsourcing and costly, front-to-back, multi-year transformations. Rather than deliver long-term value, this cycle has led to severe infrastructure complexity and often a single vendor dependency that has inhibited competitive growth and innovation.
With investment data being a central business asset, owning and controlling organisational data is the first and most vital step to unlocking value from the data you hold today. It was therefore interesting to hear from firms, that while cost vs value perception was a common barrier to operational change, surprisingly the presence of higher priorities elsewhere (56%) and lack of a business sponsor (44%) are the barriers firms most come up against.
From this, we can infer that these firms have not yet have identified that data sits at the root of business-wide challenges. And yet, we know that firms are spending far too much time, in some cases 50% or more of their day, on manual reconciliation, locating data at source, or attempting to figure out positions and exposure from siloed data – across the investment chain.
Additionally, what financial services firms don’t always recognise, especially those reliant on single platfforms, is that the data models mandated by the vendor of choice often compound the data issues they face. As a result, poor quality data and inefficient processes cause impacts across a number of workflows; from performance, attribution and exposure calculation, to accounting and reporting.
This is also an urgent concern for ESG reporting, where there is still an overwhelming lack of consensus, despite attempts to standardise processes (European ESG Template and recommendations from the Taskforce on Climate-related Disclosures), leaving teams with the arduous and manual task of data collection for regulatory reporting.
Leveraging clean interoperable data is a clear business priority. To address data-led change and to make the transition from cost centre to value generation, operations leaders and CDOs must start asking the uncomfortable questions around the costs they bear and the capabilities they are missing, not at the operational level, but at the business level.
Questions like: ‘What is the opportunity cost being missed today?’ Or ‘Are the data and operations good enough to survive today and deliver resiliency tomorrow?’
Achieving a future-fit, SaaS-native foundation
Tackling the fundamental problem of understanding, accessing and controlling the data is something the industry needs now, not in five years’ time. This starts with acknowledging the industry’s appetite for risk. In a recent industry workshop, when asked about risk appetite for change, only 12% of buy-side and sell-side firms cited a high-risk appetite, with the majority leaning to low or medium risk.
De-risking the adoption of innovative products, through Software as a Service (SaaS) investment data management technology, empowers firms to focus on the core of their business rather than the distraction of a multi-year transformation. Ultimately it enables firms to start small with mission critical areas, realise the value quickly and build the business case for further change. It’s operational change on your terms and it’s what we believe will drive the process of improving efficiency and transparency in financial services.
The time for monolithic data stores and mainframe inefficiency is drawing to a close and the industry needs to move towards and leverage the host of SaaS capabilities available today. A trusted data fabric reduces rather than adds to operational complexity, by opening up existing ‘closed’ systems, to truly understand and derive value from data. At the same time it tackles the challenge of strict data models, offering the means to flexibly migrate between data models, so firms can translate and interpret data as they need.
Achieving incremental operational change like this will drive the financial services industry to finally break through the traditional cycle and single vendor dependency that has held innovation back for so long. Our Modern Financial Data Stack, is vital to this endeavour, delivering trustworthy data at the core, while helping firms to safely adapt and evolve towards the future state. Regaining the flexibility and control to innovate on your own terms and free your talent from burn-out, is what we believe will drive businesses forward, generating the most value for clients. In the end, it empowers firms to do business the way you want, while winning back productivity and operating margins.
You can read Stephen's article and further industry insights in the latest edition of The Financial Technologist. Download your free copy here.
Leveraging a Modern Financial Data Stack to De-Risk Innovation | The Financial Technologist
27 Apr, 20235 MinutesBalancing cost optimisation while delivering clients with value-added solutions is one of th...