Data Strategies That Dictate Legacy Overhaul Methods for Established Banks
As one of the earliest adopters of technology in the 1970s, it’s little wonder that the banking industry also bears a great deal of technology debt. For example, some 40% of core banking still runs on COBOL-derived code running on mainframes. Apart from a few digital-first startup institutions, the majority of the sector experiences major challenges adapting its technology to be fit-for-purpose in 2024.
Customers’ expectations of always-on, real-time services and convenience are in stark contrast with the reality of the core systems that serve them, and established institutions are seeing an exodus of customers to providers able to make good on those demands.
The irony is that older banks possess an incredible amount of data accrued over decades, yet are unable to capitalise on that rich seam of information at their disposal. Systems architected 30 or 40 years ago may be proven, reliable and relatively secure, but were never designed to leverage data in the ways now required to guide service design and refinement at the kinds of speeds needed to remain competitive.
Financial institutions have a dual challenge, therefore. Core banking systems need to be replaced or phased out, and the data archives locked into mainframes and legacy storage released to improve customer experience, offer innovation, and take the fight to the new and neobanks that threaten to out-compete longer-established businesses.
Trusted market intelligence organisation IDC recently published papers in conjunction with Thought Machine, a next-generation core banking provider, on the options available to banks reliant on outdated infrastructure. It describes how fourth-generation cloud technologies now offer a choice of migration strategies to replace and/or update core functions.
Non-exclusive options of progressive, greenfield and ‘big bang’ methods are considered in detail, and the papers assert that an amalgam of these strategies is the most likely to succeed in the majority of cases. All, however, are predicated on the embracing of technologies like interoperability via APIs, microservice-based architecture, low-code compatibility, platform agnosticism and scalability/elasticity.
Using methodologies like architecture-as-code, self-healing and baked-in scalability, banks can develop real-time services and downstream data for analysis, all based on modular architecture that’s secure and resilient. These methods enable fast, iterative development, lowering costs and time-to-market, as well as widening product portfolios (of both customer-facing and business intelligence applications).
Once banks engage in revising their core systems, they can begin to develop an ‘enterprise intelligence’ approach that unlocks their data assets that will inform, with empirical information, long-term strategies unencumbered by the limitations that are imposed at a deep level by aging code. Running on core technology that is based on interoperability and modularity ensures an open future, allowing emerging technologies (machine learning is often quoted in this context) to be implemented – creating the possibility to develop unique and differentiating products.
The emergence of an enterprise-wide data strategy unlocks a strategic treasure-trove of information that helps banks identify, via advanced analytics, key information used to drive innovation, enhance customer experiences, and improve operational efficiencies. The myriad advantages of EI (enterprise intelligence) are explored further and possible roadmaps are outlined in ‘Unlocking Enterprise Intelligence in Banking Systems’ which is available here.
There are clearly challenges on the route to modernisation, not least of which is the maintenance of reliable, steadfast services as work proceeds. The two papers describe the pros and cons of common migration strategies and how institutions might best pick their own course.
It’s important to stress, however, that although there are myriad options, there are a number of ‘givens’, paramount among them being the use of cloud-native methodologies and techniques, even when considering a waterfall, ‘big bang’ strategy. It’s also worth noting that cloud-native methods are not solely implemented on cloud providers’ resources. Like many industries subject to high levels of governance around data security and compliance, many financial organisations adopt a hybrid topology, often to maintain a separation between critically sensitive data (airgapping) or to prevent vendor lock-in, among other reasons. Cloud-native technologies allow for this, and ensure that during and after migration, data governance is strictly observed.
The agility and scalability of cloud-native confers on platform architects the ability to structure core and ancillary systems however they’re required to be distributed, regardless of hosting platform, remote or on-premise. With the right approach, a bank’s options remain open and it can react quickly to changes in governance, as well as to market conditions, altering topology at will.
Finally, it’s worth circling back to the reasons for overhauling core infrastructure. Replacing old technology with new should not be an empty gesture because of a perceived need to progress. The end goals should remain clear in the minds of decision makers: to enable practical use of existing and future data resources and to create a basis of technology that is adaptable, secure, compliant and agile. On that basis, banks can innovate, lower the cost and time required for innovation, and compete with new-generation financial organisations that operate with little technical debt.
Startup banks and neobanks may come to the table with fewer encumbrances, but they lack the powerful body of historic data spanning multiple products that older institutions have. And it’s data, at the end of the day, that is the one resource that truly empowers.
Read the reports, ‘Driving Innovation Through Cloud-native Core Banking Platforms’ and ‘Unlocking Enterprise Intelligence in Banking Systems’ for fuller discussion of the issues covered here. Discover the cloud-native core banking and data services portfolio from Thought Machine by heading here.
READ MORE
- Securing Data: A Guide to Navigating Australian Privacy Regulations
- Ethical Threads: Transforming Fashion with Trust and Transparency
- Top 5 Drivers Shaping IT Budgets This Financial Year
- Beyond Connectivity: How Wireless Site Surveys Enhance Tomorrow’s Business Network
- Transformed Finance and Automated Excellence with Trintech