Share this post
Although this article is intended for a general reader, there might be some parts, which are for a more technical audience. Read more
When technology began to make its way into banking technology we saw the mainframe play a central role in an industry that still relied on its branch presence for traffic. Updates to the ledger took place at the end of the day, around 5 pm when the branches closed. End of day positions were calculated, the daily positions were then collected and placed in a centralized ‘general ledger’, that acted as the centralized repository for accounting data. These days experienced slow and unreliable server communication because this was before the age of the internet.
As we moved into the 1980s, consumer banking evolved to be less branch centric; the next generation systems extended to support emerging channels such as ATMs and call centers. As consumer demand changed, and 24/7 banking became the norm, banks built systems to authorize payments using basic computer logic standing-in for the core and then passed on the changes at the end of the day to the core.
Eventually, the third generation of core banking systems emerged, which could parameterizable product engines, making any changes to the fees, product portfolio or card programs became cheaper and less risky. However, since they were often tightly coupled to their respective UIs, it could be challenging to compose this newfound flexibility in novel ways. These systems still chiefly use batch-based processing and are still monolithic in style. Since there was no link to the expensive yet effective mainframe, they can actually be less resilient and perform worse than their predecessors.
In the first decade of the 21st century, facing increasing pressure from the market to ‘move to the cloud,’ banks and their vendors often employed a lift and shift strategy. This saw banks containerize their monolithic app server, and deploy it on the cloud. Although this technically qualifies as a cloud strategy, it is not an effective one because the actual benefits, notably elastic scalability, is not accessible. In essence, banks are dealing with the same monolithic server, but now with a more expensive data center.
Sealing the technology gap with tape and hope
Moving to the cloud with multiple generations of core banking platforms proved tricky. Banks typically opted for one of three equally problematic strategies: shimming the mainframe, hollowing out the core, or just cosmetically moving app servers to the cloud.
This means that initial capital expenses are eliminated and it converts a lot of the infrastructure cost into operational expenses and in many cases you only pay for what you use. Moving or expanding infrastructure to new regions or creating redundancies is significantly easier than having to buy endless new servers blades and hardware
This shift creates some challenges. You are just moving what was designed to work in a completely different self-hosted environment into the cloud. It is like putting a car engine from a car made in the 80s into a brand new car. It will most likely work if you try hard enough, but the car is severely limited because these old engines had no sensors to check for problems, optimize fuel consumptions or for example work in conjunction with a hybrid battery.
There is, however, another way: Cloud-native core banking engine!
Following the previous analogy, with a cloud-native core you are putting a modern engine into a modern car. You can use all the bells and whistles of the cloud and even go completely serverless and only use cloud functions. You are very much in the 21st century and have the capability to use all the features that appear.
On top of that, an emerging term that banks are using to describe their transformation needs is the “headless core,” which means it is no longer bound to a specific green screen terminal window or custom app. It is API-only and stream-based, designed for the 24/7/365 world of the 21s century.
|Mainframe monolith||Hybrid monolith||Cloud-native|
|Emergence||1970-1990s||Late 1990s||Late 2010s|
|Messaging||Batch||Batch + Events||Stream|
|Connectivity||Server/Client + RPC||API + File transfer||API only|
|Portability||Mainframe||App server||Cloud / Agnostic|
|Availability||Single Data center||Multi zonal||Multi-regional|
The core banking engine Vault has been written purposefully as a cloud-native by Thought Machine from its inception. Vault is built around APIs using a microservice architecture. The services within Vault constitute a significant portion of all the functionality required to run a bank. For example, they are currently powering Asia’s first all-in-one numberless bank card and virtual bank Mox:
Vault’s Configuration Layer enables a bank to achieve a wide scale of customization without changing anything in the underlying platform. It is highly advantageous and key part of how Vault’s architecture is a counterweight to the “spaghetti” that arises in other systems when customization and platform functionality are not separated.
Vault is a “headless core” essentially meaning it is agnostic to the apps and interfaces it provides access to. Whether you are launching a new challenger bank, building a new banking product or completely re-platforming your existing bank stack, you will need a guide by your side to help you build out all the necessary interfaces and applications needed to achieve the experience you want for your customers.
Vacuumlabs specializes in building digital products and tools for fintechs and banks. We have been doing this for challengers like Twisto, Cledara, Railsbank, and even worked with Though Machine to help integrate Thought Machine with various third party providers into one cohesive app experience on some of their major client successes:
“Vacuumlabs is providing us with experienced software engineers who are helping us build the bank from the ground up.” CEO of Mox Bank: Read more here
At Vacuumlabs, we have teamed up with Thought Machine to put together a guide on building a future proof digital cloud-native bank with Thought Machine at its core while building a world-class customer experience with help from Vacuumlabs.