All blog posts
Data Streaming

Why banks migrate from SOA to event-driven architectures

Legacy systems are often based on monolithic structures

In the past, financial service providers built their IT infrastructure around legacy systems and frameworks designed to handle specific business functions. These systems were often monolithic and centralized with product-aligned silos, serving as the backbone of operations. The ESB (Enterprise Service Bus) network architecture was the go-to method to integrate diverse business applications and services across an organization. It acted as a centralized hub, enabling communication and data exchange between different systems, applications, and databases. Software architects mostly used mainframes, proprietary software, and closed-source frameworks, such as IBM Integration Bus (IIB), Oracle Service Bus (OSB) or TIBCO ActiveMatrix BusinessWorks to implement the ESB. These tools were deployed to process large transaction volumes while ensuring system stability and reliability at the same time, which is the bedrock of survival for a company, especially for the banking industry.

Main issues and limitations with ESB-oriented architectures

A steadily growing customer base, transactions, and service offerings created new departments and services that needed to operate and develop independently of each other. Today’s rapidly changing environment equally requires great flexibility to communicate with and integrate different applications and data sources, including 3rd party public cloud services like, e.g., AWS (Amazon Web Services). The idea of smart endpoints and dumb pipes became widely adopted to quickly decouple or add applications, e.g., for modification or upgrading purposes, which is not a strong point of SOA (service-oriented architecture). Neither is it for analyzing and processing large data volumes in real-time, for data persistence or data replay. With growing workloads on expensive mainframe infrastructure and unacceptable delays due to batch processing, ESB architectures have become impractical for real-time use cases such as instant payment and fraud detection. Additionally, proprietary ESBs run the risk of rising costs over time due to vendor lock-in.

Transforming banking with real-time data mastery

While ESB architectures will continue to have useful applications, there is an ongoing transition towards data-driven applications, where data streams need to be reacted to and processed in real-time. Event-Driven Architecture (EDA) is more suitable to meet those requirements; it fosters better decoupling and flexibility between services, even with legacy and external partner systems. EDA enables businesses to change, evolve, and scale with ease by applying changes where they occur without touching monolithic code, waiting for centralized implementation, or jeopardizing running systems. It is no coincidence that Apache Kafka, an open-source, distributed EDA framework, is rising in popularity and becoming the industry standard for live data processing. Next to Swiss companies like, e.g., Generali and Ricardo*, financial service providers use Apache Kafka’s capabilities for instant payments, to detect fraud, and analyze customer behavior in real-time. Talk to us and find out how we can help you master the digital transition.

*Source: https://www.confluent.io/customers/

Ready to talk?

Do you have any questions? Does your business need our expertise? Or are you interested in one of our products? Drop us a message - we will get in touch as soon as we can.

Select
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.