The End of the Batch Process: How Streaming Technology Will Change the World of Risk

lexicon
.
01.09.2020

Batch Processing Is Approaching Its Sunset

Batch processing has been deeply embedded in the banking industry for decades now. From processing end of day batch processes in core systems (e.g. calculating interest) to sending data downstream for various different purposes such as feeding the general ledger, feeding risk systems, reporting systems etc. batch processes have been so deeply embedded that almost all organisations have developed their operating models around batch processing.

For many years this approach worked well for organisations both in terms of technology as well as operational efficiencies, despite some of its challenges.

Key Challenges with the status quo

1) Information Delay

  • Information is only available on a T+1 basis
  • Timely Risk analysis is constrained by system processing times
  • Decisions are based on estimates and assumptions via workaround solutions
  • Management overlays required

2) Cost of Ownership

  • High cost of implementation
  • High cost of maintenance
  • High cost of Vendors
  • High cost of change (both time and dollar value)
  • Performance driven by hardware leading to high TCO

3) System & Process Complexity

  • Process and operating models are complex due to system constraints
  • Large Applications require large IT support teams
  • Workaround solutions (e.g. spreadsheets) have been introduced
  • Duplication of business logic in various applications (e.g. cash flow projection)

“Change is the only constant”

In the recent years however, a new wave of technology is disrupting the traditional model. Tools and architectures developed by social media giants such as LinkedIn, Google, Netflix, Facebook to process vast amounts of data every day are being applied in the banking world to transform end-of-day and T+1 processes into near real-time processes.

These types of architectures must be built for scale from the ground up, which means traditional systems also start to fall-short in being able to match these types of demands.

This new wave of banking technology is based on the fundamental principles of:

1) Data Streaming Tech i.e. revolutionising how data is passed & processed / written into a database – moving away from batch transfers and central database based processing to event based data updates into multiple databases that are interconnected through orchestration engines

2) Targeted Scalability i.e. being able to automatically scale up or down specific microservice based on demand

3) Microservices Architecture i.e. smaller & unique manageable chunks of code that perform a specific task as opposed to building a complex application. This is used as a reusable unique service in the ecosystem of systems and other consuming services

4) Polyglot Architecture i.e. coding language or OS agnostic development frameworks using containerisation

5) Open Source Adoption to constantly tap into new innovations and being able to quickly adapt

Banks have started to embrace this new breed of technology in their frontline systems as well as their digital services. Most banks these days have mobile apps, trading platforms, and internet banking solutions that are either already being developed on modern tech architecture or are planned for transition in the near future to maintain competitive edge & reap its obvious business benefits.

However, the middle and back office functions within a bank are yet to reap the benefits of this technology. There are many areas within a bank that could be transformed with realtime processing of information and use of agile technology stack that can quickly adapt to business or regulatory changes.

Some notable areas that are bound to eventually embrace this wave of change driven by technology disruption are:

Real Time

  • Transaction processing e.g. payments (already occurring in Australia – NPP), loan drawdown etc.
  • Stress Testing / Simulations & calculations for managing financial risk
  • Management reporting & regulatory reporting
  • Markets and Trading platforms that receive instant updates from core systems on business transactions e.g. loan drawdown

Transforming The World Of Risk Management Via Real-Time Information Processing

“In the midst of every crisis, lies great opportunity – Albert Einstein”

Real-time risk management is a pressing topic in the current environment for banks.

During the COVID-19 crisis, banks have felt the operational pain due to their existing systems and processes not being geared up to deal with the demand of internal and external stakeholders for timely insights and management action. A few examples of this are:

  • Recalibrating models to accommodate macro-economic changes
  • Being able to run scenario tests and reviewing results in an unconstrained environment
  • Accommodating regulatory changes such as the repayment deferral schemes
  • Generating additional reports for management and regulators

The root cause has usually been attributed to process ineffectiveness, use of spreadsheets, legacy systems, rigid / inflexible systems, heavy reliance on external vendors; which has been exacerbated by the working from home arrangements.

However, one of the areas that has had limited focus is the need for speed in the current environment we are in.

What If

  • Model updates could be done instantaneously?
  • Calculations run in minutes as opposed to hours?
  • Simulations could be run with the results reviewed and presented to management in the same day?
  • This can be achieved at significantly reduced cost?

The introduction of data streaming technology coupled with the modern development architecture will change the world of Risk Management. The operating model will look very different to how things are executed today.

  • Data from source systems could be fed into Risk systems daily and in real-time
  • Finance & Risk professionals could run calculations during the day and instantly review results
  • Models & assumptions could be updated on demand
  • Several Simulations & Stress Tests can be run within a day followed by management meetings
  • Calculations such as RWA, ECL, IRRBB that rely on large data sets and complex statistical formulas could be processed within minutes and numbers ready for review and analysis instantly

Not everything will change overnight of course, adoption will occur at different frequencies across systems, with some systems still bound by upstream batch dependencies.

Many of the same benefits of adopting streaming technologies still apply to even when implemented at the single-application level. New systems can already realise these benefits, that grow continuously as the overall eco-system changes and adapts.

The 4 Key Pillars Of This Streaming Tech Revolution

“Performance optimisation is no longer about more CPU and Hardware, it is about building smarter systems”

This new wave of technology change that is enabling high performance data streaming is not just as a result of a single platform. It is a combination of years of technology innovation in architecture, technology stack and development approach that has been glued together to deliver the optimal platform for data processing at high speeds.

The flow of data from one system to another may conceptually appear to be the same, what has changed is the “under the hood” engine.

Data Streaming Tech Revolution

There are 4 key pillars that are driving this revolution.

1) Microservices Based Architecture

This is an architectural framework that is built on the foundational principles of modular, independent, and easily communicable set of services that interact with each other and external consumers to deliver a business solution.

The idea behind this architectural framework is to move away from large monolithic application development models that are developed and supported by large IT teams. This type of monolithic architecture has created complexity and rigidity in systems.

Microservices based approach creates the flexibility much needed in the current age to quickly adapt. It also organically leads to distributed accountability within IT teams and with smaller teams, throughput is increased.

2) Containerisation Based Development

Containerisation is a deployment framework that enables packaging multiple services into logical containers for quick and easy deployment into any OS or cloud. This framework essentially focusses on removing dependencies for deployment of services written in different coding languages, but also creates targeted auto scalability of services based on demand surge.

Rather than the traditional approach of beefing up CPU and hardware for increased performance, containerisation makes optimal use of existing CPU delivering higher performance than traditional systems.

It also creates the much-needed independence for a service to be deployable in any type of operating environment e.g. cloud or bare metal or different OS making transition to cloud or vice versa much easier than before.

3) Data Stream Processing Technology

In its essence, data streaming is a conceptual term to describe unbounded flow of data from one service to another designed in a way that creates minimal interruption or dependencies.

This eventually results in information being delivered (streamed) to users as soon as it is processed. The best example to describe this technology is streaming video apps such as Netflix, YouTube that is able to stream the video while it is buffering.

The same concept is being applied into processing other forms of data e.g. structured or unstructured texts.

The core shift in this technology as opposed to traditional approaches is the use of event-based messaging service model for data consumption, processing and updates.

This results in the streaming tech platform taking full control of utilisation of its services and sequencing of events while continuously delivering processed information to users as they are completed.

When this technology is coupled with containerisation and microservices based architecture, the platform delivers exceptional performance while remaining optimal in its utilisation of CPU.

The other key element of this technology is that it enables per record processing as opposed to batches of records. This can be hugely beneficial in certain use cases whereby information per record can trigger critical business processes reducing lag e.g. Credit assessment per customer as opposed to portfolio.

4) Use of Open Source Technologies & Standards

Open source technologies have existed for decades but never before have companies started to adopt these technologies like the current era of software development.

Especially when it comes to development of software’s using containers and data stream processing platforms, most leading technology companies are developing their platforms based on open source technology stack.

1) Docker – Containerisation platform (used by Netflix, PayPal)

2) Kubernetes – Container Orchestration and Load Balancing (originally designed by Google, used by Booking.com, Spotify)

3) Apache Kafka – Data Stream Processing Platform (originally developed by LinkedIn, used by Uber, LinkedIn)

Containerisation Platforms

Data Stream Processing Platforms

This article was co-authored by ElysianNxt and RegCentric.

About ElysianNxt

ElysianNxt is a Belgian-Thai FinTech company focused on real-time Risk and Finance solutions. By using real time technology, it enables Financial firms to respond faster and more cost effective to business, economic and regulatory changes. ElysianNxt is headquartered in Bangkok, Thailand and has other offices in Brussels, Belgium and Jakarta, Indonesia.

About RegCentric

RegCentric is a leading Australian consulting company, specialised in transformation in Data Management, Finance, Risk Management and Regulatory Reporting in the financial services industry. The RegCentric team consists of highly experienced business and technology consultants who are passionate about assisting Australian financial services companies leverage technology to drive efficiencies, deliver insight and ensure regulatory compliance. For further information please visit www.regcentric.com

Download the Whitepaper

Latest News

Eradicate Financial Zombies: ElysianNxt’s Solution to Outdated Risk Management Systems

April 18, 2024
Video

How To Achieve CRR3: Basel IV Key Considerations for European Banks

The CRR3 timeline to implement for European Banks is only months away; January, 2025. Selecting a solution that can address all challenges and implement in time before January 2025 is essential to ensure compliance and financial stability.
April 2, 2024
Article

Contact us today for an unparalleled experience

Ready to get started?