Data Alchemy for Banks: IReF – The RegTech Initiative for Large Scale Data Integration and Reporting

Veröffentliche eine Notiz:

The Integrated Reporting Framework[1] (IReF) is a golden opportunity to increase the quality and efficiency of existing reporting processes, and, not in the spotlight, analytics. While the strategy of this data ecosystem has been discussed and, slowly but surely moving forward between traditional incumbents, not much is been discussed in getting the tech requirements right. Such a large-scale data integration and reporting framework cannot excel without developing the right architecture and use of digital disruptors.

The European System of Central Banks[2] (ESCB) gives us two key functionalities of IReF[3] that must be addressed. The first one is the collection of granular data provided by the reporting agents, i.e., intermediaries, and the second one is performing harmonised transformations to ensure data quality. There is a third one, not explicitly visible to a citizen eye, the reporting data, which must be generated.

Going through the articles on SHS and AnaCredit, it is worth noting that this large-scale data integrator will be solely based on structured data, which will be provided in fixed sizes ranging from a few kilobytes to a few gigabytes. We foresee that each reporting agent will provide monthly data in the order of Terabytes. A complementary point to consider is that the volume of data provided can exponentially grow during specific timeframes in the system, for example, during month-end or quarter-end close; timeframes, typically, used by all reporting agents, causing the “all-hell-brakes-loose” scenario where a tsunami wave of data will hit the platform for a few minutes.

IReF can’t deliver on its promise without addressing these critical elements effectively, all of which rely on the selection of the right data architecture and, subsequently, the set-up of the infrastructure.   

Although there are numerous aspects that require attention from ESCB when setting up this data ecosystem, we will only touch upon a select few:

Data Delivery Model What arrangements should be made between intermediaries and ESCB to facilitate a model of data delivery over a heterogenous technology landscape?  ESCB should seek confirmation from the intermediaries on the following assumptions: Intermediaries keep their data on own standard relation databases or data storage that are built over legacy technology structures; intermediaries have a dedicated team that addresses all data engineering, security, and analytics requirements. Access to non-local data, provided on multiple different technology stacks, can follow a data product model. That is, intermediaries offer data as product and IReF consumes them. This means that an Application Programming Interfaces (API) layer will (1) serve as the basis for building reusable patterns (pull vs push) to access data from IReF to intermediaries in a downstream fashion and (2) piece together allthe data technologies. ESCB should provide a central API catalog where all APIs will be defined to ensure consistency and to govern all accesses by intermediaries and ESCB members.   

Scalability How to scale this data integrator given its high volume, velocity, and granular nature of data? Distribution is the key to achieving scaling for this ecosystem, including the ‘tsunami’ scenario. Functional leaders should look at emerging technologies that provide a new magnitude of change in terms of computing power, bandwidth, and AI. What tech trends have at core distribution, and are pushing the frontiers in terms of speed, volume, and granularity? Edge networks could be an approach to deal with data latency and granular, dedicated data access between intermediaries and IReF. Moreover, it is worth mentioning that geo edge nodes should integrate with a self-served data platform which would be the core for IReF data processing and visualisation. Hyperscalers are providing services and partnership ecosystems to provide seamless edge and cloud connectivity. Serverless architecture might be an approach to deal with large scale data processing. For sake of space, we mention data processing for a single intermediary (engines such as Spark can run serverless) or for all intermediaries belonging to an ESCB member: NCB (massive parallel datawarehouses in hyperscalers can run serverless).

Business Efficiency Indicators Will data be delivered, processed, and visualized in batch/near-real/real -time for IReF? What is the time lag that a report provided by an intermediary should be delivered and processed by IReF? Also, how fast should those data be propagated after the processing to the statistical layer for regulatory compliance dashboards? The functional and business leaders should quantify what timing really means for IReF to make an impact and create value for the ESCB. Based on IReF archetype of reporting and analytics, we would intuitively surmise that IReF would deliver and process data in near-real-time (Notwithstanding our conjecture, the ultimate analysis and decision rest within ESCB’s purview). Existing architectures such as Lambda and Kappa provide approaches to process data either in real-time or near-real-time or batch. In addition, leaders should not be influenced by technology costs while addressing these questions. Cloud computing costs are continuing to decline while data services are becoming more scalable.

Cost-cutting: One Platform to Found Them All How can ESCB cut costs by integrating with other initiatives of sibling’s entities that have similar nature to IReF? We see DigitalEuro[4] as a potential candidate to run along with IReF. Altogether IReF and Digital Euro share same barriers on how to exchange, share and combine data within the same environment of multitude of intermediaries. Other initiatives should be considered joining such platform to leverage its extensive user base, robust infrastructure, and collaborative ecosystem, and, subsequently, cutting costs of ESCB, EuroSystem and ECB.  A different question to ask would be on how costs can be cut by consolidating operations? All initiatives, IReF, Digital Euro, and others, will have different data sources, data models, and insights, resulting in different ETL processes. The engineering time spent on reconstructing one ETL process of one initiative to another should be minimised. DataOps method can be used to achieve such goal by automating workflow and eliminating low-value operations such as managing test data and code quality. Time lags between development and utilization of data ingestion, modelling and reporting will be greatly reduced by introducing reusable processes and automating testing. 

IBM Consulting stands tall as an industry leader, having consistently delivered innovative solutions and strategies that drive success for several Euro institutions in RegTech and SupTech areas such as payments, supervisory processes and portals, data warehousing and lakes. With its unmatched expertise, commitment to excellence, and a strong track record of achievements in Data & Technology Transformation, IBM continues to be a trusted partner for the Banking and Financial Market sector seeking unprecedented growth and groundbreaking transformation.


Authors

Andi Bejleri

Senior Manager

Data & Technology Transformation

Andi.Bejleri@ibm.com

Umesh Nimbalkar

Associate Partner

Banking and Financial Market

umesh.nimbalkar@in.ibm.com

Marinela Bilic-Nosic

Executive Partner

Banking and Financial Market

Marinela.Bilic-Nosic@ibm.com


[1] https://www.ecb.europa.eu/stats/ecb_statistics/co-operation_and_standards/reporting/html/index.en.html

[2] https://www.ecb.europa.eu/ecb/orga/escb/html/index.en.html

[3] https://www.ecb.europa.eu/pub/pdf/other/ecb.iref_overview2023~5897910183.en.pdf

[4] https://www.ecb.europa.eu/paym/digital_euro/html/index.en.html


More stories
By innovate-banking on Februar 14, 2024

Leadership Skills in the Digital Transformation Wave of Central Banks

In an era of unprecedented change, many companies are faced with the challenges of digital transformation. Becoming a digital business means using technology to create new value in business models, customer experiences, and the internal capabilities that support core operations, whilst also generating efficiencies. The successful adoption of new technologies, digital ways of working and […]

Weiterlesen

By Georg Ember and Malte Menkhoff on Oktober 13, 2023

Warum ein IT Leitstand ohne AI Tools möglich, aber nicht sinnvoll ist

Es gibt Statements unter Hundebesitzern, die fangen etwa so an : „Ein Leben ohne Hund ist zwar möglich, aber nicht erfüllend und daher sinnlos“. Ganz so krass ist die Einstellung in der IT unter Systemadministratoren und Site Reliability Engineers noch nicht, was den Einsatz von Tools angeht. Aber die Sinnhaftigkeit des Einsatzes von AI Algorithmen […]

Weiterlesen

By innovate-banking on August 30, 2023

Implementation of the Digital Euro

EXECUTIVE SUMMARY The digital euro will enter the highly competitive, multifaceted, and heterogeneous payments landscape in the Eurozone. As any other payments method, it needs to provide additional value for the variety of stakeholders to achieve the envisioned acceptance rate in daily payments. Therefore we see the following aspects as levers for a successful digital […]

Weiterlesen