CategoriesIBSi Blogs Uncategorized

Data volumes are not just growing, they are exploding. Now measured in zettabytes – which could become yottabytes in the not too distant future – it’s not surprising they are causing more than a headache for today’s organisations. These vast pools of data are also putting traditional database architecture to the test.

Nowhere is this problem felt more acutely than in the banking industry, where the situation is exacerbated by a complex raft of issues. For example, many banks have had the same legacy systems in place for decades.

Often these are not fully-integrated with others in the organisation and, consequently, many applications still run in a siloed environment. In a recent study by analyst firm Enterprise Strategy Group (ESG), commissioned by InterSystems, 38% those polled reported that they had between 25 and 100 unique database instances, while another 20% had over 100. Although this was a general survey, not confined to the banking industry, it does give some idea of the scale of the problem.

So although many banks own these vast amounts of data, many of them are unable to do anything with it, especially analyse it in real-time. Which means that often they just don’t have the capability to provide the open banking demanded by new regulations such as PSD2.

Banks have been addressing new regulations in a piecemeal fashion for too long and this approach is now catching up with them. With each new ruling they have put a new siloed application in place to meet its specific needs and no more – but there’s a limit to how long this can continue. Today’s regulations are demanding an end to data siloes with integration enterprise-wide and the ability to analyse data in real-time.

These are broad-brush requirements. At a more granular level, banks must think through the step-by-step processes needed to meet compliance. Typically, they will need to bring information in from multiple applications, run reporting on this data on a real-time basis and generate that in a format that meets the regulator’s precise requirements.

As a result, banks must seek out a data platform that can ingest data from real-time activity, transactional activity and from document databases.  From here, the platform needs to take on data of different types, from different environments and of different ages to normalise it and make sense of it. The platform they select must be about to reach out to disparate databases and silos, bring the information back and then make sense of it in real-time.

This platform must also have the agility to separate out the data they need from the data they don’t need to access. It is also the case that, as businesses migrate systems and applications to the cloud, they are beginning to use software to ‘containerise’ their applications and modules. Once these containers have been set up in the cloud, they are then reusable by other applications.

It is crucial that a data platform enables data to be interrogated even if it is in large data sets and stored in different silos. This capability is important to enable the bank to comply with regulatory requirements such as answering unplanned, ad hoc questions from the industry regulators, for example.

The advantage of working this way is that it can take the bank far beyond compliance. It will now have a secure, panoramic view of disparate data which can be used for distributed big data processing, predictive and real-time analytics and machine learning. Real-time and batch data can be analysed simultaneously at scale allowing developers to embed analytic processing into business processes and transactional applications, enabling programmatic decisions based on real-time analysis.

So although many banks and other financial services organisations may feel they are being swallowed up by data, the need for compliance will ensure this doesn’t happen. The more they are storing on legacy systems, the more they are going to need an updated data platform. If they think carefully about selecting the right one, the move could result in improvements across data management, interoperability, transaction processing and analytics, as well as the means to address today’s and tomorrow’s regulatory demands.

By Jeff Fried, Director, Product Management – Data Platforms, InterSystems

 

Leave a Reply

Your email address will not be published. Required fields are marked *

Call for support

1800 - 123 456 78
info@example.com

Follow us

44 Shirley Ave. West Chicago, IL 60185, USA

Follow us

LinkedIn
Twitter
YouTube