CategoriesIBSi Blogs Uncategorized

Legacy Systems and Data Security in Open Banking

                     Shuvo G. Roy

The Catalyst for Change

Billed as a game changer by most in the industry, Open Banking witnessed a managed roll out in the UK in April 2018, paving the way for customers to experience enhanced banking services through a variety of authorised providers. The Competition and Markets Authority ushered in Open Banking with the aim to improve the quality of banking and financial services, ensuring banks remain customer-oriented in an extremely competitive market.

Optimistic market forecasts estimate that Open Banking could generate more than £7.2bn by 2022 if various sectors tap into its massive potential.

Open Banking allows secure data sharing by using an integration technology called Application Programming Interface (‘API’) that accesses the account and transaction information of customers and even allows third party providers (‘TPPs’) to initiate payment on behalf of customers, only upon their explicit approval.

As we move into 2019, what has actually changed and what lessons can we learn? Has this ‘great disruptor’ in the banking sector lived up to its initial hype?

A Closed Mind to Open Banking

The CMA reported that in June, there were 1.2 million uses of Open Banking APIs, describing it as a slow but positive start to changing consumer attitudes and revitalising the banking ecosystem for the better.

However, one senior source at a financial technology company told The Daily Telegraph: “The lack of promotion by the big banks has been disappointing and it’s the main reason for the slow take-up”.

So what are the reasons for the slow start? Why are the big banks taking their time?

Anne Boden, CEO and founder of Starling Bank, has been quoted as saying that the big banks “are all using legacy technology that’s 20, 30 or 40 years old… there’s no commercial reason why they want to do it [Open Banking]. Without that it’s a very difficult thing to do.”

Though public sentiment towards Open Banking is far from effusive, do remember it is a complex change that will take time to transform the way banking is done. Open Banking inherently brings a raft of technological and economic risks for the traditional banking model and navigating those changes is going to be an uphill task. One of the biggest teething problems faced in the banking sector is the legacy technology that is still used in the major banks, preventing them from quickly benefiting from this ambitious regulatory-driven process. In some instances, the technology could be even thirty or forty years old. The cost of overhauling their legacy technology to allow integration with API is prohibitively high, adding further traction to the process of adoption. However, if banks and financial organisations are eager to monetise the myriad opportunities presented by Open Banking, they need to be quick about overhauling their systems and IT infrastructure. Further, they also need to constantly innovate and bring out banking apps and other technology-driven solutions to enhance the banking experience for their customers.

Though the CMA provides guidelines on security measures and details of regulated providers, it still fails to address the underlying issues of legacy technology to ensure that there is no loss in the transfer of customer data.

Driving the Change

Banks own valuable customer data and are fiercely protective of it. Also, consumers who are not familiar with the actual applications of Open Banking are reluctant to embrace it as they fear fraudulent transactions and other complications arising from this technology. Adding to this hurdle is also the lack of awareness of the risks and benefits associated with Open Banking that has limited its appeal among the masses.

Therefore, the challenge for the banking sector is in implementing these concepts on the ground. Any compromise on customer data will not only result in regulatory penalties but also in the damaging press. No wonder then that cyber and data security rank amongst the top priorities of every Bank CIO and CEO.

Since Open Banking requires banks to share detailed customer information (other than sensitive payment data), they are required to undertake due diligence while sharing the same, even under the express consent of the customer. Banks and TPPs need to ensure customer consent is taken with due emphasis on the customer’s ability to understand and appreciate the possible outcome from the provision of their data. Since banks are deemed to be the final custodian of customer information, they have to secure their systems against financial crime, fraud detection and AML, among other things. Further, a bank’s IT infrastructure will need to be more secure and resilient as it will now be exposed to threats ported through TPP systems. They have to invest more effort and energy to analyse and discover potential points of vulnerability and take adequate measures to address this holistically. Core banking systems need to adopt open API based peripheral development, delivering quicker implementation cycles and minimal customisation of the core product. Furthermore, the industry’s adoption of API standards should set a benchmark for all involved parties. Banks and TPPs should adhere to and promote development in line with these standards.

Finally, it is worth mentioning that many large payment systems and core banking providers have developed Open Banking-compliant solutions. Without going into a lengthy debate on the merits and demerits of each of them, it might suffice to recognise that these systems, along with robust identity and access management systems, can comprise a strong first line of defence for the Open Banking ecosystem.

The Best Has Yet to Come

While the consumer experience may not have altered significantly in the initial rollout of Open Banking, experts opine that it won’t be long before the positive effects of this innovative model trickle down to the end users.

Already, the market is charged with competition and has become riper for innovation. Positive changes are taking place internally and banks are strategising to become more customer-centric and proactive. This will bode well for the long-term relationships banks have with their customers. As we gear up for the next wave of Open Banking, we hope that its innovative model will lead to a level playing field for both customers and banks. For once, innovation will go hand in hand with pragmatism and plain grit, to script the winning equation for the future of banking.

By Shuvo G. Roy, Vice President & Head – Banking Solutions (EMEA), Mphasis

CategoriesIBSi Blogs Uncategorized

A GDPR storm is coming – are you prepared?

Julian Saunders, CEO and founder, PORT.im,

Julian Saunders, CEO and founder of personal data governance company PORT.im, discusses how alleged breaches of GDPR by Facebook and Twitter may just be the beginning

Cast your mind back to early 2018. The world was alive with the sound of GDPR commentary. In the run-up to the May compliance deadline, everything was up for debate. Would it spell the end of marketing as we know it? Was anyone actually compliant? Was it good news or bad news for businesses? And, getting the most airtime – would GDPR be a damp squib like the Cookie Directive?

If you were of the opinion GDPR was a lot of hot air, the intervening months may feel like vindication. GDPR has largely gone off the agenda of most media publications and with it the minds of many business owners. However, we’re merely in the eye of the storm. In the last few weeks Facebook, and now Twitter, have been squarely in the crosshairs of regulators for allegedly failing to comply with GDPR. The EU has issued a stark warning that big fines will be handed down before the end of the year. Similarly, the ICO has ramped up its warnings that major action is likely to be taken. Added to this momentum has been a seemingly endless series of high-profile data breaches with Google+ the latest casualty.

For business owners who put their GDPR compliance on the backburner since May, the warnings could not be clearer: If you aren’t GDPR compliant you’re likely to be in some serious trouble in the next few months.

Facebook has quickly become the poster boy for poor data governance procedures. Cambridge Analytica, data breaches, and GDPR failures have all come in quick succession and provide a case study for businesses on how not to collect and manage data. While it may be tempting to revel in some schadenfreude, a better approach is to see what every business can learn from Facebook and how they can protect themselves from the expected GDPR storm.

First, it should go without saying that financial organisations hold some of the most sensitive personal data. Thankfully, the most important data linked to account information has largely been well protected. However, having high security standards around bank accounts can breed complacency especially when you consider it’s not the only information the average financial company holds. The marketing, customer service and sales departments will all, usually, have their own customer databases which may be subject to vastly different security and governance standards. A breach related to any of this data could be fatal to a financial organisation and result in hefty GDPR fines.

General complacency is kryptonite for data management and protection. For Facebook, its complacency manifested itself in lax standards, questionable practices and a belief it would never be brought to account. For financial organisations, it can lead to blind spots related to data that is deemed less ‘sensitive’. Often, to enable smooth marketing, client management and sales operations, customer data is more readily accessible than financial information, shared with more parties, updated more frequently and inputted into more platforms. Each of these processes increases risk. Compounding this issue is a general lack of education related to the power of this data to do harm. Many would ask, what use is an email address to a hacker? The short answer is, a lot. This is why GDPR seeks to protect every piece of personal data.

If you’ve got to this point in this article and you’re beginning to feel some doubt surrounding your data practices – good. Now is the perfect time to audit and review all your data processes and security standards. The baseline should be – is everything GDPR compliant? If it was in May – is it still compliant? New technology, teams and initiatives can all impact your data processes and result in non-compliance.

If you avoided all of this in the faint hope that GDPR wasn’t going to be an issue, you need to get on it immediately. In this instance, buying in technology and availing yourself of the services of specialist consultants will be the fastest (but not the cheapest) option.

Next, what is the general understanding of your staff? All the procedures and technological safeguards will mean nothing if your colleagues do not understand what GDPR is and the danger of data breaches. Undertaking company-wide training regularly and incorporating data management expertise and ethics into staff development and assessment can be a powerful way to measure and improve education.

Finally, if the worst happens and there’s a breach – are you prepared? Time and again we see that a poorly handled response to the data breach generally do more damage than the breach itself. Again – I’ll point to Facebook and its slow, incomplete and unsatisfactory responses to each and every data issue it has encountered.

Slow responses are symptomatic of a failure to have the right procedures in place. This can be because there is no technology or expertise available to identify the breach in the first instance or the right people are not empowered to make quick decisions. You need to start from the position that any breach, no matter how minor it appears, is serious. It should be reported to a specialist team led by the CEO. Within that team should be the IT lead, marketing, customer service and legal. Consumers should be informed as quickly as possible, both to be GDPR compliant, and to reassure. The business needs to identify who is impacted, how, what went wrong, how it can be fixed and how consumers will be protected in the future. The faster these boxes are ticked and communicated the better the end result – especially if the ICO gets involved. As with anything, practice makes perfect. Conducting wargames and drawing up ideal responses and contingencies with this team could make all the difference.

We now live in a world where the reputation and future of a company can be destroyed by hacks and data breaches. Organisations are generally to blame for this environment. There has long been a culture that personal data is a commodity that businesses can deal with as they wish. Now the wheel has turned. If you’re one of the many business owners that still believe that data governance is just something for the IT department to worry about – you’re going to be in for a big surprise. By the end of the year, a number of large businesses will be hit with near-fatal fines as a warning to other companies. Acting now will ensure that your company is not one of these cautionary tales.

CategoriesIBSi Blogs Uncategorized

Indian FinTech sector has potential to cross $2.4 billion earnings by end 2020

Abhishek Kothari, Co-founder, FlexiLoans

2020 is almost here, and it is a perfect time to look back on 2019 and appreciate the highs and lows. By this point in 2019, the words ‘FinTech’, ‘Data Science’ and ‘Machine Learning’ have become relatively common, and implications attached to these words have become apparent to anyone who is a part of the modern world.

FinTech in India has been growing at a significant pace for the last four years as a result of the increasing focus from RBI, government policies, advancing technology and affordable smartphones and data.

In turn, the Indian FinTech ecosystem has finally matured with the public at large, becoming more receptive towards digitization and tax automation. This is owing mainly to the demonetization of 2016 and the introduction of the Goods and Services Tax in 2017. In fact, implementation of GST alone has led to dedicated startups and new business verticals from established brands to help small, medium and large businesses with their taxes.

2019 was expected to be a year with continued momentum, but it came with its share of surprises. The industry did not grow as fast as anticipated, but like everything else in life, there were also moments of delight.

Firstly, the IL&FS liquidity crisis led to a massive trickle-down effect on NBFC lending, which led to a considerable reduction in available debt to smaller NBFCs. Liquidity is the raw material for financial services, and in the absence of a steady supply, many FinTechs grew slower than expected.

Secondly, RBI continues to be silent on some key issues like e-KYC, e-sign, e-NACH, which were the catalysts for a seamless journey and growth. The circulars were expected to post the elections, but that has been delayed, leading to a lack of clarity.

Thirdly, UPI and Payments saw a great deal of growth and investments coming in. UPI has been recognized globally as a masterpiece of innovation. With 143 banks live on UPI clocking 1.2Bn transactions in November alone, it has completely transformed the way money moves in India.

2019 was also a year with many FinTechs building real-time, fully automated and intelligent solutions for lending and payments. AI and Machine Learning saw some real takers and many human-led processes were fully automated.

As liquidity continues to come back and wait for RBI continues to streamline KYC, the trends I see shaping fin-tech startups in 2020 involve a highly aware customer and further innovations in data science and data engineering.

Trend 1: India is rapidly moving towards a mobile-first approach for accessing financial services, and they prefer vernacular platforms.

With a 400Mn reach of WhatsApp and thousands of hours of content being created by OTT platforms – Indian consumers are online on their smartphones. YouTube in India has over 1,200 channels with one million subscribers, and this number was only 14 in 2014. 

This provides an unparalleled opportunity for tech companies to build digital journeys and solutions to disrupt almost everything that we know today. Financial Services, Transportation, Logistics, Shopping, Telecom, Healthcare, Education are all going to see newer players challenging the status quo. There is nothing called Digital Strategy now, it’s just Strategy to survive in a Digital India!

FinTech also is witnessing the same behavioral shift where 95%+ users apply for a loan using a mobile device while this number was less than 30% three years ago. We have seen a 2X conversion on our vernacular pages compared to English landing pages.

Trend 2: Data Science and Engineering are delivering substantial cost efficiencies and better decisions with cutting edge applications of Computer Vision, Optical Character Recognition and Pattern recognition.

FinTech is growing at an exponential pace in India with high applications of data science in aspects like lending, insurance, broking and wealth management. Several lending companies have used image, text, and voice as input data sources to provide accurate decisions and better experiences than their banking counterparts in the last couple of years in India. Optical Character Recognition was meant to read the text inside images and transform that into digital text data. Now, there is an integration of OCR in our daily lives – from scanning documents and credit cards to data entry. The traditional, time-consuming paper-based work has been replaced with an optimized way of collecting the same data. With the enhanced ease in collecting data, data scientists can start their analysis journey quicker.

Data Science and Data Engineering are working more closely than ever with T-shaped data scientists becoming popular by the day.

Being one of the youngest nations in the world, a considerably large section of the Indian population is significantly more receptive and adaptive. The result is tech-savvy zealous entrepreneurs pushing the Indian fin-tech industry towards potential earnings to the tune of US$ 2.4 billion by end 2020.

CategoriesIBSi Blogs Uncategorized

Accenture to enhance core banking platform with SEC Servizi acquisition

Accenture has completed its acquisition of Italian banking technology service provider, SEC Servizi Spa from the Intesa Sanpaolo Group. Accenture now has 80.8% ownership in SEC Servizi and will also be acquiring the remaining interests held by other shareholders.

Established in 1972, SEC Servizi is a consortium formed by Italian banks to provide IT services and outsourcing solutions for banks and other financial institutions in the country. Its offerings include application and facility management, centralized back office services and specialized multi-channel, consulting, education and support solutions. The company reportedly manages more than 21 million transactions per day for nearly 1,400 bank branches in Italy and had revenues of EUR 152 million by the end of 2017. Some of its clients include Banca di Credito Popolare, Banca Italo Romena, Banca Nuova, Veneto Banca, Allianz Bank Financial Advisors, others. Intesa Sanpaolo acquired SEC Servizi in 2017 as part of the acquisition of certain assets, liabilities and legal relationships of Banca Popolare di Vicenza S.p.A. and Veneto Banca S.p.A, both in compulsory administrative liquidation.

The acquisition of SEC Servizi’s expertise and technology and operational assets will enable Accenture to create an advanced and innovative core banking platform that can support banks in their transition to digital. This transaction will help to establish Accenture as a leader in the banking technology market in Italy, serving SEC Servizi Spa’s existing customers, including Intesa Sanpaolo and other mid-sized financial institutions in Italy.

After slowly recovering from the financial crisis, Italian banks are now looking at modernizing their technology infrastructure and are increasingly relying on digital resources to remain competitive in the market and align their services to the digital savvy customer. An indication of this is the drop in the number
of branches at the end of 2017 which was was 20 per cent lower than in 2008.  Banks such as Unicredit, Intesa Sanpaolo, Monte dei Paschi, Mediobanca, Banca Carige are leading the way with digitalization initiatives ranging from contactless payments, virtual reality branches, robo advisory service, etc.

For Accenture, this presents an opportune time to enhance its core banking technology services with the acquisition of SEC Servizi Spa.

CategoriesIBSi Blogs Uncategorized

The spreadsheet challenge as banks move processes to European financial centres in preparation for Brexit

Henry Umney, CEO, ClusterSeven

Uncertainty around Brexit continues, but practical preparations have begun – many banks are now well in the throes of duplicating or moving systems and business processes from London to other financial hubs.

Extricating processes isn’t going to be an easy task. There are two aspects to this separation process – formal IT supported enterprise systems and the grey IT (or end user supported IT systems). Most banks have the understanding and the ability to effectively disentangle the core enterprise systems. Where in this extrication activity, banks are likely to come unstuck is situations wherever there are end user supported IT, commonly Microsoft Excel spreadsheet-based processes, that are deeply linked with the rest of the banking group’s enterprise systems.

If a bank is required to set up a separate entity in the UK, all the data residing in ancillary spreadsheets that feed data into the various systems pertaining to this jurisdiction will need to be delinked/duplicated and housed separately too. For instance, as banks separate their Treasury operations, there will likely be certain processes that heavily rely on common Bloomberg and Reuters market feeds that are owned by or have deep linkages to the banking group’s systems. Similar issues will arise for capital modelling-related processes. While previously a bank might be evaluating business risk based on its aggregated position across its European operation, post-Brexit, determining the UK entity’s risk position will require the financial institution to disconnect and separate the relevant data for this jurisdiction.

Essentially, as banks duplicate their enterprise systems for specific jurisdictions, they need to do the same for the spreadsheet-based application landscape that they rely on operationally.

Disentangling these unstructured, but business-critical processes manually will prove impossible and eye-wateringly costly.  Typically, spreadsheets surround the core systems such as accounting, risk management, trading, compliance, tax and more. Complete visibility of the spreadsheet-based processes landscape is essential to identify the ones that need to be duplicated/extricated for the new entity, but due to the uncontrolled nature of spreadsheet usage, there will potentially be 1,000s of such interconnected applications and no inventory of these processes.

Banks should consider adopting an automated approach to safely extricating their spreadsheet-based processes. Spreadsheet management technologies, can scan and inventory the entire spreadsheet landscape, based on very specific complexity rules and criteria. The technology can expose the data lineages of individual files across the spreadsheet environment to accurately reveal the data sources and relationships between the applications.

This approach is already proven in M&A type operational transformation situation, which to some extent resemble the Brexit scenario. Aberdeen Asset Management adopted this approach to separate the Scottish Widows Investment Partnership (SWIP) when it bought the business from Lloyds Banking Group. Due to the number of convolutedly connected spreadsheets across the vast spreadsheet landscape and the complexities of the business processes residing in this environment at SWIP, manually understanding the lay of the land was unfeasible. Utilising spreadsheet management technology, SWIP inventoried the spreadsheet landscape, identify the business-critical processes, and pinpointed the files that required remediation. Simultaneously, the technology helped expose the data lineage for all the individual files, revealing their data sources and relationships with other spreadsheets. SWIP was able to securely migrate the relevant business processes to Aberdeen Asset Management and where necessary decommissioned the redundant processes.

Post-Brexit too, banks have a lot more to gain from automated spreadsheet management.  Spreadsheets will likely be the used to set up temporary business processes/solutions for the new operations. Spreadsheet management will embed best practice-led use of these tools across the lifecycle of such applications – from creation through to remediation and decommissioning into formal IT supported applications– encompassing spreadsheets and their unique data flows. It will also offer banks an in-depth understanding of their data landscape. This will help institute data controls and spreadsheet change management processes so that there is complete transparency and an audit trail tangibly reducing operational, financial and regulatory risks caused by spreadsheet error.

 

By Henry Umney, CEO, ClusterSeven

 

About the author

Henry Umney is CEO of ClusterSeven. He joined the company in 2006 and for over 10 years was responsible for the commercial operations of ClusterSeven, overseeing globally all Sales and Client activity as well as Partner engagements. In July 2017, he was appointed CEO and is strongly positioned to take the business forward. 

CategoriesIBSi Blogs Uncategorized

The Danish startup putting the killing blow into key encryption technology

Danish encryption specialist Sepior, founded in 2014, was started on the back of ground-breaking encryption projects and the support of the EU’s Horizon 2020 programme. In discussion with IBS Intelligence it revealed that it has lots more surprises up its Fairisle jumper

Sepior’s big break came with the EU’s Horizon 2020 initiative, an irony not lost on CEO Ahmet Tuncay – as we spoke to him, the chaos which is Brexit continues to engulf Europe.

Ahmet Tuncay, Sepior CEO said: “Yes, we’re a truly Danish company and found our footing within the Horizon programme, which deals mostly with small to medium enterprise projects or SMEs.  For companies with promising technologies, the EU economic commission provides grants for the ones they believe will become a commercial success.  But there’s a fairly high bar for them to grant you this money, you have to commit to specific milestones and strict targets.  The commitment our founders of the company made was: ‘If you give us these funds and support, we’re going to create economic activity within the EU, which means hiring people and growing the company’.

He continued: “Our obligation was really to take that money and create a piece of commercially viable technology.  At the early stages, specific use cases aren’t as important as the foundational technology and broad market appeal.   Once the foundation is created,  we wanted to be able to acquire institutional funding to go and build a business.  In the long term our obligation is to create jobs, insofar as the EU is concerned, but now we have commitments to our shareholders, so it’s not just jobs that matter today.”

Tuncay says: “If you just look at the size of the market for encryption key management, you’re not going to be impressed by the number, it’s only around a $1 billion market.  But if you take the same technology, repurpose it and, apply it to commercial asset exchanges, which is a $300 billion market, and find a way to participate in a revenue sharing opportunity, you’ve moved yourself from a $1 billion market to a $300 billion market. You then have to figure out how to extract your fair share from that activity.”

The numbers are certainly impressive if you consider the amount of dollars that brokers and exchanges collect in fees – it’s a vast amount – it’s certainly more than the $1 billion market for encryption key management.  It’s several hundred billion dollars, it is super lucrative and it’s a great market to be in because few companies are good enough to offer a differentiated service to capture new customers..

Tuncay says: “Our investors recognised that the big pain of cryptocurrency activity is that if you lose the coins, they’re gone forever.  So that turns up the need for novel security solutions more than ever.  The digital wallet containing the cryptocurrency assets must be hosted in trusted custody and the transactions involving the wallet must be protected against malicious or incompetent brokers and clients. The need for a higher level of security means having multiple signatures and multiple approvers, which obviously more secure than having just one.  When you have the multiple approvers using our ThresholdSig technology versus a MultiSig or multiple signature technologies, we can deliver very high levels of security and trust along with some operational benefits for the exchange, because the administration of the security policies involving adding people, removing people, replacing lost devices, and who can participate in those signatures, that’s all done off-chain and it’s simple.”

The alternative approach is to use MultiSig, which is all on-chain, so when you change the policies you have to broadcast the policy, telling everyone who the approvers and policies are, which is not really good for security. You may also have to reissue or generate new keys.  There is a lot of administrative bureaucracy that goes with that approach.  Until recently MultiSig has been the gold-standard for threshold cryptographic currencies but ThresholdSig provides an equal or higher level of security with a more flexible, lower administrative effort environment and also has some potential efficiencies to improve and reduce the size of the recorded transaction on the blockchain.  That means that the way the transactions occur, they’re recorded on the ledger, and with MultiSig, the blocks actually contain multiple signatures that have signed off on the transaction, which of course increases the block sizes.

Tuncay says: “With ThresholdSig there’s only one signature that goes on the ledger, so it actually reduces the amount of data on the ledger. It turns out these signatures are a substantial portion of the total transaction size.  So, there’s this kind of tertiary benefit that could end up being quite material, because it means that the blocks can contain more transactions. Blocks are typically fixed in size, so if the transactions are smaller you get more of them onto the chain.  In some of the currencies, like Bitcoin, it’s already hitting capacity on processing.  So, if you can have the highest level of security and smaller transaction sizes it’s going to maximise throughput.”

There is the hope that ThresholdSig transactions will also have lower transaction fees than MultiSig. ThresholdSig transactions appear as a single signature transaction on the blockchain. Historically, single signature transactions are the smallest in size, allowing for maximum transactions per block and typically have the lowest mining transaction fees. Our expectation is that the exchange could end up with lower transaction fees, with higher security and lower administrative overhead. So, there are some very compelling reasons why this technology is going to be relevant to a far wider audience than up to now.

Sepior’s investors were on the front edge of recognising threshold schemes, the cryptography approach with multiparty computation, and how that technology could bring real benefits in this use case.  As Tuncay says: “We’re focusing on the implementation around cryptocurrency exchanges and hot wallets, but this technology is applicable to a much wider range of applications.  So next month we’re going to be making some announcements around more blockchain generic solutions, to provide more privacy on private blockchains in general. There are a whole series of problems with using distributed ledger technology for business and one of these is scalability.  How do you support – for example, in the case of logistics tracking operations,  a container being loaded and shipped from a point in China to destination in Los Angeles? Sometimes there are 35 or 40 different parties involved in that transaction.  These parties don’t necessarily need to know everything on the blockchain.  Effectively all the transactions are on the chain.  So all parties that are participating in the chain can validate and see their own transactions but need not see the confidential data of other parties.  One strategy for this has been to create virtual blockchains called channels, which is used in Hyperledger fabric, but it’s use creates a messy scalability problem.

Tuncay says: “If I were to generalise it further, while a blockchain is supposed to contain transactions that are immutable because everybody on the chain can validate them, the downside is that everybody on the chain can also see everything on the chain.  So how do you create an application like logistics tracking where there are 30 parties on the chain and you want every party to have a different view of it?  Our solution to this – and there are existing solutions which have proven to be unscalable, is based on access control policy that relies on encryption to make only the intended parts of the chain available to users based on their permissions.

“There is nothing magical about this, we’re just using our underlying key management system and fabric.  But once we make this available, it will also enable the creation of privacy-preserving chains that are massively scalable than what is possible today.  We think there’s value there, again this is something that we’re going to go and test out and we’re involved in activity with several large companies, to validate this.  We think that it’s worthwhile.:

Fundamentally Sepior is providing fine-grained control over who has visibility to what on the blockchain.  The key words here are ‘threshold cryptography’.  Sepior is pioneering and leading the industry in the field of threshold cryptography, to apply these key management concepts in a manner that’s more scalable and works in distributed environments with a high degree of efficiency.  Part of the threshold aspect, the threshold cryptography, in the case of a crypto wallet is that you might have four parties who are available to approve a transaction, but you might have a threshold that says if any three are available it will be accepted as a valid transaction.  Therefore, you can define a threshold so that if somebody loses their phone or their device gets hacked and we no longer want to trust it, it can be excluded but continue to transact and do business.

Tuncay says: “When you move into the blockchain application the threshold aspect is more around signing key availability and management. What we’ve done here is to take the key management function and distribute it using multi-party computation (MPC).  We’re able to distribute the key generation and management functions across multiple virtual servers, if you will, in the cloud, such that no individual server has a full key that could be hacked or stolen.  But collectively maybe two out of three of these virtual servers can provide keys for all the users that require access to the content on that blockchain.  This threshold aspect gives a high degree of availability, reliability and integrity of both the encryption and the availability of key management.”

For this Danish company, it looks like blockchain will be The Killing it deserves.

CategoriesIBSi Blogs Uncategorized

Risk professionals have missed the innovation train – but it is not too late to get on board

Mark Davison, Chief Data Officer, Callcredit Information Group

Remember Blockbuster? Woolworths? HMV? Comet?

These brands – once household names – are now a distant memory. But they do lead us to think about what lessons we can learn from their demise. Ultimately, if businesses do not adapt to new technology, they run the risk of falling behind the times, and eventually, failing.

It seems that across almost all sectors, there is a push to innovate. Yet credit and risk professionals continue to rely on traditional techniques, largely ignoring a key tool that could see the revolution of the marketplace – machine learning.

Machine learning is one of the top buzz words being used at the moment – but what does it mean? It is a form of artificial intelligence (AI) that can learn from data, identify patterns and make decisions with minimal human intervention. Once these patterns have been found, they can be used to make predictions and solve a range of data-related problems.

And for risk professionals, this can offer a wealth of benefits – including ensuring regulatory compliance and enabling transparency with data usage. So, it is time to ask firstly, why it is not more prevalent in the sector? And secondly, what can we do to move towards a future where machine learning is at the centre of financial decision-making.

Why now?

Talk about AI and machine learning started a long time ago, but momentum has picked up in recent years, and businesses have started to not just talk about the technology but actually implement it too.

The reason for this is simple – availability. The sheer amount of quality data and computer power we have available, combined with modern technology and the approval from regulators around the world make it easier than ever to use machine learning to make more informed decisions on lending or credit.

Late to the party

The downside of machine learning is that credit and risk professionals are already late to the party as other industries, such as fintech, marketing and transportation have already adopted this technology.

However, this should by no means deter the credit and risk sector from starting to use it. In fact, it is even more critical that they get to grips with machine learning or they could be left behind. As an article in the Harvard Business Review summarises ‘AI won’t replace managers, but managers who use AI will replace those who don’t.’

Fear of the unknown

Due to the handling of consumer finance, trust and reliability are crucial for the credit and risk sector meaning that adopting new technology is often viewed as an uncalculated risk that many are unwilling to take. This puts the sector at a disadvantage in relation to others, such as retail and customer service which are historically more open to trialling new techniques.

Fraud professionals may trust the tried and tested traditional systems that alert them to a potentially fraudulent application, more than machine learning. Or risk analysts for creditors may be hesitant to allow machine learning to make important financial decisions instead of specialists (or pre-existing trusted algorithms).

Despite reservations, the time has come to move past this scepticism and embrace the fact that machine learning already surrounds us in every branch of society. Alexa uses machine learning to collect data and then provide a tailored service to your preferences, Uber uses algorithms to determine arrival times and pick-up locations, and Netflix uses machine learning as part of their recommendation engine. Machine learning is everywhere, and credit and risk professionals need to innovate before it’s too late.

Better decisions, better lending

As regulators begin to shift the parameters that lenders and financial institutions work within, it is important that credit and risk professionals are performing at the top of their game – and machine learning can help.

In fact, at Callcredit we ran a year-long machine learning trial to help highlight the potentially positive benefits of the predictive accuracy that machine learning can offer. The results of the study were very encouraging and point to potential financial benefits for adopters of the technology in the credit, fraud and insurance arenas.

In one modelled scenario, the level of default in a portfolio of 60,000 credit cards was reduced significantly, resulting in a 10 per cent decrease in overall bad debt. If used with other elements of the customer lifecycle, potential machine learning generated benefits could be even greater.

As the benefits of machine learning become evident and transparency becomes a main focus for the sector, machine learning can be used by lenders as a new way to explain lending decisions and to demonstrate that the data going in backs up the eventual decision that comes out. Machine learning can support lenders by analysing huge amounts of data and by doing so, it can discover relationships that have previously been hard or impossible to see, prior to making a decision. The lender can then use the machine’s output to justify its decision to a regulator or a consumer.

Businesses can either embrace the innovation that technology brings, or risk being left behind by it. Many industries are improving their efficiency and performance by using smarter technology and it will only be a matter of time before machine learning will be the norm rather than the exception. It is time for credit and risk professionals to get on board and harness this technology to make better decisions for better lending today.

By Mark Davison, Chief Data Officer, Callcredit Information Group

CategoriesIBSi Blogs Uncategorized

PSD2 – banking on a gamechanger In ecommerce

Dr Rachel Gauci, Senior Legal Counsel at Credorax

The Payments Service Directive 2 (PSD2) came into full force in January 2018, bringing with it dreams of open banking that will transform the way we move and use money.

PSD2 opens up banks’ payments infrastructure and customer data assets to third parties.

Expect PSD2 to bring more options and innovations in payments and information services for consumers.  In this new era, banks are required to provide other third parties such as qualified payment service providers (PSPs) connectivity to access customer account data and to initiate payments.

The Giant Leap to Commoditisation

 From a merchant acquiring bank’s perspective, it is exciting to see all the new opportunities that PSD2 will bring in terms of transparency, fair competition, and entry barriers being broken down for new payment services.

The EU banks’ monopoly on their customers’ account information and payment services will soon be in the distant past.  Bank customers will have the power to give third-party providers permission to retrieve their account data from their banks.

PSD2 makes the role of the merchant acquiring bank even more important than ever because now there will be an even stronger need for security and expertise.  It’s all the know-how of the ins and outs of global payment services to truly leverage the benefits PSD2 brings to the payments landscape.  There is going to be a greater need in understanding the intricacies of helping merchants and retailers connect directly to the consumer bank account to initiate payment. There will be a need to safeguard consumers from any bad ecommerce experiences, including fraud.

The Key is in Technology and Innovation

Retailers and ecommerce merchants as well as other third-party providers will look to bank with merchant acquirers and ecommerce FinTechs to help them achieve an improved payment experience.  They will need help to leverage the power of connecting with banking open application program interfaces (APIs) without the need to maintain anything else such as any other backend systems from the bank.

Through the utilization of banks’ APIs, non-banks can enter the financial market without the heavy compliance and infrastructure that banks are required to maintain. This ignites innovation in the financial market and brings fresh ideas about how to shape the banking experience.

However, technology savvy merchant acquiring banks are going to give ecommerce merchants a leg-up in enabling them to quickly deploy their go-to-market strategy and ultimately generate more revenues without the pitfalls.  They will be able to guide them, bringing them within the scope of PSD2 regulation.  They will also be able to provide them with onboarding gateways and beneficial applications to deliver a consolidated view across different types of accounts in a secure and safe way, resulting in better customer insight.  This is why it will be important to partner with the right FinTechs that have the knowledge, technology and services to do all of this.

Ultimately it will be critical for PSPs and online merchants to use payments technology to their advantage and optimize operational procedures in a safe and secure way without losing customers to shopping cart abandonment or have consumers frustrated and not completing their online purchase.  PSD2 requires stronger identity checks of users when they are paying online.  FinTechs that build artificial intelligence (AI) into their ecommerce business will provide better consumer protection against fraud.

The Winning Strategy

In conclusion, PSD2 empowers bank customers, giving them the option to use third-party providers to manage their finances. It wouldn’t be out of the question to use Facebook or Google to pay bills, make P2P transfers and even analyze spending, all while the money is being safely placed in a bank account. The newcomer tech companies and even well-known big-tech can be risky because they are not familiar with the payments market enough, and will provide substandard service to businesses while also carrying over their method of doing business, with privacy issues, etc.  Only tech-savvy banks are uniquely positioned to launch revolutionary services, mitigating risk.  Not only are they able to provide a breadth of services to customers of the post-PSD2 services, they are also able to support market newcomers via partnerships.

Consequently, the winning strategy could be “don’t wait for your retail bank to help you, don’t wait for the leading big technology firms either but rather seek a fast-mover that’s got your back.”  It is expected that third-parties will build financial services on top of banks’ data and infrastructure but they will need tech savvy acquiring banks to help get them there.  The winning strategy is to choose an acquiring bank that has the know-how to reinforce consumer protection, improve the security of internet payments and account access within the EU and globally. Seek out and partner with a tech savvy acquiring bank to get up and running fast. There will be a race to gain market-share and the customers that will, in the end, create their own collection of smaller service providers, instead of choosing one specific bank for all financial needs, will be the most successful.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought for any specific circumstances.

By Dr. Rachel Gauci, Senior Legal Counsel at Credorax

 

 

CategoriesIBSi Blogs Uncategorized

Five keys to achieving a hyperscale data centre without a hyperscale budget

Kevin Deierling, vice president marketing, Mellanox Technologies

Don’t be daunted by the overwhelming technological resources of today’s market leaders, says Kevin Deierling, vice president marketing, Mellanox Technologies. Times are changing and that exclusive hyperscale architecture is now within reach of any large enterprise.

How to tame the tech titans asked a January 18th Economist headline in Competition in the digital age. A more recent article (American tech giants are making life tough for startups) outlines the problems of startups in the tech giants’ “kill-zone” – where investors will shy away from any company that might appear to be entering the big boys’ territory.

You do not have to be either a startup or a direct competitor to the likes of the Super 7 – Amazon, Facebook, Google, Microsoft, Baidu, Alibaba, and Tencent – to feel daunted by their sheer market presence and technological dominance. Then there are the second tier “unicorns” like LinkedIn, Twitter and Instagram who share their secret of building massive network infrastructures to achieve

unprecedented power to mine data and automate business processes for super-efficiency. How can the average enterprise survive in a commercial environment that is dominated by such giants?

There are two keys to their market dominance. The first is to have exceptional reach – not millions of customers, but hundreds of millions or even billions. But the real advantage is to have “hyperscale” data ccentres specifically designed to accommodate and work with such a massive customer base.

Hyperscale

“Hyperscale” describes a data centre architecture that is designed to scale quickly and seamlessly to a massive and expanding population of users and customers, while maintaining reliability, performance and flexibility for ongoing development. Until recently there was nothing available that could deliver such a service, so those giants went ahead to design and build their own hardware and software so they could control every detail and achieve unmatched efficiency. This required teams of computer scientists and specialist skills to manipulate every configurable element – something that could not be achieved using off-the-shelf solutions.

By the end of last year there were nearly 400 such hyperscale datacenters in the world, nearly half of them in the USA. There was also a growing number of specialist providers of smart interconnect solutions specifically designed for exceptional performance and minimal latency in order to serve this market.

What has changed is that those same providers now have their eyes on a very exciting opportunity: to apply their experience and advanced technology to simplify the deployment and lower the cost of hyperscaling to bring it within reach of medium to large enterprises. This is wonderful news for thousands of enterprises that will benefit enormously from hyperscaling. For the providers, it also opens up a far larger market.

There are five key factors that must be considered to take advantage of this opportunity.

Key 1 – High Performance

The faster the data travels through a complex system, the more responsive and quick will be the benefits. The leading solution providers have been providing an end-to-end portfolio of 25G, 50G, and 100G adapters, cables, and switches to these hyperscale data centres, and the resulting intelligence, efficiency and high performance is now well proven. Your own business might not yet need 100G performance, but it no longer makes sense to buy 10G now that the cost of 25G is on a par with it.

Key 2 – Open Networking

In a traditional static network environment, the one-stop-shop approach is efficient and reassuring. But today’s business environment demands agility and an infrastructure that can be extended and optimised to meet less predictable changes. Sometimes that means choosing best-of-breed, or sometimes the most cost-efficient, solutions. An open and fully disaggregated networking platform is now vital for scalability and flexibility as well as achieving operational efficiency

Key 3 – Converged Networks on an Ethernet Storage Fabric

A fully converged network will support compute, communications, and storage on a single integrated fabric. To grow a traditional network it was necessary to scale it “up” by the disruptive process of installing further resources into the existing fabric. This is like growing business by recruiting training and accommodating extra staff, whereas in today’s business environment it is often more efficient to outsource skills to meet sudden demand. Hyperscale networks are designed to scale “out” disaggregated hardware, so you can add units of CPU, memory and storage independently – and an integrated, scalable, and high-performance network is the key to achieve this.

Key 4 – Software Defined Everything and Virtual Network Acceleration The hardware required for a converged network (Key 3) is fully integrated with software to orchestrate a virtual environment optimized for the needs of each specific application. The software controller enables the system to be managed from a single screen, and software automation removes most or all of the burden of manual commissioning and ongoing management.

Software defined networking, storage, and virtualization – or software defined everything (SDX) – transforms what would have been an impossibly complex aggregate into an intelligent and responsive whole.

Key 5 – Cloud Software Integration

It goes without saying that you will want your new hyperscale network to be fully integrated with popular cloud platforms such as OpenStack, vSphere, and Azure Stack. It should also support advanced software defined storage solutions such as Ceph, Gluster, Storage Spaces Direct, and VSAN.

One integrated whole

These five key factors show that we have come a long way from a bank’s traditional static datacenter – and this is the way to go. The “Super 7” may be way ahead of anything most enterprises can even dream about, but many more companies will be facing similar pressures for flexible and efficient scalability. A retail or food chain going international could be taking on millions of new customers. There are numerous IoT initiatives that will manipulate terabytes of data flooding into their systems and a company needs massive in-house capability to run and evolve new algorithms. The result could be disastrous unless the systems are designed to scale to meet the needs of the business, while maintaining performance and reliability.

A recent example was provided by Vault Systems, a company that delivers ASD certified Government Cloud to Australian federal, state and local government agencies and their partners – managing sensitive data at the highest levels of security. The company wanted an open, flexible 100GbE network that would at the same time maintain its high level of security. They chose a supplier of hyperscale network solutions to the tech giants but one that also provides for high performance computing, enterprise data centers, cloud, storage and financial services that do not have a hyperscale budget or resources. In the words of Vasult Systems’ CEO and Founder, the resulting system has “contributed to the high performance of our cloud and also given us the confidence and peace of mind that our network is the fastest and most resilient available in market today. We couldn’t be happier with the results we have seen so far.”

Conclusion

All the five keys listed above are bread and butter to the companies that supply those “tech titans”. But don’t be daunted by the thought of asking advice from a company whose customers include giants like Netflix. As a more normal size enterprise you represent their next, even bigger, market opportunity. They will be keen to prove that they can build you hyperscale networking – without a hyperscale budget.

CategoriesIBSi Blogs Uncategorized

How can banks compete with the tech disruptors?

Digital disruption in the banking industry is something that’s gradually been gathering pace in recent years, but it’s about to get much more prevalent. Enter the GAFAMs. Google, Apple, Facebook, Amazon and Microsoft – the big five global tech companies that have made their presence known by expanding their customer offering and disrupting multiple industries in recent years. In the world of finance, Amazon has just made headlines following the announcement it’s investing in a digital insurer, while Facebook has secured an electronic money license in Ireland.

Banks beware. PSD2 has allowed GAFAMs to access customer data with their permission and use it to provide innovative solutions to their needs and the issues they face when it comes to banking. The GAFAMs have enviable digital prowess and knowledge, not to mention near-limitless funds. Combine this with data-rich customer insight and they could easily change the face of banking forever. So how will this affect the industry as it stands?

 Could challenger banks be the underdog?

Challenger banks have been quietly but effectively shaking things up in the industry, in particular looking at ways customers interact with their bank and providing a more seamless, convenient alternative. The initial Open Banking fears that challenger banks would immediately start stealing vast amounts of market share from high-street banks have been quashed for now, but they have certainly raised standards across the board when it comes to providing a slick customer experience.

So much so that Paul Riseborough, CCO of Metro Bank has stated that it will take a while before Open Banking starts to get exciting, with real innovation approaching in “about three to five years’ time”. In contrast however, PwC revealed last year in some research that 88 per cent of the financial industry is worried they will lose revenue to disruptive innovators. While there is uncertainty regarding challenger banks, it’s more likely that GAFAMs will have more power and influence when it comes to innovation and changing how customers engage with the banking industry.

 Finance and tech crossing over

The lines of relationships between financial organisations and technology platforms are becoming increasingly blurred, as China’s WeChat app has proven. Launched in 2011 with an initial concept similar to that of WhatsApp, it has since evolved into a much broader service that allows its one billion users around the world to do everything from ordering a taxi to arranging a doctors appointment, but also money transfers and other banking transactions.

Given that the GAFAMs are all heavily tech-led, if they were to establish a presence in the financial industry and introduce a similar all-encompassing product, retail banks face a further risk of falling behind in customer engagement and losing market share.

 Investing wisely

Amidst the uncertainty and potential threats brought about by GAFAMs, there is opportunity for banks to improve their innovation strategies using information they already have on their customers. McKinsey recently said in a report that banks may be at an advantage compared to the industry’s disruptors, as “customers would not find it attractive to provide third parties access to their data or accounts.” If banks can harness their data in the correct way before the tech goliaths come into view, they could strengthen their customer retention.

RBS is staying ahead of the curve as it announced earlier this year that it plans to launch a digital-only bank to complete with existing challenger banks such as Monzo and Starling. On a more international scale, a survey by PwC shows that 84 per cent of Indonesian banks are likely to invest in technology transformation over the next 18 months.

Partnerships and collaboration are also key and fast-becoming a growing trend. Software developers are being encouraged to use existing APIs to build platforms that allow financial organisations to improve both the internal and customer-facing elements of their businesses. Avaloq is a good example; its developer portal aimed at freelancers, fintechs and large banks currently has more than 1,000 developers collaborating and sharing insight with the global financial sector to drive innovation. For retail banks, it’s certainly worth taking advantage of the tech and insight on offer from external parties.

 Going above and beyond

The disruptors and challengers which have already made a mark on the financial services industry have done so by going above and beyond the perceived limits of retail banking. It’s something that retail banks need to take a step back and look at to learn from.

Many are already making strides, such as a group of big banks including Bank of America, Citi and Wells Fargo reacting to newcomer Venmo marking its territory on instant transfers. They’ve partnered with P2P payments app Zelle to integrate directly with their own apps.

Instant transferring follows a wider trend of convenience that consumers have come expect from all industries. Banks can go even further by looking at non-banking services which ensure they are making more a positive impact on their customers’ lives. Whether it be the introduction of lifestyle benefits such as high-street discounts, or helping customers to simplify their monthly bills, offering add-ons that increase convenience or reward the customer is likely to make them want to stay. In fact, our ‘Connected Customer’ report shows businesses that offer three or more additional products have considerably higher customer engagement scores, resulting in customers staying longer and spending more.

 Planning ahead

With PSD2 and Open Banking making an impact, it’s all change in the banking industry and as GAFAMs enter the market, banks and fintechs need to plan ahead to maintain their presence and stay relevant to customers.

Innovation and collaboration are the two key ingredients to improve their offering and position. The introduction of GAFAMs and other new players is a healthy addition to the financial sector, as it drives positive change and competition, while customers will reap the benefits.

By Karen Wheeler, Vice President and Country Manager UK, Affinion

 

 

Call for support

1800 - 123 456 78
info@example.com

Follow us

44 Shirley Ave. West Chicago, IL 60185, USA

Follow us

LinkedIn
Twitter
YouTube