Banks are desperately trying to hedge their positions as equities and bond values have plummeted, but do they have a full understanding of their credit risk exposure? In most cases, no.
By Volker Lainer, VP of Product Management and Regulatory Affairs at GoldenSource
After years of flatlining market conditions, it is safe to say volatility is back with a vengeance. The knock-on effects of the Covid-19 crisis will make the coming months, and perhaps even years, very testing for financial institutions. Despite there being several regulations to help banks prepare for a large global economic downturn since Lehman’s, such as FRTB and Basel 239, the current levels of volatility will show just how well capitalised banks really are.
Realistically, it’s extremely unlikely there won’t be any wholesale bankruptcies at some point in the next few months as the ripples of the enfolding crisis work their way through the global economy. As the UK Chancellor has acknowledged, we will not be able to save every job and every business. For banks, it’s only a matter of time until the first domino falls because, at some point, there will be the first multi-national company, or even country to default on their debt.
The nature of global debt makes it very difficult for banks to truly know their credit risk at the corporate level. When Lehman’s went under, nobody knew the extent of its exposure because it was 2,800 seperate legal entities. Regulations like Basel 239 address some of these problems and encourage banks to have a single view of their customer. However, many banks have been implementing their compliance solutions across the bank without fundamentally changing the way they aggregate and manage data across their business. The various systems remain separate and do not work in tandem, meaning a parent company can still be registered with different names across a bank’s trading books and, therefore, the banks aren’t in a much better situation now to do comprehensive risk calculations.
They might have successfully kept the regulator happy but, in most cases, they have not really achieved the required understanding of their credit risk for the scenarios they may soon find themselves in. To find out the exposure in case of a major default, a bank would have to compile a load of reports, consolidate it into a spread sheet and try to figure it out.
What is needed is a central validated model for credit risk at an umbrella level. This modelling should be able to isolate any entity in question, whether that be a currency or company, before analysing the banks entire relationship with the entity into one consolidated data set. As an example, let’s say Italy or a major airline was going to default, banks should know what that means for them and how it affects their trading operations. The only way to do this proficiently and at speed is to automate their approach to having as single view of their corporate clients.
Having such a capability will also help make the best lending decisions and have the best view of risk while loosening lending requirements to maintain liquidity in the economy. Several government representatives have prompted banks to be less stringent with granting loans at this time, but having some freedom to use reserves for the greater good of the economy should only be done with eyes wide open. This makes it even more important to fully understand what the true risk is, so as not to have too loose conditions blindly.
Finally, the current pricing volatility is the ultimate test of the banks’ operations and how well their systems can come together in a coherent way. Credit risk solutions are about to be put to the test to see how far they have come since 2008 and we’ll soon find out how well capitalised these firms really are. Those who have the data modelling capabilities to quickly analyse how an inevitable default will affect them will be best placed to hedge their risk of large exposure.