The global banking sector is currently gearing up to implement the Fundamental Review of the Trading Book (FRTB) regulations. These rules are a new market risk framework, which aim to ensure that banks manage their capital better. Drawn up by the Basel Committee in the aftermath of the 2008 financial crisis, FRTB aims to prevent banks from being exposed to similar systemic losses in the future, mainly by creating a very clear distinction between the trading book and the banking book. The idea is to stop banks being overly flexible with the way they report and manage risk, which was one reason for the serious problems of 2008.
The guiding principle of FRTB is that banks should clearly demarcate the trading book and the banking book. The trading book should only include assets which are actively traded on a regular basis. The banking book, on the other hand, must solely consist of assets that are held until maturity – such as customer loans.
FRTB is also bringing in a new method of calculating risk, known as the Standardized Approach (SA). The SA aims to move banks away from using the “value at risk” (VaR) measure and towards that known as the “expected shortfall” (ES). VaR is used to estimate the probability of a given portfolio losing a certain amount of money under normal market conditions in a set time period, usually a day. ES involves measuring the expected profit or loss on a portfolio in a more conservative way, by focusing on less profitable outcomes. The changes to these calculations are designed to help banks take better account of tail risk and variable liquidity horizons.
Banks are also allowed to devise their own models for calculating risk other than the SA, although these models must be approved by the regulator. Regardless of which model is adopted, the FRTB is designed to introduce new liquidity and reporting standards. By pushing the industry after from the classic VaR approach, these new rules are instigating a fundamental shift in how banks must approach risk management.
Given the scale of the change, the implementation of FRTB is causing significant difficulties for banks. For one, the calculations involved require a lot more data than was previously the case. And, with ES, the calculations are a lot more data – sensitive. Even small inaccuracies can massively affect the final number. Gathering the required information, and then handling it effectively, isn’t an easy task.
Another difficulty is that FRTB includes a fixed risk – factor legibility test. Banks need to prove exactly how much risk is involved in every trade – from complex OTC derivatives, to plain swaps, to futures – and that they have the capital available to cover any margin calls, for example.
Doing so requires a huge amount of trade, reference, and pricing data. This is relatively easy to provide for assets that are being traded frequently on markets – like major currencies. But how do you find the data for assets that haven’t been traded for a long time, such as bonds that were issued months or years ago? This is very tricky, and can mean that banks are forced to allocate even more capital to risk management, in effect over – compensating for assets whose risk – profile cannot be easily observed.
For these reasons, FRTB is proving to be an expensive exercise. Banks are finding they need to bring in whole new teams that are capable of analyzing data in the required way. They are having to revamp whole swathes of infrastructure and introduce new technological solutions. And then there’s the indirect cost. Because FRTB requires banks to differentiate more clearly between the trading book and the banking book, the result can often be that less capital is available to trade.
What steps can banks take to ensure they’re complying with FRTB? While the regulators do allow banks to design their own models for calculating risk exposure, many believe it’s much easier to adopt the SA. After all, it’s the standard that the regulators themselves have devised: cutting out the need to dedicate time and resources to creating a new model, that will also then need to be specially approved.
Then it’s all about quality of data. One of the aims of FRTB is to improve the type of data banks are using, with the hope this will lead to more effective risk management practices. Banks will need to onboard a dedicated and reliable data provider, preferably one that collects its information from dozens of different sources across different markets, asset classes, and financial instruments. On top of that, banks will also need to integrate another layer based on high – class data analytics technology. Such technology allows banks to manage their risk according to the programmed taxonomy, like the SA, that they’ve adopted. Establishing sound and quality processes – and being able to confidently demonstrate how they work to the regulator – is key to navigating the challenges posed by FRTB implementation.
Regulatory requirements are constantly changing, and FRTB is just one part of the puzzle. Overcoming this challenge is difficult for all institutions. But the key to successful compliance is data. Banks and financial institutions must ensure they’re using the most accurate calculations and valuations for financial instruments, products, and portfolios – the only way to make sure that risk management is being measured as effectively as possible. Leveraging accurate data is crucial to complying with both FRTB and whole swathes of financial regulation.