Business Requirements understanding - Credit Risk Management
The discipline of credit risk management attempts to identify and quantify (ahead in the cycle – much before occurrence of the default) such losses that can be faced by the provider of the credit facility due to potential default by the borrower.
Essential functions of risk management are Identifying, Measuring, Managing and mitigating risk.
Purpose of measuring risk
-
Pricing the risk: Identifying and quantifying credit risk helps the provider to charge interest to borrowers in line with risk associated
-
Capital cushion: Under Basel guidelines, bank has to keep a certain percentage of their risk weighted assets as capital cushion (equity capital) in its balance sheets to stay in business. We will see more about the capital cushion and its implication in the later part of this document. Currently according to RBI, bank has to maintain the capital cushion of 9% on the risk weighted asset.
- Quantum of credit risk borne by the lender would also impact - Amount of loan; the lending rate; tenor of the Loan & frequency of monitoring
The Environment of Basel II
The first Basel Accord of 1988(Basel I) emphasized the importance of minimum capital adequacy to address credit risk by devising the standardized approach. Capital adequacy is defined as the ratio of capital funds and risk weighted assets. Each asset in the balance sheet carries a risk weight and sum total of these products are called as risk weighted assets. In India, the RBI has suggested risk weights of 0, 20, 50, 75, 100 and 175 % for various types of assets. The rule is that a bank’s capital should not be less than 8 percent (in India it is 9%) of its risk weighted assets.
The Basel Committee has developed two approaches for calculating regulatory capital for credit risk, the so-called “standardized approach” and “internal ratings based approach” (IRB).
Measuring Credit risk – Basel II Standardized approach
The standardized approach uses external ratings such as those provided by “external credit assessment institutions” to determine risk-weights for capital charges, whereas the IRB allows banks to develop their own internal ratings for risk-weighting purposes subject to the meeting of specific criteria and supervisory approval.
Under the standardized approach, the rating assigned by the eligible external credit rating agencies is largely used to measure the credit risk. The RBI has identified the following external credit rating agencies
- Domestic Credit rating agencies - CRISIL Ltd, CARE Ltd, ICRA Ltd, FITCH India
- International Credit rating agencies – S&P, Moodys, FITCH
Below shown mapping is the generalized format given by Basel committee. Banks have different sets of mapping based on the claims which is almost similar to the one shown below.
Measuring Credit risk – Basel II IRB approach
Internal ratings based (IRB) approach was created with the expectations that it will accurately align capital requirements with the intrinsic amount of credit risk and, consequently, will produce lower overall capital requirements. IRB is also complex to understand and implement, but it is undoubtedly more sensitive and effective than previous standardized approach. Thus it encourages banks to improve their internal risk management.
Motivation to banks for moving to IRB approach
-
As the picture below summarizes, by moving to IRB approach, banks will be able to quantify risk accurately and hence they may require only low capital charge. For example, capital requirements should drop substantially at a bank with a prime business portfolio that is well collateralized. On the other hand, a bank with a high-risk portfolio will likely face higher capital requirements and, consequently, limits on its business potential. Thus, over time, it presents banks with the opportunity to gain competitive advantage by allocating capital to those processes, segments, and markets that demonstrate a strong risk/return ratio
- Bank will also be able to do risk based pricing which means borrowers whose risk is high will be charged a high interest rate, thus using pricing they will be able to compensate the capital required for meeting unexpected losses.
IRB Components - Expected & Unexpected Losses
The concept of expected and unexpected losses plays an important role in the economic foundation of the IRB. Whereas a bank cannot predict in advance what losses it will suffer over a given period, it can forecast the average level of credit losses. Expected losses (EL) are those within the average level of reasonably foreseeable credit losses. Unexpected losses (UL) relate to potential volatility in the expected losses. These expected and unexpected losses are mainly calculated in order to cover them by capital provisioning and write-offs which are regarded as cost of a banking business.
JP Morgan’s CreditMetrics IRB Model
CreditMetrics is the popular methodology developed by J.P. Morgan and it utilizes historical default rates to model default probabilities using a credit migration or a transition matrix. Transition matrices measure the probability of a change in a given credit rating for each debt security over the credit horizon.
Consider the example of a five-year fixed-rate BBB rated loan of $100 million made at 6 % interest. Based on historical data on publicly-traded bonds, the probability that a BBB borrower will stay at BBB over the next year is estimated at 86.93 percent. There is also some probability that the borrower will be upgraded (e.g., to A) or will be downgraded (e.g., to CCC or even to default, D). Table below shows the possible probabilities of transition.
The effect of rating upgrades and downgrades is to impact the required credit risk spreads or premiums on the loan's remaining cash flows. BBB rated Loan’s current price is given by
P=6+ 6(1+r1+s1)+6(1+r2+s2)2+6(1+r3+s3)3+ 106(1+r4+s4)4
Where ri are the forward risk-free rates on BBB rated Zero-coupon treasury bonds and si is the annual credit spread on (zero coupon) loans of a BBB rated class.
Thus using the above formula and appropriate risk adjusted rate, Loan cash flows are discounted to obtain the respective current loan values.
The distribution of loan values on the one year credit horizon date can be drawn using the transition probabilities and the loan valuations. The mean of the value distribution shown is $107.09 million. If the loan had retained its BBB rating, then the loan’s value would have been $107.55 million at the end of the credit horizon. Thus, the expected losses on this loan are $460,000 (=$107.55 minus $107.09million).
However, at 99% confidence level (1% chance that loan values will be lower than $100.12), unexpected loss is found to be $6.97 which is also nothing but the economic capital. Similarly, portfolio loss distributions are then obtained by calculating individual asset values for each possible joint migration scenario.
Data Requirements – IRB approach
One thing which is common in all IRB models is robust information system to store and analyze long-term time series databases on the credit performance. Hence a bank looking to move forward with IRB approach has to have an integrated risk management system which aid the banks in better data collection, support high quality data and provide scope for detailed technical analysis. Hence, banks aiming at maintaining lower capital by adopting the advanced approaches would also have to be prepared to meet the higher information needs.
Credit Risk Management (CRM) System – Data Flow Diagram
Integrated Credit Risk Management (CRM) System – DFD diagram
Hence, based on the understanding and analysis so far, one point that can be stressed on for all the Indian banks is to importance on developing a completely integrated credit risk management system. Above suggested data flow diagram is one such model. As the data gets collected from different sources – Customers, Rating agencies External data sources, the centralized data warehousing system amasses the data for further processing through different CRM modules. Finally the required measures can be presented through the dashboards for the risk management group to monitor and control the risk.
For the design and development of data base system, dimensional modeling is used because of its obvious benefits like easier interpretability and understandability, efficient performance of complex queries and extensibility to accommodate unexpected new data.
Data understanding and Collection
As discussed in the previous stages, we required Data from the Corporate (Commercial) Banking sector of a bank as an input for our DW model. Along with this, as a second input to the model, we also required rating data from an external rating agency (CRISIL, CARE etc.).
For this purpose we collected data from a reputed nationalised bank of India and scaled it down, in dimensions, to include only the relevant columns and in size, to include only a limited no. of rows to enable a satisfactory processing speed.
Banking Data
For the purpose of designing and implementing this model, we selected a list of 10 big to moderate size customers of the bank with varying degree of exposure (in loan size), varying ownerships (combination of pvt. and public co.s) and varying degree of risks.
After this step, we included a range of loan products with various features (secure/unsecure) to which these customers were exposed. The data contained several properties of loan products as Product codes, Description, Product wise limits, Currency, Collateral Required, Interest Rate etc. which were relevant to our model designed. We also collected the industry codes and descriptions to be used in the model.
We also collected details for more than 20 loan accounts opened by the selected list of customers under the above mentioned industries and from the selected range of products. We captured the following details for these accounts: Account No., Product Code, Balance outstanding, Term of the loan etc.
Rating Data
In rating data, we had a table of ratings which were awarded by the agencies and the probabilities associated with the ratings (as explained in the previous section). Based on the rating data, we created a rating fact table which contained derived multipliers to be used as an input for the model. These multipliers were used to calculate the expected and unexpected portion of the loss in the risk assessment model.
Note: The data that we collected was modified (names and codes changed) and scaled down to maintain confidentiality.
Dimension modeling
Business process
Business process or measurement event that is captured for modeling is each of the new transactions that occur in the branch offices with new or existing clients. Also a separate start schema is created to store the information about the rating. It holds measures like risk free rate, default spread for different ratings. It is basically generated or updated yearly based on historical data captured form external data source.
Granularity
Granularity of this dimension design is the account number for each type of product and for each of the customers. Assumption here is each of the customers can hold multiple products and each of the products will be recorded and tracked in unique account number.
Dimensions
Dimensions are designed such that they take on a unique value at the declared grain of the fact table. Dimensions used are Product, Account, Customer, Industry and Date.
Facts – Measures
Extraction, Transformation, and Loading
Extraction, Transformation, and Loading (ETL) processes are responsible for the operations taking place in the back stage of data warehouse architecture. In a high level description of an ETL process, first, the data are extracted from the source data stores that can be On-Line Transaction Processing (OLTP) or legacy systems, files under any format, web pages, various kinds of documents (e.g., spreadsheets and text documents) or even data coming in a streaming fashion. Typically, only the data that are different from the previous execution of an ETL process (newly inserted, updated, and deleted information) should be extracted from the sources. After this phase, the extracted data are propagated to a special-purpose area of the warehouse, called the Data Staging Area (DSA), where their transformation, homogenization, and cleansing take place. The most frequently used transformations include filters and checks to ensure that the data propagated to the warehouse respect business rules and integrity constraint, as well as schema transformations that ensure that data fit the target data warehouse schema. Finally, the data are loaded to the central data warehouse (DW) and all its counterparts (e.g., data marts and views). In a traditional data warehouse setting, the ETL process periodically refreshes the data warehouse during idle or low-load, periods of its operation (e.g., every night) and has a specific time-window to complete. Nowadays, business necessities and demands require near real-time data warehouse refreshment and significant attention is drawn to this kind of technological advancement.
Microsoft Integration Services
MIS is a platform for building high performance data integration solutions, including extraction, transformation, and load (ETL) packages for data warehousing.
Integration Services includes graphical tools and wizards for
- building and debugging packages;
- tasks for performing workflow functions such as FTP operations,
- executing SQL statements
- sending e-mail messages
- data sources and destinations for extracting and loading data
- transformations for cleaning, aggregating, merging, and copying data
- a management service, the Integration Services service for administering package execution and storage
- Application programming interfaces (APIs) for programming the Integration Services object model.
Thus SSIS was an apt solution for the ETL problem in our project, to cleanse and load data.
Creation of SSIS Packages
We had data sources on the historical loan data from a reputed bank. The data was in the form of an Excel file and an XML file which we had to load to the SQL Server 2008 database. The data was not clean and had to be cleansed and standardized to meet the data quality required to create a Data warehouse project with the transaction data from the bank on the loan disbursal and repayment. We created a SSIS package depicted on the next page to load the data from the various data sources.
Database setup and table creation
We had the following tables created in SQL server with proper primary key and foreign key constraints. The tables we created as below were used to load data from the various data sources such as XML and excel files. The tables created are –
- DIM_ACCOUNT
- DIM_CUSTOMER
- DIM_INDUSTRY
- DIM_PRODUCT
- DIM_RATING
- DIMDATE
- FACT_RATING
- FACT_RISK_ESTMN
SSIS Package – Components
Data Source
The data sources where the excel file and the XML files, proper source connection managers were established with these files.
Data Conversion
Some data were not in proper format as per the requirement in the target table, for example the date format needed was an integer such as “20110801” depicting “1st August 2011”. Such tasks were achieved by using the Data conversion task in SSIS.
Derived Column
Some columns had to be combined to form a single column, for example, we had columns such as first name, last name in the source data. We used the Derived column task to combine such columns to form a single derived column. Here is an example of getting net multiplier from 2 multipliers.
Data Destination
The data destination – OLE DB Destination was our SQL Server MAX_MIN database which we utilized for all the database operations in the project.
These were only few of many ETL options available in SSIS for loading data in proper format.
Shown below is the complete package that shows all the data flow tasks for all the Fact and Dimension tables used in the project.
Cube Creation & Deployment
After the ETL process, the data source view and cubes are created and deployed. The snapshot of created cube and browser showing the pivot tables are shown below.
Risk Measures Calculation
As the final output for the dashboard requires the unexpected losses and capital charge for each of the customers and for each of the product, the measures like Capital charge and unexpected losses are calculated dynamically and stored as calculated member.
Business Intelligence – Portfolio Risk Management Dashboard
Dashboard may be defined as the tool that displays data coming from many sources. It displays metrics that are the results of simple or complex calculations. Dashboard also offers the drill down from summary data to detailed transactions. In our project, a dash board has been created to display the various metrics related to the credit risk assessment like the unexpected losses incurred by the bank and the capital requirements that must be in place to sanction the loan.
The above information could be drilled down to granularity based on the availed loan product, the Industry/sector or Customer. The other indicators that could be shown are the total loan taken by each customer, Total loan in each product, the loan and interest charges in each industry, the various customers and the loan amount of a particular product etc...
Implementation
-
Establish connection: The connection is established from the excel to the analytical services (via existing connections), which has a Data warehouse containing all the credit related information like the loan account no, Customer Name, Loan Product, loan amount ,interest amount, Industry, Credit risk rating etc..
-
Generate Pivot tables: Create individual Pivot tables and a Pivot charts which displays the required information as per the requirements of the user. It is recommended to create individual pivot table and pivot charts for each and every query separately.
-
Create Slicers: Create slicers as per the requirements such as the Customer Name, Loan Product, Industry description etc... On which basis the query would be addressed. Slicers are tools that are used to slice the information form the cube based on some dimensions. (fig 1)
-
Link Slicers to Pivot table: Once the slicers are created, link all the slicers with all the pivot tables. This ensures that if a user queries by one slicer, it results in generating the information related to that slicer in all the pivot charts.
Features of Portfolio Risk management dashboard
- Improved decision making and performance
- Ability to easily assess the credit risk associated with sanctioning loans for customers ( corporates) in different industries
- Gives a bird’s eye view of all the credit information of a particular customer based on different dimensions
- Ability to perform improved analysis through visual representation of performance measures.
- A great tool for the risk management crew to calculate the capital charges to be held by the bank before sanctioning any loan to the customer.
- Acts as a “Tactical Dash Board” - Supports in measuring the progress of key financing projects or other initiatives
- Time efficient: Saves the time of managers by eliminating the need for multiple reports as this aggregates and drills down to granularity simultaneously
- Reduces the need to create and maintain large number of static reports.
- The dashboard which are well designed like this are more interesting than the old fashioned tabular reports, thus being more readable than the conventional reports.
References
-
Credit Risk Modeling of Middle Markets -
-
Internal credit Rating practices of Indian Banks -
-
Risk management in Banks -
-
A comparative analysis of current credit risk models -
-
The Quantitative Credit Strategist: The past, present and future of credit risk -
-
Comparative Analysis of Alternative Credit Risk Models -
-
A Comparative Anatomy of Credit Risk Models -
- FRM Hand book 2012
Indian Institute of Management, Lucknow