Describe the principal methods used by investment banks to compute their Value at Risk to movements in market prices. What are the advantages and limitations of using such measures?
Describe the principal methods used by investment banks to compute their Value at Risk to movements in market prices. What are the advantages and limitations of using such measures?
. Introduction
Philippe Jorion defines Value at Risk (VaR) as a model used to "summarise the maximum loss on a portfolio in a given time horizon, within a given confidence level". VaR is the main method for financial institutions to measure their exposure to risk. In the world of banking today, risk management is becoming an important subject as banks strive to prevent events such as LTCM occurring again. There are several types of risks that banks face. These are: operational, market, credit, liquidity and business risks.
There four steps to calculating VaR. First, the risk manager must collect all the data, regarding losses previously made and information about the risk factors involved. The risk factors must be identified. There are four main steps risk factors employed in a VaR model, which are: the decline or rise in interest rates and equity prices, or the movement in commodity and currency prices. The risk manager must then choose the appropriate method of calculating VaR. This could be Delta Normal, Historic Simulation, or Monte Carlo Simulation. Finally, when all the information and data is inputted, the VaR can be calculated.
This essay will focus on how the principle methods of VaR are calculated and will explain the advantages and disadvantages of each.
2.
Delta Normal
Delta-normal method, also known as variance-covariance approach, relies on the normality assumption for the driving risk factors. The portfolio standard deviation (portfolio VaR) is a simple linear transformation of individual risk factors if all positions are linear in underlying risks. For the non-linear factors, such as bond and options, it works with linear approximations where the true positions are placed with linear approximations.
To illustrate this method, first let us take an instrument whose value depends on a single underlying risk factor, S. The value of the portfolio at the initial point . Then we define as the first partial derivative, or the portfolio sensitivity to changes in prices, evaluated at the current position V0. The potential loss in value is computed as: .
Given the assumption that the distribution is normal, the portfolio can be derived from the product of the exposure and the of the underlying variable: ,where is the standard normal deviate corresponding to the specified confidence level, e.g., 1.645 for a 95 percent confidence level. Here, we take as the standard deviation of rates of changes in the price. (2001 Jorion)
Here is an example calculating the risk of a single cash position:
There is a USD-based firm with one asset: JPY 14 billion in cash. What is the 95% worst-case loss over a 1-day period?
We are given the information that, according to the RiskMetrics(r)1 data, the daily price volatility of the JPY/USD exchange rate is 1.78%, using a 95% confidence level. (Note: This implies that 1 standard deviation equals 1.78%/1.65=1.08%) So the risk calculated is $100 million1.78%=$1.78 million, which means that the 95% worst-case loss due to adverse movements in the JPY/USD over 1 day would be $1.78 million (or, we have a 5% chance of losing $1.78 million or more overnight).2
The next step is to estimate the of a portfolio. The measurement of is relatively simple if the portfolio consists of only securities with jointly normal distributions. The portfolio return is , where the weights are indexed by time to recognize the dynamic nature of trading portfolios. Using matrix notations, the portfolio variance is given by , where t+1 is the forecast of the covariance matrix over the horizon. The portfolio is then: . The covariance matrix could be achieved based on the data of correlations of underlying risk factors.
To illustrate this, we use a ...
This is a preview of the whole essay
The next step is to estimate the of a portfolio. The measurement of is relatively simple if the portfolio consists of only securities with jointly normal distributions. The portfolio return is , where the weights are indexed by time to recognize the dynamic nature of trading portfolios. Using matrix notations, the portfolio variance is given by , where t+1 is the forecast of the covariance matrix over the horizon. The portfolio is then: . The covariance matrix could be achieved based on the data of correlations of underlying risk factors.
To illustrate this, we use a simplified example with only two risk positions. The JPY/USD has a of $1.78 million, while THB/USD's is $1.9 million. The RiskMetrics data set shows a correlation of 55% between JPY/USD and THB/USD.
So using the formula:
Hence in our example:
3
2.1 Advantages:
First, it is easy to implement and to calculate because it is based on the assumption of normality. It only involves a simple matrix multiplication and only requires the market values and exposures of current positions, combined with risk data. As a result it saves time and cost of facilities. Compared to the Historical method and Monte-Carlo method, it only takes 0.08s compared to 66.27s using full Monte Carlo calculation. (Jorion 2001:228) Second, is easily amenable to analysis, since measures of marginal and incremental risk are a by-product of the computation. (Jorion 2001:220)
2.1 Limitations:
One limitation is the existence of fat tails in the distribution of returns on most financial assets. It has problems because attempts to capture the behaviour of the portfolio return in the left tail. Thus, a model based on a normal distribution would underestimate the proportion of outliers and hence the true value at risk. A further study could be distributed to adjust the fat tails. There are mainly two approaches to adjust for fat tails, one of which is the normal mixture approach and the other is generalized error distribution. (Down 1998:74) Another problem is that the method inadequately measures the risk of non-linear instruments, such as options or mortgages. (Jorion 2001:221)
3.
Historic or Back-Simulation Approach
The historical simulation method is nonparametric in that distributional relationships are already embedded in simulating historical data, thereby getting rid of the need for cumbersome estimations of parameters such as volatilities and correlations. It makes no assumption about the distribution of return, a truly distribution-free approach.
There is an example to help our understanding. Choose a period and frequency, say, 1000 days. And assume that you held your current portfolio on every one of these 1000 days. Then we can compute the daily gain or loss. Rank these from the worst loss to the highest gain. And now we are able to determine the loss corresponding to your desired tolerance level. For example if the tolerance level is 5% then this will be the 50th worst loss. So here the VaR is the size of this loss.
Historic Approach [C1]is also a relatively simple method where distributions can be non-normal, and securities can be non-linear. Historic approach involves keeping a historical record of preceding price changes. It is essentially a simulation technique that assumes that whatever the realizations of those changes in prices and rates were in the earlier period is what they can be over the forecast horizon. It consists of going back in time, such as over the last 250 days, and applying the weights of a current portfolio to a series of past observations to a time-series of historical asset returns:
4
Rp,k is the return of the hypothetical portfolio
Wi,t is the weight that is kept at their current value
Ri,k is the history return.
More generally, full calculation requires a set of complete prices, such as yield curves, instead of just returns. Hypothetical future prices for scenario k are obtained from applying historical changes in prices to the current level of prices:
A new portfolio value Vp,k is then computed from the full set of hypothetical prices, perhaps incorporating non-linear relationships Vk = V(Si,k).
VaR is then obtained from the entire distribution of hypothetical returns, where each historical scenario is assigned the same weight of 1/t. As always, the choice of the sample period reflects a trade-off between using longer and shorter sample sizes. Longer intervals increase the accuracy of estimates. On the other hand, if the estimation period used in historical simulation is too long, VaR estimates may not reflect the impact of events appropriately. Taken together, it indicates that the estimates of VaR can be significantly influenced by the choice of the data.
3.1 Advantages
Historic Approach is relatively easy to compute and to understand where distributions can be non-normal, and securities can be non-linear. Especially, if the historical data has been collected for daily market-to-market, the same data can then be stored for later reuse [C2]in estimating VaR. Also it can easily be adapted to scenario analysis.
3.2 Disadvantages
The main drawback of the historical simulation approach is that it requires extensive data to estimate a reliable VaR figure because the number of days in the estimation window dictates the number of simulation trials that can be run (Hull and White, 1998)5.6 The Basle Committee requires banks to use at least one year of daily returns for historical simulation, but much more data would be needed for the reliable VaR estimation (Vlaar 2000)7.
Moreover it has other drawbacks such as unstable parameters and altering variances. In addition, the model may not work well if based on small sample results in the discreteness of the tails of a distribution, leading to the problem that the variance is too high for the estimation of the tails to be unreliable. Other limitations of the historical simulation approach are that it does not allow prediction beyond the sample window examined (Danielsson and de Vries, 1997)8 and that it may not be appropriate for a situation in which the availability of the historical data on the underlying risk factors are limited (Hull, 2000)9.
4. Monte Carlo
The Monte Carlo simulation is very similar to the Historic simulation. The difference is that it requires randomly generated values rather than past data. Using this method, there are two steps to calculating VaR. Firstly, all the relevant risk factors must be specified. The stochastic process for the financial variables and the parameters (this includes correlation, volatilities, risk, etc.) must be identified. Price paths must then be constructed. Each of these price paths generates values for the risk factors that are inputted into the pricing models for each security in the portfolio. Over one thousand market rate scenarios are generated, and the portfolio re-valued each time to construct a distribution of returns. The diagram below illustrates the process10:
Finally, the outcomes are ranked and the VaR at the required confidence level can be derived from the distance of any specified percentile to the mean. For example, if the 99% confidence level was required for VaR, the 990th worst-case portfolio value out of the 1,000 iterations is used. Or, if the 95% confidence was instead required, the 950th worst-case portfolio value is selected, and so on.
4.1 Advantages:
Out of the three methods, Monte Carlo simulation is by far the most powerful and flexible for computing VaR. The simulation can perform sensitivity analysis and stress testing (which can facilitate determining how a portfolio will fare under any situation), without the need to operate real tests. This can be useful when calculating the VaR for exotic derivatives, which are more complicated and specialised, compared to vanilla derivatives, which tend to be simple and more common. An example of an exotic derivative is a "knockout" derivative. It takes into account a wide range of exposures and risks, which includes non-linearity price risk, volatility and model risk. This method is also flexible, in terms of time variation in volatility and allows for fat tails, where extreme events are expected to occur more commonly than in normal distributions. Monte Carlo simulation includes time horizons, which can dramatically change the structure of a portfolio. The effects of time can include the time decay of options, the time value of daily cash flows, or the effect of pre-specified trading or hedging strategies. These effects become more important as the time horizon increases along with uncertainty about the future.
4.2 Limitations:
The main drawback of this method is its computation time due to the large volume of calculations required. It can often take hours to compute the pricing paths, which would not be ideal when a quick decision is required. This method is computer intensive and therefore expensive to implement due to the development of the system infrastructure and the development of the software. There is the addition cost of maintaining the system to ensure no errors or system failures occur. Another weakness is the effects of model risk. Since Monte Carlo relies on stochastic processes for the underlying risk factors in addition to pricing models for mortgages or options, a potential problem could occur if the models are wrong. To check if models are robust to changes in the model, sensitivity analysis should be used. The VaR estimates that derive from the Monte Carlo simulation are subject to sampling variation, which is due to the limited number of replications and data simulated. However, if the sample size increases, the measurement of VaR will become more accurate.
5.
Conclusion
Each of the three methods that have been described in the previous sections has their advantages and disadvantages. It is dependant on the situation to which one a risk manager will choose as the most suitable to implement. Delta normal is easy, fast and an accurate measure of Value at Risk. However, this measurement it not particularly good in measuring risk exposure in complex securities. Historic simulation is easy to implement, and uses previous full valuations of securities. Its weakness is that it does not take time variation into account. Monte Carlo simulation can eliminate some of these technical difficulties and is able to measure non-linearity. It is especially useful in calculating risk exposure for securities with no previous data. There is however a trade off between accuracy and speed. This is because if more price paths are calculated, the longer the computation time will be.
Value at Risk is a very useful tool for banks, risk managers, and regulators. Financial institutions must identify their risk exposures and put in place measures to prevent losses. However, VaR is only a model and accuracy cannot be guaranteed. The occurrence of fat tails is a critical problem of this measurement. None of the VaR methods can incorporate this factor successfully at 99% confidence levels. This is because the confidence level width is too narrow to allow for extreme errors. Therefore, it's important to stress test portfolio positions extensively in addition to VaR.
This measurement is improving with time as more data is available and models are more refined. Possibly the best option would be to check the results of each method against one another and analyse the sources of differences.
Bibliography
Jorion P. (2001). Value at Risk. New York: McGraw-Hill
Down K. (1998). Beyond Value at Risk. West Sussex: John Wiley & Sons Ltd.
Crouhy, Galai, and Mark, Risk Management, McGraw-Hill, 2001
Online material:
http://www.riskmetrics.com/courses/
http:// www.gloriamundi.org
RiskMetrics, a registered trademark of J.P. Morgan & Co., rapidly became an industry standard. Its wide acceptance and the following success of a credit risk measurement methodology let RiskMetrics Group to emerge as an independent company through the spin-off from J.P. Morgan in September 1998.
2 http://www.riskmetrics.com/courses/measuring_risk/parametric.html
3 http://www.riskmetrics.com/courses/measuring_risk/correlations.html
4 This Approach is sometimes called bootstrapping because it involves using the actual distribution of recent historical data without replacement.
5 Hull, John C., and Alan White, 1998, "Value at Risk When Daily Changes in Market Variables Are Not Normally Distributed,"
6 There seems no agreement on the issue of how long a data window should be in historical simulation in order to obtain reliable VaR estimates. However, a study by Vlaar (2002) on Dutch bond portfolios show that the estimation of VaR using the historical simulation method based on 750 days still fails to attain desirable accuracy.
7 Vlaar, Peter J.G., 2000, "Value at Risk Models for Dutch Bond Portfolios"
8 Danielsson, Jon, and Casper G. de Vries, 1997, "Value-at-Risk and Extreme Returns"
9 Hull, John C., 2000, Options, Futures, and Other Derivatives, 4th ed.
0 Diagram taken Riskmetrics website
[C1] Historical?
[C2] Reuse or use