St. Petersburg University
Graduate School of Management
Master in Corporate Finance
CREDIT RISK MANAGEMENT, MODELING LOAN
PORTDOLIO LOSS DISTRIBUTION, A CASE STUDY
IN BANKING
Master’s Thesis by the 2nd year student
Corporate Finance
Amir Azamtarrahian
Research advisor:
Professor, Alexander Bukhvalov
St. Petersburg
2016
Table of Content
Introduction ................................................................................................................................................... 5
CHAPTER 1: LITERATURE REVIEW .................................................................................................. 9
1.
Literature review ..................................................................................................................................... 9
1.1
Credit risk models ........................................................................................................................... 9
1.2
Loss given Default (LGD).............................................................................................................13
1.3
Default correlation.........................................................................................................................16
CHAPTER 2: METHODOLOGY ........................................................................................................... 20
2.1 Regulations in banking ............................................................................................................................20
2.1.1 Basel II & III ....................................................................................................................................21
2.1.2 Bank balance sheet and Basel ..........................................................................................................22
2.2 Problem formulation ...............................................................................................................................25
2.2.1 Inputs and assumptions ....................................................................................................................25
2.2.1.1 Credit exposure .........................................................................................................................25
2.2.1.2 Probability of default .................................................................................................................26
2.2.1.3 Default correlation.....................................................................................................................32
2.2.1.4 Loss given default (LGD) .........................................................................................................37
2.2.2 Methodology ................................................................................................................................40
2.3 Basel Asymptotic Risk Factor Approach (ARFA) ..............................................................................41
CHAPTER 3: IMPLEMENTATION AND RESULTS ......................................................................... 43
3.1 Managerial prelude..................................................................................................................................43
3.2 Descriptive analysis of loan portfolio .....................................................................................................45
3.3 Model evaluation .....................................................................................................................................47
3.4 Model implementation ............................................................................................................................48
3.4.1 Expected shortfall (tail loss) .............................................................................................................50
3.4.2 Sector credit risk analysis .................................................................................................................53
3.4.3 Structuring the loan portfolio ...........................................................................................................57
3.5 Conclusion...............................................................................................................................................61
References .................................................................................................................................................. 62
Appendix A ................................................................................................................................................ 65
Appendix B ................................................................................................................................................ 67
Appendix C ................................................................................................................................................ 71
Appendix D ................................................................................................................................................ 72
Appendix E ................................................................................................................................................ 73
Appendix F ................................................................................................................................................. 78
Page 1 of 87
Appendix G ................................................................................................................................................ 80
Page 2 of 87
АННОТАЦИЯ
Автор
Название магистерской
диссертации
Факультет
Направление подготовки
Год
Научный руководитель
Описание цели, задач и
основных результатов
Ключевые слова
Амир Азамтappхиан
Моделирование распределения убытка кредитного
портфеля, исследование в банковской отрасли
Высшая школа менеджмента
Корпоративные Финансы
2016
Александр Васильевич Бухвалов
В этой работе проводится исследование в области
управления кредитными рисками и предлагается модель
для распределения убытков кредитного портфеля в
банковской индустрии. Было определено, что модель в
основе требований системы «Базель» работает хорошо в
благоприятных экономических условиях, но не была
доказана эффективность модели в периоды кризисов и
экономических спадов. В основе системы «Базеля»
лежит однофакторная модель Гаусса, которая определяет
корреляцию между дефолтами кредитов в портфеле, а
также модель Васичека для определения размера
резервов. В работе была предложена модель
распределения
убытков
портфеля
в
периоды
экономического спада и кризисов, в основе которой
лежит однофакторная копула распределения Стьюдента
со стохастическим распределением убытков. Более того
модель предоставляет кривую зависимости кредитного
риска временной структуры и расширяет модель
«Базеля» внедряя также корреляционную связь между
вероятностью дефолта и убытком в случае дефолта.
Исследование было проведено с помощью метода
Монте-Карло
и
предлагает
анализ
влияния
предложенной модели на стратегию кредитования в
сравнении с моделью «Базель».
Базель, кредитный риск, корреляция дефолтов, копула,
распределение убытков кредитного портфеля, метод
Монте-Карло
Page 3 of 87
Abstract
Master Student’s Name
Master Thesis Title
Faculty
Main field of study
Year
Academic Advisor’s Name
Description of the goal, tasks and
main results
Amir Azamtarrahian
Modeling loan portfolio loss distribution, a case study in
banking
Graduate School of Management
Master in Corporate Finance
2016
Alexander Bukhvalov
This thesis studies credit risk management and proposes a
generic model for loan portfolio loss distribution in banking
industry. Basel model works acceptably well in normal
economic situations but not in downturns. It assumes onefactor Gaussian copula for default correlations and introduces
capital reserve on the ground of Vasicek process. To model
portfolio loss distribution in downturn economy, a one-factor
t-student copula with stochastic default correlations is
proposed, moreover, the model determines specific credit
curve for each counterparty and extends Basel by correlated
PD and LGD as well. The analysis is done through Monte
Carlo simulation and studies the influence of the proposed
model in lending strategy comparing to Basel.
Keywords
Basel, credit risk, default correlation, copula, portfolio loss
distribution, Monte Carlo
Page 4 of 87
Introduction
Credit risk can be considered as the most critical of all types of risk. It is estimated that
financial institutions allocate about 60% of the regulatory capital to credit risk, about 15% to market
risk and about 25% to operational risk1. There are two types of credit risk, migration risk and default
risk whereas the latter is the subject of this thesis.
On the regulatory framework of Basel II & III, the required reserves are categorized in two
Tiers. Tier 1 generally represents shareholders’ equity and retained earnings while Tier 2 determines
the subordinated long-term debt, general loan-losses and undisclosed reserves. Banks have to maintain
a total capital ratio of 8% regarding risk-based assets2 in the balance sheet, broken into 6% for Tier 1
and at least 2% for Tier 2. Apparently, putting aside a specific part of capital to hedge against probable
losses limits the potential interest incomes. On the other hand, banks avoid unhedged risk and it
prompts them to demand for quantifying accurate reserves to trade-off between risk and return and
formulate a more efficient hedging strategy. Likewise, maintaining the net-interest income within a
steady state range and decisions concerning the bank capital structure or the service fees are highly
contingent on the amount of these reserves as well.
Among the major genres of risks that banks are exposed to, such as market, operational, credit,
and liquidity risk, this thesis concentrates on credit default risk of the counterparty corporations in a
loan portfolio and provides the bank with a quantitative figure of loss distribution and the required
economic capital. It comes up with a generic model for credit risk and extends Basel to model loan
portfolio loss distribution. Basel capital adequacy model generally works well in normal economic
situation, however, it does not take into account some types of risks appearing in economy downturns
and recessions, such as default contagion and tail dependence of default rates. Moreover, empirically
it is evidenced that default rates and recovery rates depend nonlinearly in a manner that expected
recoveries are tending to decrease more in recession comparing their likely increase in expansionary
economy.
1
2
Correlation risk modelling and management by Gunter Meissner, 2013
Weighted sum of assets based on their corresponding risks
Page 5 of 87
Research goal
The thesis primarily involves “extending Basel credit risk capital adequacy model for economy
downturns through modeling and incorporating empirical evidences of the risk factors and their
interactions and analyzing how it influences the bank lending strategy”.
Motivation and research questions
Should banks operate smoothly, Basel regulates to keep some capital as reserves to absorb
probable losses. In this regard, Basel II recommends Vasicek model as an industry standard. However,
the model comes with unrealistic assumptions such as similar probability of default for all
counterparties in portfolio with the same default correlation and constant through time. Moreover,
contrary to empirical evidences, it assumes independence between recovery rates and default rates.
This thesis extends the model to approach the problem regarding the corresponding realities based on
empirical and the stylized facts. It comes up with the consequences of applying Basel model and
answers to the following questions,
1- How reliable is it to apply Basel in economic downturns?
2- Does it matter to apply a more accurate model?
3- How much is the difference between Economic Capital (EC) in Basel and the model?
4- How does it influence the bank loan portfolio structure and lending strategy?
Research gap
Scholars in credit risk modeling devote their work on particular components of credit risk
modeling such as probability of default, recovery rates or default correlations. Moreover, they chiefly
concentrate on Analytically Tractable (AT) models via assuming independence or Gaussian processes
to come up with mathematical closed form expressions; this limits the flexibility and also the
applicability of the model in regard with real behavior of risk factors. The extensions to Vasicek model
are mostly applied in pricing credit derivative products like Collateralized Debt Obligations (CDO),
Credit Default Swaps (CDS) which deal with a portfolio of credit assets and defaultable
counterparties; the study for loan portfolio starts with Vasicek 1987 who correlated default rates by
one-factor Gaussian copula and introduced analytical model for Large Homogeneous Portfolio (LHP).
In his model correlations and recoveries are assumed deterministic and constant. Giese 2005 extends
stochastic recoveries and comes up with correlated default rates and loss give defaults, furthermore,
Fray 2013 predicts loss given defaults as a function of default rates. Moreover, Gregory, Burtschell
and Laurent 2005 carry on a comparative study of different copulas in pricing of synthetic CDO
Page 6 of 87
tranches. Hull-White 2004 applies double t-student copula to CDO and 𝑛𝑡ℎ to default CDS. They
extend their work in Hull-White 2010 and propose stochastic correlations as well as recovery rates
correlated to default rates through Gaussian copulas.
In loan portfolios there is a need to incorporate all realities together to model credit risk through
considering not only appropriate models for each risk factor but also taking into account their
empirical interactions and individual characteristics in economic downturns as well. Although HullWhite 2010 fulfilled this objective to some extent, but still they simplified some realized facts and
ignored tail dependence, a frequently observed phenomena in economic downturns, to model default
correlations. Furthermore they did not account for negative-negative tail dependence of recovery rates
and business cycles. Moreover, some commercial credit risk models like “Credit Metrics”, “Credit
Risk+” and “Moody’s KMV” propose models to forecast credit risk with better accuracy. Credit Swiss
recommends “Credit Risk+” and correlates default rates by introducing default rate volatilities rather
than some background common factor to model default correlations; Moody’s KMV tries to model
default correlations through correlating the assets’ processes of counterparties in a portfolio and JP
Morgan “Credit Metrics” model concentrates on transition matrixes of default correlations and tries
to simulate portfolio behavior in terms of a Markov chain. Although each product has an advantage
in some aspects but none of them thoroughly address the problem via incorporating all considerations.
Research design
In order to extend Basel for economic downturns and benefit from the previous proposed
methodologies in loan portfolio, this thesis focuses on modeling interactions and gets Merton model
to calculate probability of default, besides, assumes Vasicek process for Counterparty’s assets value.
The default rates are correlated through t-copula taking into account tail dependence to model
systematic risk recessions. Moreover, it correlates recoveries with default rates through a Clayton
copula to capture the negative-negative tail dependency as the stylized fact in market. In addition, it
releases the constant correlation assumption and comes up with stochastic correlations negatively
correlated with market performance through Gaussian copula. Finally, it takes a sample portfolio of
loans and compares the economic capital with Basel. The main steps in the modeling process is
summarized in the following flow chart,
Page 7 of 87
Default correlation by
t-Copula to model tail
dependence
Modeling stochastic
correlations with Beta
distribution
Modeling correlations
dependence to the
systematic risk factor
by Gaussian copula
Modelling stochastic
recovery rates,
sampling from Beta
distribution
Figure 1: thesis perspective
Modeling Recovery and
PDs dependency by
Clayton copula to model
negative-negative tail
dependence
Calculating portfolio
Credit-VaR and ES u
sing Monte Carlo
Simulation
Comparing results with
Basel framework, EC,
portfolio structure and
lending strategy
Conclusion and
managerial implications
Chapter one introduces research objective and poses thesis questions. Subsequent to a concise
literature review, the methodology and problem formulation is presented in the next chapter where
inputs and risk factors are described in detail with the associated characteristics of interactions for
Monte Carlo simulation. Chapter 2 continues with modeling loss distribution in portfolio level and
proposes credit-VaR. Finally, the third chapter represents implementation and managerial implication
and provides conclusions through evaluating results regarding Basel capital adequacy accord and the
strategies proposed by the model.
Page 8 of 87
CHAPTER 1: LITERATURE REVIEW
This chapter illustrates thesis overview and the evolution of literature about the subject.
It studies the previous works of credit risk models and the associated stylized facts for each
component of the portfolio loss risk factor is presented.
1. Literature review
This section starts with a quick definition of credit risk and Basel regulatory requirements, it
reviews the frequently cited default risk models focusing on firm-value models’3 literature.
Subsequently, default correlation models are reviewed and finally evolution of papers about modeling
Loss-Given-Default (LGD) and its correlation with default probability is presented.
1.1 Credit risk models
Credit risk has proved to be a debated topic particularly in the aftermath of 2007-2008 global
financial crisis and the appearance of the default contagion phenomena. Recently it targets not only
the so-called junk stocks but also the most credit worthy institutions like AIG and Lehman brothers
after the crisis. However, despite the recently heated topic, it had been already a concern for policy
3
Firm-value models and structural models are used interchangeably, option-based credit models is an alternative name
as well
Page 9 of 87
makers and regulatory institutions long before in banking industry respecting the regulatory issues
and banks internal risk management policies.
Credit models involve estimating default probabilities and term structure of spreads as price
of default risk. There are two major approaches in credit risk modelling, structural and intensity basedbased models known as reduced-form as well. The former takes default as an endogenous event while
the latter models default as an exogenous variable. Primary works on structural models originates
from Merton 1976 in line with Black-Scholes options pricing model. Merton assumes a company with
liabilities like Zero-Coupon-Bond (ZCB) and takes equity as European-option on the company assets
where liabilities’ par value is the strike price. Accordingly, the risk-neutral probability of default is
simply when 𝑉𝑇 < 𝐷 which is 𝑁(−𝑑2 ) in Black-Scholes framework. Relying on Merton, default can
only happens if assets fall below outstanding debt at the time of servicing or refinancing the debt.
Other structural models such as Black and Cox 1976 is similar to the Merton model in that they use
the firm’s structural variables such as asset and debt values as basis for their modelling. However,
Black and Cox states that default can occur at any time, not just at the expiration of the debt, this
property puts it in the family of first passage time models. The model allows defaults to occur as soon
as the firm’s assets value falls below a certain threshold, which does not necessarily have to be debt
value. This assumption in Merton model was contrary to bond safety covenants which allows bond
holders to push a firm into bankruptcy under certain special situations even if the firm hast not
explicitly defaulted on a payment; Furthermore, Delianedis and Geske 1977 account for more complex
capital structures by creating two tranches of risky debt. At date T1 the firm is obliged to make the
payment of F1 for short term liabilities. The firm cannot sell its assets to meet its obligation. Rather,
the firm must go to capital markets and raise funds (equity or new debt) to finance the payments.
Clearly, the ability to raise funds will depend on the amount of debt outstanding. If the present value
of all debt outstanding, together with the required payment, F1, exceeds the value of the firm, then the
shareholders will declare the firm bankrupt. Viewed from time zero, equity holders have a compound
option on the assets of the firm. No default by T1, then they can exercise their claim, make the payment
of F1 dollars and receive a call option on the assets of the firm. Hence, at date zero, they have an
option on an option, or a compound option.
Page 10 of 87
In continue, the first passage family was extended by Longstaff-Schwartz 1995 taking a
stochastic process for interest rates rather than constant4. Leland and Toft (1996) took the next major
step through incorporating bankruptcy costs and tax effects which allows a formal characterization of
optimal capital structure, debt capacity, and credit spreads in a classic trade-off model. The BlackCox model produces low credit spread because assets that begin above the barrier cannot reach the
barrier immediately by diffusion only. To increase the spreads jumps came into the asset value
process. Zhou 1997 introduced a jump component to the underlying continuous process, but the model
is somehow intractable. In an alternative approach, Finkelstein et al 2002 CreditGrades model, allows
the barrier to fluctuate randomly. The uncertainty in the barrier admits the possibility that the firm’s
asset value may be closer to the default point. This leads to higher short-term spreads than are
produced without the barrier uncertainty. Moreover, Moody’s KV 2003, came up with a modified
structural model outputs Distance to Default (DD) to be mapped on an internally developed database
of companies with real default probabilities historically complying with the DD calculated, hence, the
outcome is regarded as a real world probability of default. The work is taken as a gist of main insights
gleaned from Black-Cox 1976, Geske 1977 and Longstaff-Schwartz 1995. In their framework the
option is a perpetual down-and-out that can be exercised at any time, repurchase or issue of debt is
possible and restriction on asset sales exists5. Also, it accommodates five different types of liabilities:
short-term liabilities, long-term liabilities, convertible debt, preferred equity and common equity6.
Brigo and Tarenghi 2004 developed AT1P , on the ground of Black-Cox model, providing time
dependency in both the volatility and the barrier hence non-constant business risk and debt level, and
contrary to Zhou, still preserving closed form pricing formulas and a more flexible model comparing
to Black-Cox in a sense that parameters are perfectly calibrated to CDS market data7. The most
intriguing characteristic of the model belongs to its independency from the current asset value which
is difficult to be estimated particularly for non-listed companies. In their framework it is possible to
rescale the initial value of the firm’s assets 𝐴0 = 1 and express the (free) barrier parameter H as a
fraction of it and hence, it is not necessary to know the real value of the firm. Moreover, Brigo 20098
comes up with Scenario Barrier Time-Varying AT1P model (SBTV) to reduce effect of uncertain
4
Reduced Form vs. Structural Models of Credit Risk: A Case Study of Three Models∗ Navneet Arora, Jeffrey R. Bohn,
Fanlin Zhu Moody’s KMV February 17, 2005
5
Default forecasting in KMV, masters thesis, Yuqian Steve Lu, 2008, University of oxford
6
Modeling default risk, modeling methodology , Crosbie and Bohn (2003)
7
Although CDS market Is not available in Iran but any other suitable proxy give privilege to the model
8
Credit Calibration with Structural Models: The Lehman case and Equity Swaps under Counterparty Risk Damiano
Brigo∗ Massimo Morini Marco Tarenghi, December 22, 2009
Page 11 of 87
accounting data accomplished by defining random barriers and calibrating probabilities and barriers.
Here the market price is taken as the weighted average price of different scenarios by probabilities
calibrated. The model outputs a smoother implied volatilities contrary to AT1P and efficiently
complies with intensity models.
There are stylized facts that structural models are not able to generate positive short-term
positive spreads. This was addressed by adding jumps in the process. Moreover, credit spreads implied
from structural models are much lower than real data referred as credit spread puzzle. While empirical
evidence is still scant, a few empirical researchers have begun to test these model extensions. Lyden
and Saraniti (2000) compare the Merton and the Longstaff-Schwartz models and find that both models
under-predicted spreads; the assumption of stochastic interest rates did not seem to change the
qualitative nature of the finding. Eom, Helwege, and Huang (2003) find evidence contradicting
conventional wisdom on the bias of structural model spreads. They find structural models that depart
from the Merton framework tend to over-predict spreads for the debt of firms with high volatility or
high leverage. For safer bonds, these models, with the exception of Leland-Toft 1996, under-predict
spreads; following table summarizes literature evolution.
Row
Milestone
Description
1
Merton 1974
Option-based risky ZCB pricing
2
Black-Cox 1976
Came up with first passage default time model
3
Geske 1977
Introduce Short and Long Term debt
4
Longstaff-Schwartz 1995
Assuming interest rates mean-reverting stochastic process
5
Zhou 1997
Added jumps to the underlying process, it is not AT9
6
Leland 1998
Adding tax and bankruptcy measures to value risky debt
7
CreditGrades 2002
Modelled barrier as a continuous stochastic process
8
Moody’s KMV 2003
Commercial model (DD), mixture of previous works
9
Brigo, Tarenghi 200410
10
Brigo, Tarenghi 2006
AT1P, introduced non-constant volatility, model
independent of current asset value
(SBTV) reducing effect of unreliable accounting data
Table 1: evolution of structural models through time
9
Analytically Tractable
Selected model since it does not rely on asset value
10
Page 12 of 87
In the other category of credit models known as reduced-form11 models, the random nature of
defaults is typically characterized in terms of the first “arrival” of Poisson process. Intensity based
models model the risk of default as an event that arrives exogenously. There are several types of
reduced form models. Lando 1998, Duffie and Singleton 1999 showed in their work that price of a
risky bond of a company can be calculated by a default adjusted discount rate. The extra rate is referred
as intensity. The first model that actively used the concept of default intensity came from Robert
Jarrow and Stuart Turnbull 1995. They constructed their model based on two classes of zero coupon
bonds, a risk free ZCB and a risky one. The paper suggests that when default intensity was held
constant, the risky debt’s value is proportional to the risk free by12, while 𝛿 is recovery rate and 𝜇 is
market price of default risk (a positive constant less than 1).
Based on Jarrow and Philip Protterb 2004, the difference between these two models can be
characterized in terms of the information assumed known by the modeler. Structural models assume
that the modeler has the same information set as the firm’s manager—complete knowledge of all the
firm’s assets and liabilities. In most situations, this knowledge leads to a predictable default time. In
contrast, reduced form models assume that the modeler has the same information set as the marketincomplete knowledge of the firm’s condition. Consequently, for pricing and hedging, reduced form
models are the preferred methodology13. Jarrow concludes that if one is interested in pricing a firm’s
risky debt or related credit derivatives, then reduced form models are the preferred approach that have
been constructed, purposefully, to be based on the information available to the market.
1.2 Loss given Default (LGD)
Loss given default is defined as the amount of funds that is lost by a bank when a borrower
defaults on a loan. As defaults and credit events generally end up courts, there is considerable
uncertainty as to what an accurate recovery would be if a company defaults. Based on IRB approach
banks are able to design internal models to calculate capital reserve in light of common characteristics
identified by studies of academics and industry, and apparently, LGD is not an exception in this regard.
11
12
Or intensity-based models
𝐷𝑟𝑖𝑠𝑘𝑦 (𝑡, 𝑇) = [𝑒 −𝜆𝜇(𝑇−𝑡) + (1 − 𝑒 −𝜆𝜇(𝑇−𝑡) )𝛿] × 𝐷𝑟𝑖𝑠𝑘𝑦−𝑓𝑟𝑒𝑒 , Credit derivatives, A primer on credit risk, modelling
and instruments, George Chaoko, Andera Sjoman et al.
13
Structural verses reduced form models: a new information based perspective, Robert A. Jarrow and Philip Protterb,
Journal of investment management, Vol. 2, No. 2, (2004), pp. 1–10
Page 13 of 87
Historically, recoveries are in a range from 20% to 80% which depends to which definition for
default we refer. BIS14 defines four events as default. Schuermann 200415 mentions the debts’ place
in capital structure, seniority, general economy status and industry as the main determinants of LGD.
Also, recovery rates differ depending on claiming in which stages of the bankruptcy process has been
made16. Schuermann also argues the recoveries distribution with evidence of higher probability for
lower recoveries empirically. According to Altman and Kishore 1996 one should know about seniority
and collateral to predict the recovery rate. Likewise, Gupton, Gates and Carty report that syndicated
loan recoveries for senior secured loans were 70% in average while the unsecured one fall to 52%.
Moreover, the importance of monitoring reviewed by Carey 1998 through comparing investment
grade and lower credit grade debts highlighted by attributing the difference in performance of higher
risk instrument to the closer monitoring.
Fray 2000, shows that in recession, recovery is about a third lower than in an expansion.
Altman, Brady, Resti and Seroni 2003, suggest when aggregate default rates are high, recovery rates
are low.
Figure 2: recoveries and default rate dependency, Altman 2003
Also, Hu and Peraudin 2002 presented that correlation between recoveries and aggregate
defaults rates for the US are −20% on average and about −30% when considering only tails which
implies higher correlations in recession. Moreover, Altman and Kishore 1996 revealed that some
industries like utilities are more recession resisitent than others.
14
Basel Committee on banking Supervision document
What do we know about LGD? Federal reserve Bank of New York, By Til Schuermann 2004
16
Last Cash Paid- default- Chapter 11- emergence due to liquidation or genuine emergence, it take on average 2.5 years
15
Page 14 of 87
Table 2: Industry impact from Altman and Kishore 1996
A recent work by Archarya, Bharath and Srinivasan 2003 found that when industries are in
distress, mean LGD is on average 10% to 20% higher than otherwise. In their work utilities is still the
highest recovery industry sector. Apart from industry effect, to the size, contrary to its importance in
modeling probability of default, based on literatures it seems to have no strong effect on losses ones
default has occured. Asarnow and Edwards 1995 find no relation between LGD and loan sizes in their
study of loan data in Citi bank middle market and large corporation lending. Likewise, Thornburn
2000 also found that firm size does not matter in determining LGD. Similar results obtained from
Carty and Liberman 1996 and others as well.
There are various models that connect LGD rate to default rate. Fraye 2000 assumes recovery
is a linear function of normal risk factor associated to the Vasicek distribution. Pykhtin 2003
parameterizes the amount, volatility and systematic risk of the loan collateral and infers the loan’s
LGD and brings up a closed form expression for expected loss and economic capital. Geise 2005
applies econometric estimates of correlations between default rates and loss given default rates and
calculate their impacts on the credit risk capital. Fraye 2013 models LGD as a function of default
rates. In his paper an asymptotic portfolio is assumed with entities all having the same expected loss
and default correlation. In this work the model chiefly inspires from works in Fraye 2000 and Giese
2005 to fulfill the stylized facts about the concept. Accordingly, LGD is taken stochastic and modeled
by Beta distribution which is calibrated to industry norms and correlated with default rates.
Page 15 of 87
1.3 Default correlation
This part reviews the historical behavior of correlation through time and introduces models
proposed in line with the stylized facts studied empirically.
The degree to which defaults occur together is critical for financial lenders such as commercial
banks, credit unions as well as insurance companies etc. Default correlation is addressed in the
literature from different points of view, some deal with empirical analysis of correlation behavior in
time particularly business cycles and evidence the dynamic characteristic of correlation through time.
Others address the industry sectors and find correlation clustering phenomena inter and intra sector
and find that default correlation between industries is positive with the exception of energy sector as
the recession resistant sector with a low or negative correlation with others. Moreover, default
correlation within sectors is higher than between sectors and this suggests that systematic factors like
recession, structural weakness such as general decline of a sector have a greater impacts on defaults
than do idiosyncratic factors. Hence a lender is advised to have a sector-diversified loan portfolio to
reduce default correlation risk. Systematic risk and correlation are highly dependent and historically,
a systematic decline in stocks almost involves the entire stock market and correlation between stocks
increase sharply.
Table 3: correlation level and correlation volatility with respect to the state of economy, Meissner 2013
Meissner 2013 monitors correlation between stocks in Dow and Dow index and observes that
correlation in Dow increases when Dow increases more strongly, however, there is this increase
accelerates in time of severe decline in Dow during 2008 to Aug 2009 from a non-crisis average of
27% to over 50%. (The red triangle graph represents Dow)
Page 16 of 87
Figure 3: Dow and correlation of stocks in Dow, Meissner 2013
To model default correlations Lucas 1995 proposed the binomial model taking default as a
binary variable. Furthermore, he shows that correlation levels and as well as correlation volatilities
are higher in economic crisis. In the following figure a mean-reverting behavior of correlation through
time is noticeable. It shows the monthly average correlation levels and depicts there is a low
correlation in strong economic growth, while it increases during recession.
Figure 4: Monthly correlation levels of stocks in the Dow, Meissner 2013
Page 17 of 87
This is also the case for correlation volatility where during economic decline tends to
increase.
Figure 5: Monthly correlation volatility of the stocks in the Dow, Meissner 2013
Apparently, to the dynamic nature of correlation, using statistical methods for correlation in
finance is not applicable since most financial dependencies are not linear. There are different models
proposed in literature, Heston 1993 correlates two stochastic process of stocks and its volatility
through the diffusion part. This method is widely used in finance due to its dynamic and versatile
characteristics. Zhou 2001 applies Heston correlation to derive analytical expression for joint default
distribution in Black-Cox first passage time framework. In this thesis the default correlation is
modelled through asset-value approach which correlates defaults through the stochastic process of
asset returns via Heston methodology as well. Brigo and Pallavicini 2008 apply two Heston
correlation, the first correlates two factors that effects the interest rate process and the second
correlates interest rate process with the default intensity process.
The other famous or infamous correlation model is Copula approach. One-factor copulas were
introduced to finance by Oldrich Vasicek in 1987. More versatile, multivariate copulas were applied
to finance by David Li 2000. There are lots of copula models and among all, one-factor Gaussian and
from the Archimedean family Clayton, Gumble and Frank copulas are the most popular in finance
industry. Moreover, some extension like t-copula originates from t-Student distribution which is
categorized in two-factor copula models. Contrary to Gaussian copula, t-copula is capable of
modelling tail dependence. Copulas found a place in modelling of correlation in finance, Meisser 2007
and Brigo and Chourdakis 2009 apply a bivariate Gaussian copula to model CDS seller and the
reference asset with counterparty credit risk. Basic structural models assume that correlations are
constant. Empirical evidence suggests that assets correlations are positively related to default rates.
Page 18 of 87
From the reports by Meissner, correlations are not deterministic and show a mean-reverting behavior
through time, more importantly, the default correlation of two firms tends to increase by time, hence,
static models do not capture the entire features of default correlations. Hull White 2010 propose
dynamic correlated model based on asset-value approach. In their work, the stochastic parts of asset
processes are correlated by a one-factor Gaussian copula, where correlation and recovery rates are
both Beta random variables correlated again to the market factor via Gaussian copula. In each time
step a unique LGD and correlation is associated with the market factor, hence the higher the market
factor, the lower correlations and LGD from Beta distribution.
This thesis steps forward and extends the basic Vasicek model of Basel through Hull White
2010 insights. However, contrary to Hull-White, here t-student copula is used to correlate defaults,
furthermore, the LGD and correlations are random, and the correlation modeled by Clayton copula to
capture the negative-negative tail dependence between these variables. This methodology is more
consistent with empirical results.
Page 19 of 87
CHAPTER 2: METHODOLOGY
This chapter concisely reviews regulatory issues and proceeds to the problem
motivation and formulation. Subsequent to introduction of Basel II and III accords, bank
balance sheet and lending procedure is reviewed, afterwards, each component of expected loss
and the proposed models for probability of default, recovery rate and how to model their
interactions is modeled and finally the analysis in portfolio level is performed.
2.1 Regulations in banking
Historically, investment opportunities were solely available to affluent people who were
considered to afford losses by their wealth, however, as investment activities grew as all classes of
people began to enjoy higher disposable income and finding new places to put their money, to avoid
fraudulent activities, in theory these investors were protected by the Blue Sky laws (enacted in Kansas
1911). These state laws were meant to protect people from worthless securities; they are basic
disclosure laws that require a company to provide prospectus in which the promoters can rely on. In
this section the most recent and important regulations concerning credit risk analysis are presented.
Page 20 of 87
2.1.1 Basel II & III
Basel II was initially published in 2004 as an international standard for banking regulators to
control how much capital banks need to put aside in order to guard against the types of financial and
operational risks they face through lending and investment practices. Basel II is based on three pillars.
The first pillar deals with maintenance of regulatory capital for the three major components of risks
that a bank faces including credit risk, operational risk, and market risk. The second pillar is a
regulatory response to the first one. It also provides a framework for dealing with systematic risk,
strategic risk, liquidity risk etc. It is the International capital Adequacy Assessment Process (ICAAP)
which is the result of Pillar I and II accords. The third pillar aims to complement the minimum capital
requirements and supervisory review process by developing a set of disclosure requirements which
allows the market participants to gauge the capital adequacy of an institution. Under Basel II, the risk
of counterparty default and credit migration risk were addressed but mark-to-market losses due to
credit value adjustments were not. This problem was considered under development of Basel III17.
Figure 6: Basel framework, Moody's Analytics
Following 2007-2007 financial crisis, Basel committee considered a major overhaul to the
former Basel accords. Although the committee had increased capital requirements, it continued to
regulate extra reserves to cover credit risk in Basel III too. This followed by a tighter capital
regulations to take liquidity risk into account as well. The first proposal for Base III issued in
December 2009 and the final version released in a year after. The regulation consists of six parts
17
KTH-Royal institute of technology, Masters ‘Thesis, Dan Franzen, Otto Sjoholm
Page 21 of 87
involving definition of capital and requirements, capital conservation buffer, countercyclical buffer,
leverage ratio, liquidity risk and finally counterparty credit risk. Under Basel III bank total capital
consists of Tier 1 that represents the equity capital like share capital and retained earnings and should
be at least 4.5% of the risk-weighted assets at all times. Additional Tier 1 items include noncumulative preferred stock and the total Tier capital should be at least 6% of the risk-weighted assets
at all times. Tier 2 is the debt which is subordinated to depositors. The total Tier 1&2 must be 8% of
the risk-weighted assets at all times. The required reserves were more than doubled comparing to
Basel II. Common equity to the Basel committee is regarded as “going-concern capital”18. When a
bank is going-concern common equity absorbs losses. Tier 2 capital is referred to as “gone-concern
capital” when a bank is no longer a going-concern, then losses have to be absorbed by Tier 2 capital
that ranks below depositors in liquidation.
Basel III is part of the continuous struggling effort to enhance the banking regulatory
framework. It is built on Basel I and Basel II documents and seeks to improve the banking sectors’
ability to deal with financial and economic stress, improve risk management and strengthen banks
transparency. The Basel committee call for more capital for “systematically important” banks. This
is not a standardized term across countries, however, in US it is considered banks with capital above
$50 billion.19
2.1.2 Bank balance sheet and Basel
Each item in the balance sheet of a bank corresponds to an interest-related income or expense
item, and the average yield for the period and the net-interest income highly depends on the shape of
the yield curve. Banks usually try to overcome the undesirable impacts of yield curve flattening20
phenomena, due to narrowing the difference rate between long term and short term borrowing, through
charging more for their services. Moreover, the volume of bank fee generating activities may differ
also based on interest rate expectations and demand behavior for loan. For instance, as interest rates
rise, there is less demand for mortgages and on the other hand the prepayments happen less due to
higher cost of borrowing again, and as a result fee income and associated economic value originating
from mortgage services may increase or remain stable in these situations. The interest rate can jointly
18
The concept is an assumption in accounting that entity will be able continue operating sufficient to carry out its
commitments, obligations, and objectives etc.
19
Basel Committee for Bank Supervision, Basel III, A global regulatory framework for more resilient banks and
banking system
20
Normally yield curves are upward slopping to stimulate spending in recession period, the flattened and downward slop
yield curve happen in economic booms to slow down the economy and indicator of lower short rates in future
Page 22 of 87
act with other risk factors facing a bank. In a rising interest rate period the payments of the loan
decrease due to higher level of payment value or lower earnings, this exposes the bank, particularly
for floating rate borrowings, to credit risk. For a bank with short term liabilities, rising interest rates
will increase the likelihood of liquidity risk and credit quality problems as well.
To mitigate the credit risk banks develop certain internal credit analysis procedures next to
national and international regulatory requirements. Evaluating the creditworthiness of a corporate
client and a rough valuation is mainly done by credit department through reviewing the financial
statements. This is usually supplemented with site visits to confirm the claims via direct observation
and evaluation. The primary financial status criterion are related to the capital structure and the major
financial ratios, cross-sectional as well as time series analysis. Besides, cash generation power and the
strategic position of the company in the market is taken into account by the credit team as well. Finally,
the credit team determines the approved amount of the loan in addition to the required collateral and
other formalities to proceed for lending. Parallel to reviewing the creditworthiness of the company
and after fulfilling the requirements, the risk management department is responsible to evaluate the
extra risk the loan imposes on the bank and determines the appropriate interest rate to be charged for
the counterparty corporation. The job is completed through evaluating the required capital reserves
concerning the defaults in a single trade and portfolio level in regard with predefined credit limits.
Page 23 of 87
The chart depicts the inner-link of the problem components, the model and the Basel regulatory
capital.
Capital Requirements Regulations
Market risk
Basel I, Basel
II, Basel III
Pillar I, II, III
Credit Risk
Operational risk
Liquidity risk
Assets
Liability + Equity
Loans:
Borrowed capital:
Deposits:
Customer deposits
Certificate of deposits
Financial instruments:
Bonds
Derivatives
Interbank market funding
Equity:
Common shares, RE etc.
Mortgages
Loans to companies
Loans to consumers
Loans to government bodies
Liquid assets:
Shares, corporate bonds
Government bonds
Interbank debt claims
Other assets:
Real states
Derivatives
Good will etc.
Capital Reserve
𝑃𝐷𝑖
PD
𝑃𝐷𝑗
𝑃𝐷𝑘
Unexpected loss= CVaR - EL
LGD
EAD
Figure 7: Bank balance sheet, Basel regulatory and model linkage
Page 24 of 87
2.2 Problem formulation
The objective is to devise a generic model for loan portfolio loss distribution and evaluate the
corresponding credit-VaR and the proposed economic capital consistent with downturn economy
situations. To evaluate the performance of the model with Basel accord, a sample of bank loans is
constructed and the required capital generated by the model, the proposed portfolio structure and the
management strategy for lending is compared with the one suggested in Basel framework.
2.2.1 Inputs and assumptions
Following table illustrates the main inputs and the associated models for each variable. The
main interactions exist between default rates themselves beside LGD and default rates. Moreover, the
correlations are also modelled to be dependent on default rates as well.
Row
Inputs
Model
Remark
Basel
1
Probability of Default(PD)
Historical
Rating agencies
Merton
-
Black-Cox
-
Brigo AT1P
-
Merton
2
Loss Given Default
Beta dstr.
Stochastic
deterministic
3
Default correlation
Beta dstr.
J. Lopez study21
J. Lopez study
4
Cor(default ratei , default ratej ) t-student copula
Vasicek extension
Gaussian-copula
5
Cor(Market, LGDi )
Clayton-copula
Historical data
independent
6
Cor(ρi , M)
Gaussian copula
stochastic
deterministic
7
Exposure
Pure
discount Deterministic/loan
Deterministic/
loan
principal
loan principal
Table 4: problem inputs
2.2.1.1 Credit exposure
Literally the future credit exposure at time 𝑡 is defined as the total positive exposure of the
bank at time 𝑡 if the counterparty corporation defaults assuming zero recovery rate. The current
exposure simply is known at time 𝑡 = 0 which is the loan principle in this case. The exposure normally
is calculated in trade level and counterparty level for a single client. However, contrary to calculating
exposure in derivative contracts that bears a level of complexity, to the nature of a loan, the exposure
is nearly deterministic specifically for short term where there is a trivial probability of change or shift
Page 25 of 87
in the term structure of interest rates and particularly when the contract is fixed rate. In the bank under
study, loans granted are mainly pure discount loans and the principle and corresponding interest
accumulate to be paid at maturity, hence it could be treated as a ZCB. Conventionally the Exposure
at Default (EAD) is assumed to be loan principal.
2.2.1.2 Probability of default
A default event is an event where the counterparty cannot face its obligations on the payments
owed to the bank for a reason. There are several credit events that might lead to default, including
bankruptcy when a company become insolvent, failure to pay after a reasonable amount of time after
the due date (usually 90 days), significant downgrading of credit rating, and credit event after merger
that the new merged entity financially is weaker than the original entity and finally government action
or market disruptions typically confiscation of assets or effects of war. These events are often
categorized as being driven by either market risk or company-specific risk22.
There are three methods to extract term-structure of default rates for a risky entity, obtaining
historical default information from rating agencies like Moody’s, taking structural models like
Merton, Black-Cox etc. and finally, taking the implied approach from current market data which
resembles getting implied volatility from current market option prices and is considered as the most
reliable source of constructing default term-structure23, since current market information reflects
market agreed perception about the evolution of the market in the future and default rates derived may
be different from historical default rates.
The primary advantage of using rating agencies information is the ease and accessibility of
determining ratings for issuers. However, ratings are not perfect specifically for new structural
products that have been prone to severely inaccurate assessment24. Moreover, agencies do not have
the capacity to constantly monitor and update their ratings in real-time and their assessment often lag
behind the market. And the most serious issue is the applicability of such tables in the Iranian market
regarding different market structure, country risk, recovery rates etc. Although the meaning of the
assigned rate e.g. “B” is standard by definition, however, the default probability is not static and
default intensities and spreads definitely change by time; Using diffusion process to describe changes
in the value of the firm, Merton 1974 demonstrates it can be modeled based on Black-Scholes option
22
Credit derivatives, a primer on credit risk, modelling and Instruments, CSMD, page 18
David Li 1999, on default correlation, copula approach
24
Financial simulation modelling, Keith A. Allman, Josh Laurito, Michael Loh, page 111
23
Page 26 of 87
pricing technology, and Black-Cox came up with first passage time model, applying barrier option
technology.
The alternative frequent method is implicit approach by using observable market information.
Two general markets can be used for this purpose, credit default swaps (CDS) and bond market. The
banking loan market is the other option, however, the less liquidity and difficulty of obtaining
information is the main reason not to be practically appealing. For more frequently traded CDS with
different tenures bootstrapping is done to calculate the implied default probability for each year. The
process can be completed using bond prices as well. Bonds however have additional layers of
complexity due to their variety, fixed/floating differences, optionality, different covenants and
different payment schedule all make modelling bonds more difficult than CDS. Hence, probabilities
obtained are higher than physical default probabilities encapsulating other risks than merely default.
Li 1998 presents one approach to building the credit curve from market information based on Duffie
and Singleton 1995 default treatment and obtains a yield spread curve over Treasury. The credit curve
construction is then based on this spread yield curve and exogenous assumptions about the recovery
rate based on the seniority and the rating of the bonds, and the industry of the corporation. Since there
is neither a market for CDS nor for bonds in Iran, matching a comparable company overseas with one
in the local market does not seem to be a reliable solution. However, the method is currently used by
the bank. By and large, the most applicable method the bank can implement lies in the field of
structural models which extract default probabilities from the information available in the financial
statements of the counterparty corporation. As far as the bank has access to these data it makes
privilege for these category of models to be more appealing in practice. However, the structural
models assume a listed company with equity value easily observable from market, unfortunately it is
not the case in the Iranian market where most counterparties’ stocks are not traded in the exchange
market and it demands some valuation analysis to be done in advance.
Brigo 2004 proposed a structural model independent of the current value of company assets
which makes it appealing for the case of this study. However, in the framework of Basel, the Vasicek
model is applied that is on the ground of Merton insight. To calculate probability of default consistent
with Basel, Merton model and Vasicek process is introduced and the detailed technical review of the
selected model, AT1P, is presented in Appendix B.
Page 27 of 87
2.2.1.2.1 Merton model
Merton 197425 comes up with assumption of liabilities as ZCB, and interprets default if asset
values hit ZCB face value at maturity. Despite its simplicity, it is a widely used model in industry.
Following graph illustrates the model,
Figure 8: Merton structural model
Referring Black-Scholes options pricing framework, the payoff to creditors is,
𝐷(𝑉𝑇 , 𝑇) = min(𝑉𝑇 , 𝐷) = 𝐷 − (𝐷 − 𝑉𝑇 )+
(1)
Hence creditors are short a put option written on the assets of the borrowing firm with a strike price
of equal 𝐵, the face value of debt. Based on put-call parity, equity is a call option on the firm’s assets,
𝐷(𝑉𝑡 , 𝑡) = 𝑃(𝑡, 𝑇) − 𝑃𝑢𝑡𝐵𝑆 (𝑉𝑡 , 𝐷, 𝑟, 𝑇 − 𝑡, 𝜎) 𝑎𝑛𝑑
𝐸(𝑉𝑡 , 𝑡) = 𝐶𝑎𝑙𝑙𝐵𝑆 (𝑉𝑡 , 𝐷, 𝑟, 𝑇 − 𝑡, 𝜎)
(2)
Merton insight suggests that spreads between the credit-risky debt and otherwise identical riskfree debt is simply value of this put option. Based on Merton’s insight, the main determinants of credit
spread are, maturity of debt, leverage 𝐷, and business risk of assets of the firm 𝜎. following the model,
the spread over the risk-free rate is obtainable by price of a defaultable ZCB which is,
𝑍𝐶𝐵
𝑃𝑑𝑒𝑓𝑎𝑢𝑙𝑡𝑎𝑏𝑙𝑒
= (1 − 𝑃𝐷) 𝑒 −𝑦𝑇 𝐹26
(3)
And credit spread is 𝑠 = 𝛾 − 𝑟𝑓 while (1- PD) is survival probability. For further discussion please
refer to appendix A.
25
26
Merton 1974
F is face value, y is yield and PD is probability of default
Page 28 of 87
Practitioners apply the model with some adjustments in their interpretation of data from
company’s financial statement. For instance, since the credit-VaR conventionally targets one year
horizon, the debt in the model is taken as the short term liabilities with half of the long term debt.
Moreover, if there is any interim cash out flow like interest or dividend within a year, they will be
accrued to the year end and added to the short term due debt. Besides, in case of covenants, this barrier
could be defined consistent with the safety covenant. As another issue, to obtain assets value and
volatility, the equality of,
𝜎𝐸 𝐸0 = 𝑁(𝑑1)𝜎𝐴 𝑉0
(4)
does hold only in small instances of time. This makes the formula practically unattractive. One
approach to get asset volatility is to use an iterating method by running the model for a specific period
of time on historical data of e.g. one year and then extract 𝑉𝑡 for each time interval and finally calculate
the standard deviation of the assets value in the period27. Apparently, the last 𝑉𝑡 will be 𝑉0 for the
problem. The alternative method relies on calibration to CDS market data by matching survival
probabilities from the model to the market prices of CDS. Although both solutions are challenging in
the Iranian market to the reasons mentioned above, however, estimating the volatility from a
comparable traded company in the corresponding local industry through iteration method is more
reliable than calibrating to CDS data overseas.
2.2.1.2.2 Vasicek model
The Vasicek model basically comes from Merton, however, the difference is that in Vasicek
model, instead of taking liabilities of a firm to get probability of default (PD), default probability is
given and the debt level is inferred from the PD. To derive the probability of default for a firm taking
into account systematic risk, from the Merton model asset values should follow,
𝑑𝑉𝑡 = (𝜇𝑡 − 𝑘
𝑡28
)𝑉𝑡 𝑑𝑡 + 𝜎𝑉 𝑉𝑡 𝑑𝑆𝑡 + 𝛽𝑉 𝑉𝑡 𝑑𝐵𝑡
(5)
Where 𝜎𝑉 is sensitivity of assets value to systematic risk and 𝛽𝑉 is the sensitivity to the
idiosyncratic risk. Moreover, 𝑑𝑆𝑡 is a Wiener process associated with systematic risk and follows
𝑁(0, 𝑑𝑡) and 𝑑𝐵𝑡 follows a Wiener process of 𝑁(0, 𝑑𝑡) associated with idiosyncratic risk. In this
27
28
MODELING DEFAULT RISK, Crosbie and Bohn (2002), Moody’s KMV
Payout ratio
Page 29 of 87
model 𝜎𝑉 , 𝛽𝑉 and 𝜇𝑡 assumed to be constant and 𝑑𝐵𝑡 , 𝑑𝑆𝑡 independent Wiener processes. The value
of assets at time 𝑡 is,
(6)
Taking time horizon of one year, 𝑇 = 1, then 𝐵1 𝑎𝑛𝑑 𝑆1 follow 𝑁(0,1) distribution and above
equation simplifies to,
(7)
To model default event, a binary random variable 𝐷 introduced as follow. Here 𝐷 = 1 means default
has occurred. It says, after a period of one year value of assets fell below debt level.
𝐷={
1, 𝑤𝑖𝑡ℎ 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦
𝑃𝐷
0, 𝑤𝑖𝑡ℎ 𝑝𝑟𝑜𝑏𝑎𝑏𝑖𝑙𝑖𝑡𝑦 1 − 𝑃𝐷
(8)
Thus, PD could be expressed as,
(9)
Since
𝐵1 𝑎𝑛𝑑 𝑆1
follow 𝑁(0,1) and
are independent,
then
𝜎𝑉 𝑆1
,
𝛽𝑉 𝐵1
are
𝑁(0, 𝜎𝑉 2 ) 𝑎𝑛𝑑 𝑁(0, 𝛽𝑉 2 ) respectively. Hence 𝛽𝑉 𝐵1 + 𝜎𝑉 𝑆1 is easily 𝑁(0, 𝜎𝑉 2 + 𝛽𝑉 2 ) distributed.
So, by standardizing the random variable,
(10)
Now 𝜌𝑉 =
𝜎𝑉 2
𝜎𝑉 2 +𝛽𝑉 2
is defined as the proportion of systematic risk, hence from above equation there
is,
Page 30 of 87
(11)
And probability of default can be rewritten as follow,
(12)
Which is equivalent to,
(13)
Hence, 𝐷 gets value of 0 𝑜𝑟 1 if,
(14)
And if an estimation of probability of default of the firm is available given the systematic factor, the
conditional probability of default is,
(15)
And since 𝐵1 follows 𝑁(0,1),
Page 31 of 87
(16)
This is called conditional probability of default of the firm (CPD). In order to account for tail
dependence, here the Vasicek model is extended to account for tail dependence in default rates and
default correlations as follow.
2.2.1.3 Default correlation
There is general agreement that the state of the economy in a country has a direct impact on
observed default rates. A report by Standard and Poor’s stated that, “a healthy economy in 1996
contributed to a significant decline in the total number of corporate defaults. Compared to 1995,
defaults were reduced by one-half …”29 another report by Moody’s Investors Service30 stated that
“the sources of default rate volatility are many, but macroeconomic trends are certainly the most
influential factors”.
The default correlation of two risky entities can be defined with respect to their survival time
(or time-to default) 𝑇𝐴 𝑎𝑛𝑑 𝑇𝐵 ,
𝜌𝐴𝐵 =
𝐶𝑜𝑣 (𝑇𝐴 , 𝑇𝐵 )
(17)
√𝑉𝑎𝑟(𝑇𝐴 ) 𝑉𝑎𝑟(𝑇𝐵 )
When studying the expected loss in a multi-name loan portfolio, the objective is to extract loss
distribution. There are different methods to correlate default likelihood of two entities with each other
such as correlating the stochastic processes of assets with each other by Heston 1993 method. Heston
applied the method to negatively correlate stochastic stock returns and stochastic volatility. The
defaults correlation is introduced by correlating the two Brownian motions 𝑑𝑧1 𝑎𝑛𝑑 𝑑𝑧2 . The
instantaneous correlation between the Brownian motions is
𝐶𝑜𝑟𝑟[𝑑𝑧1 (𝑡), 𝑑𝑧2 (𝑡)] = 𝜌 𝑑𝑡
(18)
The Heston correlation approach is a dynamic versatile, and mathematically rigorous
correlation model. It allows to positively or negatively correlate stochastic processes and permits
dynamic correlation modeling since 𝑑𝑧(𝑡) is a function of time. Thus, it is an integral part of
29
30
Standard and Poor’s rating performance 1996, February 1997
Moody’s Investors Service, corporate Bond defaults and default rates, January 1996
Page 32 of 87
correlation modelling in finance31. Moreover, when applying reduced-form models with stochastic
hazard rates, one can correlate the stochastic process of default intensities and generate Heston model
correlated default probabilities as well. The alternative way is using copulas to obtain the jointly
distribution of risky entities with desired correlation. The copula functions allow the joining of
multiple uni-variate distributions to a single multivariate distribution. Numerous types of copula
functions exist and among the most popular are Gaussian, 𝑡-Student from elliptical, and Clayton and
Gumble from Archimedean families.
Figure 9: Copula models, source,
http://www.assetinsights.net/Glossary/G_Clayton_Copula.html
Following the above equation for correlation is cumbersome since it prompts to define
(𝑁2) pairs of correlation if there are 𝑁 counterparty corporations. Moreover, to incorporate the
systematic risk of default which usually happens in recession and economy downturn, taking the pair
correlations is not enough. Basel II puts the ground work for capital adequacy on Vasicek Model to
account for default correlations and Credit-VaR calculation. Vasicek 1987 proposed one-factor
Gaussian copula which correlates default probabilities via asset-value. He assumes,
𝑋𝑖 = 𝜌𝑖 𝑀 + √1 − 𝜌𝑖 2 𝑈𝑖
(19)
where 𝑀 and 𝑈𝑖 follow Wiener processes and by construction, the Wiener process of 𝑋𝑖 has a
common factor M and idiosyncratic factor 𝑈𝑖 . To the one year horizon of the credit analysis, the
Wiener process 𝑀, 𝑈𝑖 and 𝑋𝑖 with distribution of 𝑁(0, 𝑑𝑡) transform into standard Normal variables.
The 𝜌𝑖 is random (but determined in each period) weights between common factor and 𝑈𝑖 while 𝑈𝑖 𝑠
are independent from each other and 𝑀 as well. 𝑀 can be modelled as a factor that defines defaulting
environment. When 𝑀 is low, there is a tendency for 𝑋𝑖 𝑠 to be low and the rate at which default occur
31
Correlation risk modeling and management, by Gunter Miessner, 2013
Page 33 of 87
is relatively high and the reverse is true when 𝑀 is high. One possible proxy for 𝑀 is a variable
modelling evolution of a well-diversified stock index such as Tehran Exchange Index, TEPIX. Hence,
in the one-factor copula framework, instead of defining (𝑁2), the entities are correlated implicitly. The
binary default variable is defined as,
𝐷𝑖 = 1
𝐷𝑖 = 0
𝑖𝑓
𝑖𝑓
∗32
𝑋𝑖 ≤ 𝐻𝑖
𝑋𝑖 > 𝐻𝑖∗
∶
𝐷𝑒𝑓𝑎𝑢𝑙𝑡
∶ 𝑁𝑜 𝑑𝑒𝑓𝑎𝑢𝑙𝑡
(20)
And the joint default probability is,
𝑃𝐷𝑖𝑗 = 𝑃𝑟𝑜𝑏(𝑋𝑖 ≤ 𝐻𝑖∗ , 𝑋𝑗 ≤ 𝐻𝑖∗ )
(21)
Moreover, the correlation between assets of company 𝑖 and 𝑗 is formulated as,
𝑎𝑠𝑠𝑒𝑡
𝜌𝑖𝑗
=
𝑐𝑜𝑣(𝜌𝑖 𝑀+ √(1− 𝜌𝑖 2 ) 𝑈𝑖 ,𝜌𝑗 𝑀+ √(1− 𝜌𝑗 2 ) 𝑈𝑗
𝜎(𝑋𝑖 ) 𝜎(𝑋𝑗 )
=
𝐶𝑜𝑣(𝜌𝑖 𝑀,𝜌𝑗 𝑀)
𝑑𝑡
=
𝜌𝑖 𝜌𝑗 𝑉𝑎𝑟(𝑀)
1×1
= 𝜌𝑖 𝜌𝑗
(22)
The parameter 𝜌𝑖 defines how sensitive is the probability of default of company 𝑖 to the
common factor. The higher 𝜌𝑖 , the more the company 𝑖 is influenced by the common factor 𝑀.
Consequently, the joint probability of default is,
𝑃𝐷𝑖𝑗 = Φ2 (𝐻𝑖∗ , 𝐻𝑗∗ , 𝜌𝑎𝑠𝑠𝑒𝑡
)
𝑖𝑗
(23)
And Φ2 is the bivariate accumulative Normal distribution and defaults are correlated by a Gaussian
copula.
The Gaussian-copula was seriously blamed as of the fundamental causes of global financial
crisis due to underestimating default correlations in such situations. It goes back to the nature of
bivariate Normal distribution which cannot get tail dependence33 at any value for correlation
parameter, while there is such behavior in economy downturns called negative-negative tail
dependence when companies tendency to default increase all to gather. Contrary to Gaussian-copula,
t-Student34 copula satisfies the tail dependence equation and it is more desirable for financial crisis
32
33
𝐻𝑖∗ is default threshold (please refer to appendix A for further discussion)
A bivariate copula has tail dependence if
lim
𝑦1 ↓0, 𝑦2 ↓0
𝑃𝑟𝑜𝑏[(𝜏1 < 𝑁 −1 (𝑦1 )| (𝜏1 < 𝑁 −1 (𝑦1 )] > 0 , 𝜏𝑖 is default times
and 𝑦𝑖 is cumulative distribution of 𝜏𝑖 .
34
Student's t-distribution with 𝑑𝑓 degrees of freedom can be defined as the distribution of the random
𝑍
variable T with 𝑇 = 𝑌 where Z is a standard normal with expected value 0 and variance 1; 𝑌 has a chi-squared
√𝑑𝑓
distribution with 𝑑𝑓 degrees of freedom and Z and V are independent.
Page 34 of 87
modeling. The following graphs compare standard Normal and t-student distribution with various
degree of freedom. Heavier tails in t-distribution is observable.
Figure 10: Standard Normal vs. t-student tails
Figure 11: Gaussian Copula vs. t-Copula with df=1, source:
http://www.assetinsights.net/Glossary/G_Clayton_Copula.html
In the Vasicek model, in addition to its weakness in modeling tail dependence, assumptions
go further of not only assuming constant and the same pairwise correlation among all entities, but also
takes the same probability of default for all entities in the portfolio. To incorporate the tail dependence,
here t-Student one-factor copula is used as a chosen preferred alternative. A multivariate t-Student
distribution with 𝑑𝑓 degree of freedom obtains when multivariate standard normal variables 𝑋𝑖 are
divided by Chi-squared variable 𝑌 with 𝑑𝑓 degree of freedom.
𝑡𝑖 =
𝑑𝑋𝑖
𝑌
√
𝑑𝑓
,
𝑑𝑋𝑖 ~ 𝑁(0,1),
𝑌𝑖 ~ 𝜒 2 (𝑑𝑓)
(24)
Page 35 of 87
To implement the model, each 𝑋𝑖 is determined according to one-factor model in equation 24,
𝑌
and then divided by√𝑑𝑓 to get t-student asset value variable. For small 𝑑𝑓 it can dramatically increase
default correlations. Default occurs once the assets (here 𝑡𝑖 ) falls below a threshold, and for instance,
in case of
𝑌
𝑌
smaller than one, since each 𝑋𝑖 is divided by the same √𝑑𝑓 this makes the assets value
𝑑𝑓
of all counterparties more extreme and thus increases the probability of observing more defaults.
Besides, the new default thresholds transform from Φ−1 (𝑃𝐷𝑖 ) which is standard Normal cumulative
invers to the t-Student inverse with 𝑑𝑓 degree of freedom35.
Moreover, contrary to Basel where correlation assumed to be the same between all pairs, the
sensitivity to common factor, 𝜌, is modeled to be different for each entity. Hull-White 2004 suggest
the correlation of equity returns of the counterparty to the market return as a proxy for 𝜌. This allows
not only a specific correlation for each entity but also the pair correlation is the product of of 𝜌𝑖 𝜌𝑗 .
However, here the correlations are not treated as exogenous variables but come from the relationship
between default rates and correlations studied by Lopez 2004.
(25)
Hull approximated the expression with high level accuracy through,
(26)
To calculate the probability of default and default correlation the corresponding input data
shall be collected,
Inputs to the model
Source to collect
Current market value of equity (𝐸0 )
Current share price × total No. of shares or
Equity valuation methods (e.g. residual Income)
Current market value of assets (𝑉0 )
Iteration method (KMV)
Assets volatility (𝜎𝐴 )
Comparable traded company analysis via KMV method
Default barrier36
Conventionally is (𝑆𝑇𝐿 +
1
2
× 𝐿𝑇𝐿), covenant
35
The code in R is : di ≪ qt(PDi , df) and rchisq(n, df) to generate n Chi − squared random variables in R
In case of Merton model, the default barriers is the ST liabilities plus all interim cash flows (interests, dividends etc.)
accrued to the end of the year.
36
Page 36 of 87
Payout Ratio (if any)
Financial statements
LGD
Calibrated 𝐵𝑒𝑡𝑎 𝑑𝑖𝑠𝑡𝑟𝑐𝑖𝑏𝑢𝑡𝑖𝑜𝑛 to historical database or
industry statistics
Maturity
Given from loan profile (here assumed 1year)
Assets growth rare
CAPM
Common factor sensitivity
𝜌𝑖 = 𝐶𝑜𝑟(𝑟𝐸 , 𝑟𝑀 ) = 𝜎(𝑟
𝐶𝑜𝑣(𝑟𝐸 ,𝑟𝑀 )
𝐸 ) 𝜎(𝑟𝑀 )
or from Lopez (2004)
Table 5: PD and asset correlation inputs and sources to collect
Based on empirical evidence, asset correlations are stochastic and tend to increase when
default rates are high. Servigny and Renault 2002 find that the correlations are higher in recession
than in expansion periods. Similar results obtained by Das, Freed, Geng and Kapadia2004. Likewise,
Ang and Chen 2002 find that the correlation between equity returns is higher during the market
downturn. Hull-White 2010 suggests a Beta distribution for the correlation parameter in order to test
the impact of stochastic correlation. The Beta distribution is the same for all 𝑖 and the dependence is
modeled by Gaussian copula through taking a sample from variable 𝐴 that is standard Normal and
correlated37 with M, then 𝜌𝑖 is set equal the same quintile of beta distribution in which the 𝐴 comes
from standard Normal. Hence, in case of economic downturn, 𝑀 falls down and 𝐴 will be very low
as well, this associates with generating a higher sensitivity factor by construction drawn from Beta
distribution. A negative correlation between M and 𝜌𝑖 corresponds to a positive correlation between
default rates and correlation
2.2.1.4 Loss given default (LGD)
A model for LGD (one minus recovery rate) should be able to capture general characteristics
described in empirical studies and the idiosyncratic features of the specific debt in the bank. According
to the historical recovery data distribution, the lower LGD rates are more likely than the higher rates
and historically LGDs change by business cycles which implies they are contingent on the overall
status of the economy as well.
37
In Hull-White this correlation is set −√0.5 and it is the case here as well
Page 37 of 87
Figure 12: Beta distribution
Furthermore, knowing the industry of a company gives guidelines about the recovery ratios as
well, for example, large industrial or consumer goods with lots of fixed assets to support the debt often
have higher recoveries. On the other hand banks and financial institutions are assumed to have lower
recoveries since they often are taken over by governments that insure depositors and policy holders
to the detriment of creditors, Moreover, companies in the same industries usually have similar capital
structure which can give guidance on what recoveries can be expected. Archarya, Bharath and
Srinivasan 2003 found that when industries are in distress, mean LGD is on average 10% to 20%
higher than otherwise.
Table 6: industry impact on recoveries, Archarya 2003
Geise 2005 suggests a conditional beta distribution to model loss given default. Although the
recovery distribution domain goes beyond one in the historical density graph, however, it is a rare
case which takes place solely for bonds. To make the model consistent with the empirical facts, LGD
should be conditional on the common market factor, like the market index in case of default rates, and
also captures each industry characteristics with regard to seniority and so on. To do so, the parameters
of the Beta distribution should be calibrated to the industry/seniority statistics and then the newly
Page 38 of 87
calibrated model shall be applied in the copula for each specific counterparty corporation. To the
results presented by Peraudin 2002, correlation between recoveries and aggregate defaults rates for
the US are -20% on average and about -30% when considering only tails. Therefore, in order to model
the negative-negative tail dependence of LGD and defaults, the Clayton copula is used to generate
bivariate random variables with desired correlation. The copula will output standard Normal and the
calibrated Beta random variables as marginal distributions with required correlation structure. Since
the asset return is contingent on the common market factor, the LGD will be indirectly correlated to
the systematic factor as well. For example, in case of weak market or recession, e.g. 𝑀 = −2, the
associated assets return will be very low by construction and the Clayton copula will generate a
correlated (very low) LGD subsequently. The detail process is as described for assets correlation, but,
here for recoveries Clayton copula is applied. Similarly, 𝐴𝑖 = √0.5 𝑀 + √0.5 𝑈𝑖 is the random
variable correlated with 𝑀 with an arbitrary weight of √0.5 the equally distributes weights between
M and the idiosyncratic factor. Mapping 𝐴𝑖 on its CDF generates the corresponding quintile, then a
random uniform variable is generated 𝑈(0,1) for partial derivative of Clayton copula to
give 𝑣 (𝑚𝑎𝑟𝑔𝑖𝑛𝑎𝑙 𝑑𝑖𝑠𝑡𝑟𝑖𝑏𝑢𝑡𝑖𝑜𝑛 𝑜𝑓 𝑟𝑒𝑐𝑜𝑣𝑒𝑟𝑖𝑦 𝑟𝑎𝑡𝑒). This should be mapped on Beta distribution
with the associated quintile. The result will be a random recovery rate from Beta distribution with the
desired correlation pattern with market factor. A sample generated by 𝛼 = 2 based on Kendall 𝜏 =
0.538; this parameter can be estimated from regression on historical data corresponding to default rates
and recovery rates. The sample simulation is as follow generated in R-studio; the LGDs are taken
from 𝐵𝑒𝑡𝑎(2,8).
38
Empirical correlation
Page 39 of 87
Figure 13: LGD and default rate dependence model by Clayton Copula
The model robustness originates from its consistency with empirical evidences to model
negative-negative tail dependence and the calibrated Beta distribution to capture industry and seniority
characteristics of the loan. In other words, all derivers of the recovery rates are appropriately modelled
in this procedure.
2.2.2 Methodology
The bank has provided a sample of 197 financial statements and the corresponding ratings
from counterparty corporations in five sectors including manufacturing, service, domestic trade and
international trade (named “trade” in the thesis).
There are 𝑁 counterparty corporations with assets 𝑉𝑖 with 1 ≤ 𝑖 ≤ 𝑁. initially, probability of
default is calculated based on historical, Merton, Black-Cox or AT1P models. The expected loss for
each individual counterparty at time 𝑇 = 1 is,
𝐸𝐿𝑖 = 𝐿𝐺𝐷𝑖 ∗ 𝐸𝐴𝐷𝑖 ∗ 𝐷𝑖 (𝑇)
(27)
each asset 𝑖 can have one of two states at given horizon, T, it can either be defaulted or not. As an
indicator for the assets status, 𝐷𝑖 (𝑇) which is binary variable, zero in case of survival and one if default
happens. Hence, based on the model, if 𝑋𝑖 ≤ 𝐻𝑖∗ then 𝐷𝑖 (𝑇) = 1 𝑎𝑛𝑑 0 otherwise, and, the total
expected loss of the portfolio EL is,
𝑁
𝐸𝐿(𝑇) = ∑ 𝐸𝐿𝑖
(28)
𝑖=1
Page 40 of 87
An illustrative flow chart of the methodology is as follow,
Recovery rate
PDs
Default corr.
Stochastic,
Stochastic,
Stochastic
Follow Beta
distribution,
One-factor
model,
Follow Beta
distribution
Correlated with
PDs via Clayton
copula
Correlated via tcopula
Correlated via
Gaussian copula
Exposure
Deterministic,
Loan principle
𝑁
𝐸𝐿(𝑇) = ∑ 𝐸𝐿𝑖
𝑖=1
Portfolio Credit Value at Risk
Figure 14: methodology flow chart
And the pseudo code for simulation process is presented in Appendix D. In the following, the
Basel methodology for capital adequacy is reviewed with the corresponding assumptions and initial
versions of extended models. In the next chapter the proposed model is implemented and outputs
compared with the results from Basel.
2.3 Basel Asymptotic Risk Factor Approach (ARFA)
The ARFA approach is used by Basel framework to compute the capital needed to prevent the
bank from bankruptcy under a one year period, with probability of more than 𝑞 = 0.999. In the
formula PD is probability of default and the same for all exposures, 𝜌𝑉 is the firm’s assets correlation
with the systematic common factor and Φ−1 is the inverse of standard Normal distribution,
(29)
Page 41 of 87
To derive the formula Basel takes the following assumptions,
1. The portfolio is sufficiently fined grained so that the idiosyncratic risk is diversified away
and only the systematic risk remains and it is the reason that is called a single factor model.
2. Firms’ assets are correlated to the systematic risk factor which is 𝑁(0,1) distributed.
3. The Loss Given Default is assumed constant and similar for each exposure
4. The loan generates no cash flows
In practice it is quite impossible to find a portfolio with LGDs, PDs, and correlation the same
for all exposures. Hence to compute a more accurate 99.9% CVaR per unit of exposure, Gordy 2003,
Pykhtin and Dev (2002) suggest,
(30)
Where index 𝑖 represents the corresponding PD, LGD and correlation of assets of firm 𝑖 with
𝐸𝐴𝐷𝑖
the systematic risk and 𝜔𝑖 represents the exposure weight, 𝜔𝑖 = ∑𝑁
𝑖=1 𝐸𝐴𝐷𝑖
. The formula still follows
earlier assumptions in terms of well granularity and single systematic factor model.
Furthermore, Schonbucher (2002a) and Wehrspohn (2002) investigated that in case of one-factor
model, by dividing the whole portfolio of credits in fine grained homogeneous portfolio clusters where
all assets of the same cluster have the same PD, LGD , EAD, correlation and expiry date, the
percentage of capital needed for the whole portfolio with only 𝑞 = 1 − 𝛼 of default probability
becomes,
(31)
Accordingly, if a portfolio is constructed of homogenous sub portfolios then the value of
regulatory capital to cover the entire portfolio is just the sum of the amounts required to cover each
sub-portfolio. Here the index 𝑘 represents each homogeneous sub-portfolio39.
39
This section mostly borrowed from: Lionel Martin , Analysis of IRB correlation coefficient with an application to
credit portfolio, 2013, University of Uppsala, 2013
Page 42 of 87
CHAPTER 3: IMPLEMENTATION AND RESULTS
In this part the proposed model is applied on a sample portfolio of loans from the bank. The
chapter starts with the managerial issues, and afterwards, Basel Economic Capital is compared with the
one suggested by the model. Finally, a mathematical model is proposed for the loan portfolio
optimization and, the efficient frontiers plus the proposed portfolio structures in both the frameworks
of Basel and the model is compared. The chapter investigates whether model and Basel suggest distinct
lending strategy or not.
3.1 Managerial prelude
Study of likelihood of unexpected losses in a portfolio of exposures is fundamentally important
for effective risk management. When default losses are modelled, it can be observed that the most
frequent loss amount will be much lower than the average, because, occasionally, extremely large
losses are suffered, which have the effect of increasing the average loss. Therefore, a credit provision
is required as a means of protecting against distributing excess profits during the below average loss
years.
To absorb the expected loss of an exposure portfolio the bank should take appropriate pricing
methods to offer risk-adjusted rates for the loans granted. However, Economic Capital (EC) is required
as a cushion for the risk of unexpected credit default losses in the bank, because the actual level of
losses could be significantly higher than the expected loss.
Page 43 of 87
Knowledge of credit default loss distribution arising from a portfolio of exposure provides a
bank with management information of the amount of capital that the bank is putting at risk by holding
that credit portfolio. Given the necessity of economic capital for unexpected losses, a percentile level
provides a means of determining the level of economic capital for a required level of confidence. In
order to capture a significant proportion of the tail of the credit default loss distribution, conventionally
from the standard, the 99.9th percentile of loss level over a one-year time horizon is a suitable
definition for credit risk economic capital40.
Figure 15: how banks treat their loan portfolio loss
It is possible to control the risk of losses that fall within each of the three parts of the loss
distribution in the following ways,
Part of loss distribution
Control mechanism
Up to Expected Loss
Adequate pricing and provisioning
Expected Loss—99.9% Percentile Loss
Economic capital and/or provisioning
Greater than 99.9% Percentile Loss
Quantified using scenario analysis and controlled with
concentration limits
Table 7: how banks treat loan portfolio loss
40
Credit Swiss, Credit Risk+ document
Page 44 of 87
In the latest version of the Basel proposal for an Internal Ratings-Based (“IRB”) approach
(Basel Committee on Bank Supervision 2001), the bucketing system is required to partition
instruments by internal borrower rating; by loan type (e.g., sovereign vs. corporate vs. project finance);
by one or more proxies for seniority/collateral type, which determines loss severity in the event of
default; and by maturity. More complex systems might further partition instruments by, for example,
country and industry of borrower.
3.2 Descriptive analysis of loan portfolio
Loan data (of 197 companies) from bank is presented in Appendix E and in this part a
descriptive statistics of data is presented. From the summary statistics of exposures, the minimum
exposure of 2 mln belongs to a counterparty from “domestic trade” sector, while the maximum of
362,700 million is from this sector as well. Moreover, the average exposure is around 29,320 (mln)
and 50% of the counterparties requested a loan below 10,000 million. The aggregate portfolio
exposure is 5,776,872 mln.
Min. 1st Qu.
2
3000
Median
10000
Mean 3rd Qu.
29320
30000
Max.
362700
A more illustrative distribution of exposures is depicted in the following pie chart. According
to this graph, the major concentration of the bank loan portfolio lies in the “manufacturing” and
“domestic trade” sectors, however, “domestic trade” sector dominants by 37% of total exposures and
“manufacturing” places in the second rank by 29% of total exposures in the portfolio. “Real estates”
15%, “trade” 13%, and “service” by 6%, are the least exposures respectively.
EXPOSURE
real estates
15%
service
6%
trade
13%
manufacturing
29%
domestic trade
37%
Figure 16: Exposure distribution among sectors
Page 45 of 87
Summary of the default probabilities in the portfolio is as below. Apparently, depending on
the credit rate assigned by the bank, PDs vary in a range from .02% to 17.7%.
Min. 1st Qu. Median
Mean 3rd Qu.
Max.
0.00022 0.01166 0.04546 0.06780 0.17720 0.17720
The inner distribution of loan amounts in each sector is magnified in figure 17 and 18.
Apparently, in all sectors the exposures are highly right skewed and hardy exceed 100 billion.
However, there are some loans beyond 200 billion in “real estates” and “manufacturing” where the
bank is more confident of collaterals and fixed assets in the corporation’s balance sheets. This is also
the case for “domestic trade” and implicitly reflects banks closer business interaction with
corporations in “domestic sector” rather companies active international “trade”.
Figure 17: Exposure distribution in each sector
From the pie chart information and the illustrative charts below, bank is likely to grant loan to “trade”
and “service” companies and attempts to interact with more credit worthy clients in these sectors. This
is obvious from the credit grades distribution by sectors in the subsequent charts (figure 17, 18).
Moreover, in the frequency graph of credit rates by sector, below, bank portfolio chiefly is
constructed by loans rated “Baa”, “Ba” and “B”. The charts suggest that distribution of loans with
various rates differ in sectors. For instance, in “service” , “domestic trade” and “real estates”, the
majority of loans are from rates “B” and “Ba”, “Baa” and “C”, however, in “trade” and “real estates”,
grades of type “A” is more observable relatively. Moreover, “service” and “manufacturing” sectors
carry the least creditworthy loans of rate “CCC”, but this is not the case for “real estates”.
Page 46 of 87
Figure 18: credit rate distribution in each sector
3.3 Model evaluation
In order to get ensured if the model fulfills all the expetations concerning the attributes of
variables and their interactions, prior to evaluating the results with Basel a sample of outputs are
reviewed. Follwing graphs illustrate the dependecy format of correlations and LGDs to the economy
status. Obviously, from the right hand graph, there is a pinch in the down right revealing a higher level
of dependency in bad economy situation, between LGD and economy, however, the likehood of comovement declines as economy is experiencing normal or expansionary conditions. Applying Clayton
copula enabled the model to capture such dependency pattern.
Figure 19: correlations vs. economy status
Figure 20: LGD vs. economy status
Page 47 of 87
Moreover, the left scatter plot demonstrates the negative dependency of correlations and
economy index which is created via Gaussian copula. Moreover, t-copula modeled tail dependence
and intensified the default rates’ co-movements in the extreme situations of economy booms and
downturns. This is the case particularly in downturns where companies are likely to default together.
In Gaussian copula the probability of this co-movement in extreme cases in zero for all range of
correlation, however, this is not the case in t-copula.
Figure 21: default rate correlation by t-copula
Figure 22: default rate of company A and B
The graph at the left depicts the margin distribution of default rates for two sample companies
in the loan portfolio. The pinched areas at the corners is noticable representing higher correlation at
extremes. It illustartes that when company A performs very god or bad, it is more likely to observe
such behavior from company B, while it is not the case in normal situations. The pinched areas reveal
a more intense tendency for co-movement at extremes.
3.4 Model implementation
The simulation of the model41 on the bank laon portfolio is run for 100,000 times (degree of
freedom=3) and Creit-VaR for various percentiles are compared with 99.9% percentile suggested by
Basel. To make the portfolio consistent with Basel assumptions, the average of default probabilities
is tabken as the common PD and the copula correlation is inserted on the ground of J. Lopez (2004)
which investigates an empirical relationship between default probabilites and asset correlations.
41
mybankBasel() in R
Page 48 of 87
Figure 23: Model vs. Basel CVaR
The expected loss is 104,186 mln and the portfolio loss standard deviation is 179,317 mln.
Apparently, the 99.9% CVaR from Basel lies between the model 95% and 99% CVaR. It implies that
incorporating realities, such as stochastic recoveries and correlations as well as the interactions like
dependency of recoveries and PDs and correlations with economy status, contribute to a higher level
of WCDR and economic capital which goes beyond Basel figures.
95%
99%
99.5%
99.9%
Basel_WCDR Basel 99.9%
4.081149e+05 8.585043e+05 1.096985e+06 1.690528e+06 3.338937e-01 5.558981e+05
Moreover, the 99.9% percentile Worst case Default Rate (WCDR) for Basel is 33.38% which
is close to the one for 99% percentile from the model. According to the model, the bank is 99.9%
confident that the number of defaults in the portfolio will not exceed from 54% of the totol number of
loans in the portfolio. In this case, since the portfolio consists of 197 counterparties, hence, with 99.9%
confidence the bank will not experience more than 106 defaults within a year. This number is 41, 67
and 77 with 95%, 99% and 99.5% confidence respectively. Also, in Basel, the WCDR is 66 defaults
in a year with 99.9% confidence.
95%
99%
99.5%
99.9%
Percentile for number of defaults: 0.2131980 0.3451777 0.4060914 0.5431472
Comparing the outputs from both Basel and the model revelas that Basel underestimates
possible default rates particularly in recessions when there is a tendency known as default contagion.
The differnce for 99.9% percentile sometimes is more than three time of the figures Basel suggests.
Page 49 of 87
Furthermore, to the simulation outputs, Basel implicitly account for solely 95% to 99% confidence of
reality. It means, by applying Basel management is happy to be confident of his/ her strategies with
99.9% confidence, while incorporating real world attributes of risk factors and interactions in
modeling, down grades this confidence to 95%- 99% and the manager has been lured by the unreliable
confidence in his/ her strategic lending process.
The following histogram (figure 24) represents the distribution of number of defaults. The
expected number of defualt for the portfolio is 13.7, implying approximately the bank will face with
14 defaults on average within the year. The bank is supposed to estoimated this expected loss and
consider it in the amount of interests charges for loans granted. The remaining probable defaults shall
be hedged against by reserves as economic capital.
Figure 24: No. of defaults distribution
3.4.1 Expected shortfall (tail loss)
Although it is the standard most commonly applied, value-at-risk is not without shortcomings
as a risk measure for defining economic capital. Because it is based on a single quintile of the loss
distribution, VaR provides no information on the magnitude of loss incurred in the event that capital
is exhausted. A more robust risk-measure is expected shortfall (“ES”), which is (loosely speaking) the
expected loss conditional on being in the tail42.
The following histogram (figure 25) demonstrates the tail distribution of portfolio losses
beyond 99.9% percentile. It reveals what is going on at the tail above 99.9% percentile. The
42
Gordy 2002
Page 50 of 87
distribution is positively skewed showing the likelihood of extreme events is remarkably less
comparing to the events close to the VaR at that level. Comparing CVaR of different percentiles with
the corresponding expected shortfalls at that level demonstrates higher values for expected shortfalls.
One problem with VAR is that, when used in an attempt to limit the risks taken by a bank, it
can lead to undesirable results. When a bank intends that the one-year 99.9% VAR of the loan portfolio
must be kept at less than 1.6 million. There is a danger that the bank will construct a portfolio where
there is a 99.9% chance that the loss is less than $1.6 million and a 0.1% chance that it is 2 million.
The bank is satisfying the risk limits imposed, but is clearly taking unacceptable risks. Where VAR
asks the question 'how bad can things get? expected shortfall asks 'if things do get bad, what is our
expected loss?'. A risk measure that is used for specifying capital requirements can be thought of as
the amount of cash (or capital) that must be added to a position to make its risk acceptable to
regulators. Artzner, et al. (1999) have proposed a number of properties that such a risk measure should
have such as,
- Monotonicity: if a portfolio has lower returns than another portfolio for every state of the
world, its risk measure should be greater.
- Translation invariance: if we add an amount of cash K to a portfolio, its risk measure should
go down by K.
- Homogeneity: changing the size of a portfolio by a factor (lambda) while keeping the
relative amounts of different items in the portfolio the same should result in the risk measure
being multiplied by (lambda).
- Sub-additive: the risk measure for two portfolios after they have been merged should be no
greater than the sum of their risk measures before they were merged.
The first three conditions are straightforward given that the risk measure is the amount of cash
needed to be added to the portfolio to make its risk acceptable. The fourth condition states that
diversification helps reduce risks. When two risks are aggregated, the total of the risk measures
corresponding to the risks should either decrease or stay the same. VAR satisfies the first three
conditions, but it does not always satisfy the fourth, as will now be illustrated.
Page 51 of 87
Figure 25: portfolio expected shortfall
Risk measures satisfying all four of the conditions are referred to as coherent. The example
illustrates that VAR is not always coherent. It does not satisfy the sub-additivity condition. This is not
just a theoretical issue. Risk managers sometimes find that, when the London portfolio is combined
with that of New York to form a single portfolio for risk management purposes, the total VAR goes
up rather than down. In contrast, it can be shown that the expected shortfall measure is coherent. A
risk measure can be characterized by the weights it assigns to quintiles of the loss distribution. VAR
gives a 100% weighting to the Xth quintile and zero to other quintiles. Expected shortfall gives equal
weight to all quintiles greater than the Xth quintile and zero weight to all quintiles below the Xth
quintile43.
By and large, the bank expected loss beyond a certain percentile is greater than the CVaR at
the corresponding confidence level. The figures are depicted as below.
Expected shortfall at confidence of:
99.9%: 2,131,408 mln
99.5%: 1,479,650 mln
99%:
1,221,897 mln
95%:
693,484 mln
43
The talk about expected shortfall and VaR comparison borrowed mostly from www.risk.net and Risk management
and financial institutions, John Hull
Page 52 of 87
3.4.2 Sector credit risk analysis
This section involves analysis of credit risk for each sector to evaluate the risk contribution
of each sector in the portfolio. The distribution of sectors in the portfolio is,
domestic trade
61
manufacturing
49
real estates
28
service
29
trade
30
Following table depicts contribution of each sector in the total exposure of the portfolio.
Apparently, “domestic trade” ranks top and “manufacturing” holds the second largest exposure. “Real
estates”, “trade” and “service” seat at the bottom respectively. An exercise of analyzing its
Row
Sector
Exposure (mln)
1
trade
740,000
2
manufacturing
1,670,000
3
domestic trade
2,137,000
4
real estates
900,000
5
service
331,000
Table 8: exposures by sector
portfolio credit risk for the bank is to evaluate the risk contribution of sectors and also the individual
counterparties in order to come up with appropriate strategies44.
Different percentiles of Credit-VaR is presented in the table below. EL is the expected loss of
a particular sector. It is noticeable that again Basel 99.9% CVaR lies between 95% and 99% percentile
of the model for each sector. Moreover, the largest CVaR percentile belongs to “manufacturing”
sector, and “service” occupies the last position in term of CVaR.
Row Sector
EL
95%
99%
99.5%
99.9%
Basel
1
manufacturing
44,239
174,143
355,262
443,608
688,585
203,298
2
dom. trade
28,803
141,177
346,613
437,949
664,277
171,487
3
real estates
5,749
32,089
58,308
75,069
152,865
55,932
4
services
6,972
31,362
61,850
76,538
110,338
36,909
5
trade
17,202
82,880
159,136
197,693
283,685
71,166
Table 9: CVaR percentile by sector (mln)
44
The Bank can optimize its portfolio structure of loans or make decisions of best restructuring strategy in crisis
Page 53 of 87
Figure 26 demonstrates a comparative analysis of CVaR percentile among sectors more
illustrative. Obviously “manufacturing” and “domestic trade” dominate others while “service” and
“trade” have the least amount of CVaR respectively.
C-VaR by Sector (mln)
800,000
700,000
600,000
500,000
400,000
300,000
200,000
100,000
0
EL
manufacturing
95%
99%
Dom. trade
99.50% 99.90%
Real estates
EC
services
Basel
trade
Figure 26: CVaR by sector, Basel vs. model
In order to acquire a sensible judgment of risk contribution of each sector in the portfolio, the
corresponding amount of economic capital in each sector is divided by its total exposure. The output
is an indicator of relative riskiness of the sector. Results are depicted in the following graph. It
suggests, although the total exposure in “manufacturing” and “domestic trade” is quite close to each
other, “manufacturing” exposure is almost 80% of “domestic trade” ( respectively 1,670 and 2,137
bln) , however, “manufacturing” is much riskier comparing to “domestic trade”. The ratio of required
capital reserve to the exposure is 39% and 30% for these sectors respectively. This makes “domestic
trade” more appealing to the bank with higher possibility of interest income by granting larger loans
but imposing a lower amount of economic capital comparing to a sector with similar exposure.
Furthermore, “real estates” appear as the safe haven in the portfolio with the minimum required
economic capital reflecting a considerable lower level of Credit-VaR comparing with others.
Strategically, in case of two sectors with same level of exposures, the bank prefers to grant higher
amount of loans to the one with lower EC contribution. This is the case for “Real Estates” and “Trade”
where exposures are relatively close among other pairs in the portfolio, however, “Trade” imposes a
risk45 of above 35% while this is around 16% for “Real Estates”. By and large, “domestic trade”
45
Ratio of corresponding economic capital (EC) to the Exposure is a proxy of riskiness of a loan in portfolio
Page 54 of 87
carries a reasonable combination of exposure and risk among other sectors and “trade” and “service”
are the least attractive target for lending.
Figure 27: Sector EC contribution
Although Basel generates distinct inter-sector CVaR values, however, it slightly influence the
risk order of sectors, in other words, different results does not necessarily lead to an completely
different strategy for the bank to manage its portfolio credit risk, whether applying Basel or the model.
According to the graph, the order of three first sectors is the same alike the model proposed, however,
according to Basel, “service” is considered as the riskiest sector and “trade” ranks 4th comparing to its
order suggested by the model. Moreover, contrary to rather similar riskiness order, the level of
estimated EC contributions are far different with much lower variety among different sectors.
0.45
0.40
Ratio of EC to Exposure: Basel vs Model
0.35
0.30
0.25
0.20
0.15
0.10
0.05
-
manufacturing
Dom. trade
EC to Exp.
Real estates
services
trade
EC to EAD (Basel)
Figure 28: Sector EC contribution, Basel vs. model
Page 55 of 87
Accordingly, Basel makes management somehow indifferent between sectors due to a rather
close levels of EC contributions as a measure of riskiness, while the model distinguishes between
sectors through allocating a wider range of risk contributions, this makes the manager more sensitive
to formulate a more scrutinized lending strategy.
Obviously, banks look for a portfolio which prompts a lower capital reserve in order to boost
their flexibility in generating interest income as much as possible from the available capital. Hence,
they should be naturally more inclined to higher loan amounts with lower level of required regulatory
capital which implies “the southern” and “the eastern” parts of the following graph. Tough, the
optimum area of the chart locates in the bottom right with loans of having the maximum exposure and
the minimum EC contribution.
Figure 29: risk contribution at counterparty level
The graph depicts a dense of loans from different sectors at the top left, on the other hand,
there are rarely optimum lending, such as the ones to “real estates” and “domestic trade” in southeast.
It also reveals existence of loans from “domestic trade” with far large exposures and risk contribution.
The bank is recommended to be cautious enough in treating with such borrowers.
Page 56 of 87
3.4.3 Structuring the loan portfolio
In order to evaluate the effect of applying Basel in place of the model, the improved loan
portfolio strategy of the bank is modeled in Markowitz framework. The main question is whether
Basel has any influence on the construction of optimum portfolio or not and to what extend it impacts
the optimum solution comparing to the results generated from the model.
According to the work of Harry Markowitz46 in the early 1950s each portfolio can be classified
along the axes of risk and return. Any portfolio that has a minimal amount of risk for a given amount
of return is called efficient, and the line that connects these portfolios in a risk-return graph is called
the efficient frontier. In 1993, Terri Gollinger and John Morgan, at the time working with Mellon
Bank in Pittsburgh, published the pioneering article “Calculation of an Efficient Frontier for a
Commercial Loan Portfolio” in the Journal of Portfolio Management47. This article takes Markowitz’s
portfolio theory to the banking sector and to the allocation and optimization of loan portfolios in
particular. Like their approach, here the industry sectors take the place of securities in the Markowitz
model, and the risk contribution as the ratio d economic capital to the total exposure is used as a proxy
for risk. This ratio represents how risky is the company in a way that higher EC implies riskier loan.
Just as an investor searches for an optimal combination of risk and return in creating a portfolio of
securities, a bank extends loans to those industries that minimize risk (EC contribution) for a given
level of return.
Spreads on loans are taken as returns for securities. According to the hazard rate model, having
the PDs (and conclusively the average hazard rates) and recoveries in hand, the spreads are available
from the triangle credit expression below,
ℎ̅ =
𝑠𝑝𝑟𝑒𝑎𝑑
1 − 𝑅𝑒𝑐𝑜𝑣𝑒𝑟𝑦
(32)
These spreads are the additional rate that the bank charge each sector comparing the interest
on deposit accounts. Considering loans of the senior secured class, the average recovery rate is 71.11%
according to Moody’s. Spreads are calculated accordingly and presented in the following table,
46
Markowitz, H., Portfolio Selection, The Journal of Finance, March 1952, pp. 77-91.
Gollinger, T.L. and J.B. Morgan, Efficient Frontier for a Commercial Loan Portfolio, The Journal of Portfolio
Management, Winter 1993, pp. 39-46.
47
Page 57 of 87
Sector
Risk contr. (𝝈)
Spread(𝝁)
Weights
manufacturing
.39
2.05%
W1
Domestic trade
.30
3.20%
W2
Real estates
.16
1.42%
W3
Service
.31
0.80%
W4
trade
.36
2.71%
W5
Table 10: loan portfolio risk and returns
Moreover, the covariance matrix of loans is constructed based on the correlation structure in
the t-copula where the pair wise correlation between loans defined as 𝜌𝑖 𝜌𝑗 , hence the variancecovariance matrix of the loan portfolio is,
(33)
𝑊 𝑇 Σ̂𝑊
Where Σ̂ is the variance-covariance matrix. The best way to allocate the lending capacity of
the bank across various industries basically, involves finding the industry weights that result in the
most efficient solutions. So far, the weights of the industry sectors assumed constant by freezing them
at 20 percent to construct an equivalently weighted portfolio. For the decision parameters, the model
propose the optimal values considering the objectives, requirements, and constraints defined. In this
case, the objective is to optimize the return on assets (loans) of the portfolio by deciding on the
portfolio shares. Furthermore, the requirement that the EC contribution of the portfolio should not
exceed a predefined threshold determined by bank. This limits the risk the bank is willing to take on.
Hence, solutions with a higher returns, but an EC exceeding this ceiling, will be discarded.
𝑵
𝑶𝒃𝒋𝒆𝒄𝒕𝒊𝒗𝒆 𝒇𝒖𝒏𝒄𝒕𝒊𝒐𝒏: 𝑴𝒂𝒙𝒊𝒎𝒊𝒛𝒆 ∑ 𝝁𝒊 ∗ 𝒘𝒊
𝒊=𝟏
𝒔𝒖𝒃𝒋𝒆𝒄𝒕 𝒕𝒐:
𝑵
𝝈𝟐𝒑
𝑵
= ∑ ∑ 𝒘𝒊 𝒘𝒋 𝝆𝒊𝒋 𝝈𝒊 𝝈𝒋
𝒊=𝟏 𝒋=𝟏
𝒘𝒊 ≥ 𝟎. 𝟏
∀ 𝒊
𝑵
∑ 𝒘𝒊 = 𝟏
𝒊=𝟏
Page 58 of 87
In addition, a constraint that the weights should add up to 100 percent is defined. Also, a
minimum weight of 10 percent is considered, ensuring that the bank keeps a presence in all sectors.
More constraints could be added concerning any national regulatory requirements.
The model is solved in Excel solver48 by GRG Nonlinear Solving Method for nonlinear
optimization. The optimal solution found (in the following table) is valid for a risk ceiling that, in the
case, was set at 12 percent. According to Markowitz portfolio theory, any portfolio is defined along
the axes of risk and return. This implies that a different maximum standard deviation will result in a
different optimum for the portfolio allocation. By varying the risk ceiling and running the optimization
simulation multiple times, the efficient frontier is obtained as follow.
risk
return
Portfolio weights
dom. trade Real estates
0.18
0.50
0.28
0.37
0.33
0.29
0.12
0.13
0.1449
0.0188
0.0211
0.0224
manufacturing
0.10
0.10
0.10
service
0.11
0.10
0.10
trade
0.11
0.16
0.18
0.15
0.16
0.17
0.18
0.0234
0.0243
0.0251
0.0258
0.10
0.10
0.10
0.10
0.37
0.41
0.45
0.522
0.23
0.17
0.12
0.10
0.10
0.10
0.10
0.10
0.20
0.21
0.23
0.178
0.19
0.0261
0.10
0.584
0.10
0.10
0.116
0.2
0.0262
0.10
0.60
0.10
0.10
0.10
Table 11: improved portfolio structure (weights)
From the efficient frontier the improved solution regarding the composition of the loan
portfolio depends on the bank’s risk appetite and obviously like the case in security investments case
risk and return go hand in hand.
Comparing the efficient frontier from Basel to the one produced by the model, the range of
maximum returns for the given risk appetite is the same, however, Basel presents a lower level of
associated risk to a specific rate of return in comparison to the model. Besides, based on Basel,
portfolios with EC contribution of less than 3.4% is not feasible while in case of model 12% is the
minimum capital reserve ratio that is possible concerning the efficient frontier.
48
The GRG Nonlinear Solving Method for nonlinear optimization uses the Generalized Reduced Gradient (GRG2) code, which
was developed by Leon Lasdon, University of Texas at Austin, and Alan Waren, Cleveland State University, and enhanced by
Frontline Systems, Inc. http://www.solver.com/excel-solver-algorithms-and-methods-used
49
The equivalently weighted portfolio results in 2.03% return at 14% risk level
Page 59 of 87
Figure 30: portfolio efficient frontier (model)
Figure 31: efficient frontier (Basel)
Furthermore, having different efficient frontiers obtained from two approaches, the average
portfolio weights is compared in the following diagram. Although the average differences are not far
from each other, however, if the bank relies on Basel and tries to extract the optimum portfolio based
on the outputs of Basel model the improved portfolio structure that Basel suggests is not practically
better than the one proposed by the model.
Improved portfolio weights Basel vs Model
50.0%
45.0%
40.0%
35.0%
30.0%
25.0%
20.0%
15.0%
10.0%
5.0%
0.0%
manufacturing domestic trade
real estates
model
service
commerce
Basel
Figure 32: Improved portfolio weights Basel vs. Model
This is examined by comparing generated returns of the portfolio from Basel with the ones
suggested by the model for a specific level of risk. For instance, based on the solution in Basel
framework, the maximum return for the portfolio is 2.136% at 13.33% risk level, however, the
maximum return proposed in the framework of the model is 2.158%. Conclusively, the model
Page 60 of 87
provides the bank with better solutions (returns) at a particular level of risk. This improvement is
roughly .5% on average.
3.5 Conclusion
Basel results look quite acceptable in normal economic situations, however, it is not reliable
for crisis or economic downturn periods when extreme values are more likely to take place. That is
because applying a more sophisticated model increased the level of economic capital sometimes two
times of the amount Basel suggests. This demonstrates the effect of simplification of the complex
interactions and unrealistic assumptions for each of the individual risk factors in portfolio risk.
Moreover, the 99.9 % CVaR that Basel suggests lies between the 95% and 99% percentile of
portfolio loss distribution proposed by the model. In other words, the 99.9% percentile Credit-VaR of
Basel is approximately 95% and hardly up to 99% confidence of the real capital at risk, and not the
99.9% confidence. This leaves the bank with a considerable level of capital at risk which is not hedged
by keeping as reserve or any other hedging strategy. Moreover it artificially lures management to take
strategies with 99.9% confidence while in practice he/she is taking risks more than he/she assumed.
Applying Basel or the model does not have a significant impact on the risk contribution of
sectors, however, the magnitudes and variance of riskiness differ remarkably between two models.
Basel risk contribution ratios are much lower and less variant among sectors, however, the differences
among sectors are more noticeable in the model. This implies that Basel makes management somehow
indifferent between sectors to which grant more loans, while the model makes it more crystal-clear
for manager with a quite distinguished risk contribution levels.
Furthermore, the efficient frontiers extracted from Basel and the model are different, however,
the return varies in a similar range. The discrepancy mostly appears in the risk levels. e.g. manager
expects 2.4% return in price of 3.8% risk, while the realistic risk level is around 15%; applying Basel
gives managers an unrealistic confidence of portfolio risk-return profile
Last but not the least, if the manager relies on Basel, the improved loan portfolio structure will
not be the better one in practice, there are some inefficiencies of roughly .5% away from better solution
which is material in large portfolio values.
Page 61 of 87
References
1. A.K.Misra, V. J.Sebastian, Portfolio Optimization of Commercial Banks- An Application of
Genetic Algorithm , 2013
2. André Koch, Optimizing Loan Portfolios, an Oracle white paper, October 2012
3. Basel Committee for Bank Supervision, Basel III, 2010, A global regulatory framework for
more resilient banks and banking system,
4. Brigo, Damiano and Fabio Mercurio. 2006. Interest Rate Models. Berlin: Springer.
5. Brigo, Damiano, Massimo Morini, and Andrea Pallavicini, 2013. Counterparty Credit Risk,
Collateral And Funding,
6. Brigo, Damiano, Massimo Morini, and Marco Tarenghi. "Credit Calibration With Structural
Models: The Lehman Case And Equity Swaps Under Counterparty Risk". SSRN Electronic
Journal.
7. Burtschell, X, Jonathan Gregory, and Jean-Paul Laurent. 2009. "A Comparative Analysis Of
CDO Pricing Models Under The Factor Copula Framework". The Journal Of Derivatives 16
(4): 9-37.
8. Chaoko, George. 2006. A Primer On Credit Risk, Modeling And Instruments.
9. Credit Swiss “Credit Risk+”, a credit risk management framework
10. Franzen, Dan. 2014. "Credit Valuation Adjustment". Master’s Thesis, KTH-Royal institute
of technology.
11. Fray 2013, Loss given default as a function of the default rate
12. Garoczi, Gergerly. 2013, Introduction To R For Quantitative Finance.
13. Geise 2005, the impact of PD/LGD correlation on credit risk capital,
14. Gollinger, Terri L and John B Morgan. 1993. "Calculation Of An Efficient Frontier For A
Commercial Loan Portfolio". The Journal Of Portfolio Management 19 (2): 39-46.
15. Gordy, Michael B. "A Risk-Factor Model Foundation For Ratings-Based Bank Capital
Rules". SSRN Electronic Journal.
16. Gregory, Jon and Jon Gregory. 2012. Counterparty Credit Risk And Credit Value
Adjustment. Hoboken, N.J.: Wiley.
Page 62 of 87
17. Hoffman, Frederick. 2011. "Credit Valuation Adjustment". Master, University of Oxford.
18. http://www.solver.com/excel-solver-algorithms-and-methods-used
19. Hull, John C and Alan D White. 2004. "Valuation Of A CDO And An N -Th To Default
CDS Without Monte Carlo Simulation". The Journal Of Derivatives 12 (2): 8-23.
20. Hull, John C., Mirela Predescu, and Alan White. "The Valuation Of Correlation-Dependent
Credit Derivatives Using A Structural Model". SSRN Electronic Journal.
21. Hull, John. 2006. Options, Futures, And Other Derivatives. Upper Saddle River, N.J.:
Pearson/Prentice Hall.
22. Hull, John. 2012. Risk Management And Financial Institutions + Website. Hoboken, New
Jersey: John Wiley & Sons, Inc.
23. Kealhofer, Stephen. 2003. "Quantifying Credit Risk I: Default Prediction". Financial
Analysts Journal59 (1): 30-44. doi:10.2469/faj.v59.n1.2501.
24. Kealhofer, Stephen. 2003. "Quantifying Credit Risk II: Debt Valuation". Financial Analysts
Journal 59 (3): 78-92. doi:10.2469/faj.v59.n3.2534.
25. Keith A. Allman, Josh Laurito, Michael Loh, Financial simulation modelling, 2011, page
111
26. Kinsey, Jean. 1981. "Determinants Of Credit Card Accounts: An Application Of Tobit
Analysis". J CONSUM RES 8 (2): 172.
27. Leland, Agency Costs, Risk Management, and Capital Structure, 1998
28. Li, David X. "On Default Correlation: A Copula Function Approach". SSRN Electronic
Journal.
29. Löffler, Gunter and Peter N Posch. 2007. Credit Risk Modeling Using Excel And VBA.
Chichester, England: Wiley.
30. Markowitz, Harry. 1952. "Portfolio Selection". The Journal Of Finance 7 (1): 77.
31. Martin, lionel. 2016. "Analysis Of IRB Correlation Coefficient With An Application To
Credit Portfolio". Master thesis, University of Uppsala.
32. Meissner, Gunter. 2014. Correlation Risk Modeling And Management. Singapore: Wiley.
33. Merton, Robert C. 1974. "On The Pricing Of Corporate Debt: The Risk Structure Of Interest
Rates".The Journal Of Finance 29 (2): 449.
Page 63 of 87
34. Michael Miller, Mathematics and statistics for financial risk management, 2th edition, 2014
35. Miller, Michael B. 2012. Mathematics And Statistics For Financial Risk Management.
Hoboken, N.J.: Wiley.
36. Modeling default risk, Crosbie and Bohn, Moody’s KMV, 2002
37. Moody’s Investors Service, corporate Bond defaults and default rates, January 1996
38. Navneet Arora, Jeffrey R. Bohn, Fanlin Zhu, 2005, Reduced Form vs. Structural Models of
Credit Risk: A Case Study of Three Models, , Moody’s KMV February 17,
39. Pykhtin M, 2003, Unexpected recovery risk, Risk, 16(8), 74—78
40. Schuermann, Til. "What Do We Know About Loss Given Default?". SSRN Electronic
Journal. doi:10.2139/ssrn.525702.
41. Standard and Poor’s rating performance 1996, February 1997
42. Steve Lu, Yuqian. 2008. "Default Forecasting In KMV". Master thesis, University of oxford.
43. Vasicek, 1987, Probability of Loss on Loan Portfolio, KMV corporation, USA
44. Zhou, Chunsheng. "A Jump-Diffusion Approach To Modeling Credit Risk And Valuing
Defaultable Securities". SSRN Electronic Journal.
Page 64 of 87
Appendix A
Zero Coupon Bonds (ZCB) and spreads
Let 𝑍(𝑡, 𝑇) denotes today’s value of riskless ZCB50 with payoff of 1$ at T. If 𝑅(𝑡, 𝑇) is the
continuously compounding yield to maturity of this bond, then we have 𝑍(𝑡, 𝑇) = 𝑒 −𝑅(𝑡,𝑇)(𝑇−𝑡) as
reflecting the time value of money or todays’ time-𝑡 value of 1$. Now if we have a risky ZCB bond
paying 1$ in good state with probability 𝑃(𝑡, 𝑇) and nothing (zero-recovery) at other state with
probability (1 − 𝑃(𝑡, 𝑇)), the expected payoff under this physical probability measure conditional on
all information available at time-𝑡 (conditional on that the issuer has not defaulted by time 𝑡) is
survival probability of the issuer. If 𝑍0𝑑 (𝑇, 𝑇)51 be payoff at time 𝑇 of this ZCB which is unknown at
time-𝑡, its expected value computed on knowledge of 𝑃(𝑡, 𝑇) is
𝐸𝑡𝑃 [𝑍0𝑑 (𝑡, 𝑇)] = 𝑃(𝑡, 𝑇) × 1$ + (1 − 𝑃(𝑡, 𝑇)) × 0$ = 𝑃(𝑡, 𝑇)
Where 𝐸𝑡𝑃 [. ] denotes expectation formed on the basis of information available at time 𝑡, given the
survival probability 𝑃(𝑡, 𝑇). Hence if we have default free and default able ZCB prices for continuum
of maturities then we have survival probabilities for all maturities and also densities for all maturities.
From this a term structure of survival probabilities can be derived.
Risk-Neutral Valuation and Probabilities
While the ZCB is risky and there is a chance of no payoff, one may want to discount the promised
further when assessing the current value of the bond. There are two equivalent ways of thinking about
this discounting, one can apply a higher discount rate over risk-free as
𝑍𝑡𝑑 (𝑡, 𝑇) = 𝑒 −[𝑅(𝑡,𝑇)+𝑆(𝑡,𝑇)](𝑇−𝑡)
Now the promised payment of the bond is discounted by a higher rate of [𝑅(𝑡, 𝑇) + 𝑆(t, T)].
Alternatively one can think of the “artificial” probability, 𝑄(𝑡, 𝑇) conditional on information available
at time- 𝑡 is
𝑍0𝑑 (𝑡, 𝑇) = 𝑒 −𝑅(𝑡,𝑇)(𝑇−𝑡) [𝑄(𝑡, 𝑇) × 1 + (1 − 𝑄(𝑡, 𝑇)) × 0] = 𝑍0 (𝑡, 𝑇)𝑄(𝑡, 𝑇)52
50
ZCB is a bond with no coupon but the only payment at maturity
𝑍0𝑑 (𝑇, 𝑇) denotes a defaultable ZCB with zero recovery
ℚ
52
The alternative way of writing price of a default able ZCB is 𝑍𝑑𝑡 (𝑡, 𝑇) = 𝑍(𝑡, 𝑇)𝐸𝑡 [1𝜏>𝑇 ] in which the second part is
the risk- neutral probability of default 𝑄(𝑡, 𝑇) assuming independent 𝜏 and risk-free rate
51
Page 65 of 87
While (1 − 𝑄(𝑡, 𝑇)) is the probability attached to the default of the bond issuer. The physical
probabilities does not coincide with risk-neutral ones since investors are risk averse, because they are
ready to pay a higher amount for a riskless investment rather than a risky one, hence the today’s price
of a safe investment shall be higher therefore physical survival probabilities are larger than risk-neutral
ones and the case is vice versa for default probabilities53. Assuming the 𝜏 (default time) be independent
of risk free rate we have the equation which represents a prominent result of,
Price of a risky-bond = price of a risk-free bond × risk-neutral survival probability of risky bond
issuer
In order to incorporate all premiums for the loan adjusted return the bank shall follow the general
equation for the adjusted rate,
𝑟𝑎𝑑𝑗𝑢𝑠𝑡𝑒𝑑 = 𝑟𝑟𝑒𝑎𝑙 + 𝑖𝑛𝑓𝑝𝑟𝑒𝑚𝑖𝑢𝑚 + 𝑑𝑒𝑓𝑎𝑢𝑙𝑡 𝑟𝑖𝑠𝑘𝑝𝑟𝑒𝑚𝑖𝑢𝑚 + 𝑙𝑖𝑞𝑢𝑖𝑑𝑖𝑡𝑦 𝑟𝑖𝑠𝑘𝑝𝑟𝑒𝑚𝑖𝑢𝑚
+ 𝑚𝑎𝑡𝑢𝑟𝑖𝑡𝑦 𝑟𝑖𝑠𝑘𝑝𝑟𝑒𝑚𝑖𝑢𝑚
The loan spread is: 𝑆𝑝𝑟𝑒𝑎𝑑 = 𝑑𝑒𝑓𝑎𝑢𝑙𝑡 𝑟𝑖𝑠𝑘𝑝𝑟𝑒𝑚𝑖𝑢𝑚 + 𝑙𝑖𝑞𝑢𝑖𝑑𝑖𝑡𝑦 𝑟𝑖𝑠𝑘𝑝𝑟𝑒𝑚𝑖𝑢𝑚
53
For further discussion and the proof please refer to understanding credit derivatives and related instruments, p.148 and
149
Page 66 of 87
Appendix B
From Merton to AT1P
According to Merton 197454 defaults happens at maturity date 𝑇 and creditors take over the firm and
realize an amount of 𝑉𝑇 .
As reviewed in chapter 1, Black-Cox 1976 suggested the first model from family of first passage time,
in addition, they take into account the safety covenants in loan contracts which enables creditors to
take over the borrowing firm when its value 𝐴𝑡 falls low enough “safety level”𝐻(𝑡). Hitting this
barrier is considered early default and this makes default time unpredictable, ex-ante.
Due to possibility of default at any time prior maturity the spreads generated by Black-Cox are higher
than Merton.55 The first candidate of the barrier is the face value of debt discounted to the present
time, however, one may cut some slacks to the counterparty give it some time to recover even if the
level goes below the barrier 𝐿𝑃(𝑡, 𝑇) and the safety level can be chosen to be lower than 𝐿𝑃(𝑡, 𝑇).
Clearly, pricing this bond is solving a barrier option pricing problem, and first passage time models
make use of barrier option techniques. Here 𝜏, default time, can be defined as
𝜏 = inf{𝑡 ≥ 0, ; 𝑉𝑡 ≤ 𝐻(𝑡)
if this quantity is smaller than the debt final maturity 𝑇, and by 𝑇 if further 𝑉𝑇 ≤ 𝐿, in all other cases
there is no default. If 𝐻(𝑡) is the barrier depending on 𝑡 and zero coupon bond maturity date 𝑇, for
each counterparty corporation 𝑖, Black and Cox assume a constant parameter Geometric Brownian
Motion
54
55
Merton 1974
A review of Merton’s model of firm’s capital structure with its wide applications, Suresh Sundaresan 2013
Page 67 of 87
𝑑𝑉𝑖 = (𝜇𝑖 − 𝑘𝑖 )𝑉𝑖 𝑑𝑡 + 𝜎𝑖 𝑉𝑖 𝑑𝑋𝑖
So that
𝑑𝑙𝑛𝑉𝑖 = (𝜇𝑖 − 𝑘𝑖 −
𝜎𝑖2
) 𝑑𝑡 + 𝜎𝑖 𝑑𝑋𝑖
2
In the equations 𝜇𝑖 is the expected growth rate of assets for company 𝑖, 𝜎𝑖 is business risk or assets
volatility, 𝑘𝑖 is payout ratio and 𝑋𝑖 a random variable that follows Wiener process. Furthermore, 𝜇𝑖
and 𝜎𝑖 are assumed constant. Based on Merton, firm default when its assets value falls below the face
value of liabilities. In Black-Cox framework, defaults takes place as soon as assets values hits the
default barrier 𝐻𝑖 , safety covenant, from above. The exponential barrier is defined as,
𝐻𝑖 (𝑡, 𝑇) = {
𝐿
𝑡=𝑇
𝐾𝑒 −𝛾 (𝑇−𝑡) 𝑡 < 𝑇
where 𝐾 𝑎𝑛𝑑 𝛾 are positive parameters. Black and Cox also assumed that 𝐾𝑒 −𝛾 (𝑇−𝑡) < 𝐿𝑒 −𝑟 (𝑇−𝑡) .
This assumption means that safety covenant are lower than the final debt present value. If 𝛾 = 0 it’s
a special case of flat barrier. According to Hull White 2010, corresponding to 𝐻𝑖 , there is barrier 𝐻𝑖∗
such that company 𝑖 when 𝑋𝑖 falls below 𝐻𝑖∗ for the first time. Assuming 𝑋𝑖 (0) = 0 there is and zero
payout ratio (𝑘 = 0),
𝜎2
𝑙𝑛𝑉𝑖 (𝑡) − 𝑙𝑛𝑉𝑖 (0) − (𝜇𝑖 − 2𝑖 ) 𝑡
𝑋𝑖 (𝑡) =
𝜎𝑖
And
𝜎2
𝑙𝑛𝐻𝑖 − 𝑙𝑛𝑉𝑖 (0) − (𝜇𝑖 − 2𝑖 ) 𝑡
𝐻𝑖∗ =
𝜎𝑖
Where 𝑎1 = 𝑟 − 𝐾 − 𝛾 −
𝜎𝐴 2
2
and 𝑎1 =
𝑎1
𝜎𝐴 2
.
Hence the default probability will be
𝑃𝐷𝑖 = 𝑃𝑟𝑜𝑏(𝑉𝑖 ≤ 𝐻𝑖 ) = 𝑃𝑟𝑜𝑏(𝑋𝑖 ≤ 𝐻𝑖∗ )
Harrison56 1990 showed that probability of first hitting the barrier between times 𝑡 and 𝑡 + 𝑇 is,
56
Most parts from Hull White 2010
Page 68 of 87
𝛽𝑖 + 𝛾𝑖 (𝑡 + 𝑇) − 𝑋𝑖 (𝑡)
𝛽𝑖 + 𝛾𝑖 (𝑡 − 𝑇) − 𝑋𝑖 (𝑡)
𝑃𝑟𝑜𝑏 = Φ (
) + exp(2(𝑋𝑖 (𝑡) − 𝛽𝑖 − 𝛾𝑖 𝑡)) Φ(
)
√𝑇
√𝑇
2
While 𝛽𝑖 =
𝑙𝑛𝐻𝑖 −𝑙𝑛𝑉𝑖 (0)
𝜎𝑖
𝜎
−(𝜇𝑖 − 𝑖 )
, 𝛾𝑖 =
2
𝜎𝑖
.
Comparing results from Merton and Black-Cox for different scenarios reveal a relevant difference
which originates from the early possibility of default in Black-Cox model. Brigo57 compares two
scenarios of,
𝑆𝑒𝑡 1 ∶
𝐿
= 0.9 ; 𝜎1 = 0.2;
𝑉0
𝑆𝑒𝑡 2:
𝐿
= 0.2 ; 𝜎1 = 0.9;
𝑉0
and demonstrates the results in the graphs show a relevant difference.
Brigo and Tarenghi 2004, have extended Black-Cox first passage model first by means of timevarying volatility and curved barriers techniques and then further by random barrier and volatility
scenarios. The AT1P model is selected for two reasons, first it’s less complexity comparing to SBTV
and secondly the idea of not requiring the current value of assets where the model suffices to insert
the corresponding ratio of the barriers concerning the asset value level, this increases its applicability
in Iranian market.
Analytically Tractable 1th- Passage Model (AT1P)
Analytically Tractable first Passage (AT1P) model assumes the risk neutral dynamics for the value of
the firm characterized by 𝑟 and payout ratios of 𝑘 and instantaneous volatility of 𝜎𝑡
𝑑𝑉𝑖 = (𝜇𝑖 − 𝑘𝑖 )𝑉𝑖 𝑑𝑡 + 𝜎𝑖 𝑉𝑖 𝑑𝑋𝑖
57
Brigo D. 2011, Credit risk management, Kings’ college FM10 master course lecture notes
Page 69 of 87
And assumes default barrier function depending on parameters 𝐻 and 𝐵 of the form
𝑡
𝐻(𝑡) = 𝐻 exp(∫ (𝑟𝑢 − 𝑘𝑢 − 𝐵𝜎𝑢2 )𝑑𝑢)
0
Letting 𝜏 be the first time firm value 𝑉𝑡 hit the barrier 𝐻(𝑡, 𝑇) from above, starting from 𝑉0 ≥ 𝐻
𝜏 = inf{𝑡 ≥ 0, ; 𝑉𝑡 ≤ 𝐻(𝑡, 𝑇)
The survival probability is given analytically by
𝐻
2𝐵 − 1 𝑇
𝑉
2𝐵 − 1 𝑇 2
2𝐵−1
ln (𝑉 ) + 2 ∫0 𝜎𝑢2 𝑑𝑢
ln ( 0 ) +
𝜎
𝑑𝑢
∫
𝑢
𝐻
0
𝐻
2
0
ℚ{𝜏 > 𝑇} = Φ
−( )
Φ
𝑉
𝑇
𝑇
0
√∫0 𝜎𝑢2 𝑑𝑢
√∫0 𝜎𝑢2 𝑑𝑢
[
]
[
]
Apparently the barrier varies in time, following the firm and market conditions
𝑡
𝐻(𝑡) = 𝐻 exp{∫ (𝑟𝑢 − 𝑘𝑢 − 𝐵𝜎𝑢2 )𝑑𝑢}
0
=
𝐻
Ε[𝑉𝑡 ]
𝑉0
𝑡
×
exp(−𝐵 ∫ 𝜎𝑢2 𝑑𝑢)
0
First part is the backbone of the barrier while the second part cuts some slack in high volatility
𝐻
conditions controlling by 𝐵. 𝐻 and 𝑉0 always appear in formulas in ratios like 𝑉 . Therefore, it is
0
possible to rescale the initial value of the firm’s assets 𝑉0 = 1 and express the (free) barrier parameter
H as a fraction of it. In this case, it is not necessary to know the real value of the firm. Here, 𝐻 may
depend on the level of liabilities, on safety covenants, and in general on the characteristics of the
capital structure of the company.
Page 70 of 87
Appendix C
Summary properties of Clayton copula58
58
Mathematics and statistics for financial risk management, 2th edition, by Michael Miller
Page 71 of 87
Appendix D
Simulation pseudo code
Step#1 get number of counterparties in the portfolio
Step#2 take the company information such as industry, balance sheet, credit grade…
Step#3 get/calculate probability of default based on historical data, Merton or Brigo AT1P
Step#4 extract the spread over locally-defined risk free rate
Step#5 calculate average market sensitivity factors based on Lopez 2004
Step#6 generate one N(0,1) as market status
Step#7 generate N sample from Beta for correlations correlated with M
Step#8 generate N independent N(0,1) for idiosyncratic risk
Step#9 generate one Chi-square random variable with desired 1 ≤ df ≤ 3
Step#10 implement one-factor t-student copula to generate correlated binary defaults events
Step#11 generate correlated LGD with PDs conditioned on Market status from Clayton copula
Step#12 calculate Expected Loss
Step#13 run step#4 to 12 for 100,000 times
Step#14 aggregate ELs and generate histogram and 99.9% CVaR
Step#15 subtract expected loss from 99.9% CVaR to get economic capital
Page 72 of 87
Appendix E
Bank loan data (IRR)
Sector
EAD
PD
Rec.59
Bank Rating
weight of EAD
1
Service
19,707,163,492
0.176%
50%
BBB
0.3%
2
domestic trade
30,000,000,000
4.546%
50%
B
0.5%
3
domestic trade
145,746,276,596
4.546%
50%
B
2.5%
4
domestic trade
3,500,000,000
4.546%
50%
B
0.1%
5
domestic trade
1,573,339
1.166%
50%
BB
0.0%
6
trade
1,568,910,286
17.723%
60%
CC
0.0%
7
trade
17,892,911,203
17.723%
60%
CC
0.3%
8
trade
100,000,000,000
17.723%
60%
CC
1.7%
Row
59
9
trade
3,000,000,000
0.051%
60%
A-
0.1%
10
trade
5,000,000,000
0.051%
60%
A-
0.1%
11
trade
600,000,000
0.051%
60%
A-
0.0%
12
trade
50,000,000,000
4.546%
60%
B+
0.9%
13
trade
10,000,000,000
4.546%
60%
B+
0.2%
14
trade
10,000,000,000
4.546%
60%
B+
0.2%
15
trade
7,500,000,000
17.723%
60%
C
0.1%
16
trade
5,000,000,000
17.723%
60%
C
0.1%
17
trade
24,200,000,000
17.723%
60%
C
0.4%
18
trade
12,000,000,000
4.546%
60%
B+
0.2%
19
trade
10,000,000,000
4.546%
60%
B+
0.2%
20
trade
15,500,000,000
4.546%
60%
B+
0.3%
21
trade
1,000,000,000
0.051%
60%
A
0.0%
22
trade
13,400,000,000
0.051%
60%
A
0.2%
23
trade
5,000,000,000
0.051%
60%
A
0.1%
24
manufacturing
9,174,656,573
17.723%
70%
CCC-
0.2%
25
trade
30,000,000,000
0.051%
60%
A
0.5%
26
trade
3,000,000,000
0.051%
60%
A
0.1%
27
trade
100,000,000,000
0.051%
60%
A
1.7%
28
trade
2,000,000,000
17.723%
60%
CCC-
0.0%
29
trade
30,000,000,000
17.723%
60%
CCC-
0.5%
30
trade
2,460,000,000
17.723%
60%
CCC-
0.0%
31
trade
20,000,000,000
4.546%
60%
B+
0.3%
32
trade
150,000,000,000
4.546%
60%
B+
2.6%
33
trade
10,000,000,000
4.546%
60%
B+
0.2%
34
trade
50,000,000,000
1.166%
60%
BB
0.9%
35
trade
50,000,000,000
1.166%
60%
BB
0.9%
36
trade
527,605,810
1.166%
60%
BB
0.0%
37
manufacturing
9,673,910,588
17.723%
70%
CCC-
0.2%
Average for the industry
Page 73 of 87
38
manufacturing
5,500,000,000
17.723%
70%
CCC-
0.1%
39
manufacturing
15,000,000,000
17.723%
70%
CCC
0.3%
40
manufacturing
100,000,000,000
17.723%
70%
CCC
1.7%
41
manufacturing
20,000,000,000
17.723%
70%
CCC
0.3%
42
manufacturing
700,000,000
17.723%
70%
CCC
0.0%
43
manufacturing
2,000,000,000
1.166%
70%
BB
0.0%
44
manufacturing
4,000,000,000
1.166%
70%
BB
0.1%
45
manufacturing
9,632,332,506
1.166%
70%
BB
0.2%
46
manufacturing
3,000,000,000
4.546%
70%
B+
0.1%
47
manufacturing
10,000,000,000
4.546%
70%
B+
0.2%
48
manufacturing
100,000,000,000
4.546%
70%
B+
1.7%
49
manufacturing
15,000,000,000
4.546%
70%
B+
0.3%
50
manufacturing
96,000,000,000
17.723%
70%
CC
1.7%
51
manufacturing
20,000,000,000
17.723%
70%
CC
0.3%
52
manufacturing
15,000,000,000
17.723%
70%
CC
0.3%
53
manufacturing
3,929,506,560
17.723%
70%
C
0.1%
54
manufacturing
4,113,085,824
17.723%
70%
C
0.1%
55
manufacturing
5,811,514,594
17.723%
70%
C
0.1%
56
manufacturing
12,641,302,571
17.723%
70%
CCC+
0.2%
57
manufacturing
10,000,000,000
17.723%
70%
CCC+
0.2%
58
domestic trade
99,934,275,967
1.166%
50%
BB
1.7%
59
manufacturing
27,000,000,000
17.723%
70%
CCC+
0.5%
60
manufacturing
10,000,000,000
1.166%
70%
BB+
0.2%
61
manufacturing
270,000,000,000
1.166%
70%
BB+
4.7%
62
manufacturing
10,000,000,000
1.166%
70%
BB+
0.2%
63
manufacturing
2,000,000,000
4.546%
70%
B-
0.0%
64
manufacturing
20,000,000,000
4.546%
70%
B-
0.3%
65
manufacturing
8,500,000,000
4.546%
70%
B-
0.1%
66
manufacturing
10,000,000,000
17.723%
70%
CCC
0.2%
67
manufacturing
50,000,000,000
17.723%
70%
CCC
0.9%
68
manufacturing
30,000,000,000
17.723%
70%
CCC
0.5%
69
manufacturing
50,000,000,000
17.723%
70%
CCC+
0.9%
70
manufacturing
7,970,400,000
17.723%
70%
CCC+
0.1%
71
manufacturing
15,000,000,000
17.723%
70%
CCC+
0.3%
72
manufacturing
5,000,000,000
17.723%
70%
CCC+
0.1%
73
manufacturing
15,000,000,000
17.723%
70%
CCC+
0.3%
74
manufacturing
7,448,033,600
17.723%
70%
CCC+
0.1%
75
manufacturing
200,000,000,000
1.166%
70%
BB+
3.5%
76
manufacturing
100,000,000,000
1.166%
70%
BB+
1.7%
77
manufacturing
383,246,654
1.166%
70%
BB+
0.0%
78
manufacturing
648,615,762
4.546%
70%
B-
0.0%
79
manufacturing
200,000,000,000
4.546%
70%
B-
3.5%
80
manufacturing
45,000,000,000
4.546%
70%
B-
0.8%
Page 74 of 87
81
manufacturing
37,000,000,000
1.166%
70%
BB
0.6%
82
manufacturing
22,334,000,000
1.166%
70%
BB
0.4%
83
manufacturing
50,000,000,000
1.166%
70%
BB
0.9%
84
real estates
50,000,000,000
17.723%
70%
C
0.9%
85
real estates
5,000,000,000
17.723%
70%
C
0.1%
86
real estates
6,000,000,000
17.723%
70%
C
0.1%
87
real estates
4,200,000,000
1.166%
70%
BB
0.1%
88
real estates
3,000,000,000
1.166%
70%
BB
0.1%
89
real estates
3,800,000,000
1.166%
70%
BB
0.1%
90
real estates
3,000,000,000
0.051%
70%
A+
0.1%
91
real estates
30,000,000,000
0.051%
70%
A+
0.5%
92
real estates
1,800,000,000
0.051%
70%
A+
0.0%
93
real estates
5,000,000,000
0.051%
70%
A+
0.1%
94
real estates
50,000,000,000
4.546%
70%
B
0.9%
95
real estates
5,000,000,000
4.546%
70%
B
0.1%
96
real estates
25,000,000,000
4.546%
70%
B
0.4%
97
real estates
5,000,000,000
1.166%
70%
BB+
0.1%
98
real estates
30,000,000,000
1.166%
70%
BB+
0.5%
99
real estates
1,500,000,000
1.166%
70%
BB+
0.0%
100
real estates
19,000,000,000
0.176%
70%
BBB
0.3%
101
real estates
10,000,000,000
0.176%
70%
BBB
0.2%
102
real estates
300,000,000,000
0.176%
70%
BBB
5.2%
103
real estates
5,000,000,000
0.051%
70%
A
0.1%
104
real estates
5,000,000,000
0.051%
70%
A
0.1%
105
real estates
250,000,000,000
0.051%
70%
A
4.3%
106
real estates
10,000,000,000
0.176%
70%
BBB
0.2%
107
real estates
10,000,000,000
0.176%
70%
BBB
0.2%
108
real estates
18,120,000,000
0.176%
70%
BBB
0.3%
109
real estates
42,000,000,000
0.176%
70%
BBB
0.7%
110
service
2,000,000,000
17.723%
50%
CCC+
0.0%
111
service
5,000,000,000
17.723%
50%
CCC+
0.1%
112
service
500,000,000
17.723%
50%
CCC+
0.0%
113
service
3,000,000,000
4.546%
50%
B-
0.1%
114
service
840,000,000
4.546%
50%
B-
0.0%
115
service
2,000,000,000
4.546%
50%
B-
0.0%
116
service
17,915,000,000
1.166%
50%
BB+
0.3%
117
service
14,750,000,000
4.546%
50%
B-
0.3%
118
service
32,682,000,000
4.546%
50%
B-
0.6%
119
service
1,900,000,000
4.546%
50%
B-
0.0%
120
service
7,000,000,000
0.051%
50%
A
0.1%
121
service
41,000,000,000
0.051%
50%
A
0.7%
122
service
12,000,000,000
0.051%
50%
A
0.2%
123
service
7,000,000,000
1.166%
50%
BB
0.1%
Page 75 of 87
124
service
29,000,000,000
1.166%
50%
BB
0.5%
125
service
24,000,000,000
1.166%
50%
BB
0.4%
126
service
30,000,000,000
4.546%
50%
B
0.5%
127
service
7,000,000,000
4.546%
50%
B
0.1%
128
service
1,200,000,000
4.546%
50%
B
0.0%
129
service
27,000,000,000
17.723%
50%
CCC+
0.5%
130
service
1,000,000,000
17.723%
50%
CCC+
0.0%
131
service
33,000,000,000
17.723%
50%
CCC+
0.6%
132
service
10,000,000,000
17.723%
50%
CCC+
0.2%
133
domestic trade
29,942,009,227
1.166%
50%
BB
0.5%
134
domestic trade
50,000,000,000
0.176%
50%
BBB
0.9%
135
domestic trade
27,362,400,000
0.176%
50%
BBB
0.5%
136
domestic trade
16,499,315,040
0.176%
50%
BBB
0.3%
137
domestic trade
180,000,000,000
1.166%
50%
BB-
3.1%
138
domestic trade
500,000,000
1.166%
50%
BB-
0.0%
139
domestic trade
2,100,000,000
1.166%
50%
BB-
0.0%
140
domestic trade
5,000,000,000
4.546%
50%
B
0.1%
141
domestic trade
20,000,000,000
4.546%
50%
B
0.3%
142
domestic trade
6,661,679,675
4.546%
50%
B
0.1%
143
domestic trade
5,000,000,000
0.051%
50%
A
0.1%
144
domestic trade
12,500,000,000
0.051%
50%
A
0.2%
146
domestic trade
42,271,127,360
1.166%
50%
BB
0.7%
147
domestic trade
2,000,000,000
1.166%
50%
BB
0.0%
148
domestic trade
1,000,000,000
1.166%
50%
BB
0.0%
149
domestic trade
40,000,000,000
17.723%
50%
C
0.7%
150
domestic trade
100,000,000,000
17.723%
50%
C
1.7%
151
domestic trade
3,000,000,000
17.723%
50%
C
0.1%
152
domestic trade
362,735,145,342
4.546%
50%
B-
6.3%
153
domestic trade
573,680,000
4.546%
50%
B-
0.0%
154
domestic trade
22,637,600,000
4.546%
50%
B-
0.4%
155
domestic trade
3,353,174,720
4.546%
50%
B
0.1%
156
domestic trade
10,000,000,000
4.546%
50%
B
0.2%
157
domestic trade
20,000,000,000
4.546%
50%
B
0.3%
158
domestic trade
884,568,384
17.723%
50%
CCC-
0.0%
159
domestic trade
5,327,857,143
17.723%
50%
CCC-
0.1%
160
domestic trade
1,852,632,311
17.723%
50%
CCC-
0.0%
161
domestic trade
26,399,545,251
4.546%
50%
B
0.5%
162
domestic trade
200,000,000
4.546%
50%
B
0.0%
163
domestic trade
14,500,000,000
4.546%
50%
B
0.3%
164
domestic trade
1,000,000,000
0.022%
50%
AA
0.0%
165
domestic trade
20,000,000,000
0.022%
50%
AA
0.3%
166
domestic trade
50,000,000,000
0.022%
50%
AA
0.9%
167
domestic trade
1,778,063,249
1.166%
50%
BB
0.0%
Page 76 of 87
168
domestic trade
1,500,000,000
1.166%
50%
BB
0.0%
169
domestic trade
27,402,239,941
1.166%
50%
BB
0.5%
170
domestic trade
50,000,000,000
1.166%
50%
BB+
0.9%
171
domestic trade
50,000,000,000
1.166%
50%
BB+
0.9%
172
domestic trade
3,107,613,422
1.166%
50%
BB+
0.1%
173
domestic trade
4,475,150,000
4.546%
50%
B
0.1%
174
domestic trade
4,300,000,000
4.546%
50%
B
0.1%
175
domestic trade
30,000,000,000
4.546%
50%
B
0.5%
176
domestic trade
30,000,000,000
4.546%
50%
B+
0.5%
177
domestic trade
10,000,000,000
4.546%
50%
B+
0.2%
178
domestic trade
1,500,000,000
4.546%
50%
B+
0.0%
179
domestic trade
100,000,000,000
4.546%
50%
B
1.7%
180
domestic trade
30,000,000,000
4.546%
50%
B
0.5%
181
domestic trade
3,000,000,000
4.546%
50%
B
0.1%
182
domestic trade
20,000,000,000
4.546%
50%
B
0.3%
183
domestic trade
9,000,000,000
17.723%
50%
CCC+
0.2%
184
domestic trade
15,000,000,000
17.723%
50%
CCC+
0.3%
185
domestic trade
1,000,000,000
17.723%
50%
CCC+
0.0%
186
domestic trade
300,000,000,000
0.176%
50%
BBB
5.2%
187
domestic trade
9,280,508,181
0.176%
50%
BBB
0.2%
188
domestic trade
48,546,865,705
0.176%
50%
BBB
0.8%
189
domestic trade
24,244,773,352
0.176%
50%
BBB
0.4%
190
service
500,000,000
17.723%
50%
CCC
0.0%
191
manufacturing
800,000,000
4.546%
70%
B-
0.0%
192
manufacturing
4,500,000,000
4.546%
70%
B-
0.1%
193
real estates
1,670,000,000
0.176%
70%
BBB
0.0%
194
real estates
800,000,000
0.176%
70%
BBB
0.0%
195
service
120,000,000
17.723%
50%
CCC
0.0%
196
service
250,000,000
17.723%
50%
CCC
0.0%
197
service
100,000,000
17.723%
50%
CCC+
0.0%
198
service
490,000,000
17.723%
50%
CCC+
0.0%
Page 77 of 87
Appendix F
Sector analysis
Page 78 of 87
Appendix G
Code in R.
Loan Portfolio Loss Distribution and Optimization
Author: Amir Azamtarrahian
Date: April 2016
This function gets loans' data and outputs the corresponding credit spreads, optimum loan structure
and lending,
Economic capital for each counterparty and portfolio. Moreover, it delivers CVaR at different
confidence intervals
##############################################################################
plus sensitvity analysis on model parametrs.
myBaselBank<<- function( Nsim= 100000,
# set as REcovery for senior secured loan , mean= 71.18% ,
# Stdev= %21.09, source Moody's
aR=rep(2.571, NC),
bR=rep(1.041, NC),
dof=3, # degree of freedom for t-student copula
ttau=.5) # tau for Clayton Copula of recovery and Pd rates
{
# Reading data of loans and exposures from Excel
loanData <<- read.csv("D:/Temp Works/Thesis/R/loanData.csv")
#loanData<<- loanData[197,]
NC<<- length(loanData[,1]) # Number of loans in portfolio
EAD<<- as.vector(loanData[,3]) # Exposures
PD<<- loanData[,4]
# Probabikity of defaults
rec<<- loanData[,5]
# Recovery rates
# tHis function gets mean and stdev of recovery rates and calibrates to Beta distribution
myRec<<- function(meanRec=.6, sigRec=.309){
Page 80 of 87
library(rootSolve)
# this describes the system of equations for mean and variance of Beta
model <<- function(x) c(F1 = (x[1]/ (x[1]+x[2])) - meanRec,
F2 = (x[1]*x[2]/(((x[1]+x[2])^2)*(x[1]+x[2]+1))) - (sigRec^2))
ss <<- multiroot(f = model, start = c(.1, .1))
# ss is vector of solutions, a & b pars. for Beta distribution
ss$root
}
# This function produces the Basel rho and WCDR
baselII<<- function(p){
ro<<- .12*(1+exp(-50* p))
wcdr<<- pnorm((qnorm(p) + sqrt(ro)*qnorm(.999))/ sqrt(1- ro))
c_var<<- (1-.7118)*wcdr
c(wcdr, c_var)
}
# Starts the clock!
ptm <- proc.time()
assetRho<<- sqrt(.12*(1+exp(-50* PD))) # rho of exposyures to the Market factor
# this loop gives each exposure rho based on Basel formula of rho and PDs and calibrate Beta
distr.
aC<<- rep(0,NC); bC<<- rep(0, NC)
for (k in 1:NC){
myRec( assetRho[k] , .1)
aC[k]<<-ss$root[1]
bC[k]<<-ss$root[2]
}
# alpha is set based on Kendall Tau= sqrt(0.5)
Page 81 of 87
set.seed(11111*runif(1))
M<<- rep(0, Nsim)
chi<<- rep(0, Nsim)
NC<<- NC
alfa<<- 2*ttau/(1-ttau)
rrho<<- matrix(0, nrow=Nsim, ncol=NC)
tPD<<- matrix(0, nrow=Nsim, ncol=NC)
LGD<<- matrix(0, nrow=Nsim, ncol=NC)
N<<- matrix(0, nrow=Nsim, ncol=NC) # Binary variable if default N[i]=1
aggEL<<- matrix(0, nrow=Nsim, ncol=NC)
v<<- matrix(0, nrow=Nsim, ncol=NC)
u<<- matrix(0, nrow=Nsim, ncol=NC)
z1<<- matrix(0, nrow=Nsim, ncol=NC)
u1<<- matrix(0, nrow=Nsim, ncol=NC)
v1<<- matrix(0, nrow=Nsim, ncol=NC)
Rec<<- matrix(0, nrow=Nsim, ncol=NC)
EAD<<- EAD
prD<<- mean(PD)
for( j in 1:Nsim){
M[j]<<- rnorm(1,0,1)
chi[j]<<- rchisq(1,dof)
for (i in 1:NC){
z1[j,i]<<-rnorm(1,0,1)
# correlating correlations to default rates by copula
rrho[j,i]<<- qbeta(pnorm(-sqrt(0.05)* M[j] + sqrt(0.5) *rnorm(1)), aC[i], bC[i])
# generating correlated default rates by t-copula
tPD[j,i]<<- (((rrho[j,i])* M[j] +
sqrt(1-rrho[j,i]^2)*z1[j,i])/sqrt(chi[j]/dof))
# Checking if defaults or not
if (tPD[j,i] <= qt(PD[i], dof)) { N[j,i]<<- 1 } else {N[j,i]<<- 0}
#Generating correlated recovery and default rate by Clayton copula
u[j,i]<<-pnorm(sqrt(0.5)* M[j] + sqrt(0.5) *rnorm(1))
v[j,i]<<- u[j,i]*(((runif(1))^(-alfa/(1+alfa))) + (u[j,i]^alfa) - 1 )^(-1/alfa);
Page 82 of 87
Rec[j,i]<<- qbeta(v[j,i],aR[i],bR[i] )
LGD[j,i]<<- 1- Rec[j,i]
aggEL[j,i]<<- aggEL[j,i] + (LGD[j,i]*N[j,i]*EAD[i])
}
}
# Stop the clock
proc.time() - ptm
h<<-hist(apply(aggEL, 1, sum) , col="blue", main="Portfolio Loss Distribution",
xlab="portfolio loss (mln)",ylab="Frequesncy")
#abline(v=mean(apply(aggEL, 1, sum)), col="red" , lwd=3, lty=5)
abline(v= NC* mean(EAD)*baselII(prD)[2], col="black" , lwd=4, lty=5)
abline(v= quantile(apply(aggEL, 1, sum), c(.95)) , col="orange" , lwd=3, lty=5)
abline(v= mean(apply(aggEL, 1, sum), col="pink" , lwd=3, lty=5))
abline(v= quantile(apply(aggEL, 1, sum), c(.99)) , col="green" , lwd=3, lty=5)
abline(v= quantile(apply(aggEL, 1, sum), c(.999)) , col="red" , lwd=3, lty=5)
legend("topright", legend=c("mean Loss","model 95% CVaR","model 99% CVaR","Basel
99.9% CVaR","model 99.9% CVaR"),
col=c("pink","orange","green","black", "red"), lty=5,lwd=2,
bty="n")
# calculating EC of Portfolio, PEC
PEC<<- quantile(apply(aggEL, 1, sum), c(.999)) - mean(apply(aggEL, 1, sum))
mean(apply(aggEL, 1, sum))
c(quantile(apply(aggEL, 1, sum), c(.95)),
quantile(apply(aggEL, 1, sum), c(.99)),
quantile(apply(aggEL, 1, sum), c(.995)),
quantile(apply(aggEL, 1, sum), c(.999)),
baselII(prD)[1], (NC* mean(EAD)*baselII(prD)[2]),
PEC)
}
# Histograms by sector
Page 83 of 87
require(lattice)# plot by each group
histogram(~ (EAD/1000)|factor(Sector), data= loanData, nint = 10, main="
Exposure by Sector",xlab = "Exposure in bln", type = "density",
panel = function(x, ...) {
panel.histogram(x, col = "darkblue", ...)
panel.mathdensity(dmath = dnorm, col = "red",
args = list(mean=mean(x),sd=sd(x)))
})
require(lattice)# plot by each group
histogram(~ MEB.Rating|factor(Sector), data= loanData, nint = 10,
main="Ratings by Sector", xlab = "Exposure in bln", type = "density",
panel = function(x, ...) {
panel.histogram(x, col = "darkblue", ...)
panel.mathdensity(dmath = dnorm, col = "red",
args = list(mean=mean(x),sd=sd(x)))
})
##########################
plot(LGD[,10], u[,10], pch=".", main="LGD vs Economy Status", xlab="LGD", ylab="Economy
Status", col="blue")
mean(apply(aggEL,1,sum))
plot(rrho[,10], M, pch=".", main="Cor. vs Economy Status", xlab="Correlations", ylab="Economy
Status", col="blue")
plot(tPD[,10], tPD[,12], pch=".", main="Default Rates", xlab="company A", ylab="COmpany B",
col="blue")
plot(pt(tPD[,10],5), pt(tPD[,12],5), pch=".", main="Default Rates (margins)", xlab="company A",
ylab="COmpany B", col="blue")
hist(apply(N,1,sum), col="blue", main="Portfolio loss Distribution", xlab="No. of Defaults",
ylab="Frequesncy")
quantile(apply(N,1,sum)/NC, c(.95,.99, .995, .999))
# Expected Shortfall
mean(apply(aggEL,1,sum)[apply(aggEL,1,sum)> quantile(apply(aggEL,1,sum), c(.999))])
hist(apply(aggEL,1,sum)[apply(aggEL,1,sum)> quantile(apply(aggEL,1,sum), c(.999))],
main="ES and Tail distribution at 99.9%", xlab="Portfolio Loss in mln", col="blue")
library(pastecs)
stat.desc(EAD)
# fubnction to generates each sector expected loss
sectorEL<<- function(secName){
Page 84 of 87
#theSec<<- as.character(secName)
ina<<-which(loanData[,1] %in% loanData[loanData$Sector== secName,1])
hist(apply(aggEL[,ina],1,sum), main= paste("Loss distribution in:", secName), xlab="Sector loss
(mln) ", col="blue")
abline(v= mean(apply(aggEL[,ina],1,sum)), col="black" , lwd=4, lty=5)
abline(v= quantile(apply(aggEL[,ina],1,sum), c(.999)) , col="red" , lwd=3, lty=5)
legend("topright", legend = c(paste("Mean =", round((mean(apply(aggEL[,ina],1,sum))), 1)),
paste("99.9% CVaR =", round((quantile(apply(aggEL[,ina],1,sum), c(.999))),
1))),
bty = "n")
quantile(apply(aggEL[,ina],1,sum), c(.95,.99, .995, .999))
SEC<<- quantile(apply(aggEL[,ina],1,sum), c(.999)) - mean(apply(aggEL[,ina],1,sum))
SRC<<- SEC/sum(EAD[ina])
plot(sum(EAD[ina]), SRC, col =c(1:5), bty="n", pch=19, cex=.75)
hist(apply(N[,ina],1,sum), main= paste("Loss distribution in sector:", secName),
xlab="No. of defaulted firms (mln)", col="blue")
legend("topright", legend = c(paste("Mean =", round((mean(apply(N[,ina],1,sum))), 1)),
paste("99.9% CVaR =", round((quantile(apply(N[,ina],1,sum), c(.999))), 1))),
bty = "n")
c(quantile(apply(aggEL[,ina],1,sum), c(.95,.99, .995, .999))[1],
quantile(apply(aggEL[,ina],1,sum), c(.95,.99, .995, .999))[2],
quantile(apply(aggEL[,ina],1,sum), c(.95,.99, .995, .999))[3],
quantile(apply(aggEL[,ina],1,sum), c(.95,.99, .995, .999))[4],
mean(apply(aggEL[,ina],1,sum)))
# sector analysisi of Basel mode
mean(PD[ina])
(1-.7118)*length(ina)* mean(EAD[ina])* (pnorm((qnorm(mean(PD[ina])) +
sqrt(roS)*qnorm(.999))/sqrt(1- roS))- mean(PD[ina]))
roS<<- .12*(1+exp(-50* mean(PD[ina])))
roS
}
Page 85 of 87
# CVaR for each company
EC<<- apply(aggEL,2 ,quantile,c(.999))- apply(aggEL,2,mean) #EC for each company at 99.9%
#ina2<<- which(EC %in% EC[EC<0])
EC<<-replace(EC, EC<0, 0)
RC<<- EC/EAD
plot(RC, col= loanData[,2], pch=19)
plot(EC, col= loanData[,2], pch=19 )
plot(EAD, RC, col= loanData[,2], pch=19,main="Counterparty Risk Analysis", xlab="Exposure
(mln)", ylab="EC contribution/ EAD")
legend("bottomright", legend= levels(loanData[,2]), col =c(1:5), bty="n", pch=19, cex=.75)
####################################################
#sectors RC to EAD
SEAD<<- c(1669761,2136620, 899890,330954 ,739650 ) # mfg, dom tr, real, service, trade
SRC<<- c( 0.39,
0.30,
0.16 ,
0.31,
0.36)
plot(SEAD, SRC, col= c(1:5), pch=19, main="Sector Risk Analysis", xlab="Exposure (mln)",
ylab="EC contribution/ EAD")
legend("bottomright", legend= c("manufacturing", "Domestic Tarde", "Real Estates", "Service",
"Trade"),
col =c(1:5), bty="n", pch=19, cex=1)
,
legend=c("best solutions"), col=c("blue"), bty="n", pch=17)
Page 86 of 87
Отзывы:
Авторизуйтесь, чтобы оставить отзыв