Study on Prediction Methods of Financial Crises and Bank Failures

Description
Operations research is a complex and interdisciplinary tool that combines mathematical modeling, statistics, and algorithms. This tool is often employed by managers and managerial scientists. It is based on techniques that seek to determine either optimal or near optimal solutions to complex problems and situations.

Yulia Demyanyk – Iftekhar Hasan
Financial crises and bank failures:
a review of prediction methods
Bank of Finland Research
Discussion Papers
35 • 2009

Suomen Pankki
Bank of Finland
PO Box 160
FI-00101 HELSINKI
Finland
? +358 10 8311
http://www.bof.fi
E-mail: [email protected]

Bank of Finland Research
Discussion Papers
35

2009

Yuliya Demyanyk* – Iftekhar Hasan**

Financial crises and bank
failures: a review of prediction
methods
The views expressed in this paper are those of the authors and
do not necessarily reflect the views of the Bank of Finland.

* Federal Reserve Bank of Cleveland.
Email: [email protected].
** Rensselaer Polytechnic Institute and Bank of Finland.
Email: [email protected]. Corresponding author.

We thank Kent Cherny for excellent comments and Qiang Wu
for research assistance.
http://www.bof.fi

ISBN 978-952-462-564-7
ISSN 0785-3572
(print)

ISBN 978-952-462-565-4
ISSN 1456-6184
(online)

Helsinki 2009

3
Financial crises and bank failures:
a review of prediction methods
Bank of Finland Research
Discussion Papers 35/2009
Yuliya Demyanyk – Iftekhar Hasan
Monetary Policy and Research Department

Abstract
In this article we provide a summary of empirical results obtained in several
economics and operations research papers that attempt to explain, predict, or
suggest remedies for financial crises or banking defaults, as well as outlines of the
methodologies used. We analyze financial and economic circumstances associated
with the US subprime mortgage crisis and the global financial turmoil that has led
to severe crises in many countries. The intent of the article is to promote future
empirical research that might help to prevent bank failures and financial crises.

Keywords: financial crises, banking failures, operations research, early warning
methods, leading indicators, subprime markets

JEL classification numbers: C44, C45, C53, G01, G21

4
Rahoituskriisien ja pankkikonkurssien
ennustusmenetelmien arviointia
Suomen Pankin keskustelualoitteita 35/2009
Yuliya Demyanyk – Iftekhar Hasan
Rahapolitiikka- ja tutkimusosasto

Tiivistelmä
Tässä työssä arvioidaan julkaistuja taloustieteen ja operaatiotutkimuksen empiiri-
siä selvityksiä, joissa pyritään selittämään syitä rahoituskriiseihin ja pankki-
konkursseihin, ennustamaan pankki- ja rahoituskriisejä tai tarkastelemaan
politiikkavaihtoehtoja, joilla näitä kriisejä hallitaan. Tässä tutkimuksessa tehdään
myös yhteenveto rahoitus- ja pankkikriisien empiirisissä tutkimuksissa käytetyistä
menetelmistä. Työssä tarkastellaan lisäksi Yhdysvaltain asuntoluottojärjestelmän
kriisiin ja globaaliin rahoitusmarkkinoiden myllerrykseen liittyviä rahoitusjärjes-
telmän ja talouden piirteitä, jotka johtivat vakavaan kriisiin monessa maassa.
Tämän tutkimuksen keskeinen tarkoitus on edistää tulevaisuudessa tehtävää
empiiristä tutkimusta, jonka avulla rahoitus- ja pankkikriisien syntyä voitaisiin
estää.

Avainsanat: rahoituskriisit, pankkikonkurssit, operaatiotutkimus, varhaisten häly-
tysten menetelmät, ennakoivat indikaattorit, subprime-asuntoluotot

JEL-luokittelu: C44, C45, C53, G01, G21

5
Contents
Abstract .................................................................................................................... 3
Tiivistelmä (abstract in Finnish) .............................................................................. 4

1 Introduction ...................................................................................................... 7

2 Review of econometric analyses of the subprime crisis ................................ 9
2.1 Collapse of the US subprime mortgage market ......................................... 9
2.2 The subprime crisis is not unique ............................................................ 11
2.3 Selected analyses of bank failure prediction ........................................... 12
2.4 Remedies for financial crises ................................................................... 13

3 Review of operations research models ......................................................... 16

4 Concluding remarks ...................................................................................... 24

References .............................................................................................................. 25

6
1 Introduction
This article reviews econometrics and operations research methods used in
the empirical literature to describe, predict, and remedy ?nancial crises and
mortgage defaults. Such an interdisciplinary approach is bene?cial for future
research as many of the methods used in isolation are not capable of accurately
predicting ?nancial crises and defaults of ?nancial institutions.
Operations research is a complex and interdisciplinary tool that combines
mathematical modeling, statistics, and algorithms. This tool is often employed
by managers and managerial scientists. It is based on techniques that seek to
determine either optimal or near optimal solutions to complex problems and
situations.
Many analytical techniques used in operations research have similarities
with functions of the human brain; they are called ‘intelligence techniques.’
For example, Neural Networks (NN) is the most widely used model among the
intelligence techniques.
1
NN models have developed from the ?eld of arti?cial
intelligence and brain modeling. They have mathematical and algorithmic
elements that mimic the biological neural networks of the human nervous
system. The model uses nonlinear function approximation tools that test
the relationship between independent (explanatory) and dependent (to be
explained) factors. The method considers an interrelated group of arti?cial
neurons and processes information associated with them using a so-called
connectionist approach, where network units are connected by a ?ow of
information. The structure of the model changes based on external or internal
information that ?ows through the network during the learning phase.
Compared to statistical methods, NN have two advantages. The most
important of these is that the models make no assumptions about the statistical
distribution or properties of the data, and therefore tend to be more useful
in practical situations (as most ?nancial data do not meet the statistical
requirements of certain statistical models). Another advantage of the NN
method is its reliance on nonlinear approaches, so that one can be more
accurate when testing complex data patterns. The nonlinearity feature of
NN models is important because one can argue that the relation between
explanatory factors and the likelihood of default is nonlinear (several statistical
methodologies, however, are also able to deal with nonlinear relationships
between factors in the data).
This paper is related to work of Demirguc-Kunt and Detragiache (2005)
who review two early warning methods — signals approach and the multivariate
probability model — that are frequently used in empirical research analyzing
banking crises. Bell and Pain (xxxx) review the usefulness and applicability
of the leading indicator models used in the empirical research analyzing and
predicting ?nancial crises. The authors note that the models need to be
improved in order to be a more useful tool for policymakers and analysts.
In this review we show that statistical techniques are frequently
accompanied by intelligence techniques for better model performance in the
empirical literature aiming to better predict and analyze defaults and crises.
1
Chen and Shih (2006) and Boyacioglu et al (2008).
7
In most of the cases reviewed, models that use operations research techniques
alone or in combination with statistical methods predict failures better than
statistical models alone. In fact, hybrid intelligence systems, which combine
several individual techniques, have recently become very popular.
The paper also provides an analysis of ?nancial and economic circumstances
associated with the subprime mortgage crisis. Many researchers, policymakers,
journalists, and other individuals blame the subprime mortgage market and
its collapse for triggering the global crisis; many also wonder how such a
relatively small subprime market could cause so much trouble around the
globe, especially in countries that did not get involved with subprime lending
or with investment in subprime securities. We provide some insights into this
phenomenon.
The subprime credit market in the United States largely consists of
subprime mortgages. The term ‘subprimé usually refers to a loan (mortgage,
auto, etc.) that is viewed as riskier than a regular (prime) loan in the eyes of a
lender. It is riskier because the expected probability of default for these loans
is higher. There are several de?nitions of subprime available in the industry.
A subprime loan can be (i) originated to a borrower with a low credit score
and/or history of delinquency or bankruptcy, and/or poor employment history;
(ii) originated by lenders specializing in high-cost loans and selling fewer loans
to government-sponsored enterprises (not all high-cost loans are subprime,
though); (iii) part of subprime securities; and (iv) certain mortgages (eg, 2/28
or 3/27 ‘hybrid’ mortgages) generally not available in the prime market.
2
The subprime securitized mortgage market in the United States boomed
between 2001 and 2006 and began to collapse in 2007. To better picture the
size of this market ($1.8 trillion of US subprime securitized mortgage debt
outstanding),
3
it is useful to compare it with the value of the entire mortgage
debt in the United States (approximately $11.3 trillion)
4
and the value of
securitized mortgage debt ($6.8 trillion).
5
In other words, as of the second
quarter of 2008, the subprime securitized market was roughly one-third of
the total securitized market in the United States, or approximately 16 per
cent of the entire US mortgage debt. Before the crisis, it was believed that a
market of such small size (relatively to the US total mortgage market) could
not cause signi?cant problems outside the subprime sphere even if it were
to crash completely. However, we now see a severe ongoing crisis — a crisis
that has a?ected the real economies of many countries in the world, causing
recessions, banking and ?nancial crises, and a global credit crunch.
The large e?ect of the relatively small subprime component of the mortgage
market and its collapse was most likely due to the complexity of the market for
the securities that were created based on subprime mortgages. The securities
were created by pooling individual subprime mortgages together; in addition,
2
See Demyanyk and VanHemert (2008) and Demyanyk (2008) for a more detailed
description and discussion.
3
As the total value of subprime securities issued between 2000 and 2007, calculated by
Inside Mortgage Finance, 2008.
4
Total value of mortgages outstanding in 2Q 2008. Source: Inside Mortgage Finance,
2008
5
Total value of mortgage securities outstanding in 2Q 2008. Source: Inside Mortgage
Finance, 2008
8
the securities themselves were again repackaged and tranched to create more
complicated ?nancial instruments.
The mortgage securities were again split into various new tranches,
repackaged, re-split and repackaged again many times over. Each stage of
the securitization process introduced more leverage for ?nancial institutions
and made valuing the holdings of those ?nancial instruments more di?cult.
All this ultimately resulted in uncertainly about the solvency of a number
of large ?nancial ?rms as over time the market value of the securities was
heavily discounted in response to tremors in the housing market itself. Also,
the securities were largely traded internationally, which led to spill-overs of the
US subprime mortgage crisis and its consequences across the country borders.
There are two sections in this paper. Section 1 summarizes empirical
methodologies and ?ndings of studies that apply econometric techniques. In
this section, we outline several analyses of the US subprime market and its
collapse. We show that the crisis, even though signi?cant and devastating for
many, was not unique in the history of the United States or for other countries
around the world. We review the analyses of bank failure and suggested
remedies for ?nancial crises in the literature. Section 3 summarizes empirical
methodologies used in Operations Research studies analyzing and predicting
bank failures. Section 4 concludes.
2 Review of econometric analyses of the subprime
crisis
In this section we analyze the collapse of the subprime mortgage market in the
United States and outline factors associated with it.
2.1 Collapse of the US subprime mortgage market
The ?rst signs of the subprime mortgage market collapse in the United States
were very high (and unusual even for subprime market) delinquency and
foreclosure rates for mortgages originated in 2006 and 2007. High rates
of foreclosures, declining home values, borrowers’ impaired credit histories,
destabilized neighborhoods, numerous vacant and abandoned properties, the
absence of mechanisms providing entry into and exit out of the distressed
mortgage market (uncertainty froze the market; a limited number of
home sales/purchases occurred), and overall economic slowdown created a
self-sustaining loop, escape from which was beyond the capacity of market
forces to ?nd.
Demyanyk and Van Hemert (2008) analyzed the subprime crisis empirically,
utilizing a duration statistical model that allows estimating the so-called
survival time of mortgage loans, ie, how long a loan is expected to be
current before the very ?rst delinquency (missed payment) or default occurs,
conditional on never having been delinquent or in default before. The model
9
also allows controlling for various individual loan and borrower characteristics,
as well as macroeconomic circumstances. According to the estimated results,
credit score, the cumulative loan-to-value ratio, the mortgage rate, and
the house price appreciation have the largest (in absolute terms) marginal
e?ects and are the most important for explaining cross-sectional di?erences in
subprime loan performance. However, according to the same estimated model,
the crisis in the subprime mortgage market did not occur because housing
prices in the United States started declining, as many have conjectured. The
crisis had been brewing for at least six consecutive years before signs of it
became visible.
The quality of subprime mortgages had been deteriorating monotonically
every year since at least 2001; this pattern was masked, however, by house
price appreciation. In other words, the quality of loans did not suddenly
become much worse just before the defaults occurred — the quality was poor
and worsening every year. We were able to observe this inferior quality only
when the housing market started slowing down — when bad loans could not
hide behind high house appreciation, and when bad loans could no longer be
re?nanced.
Demyanyk and Van Hemert also show that the above-mentioned
monotonic deterioration of subprime mortgages was a (subprime) market-wide
phenomenon. They split their sample of all subprime mortgages into the
following subsamples: ?xed-rate, adjustable-rate (hybrid), purchase-money,
cash-out re?nancing, mortgages with full documentation, and mortgages with
low or no documentation. For each of the subsamples, deterioration of the
market is observable. Therefore, one cannot blame the crisis on any single
cause, such as a particularly bad loan type or irresponsible lending — there
were many causes.
Demyanyk (2008) empirically showed that subprime mortgages were, in
fact, a temporary phenomenon, ie, borrowers who took subprime loans seemed
to have used mortgages as temporary bridge ?nancing, either in order to
speculate on house prices or to improve their credit history. On average,
subprime mortgages of any vintage did not last longer than three years:
approximately 80 percent of borrowers either prepaid (re?nanced or sold their
homes) or defaulted on the mortgage contracts within three years of mortgage
origination.
Several researchers have found that securitization opened the door to
increased subprime lending between 2001 and 2006, which in turn led to
reduced incentives for banks to screen borrowers and increased subsequent
defaults. For example, Keys et al (2008) investigate the relationship
between securitization and screening standards in the context of subprime
mortgage-backed securities. Theories of ?nancial intermediation suggest that
securitization — the act of converting illiquid loans into liquid securities —
could reduce the incentives of ?nancial intermediaries to screen borrowers.
Empirically, the authors ‘exploit a speci?c rule of thumb [credit score 620]
in the lending market to generate an exogenous variation in the ease of
securitization and compare the composition and performance of lenders’
portfolios around the ad-hoc threshold’. They ?nd that ‘the portfolio that
is more likely to be securitized defaults by around 10—25% more than a
10
similar risk pro?le group with a lower probability of securitization’, even after
analyzing for ‘selection on the part of borrowers, lenders, or investors’. Their
results suggest that securitization does adversely a?ect the screening incentives
of lenders.
Mian and Su? (2008) show that securitization is associated with increased
subprime lending and subsequent defaults. More speci?cally, the authors show
that geographical areas (in this case, zip codes) with more borrowers who had
credit application rejections a decade before the crisis (in 1996) had more
mortgage defaults in 2006 and 2007. Mian and Su? also ?nd that ‘prior to the
default crisis, these subprime zip codes (had experienced) an unprecedented
relative growth in mortgage credit’. The expansion in mortgage credit in these
neighborhoods was combined with declining income growth (relative to other
areas) and an increase in securitization of subprime mortgages.
Taylor (2008) blames ‘too easy’ monetary policy decisions, and the resulting
low interest rates between 2002 and 2004 for causing the monetary excess,
which in turn led to the housing boom and its subsequent collapse. He
compares the housing market boomthat could have resulted in the US economy
if monetary policy had been conducted according to the historically followed
Taylor rule — a rule that suggested much higher interest rates for the period
— with the actual housing boom. Based on the comparison, there would have
been almost no housing boom with the higher rates. No boom would have
meant no subsequent bust. The author dismisses the popular hypothesis of an
excess of world savings — a ‘savings glut’ — that many use to justify the low
interest rates in the economy, and shows that there was, in fact, a global savings
shortage, not an excess. Also, comparing monetary policy in other countries
with that in the United States, Taylor notices that the housing booms were
largest in countries where deviations of the actual interest rates from those
suggested by the Taylor rule were the largest.
There is a large literature that analyzes mortgage defaults. The analysis is
important for understanding the subprime mortgage crisis, which was triggered
by a massive wave of mortgage delinquencies and foreclosures. Important
contributions to this literature include Deng (1997), Ambrose and Capone
(2000), Deng et al (2000), Calhoun and Deng (2002), Pennington-Cross
(2003), Deng et al (2005), Clapp et al (2006), and Pennington-Cross and
Chomsisengphet (2007).
2.2 The subprime crisis is not unique
Demyanyk and Van Hemert (2008) show evidence that the subprime mortgage
crisis in the United States seems, in many respects, to have followed the
classic lending boom-and-bust cycle documented by Dell’Ariccia et al (2008).
First, a sizeable boom occurred in the subprime mortgage market. Depending
on the de?nition of ‘subprime’, the market grew from three to seven times
larger between 1998 and 2005 (see Mayer and Pence (2008) for measures
of the size and the increase of the subprime mortgage market based on
US Department of Housing and Urban Development and LoanPerformance
de?nitions). Second, a de?nitive collapse of the market occurred in 2007,
11
which was re?ected in high delinquency, foreclosure, and default rates. A
year later, the subprime mortgage crisis spilled over into other credit markets,
creating a much larger ?nancial crisis and global credit crunch. Third, the
periods preceding the collapse were associated with loosening of underwriting
standards, deteriorating loan quality, and increasing loan riskiness that were
not backed up by an increasing price of this extra risk. In fact, the
subprime-prime spread was actually declining over the boom period.
Increasing riskiness in the market, together with the decreasing price of
this risk, leads to an unsustainable situation, which in turn leads to a market
collapse. The subprime episode ?ts into this boom-bust framework easily.
Moreover, not only have Demyanyk and Van Hemert (2008) shown that the
crisis followed a classic path known to policymakers and researchers in several
countries but they have also shown that analysts could have foreseen the crisis
as early as late 2005. It is not clear, though, whether the crisis could have
been prevented at that point. Comparing the ?ndings of Dell’Ariccia et al
(2008) and Demyanyk and Van Hemert (2008), it appears the United States
(in 2007); Argentina (in 1980); Chile (in 1982); Sweden, Norway, and Finland
in (1992); Mexico (in 1994); and Thailand, Indonesia, and Korea (in 1997) all
experienced the culmination of similar (lending) boom-bust scenarios, but in
very di?erent economic circumstances.
Reinhart and Rogo? (2008), who analyzed macro indicators in the United
States preceding the ?nancial crisis of 2008 and 18 other post-World War II
banking crises in industrial countries, also found striking similarities among all
of them. In particular, the countries experiencing the crises seem to share a
similarity in the signi?cant increases in housing prices before the ?nancial crises
commenced. Even more striking is evidence that the United States had a much
higher growth rate in its house prices than the so-called Big Five countries in
their crises (Spain in 1977, Norway in 1987, Finland in 1991, Sweden in 1991,
and Japan in 1992). In comparing the real rates of growth in equity market
price indexes, the authors again ?nd that pre-crisis similarities are evident
among all the crisis countries. Also, in comparing the current account as a
percentage of gross domestic product (GDP), not only are there similarities
between countries, but the United States had larger de?cits than those of the
other countries before their crises, reaching more than six percent of GDP.
The authors noted, however, that there are great uncertainty associated with
the still ongoing 2008—2009 crisis in the United States; therefore, it is not
possible to project the path of crisis resolution based on the experiences of
other countries.
2.3 Selected analyses of bank failure prediction
Demirguc-Kunt and Detragiache (1998) study the determinants of the
probability of a banking crisis around the world in 1980—1994 using a
multivariate Logit model. They ?nd that bank crises are more likely in
countries with low GDP growth, high real interest rates, high in?ation rates,
and explicit deposit insurance system. Countries that are more susceptible
12
to balance of payments crises also have a higher probability of experiencing
banking crises.
Demirguc-Kunt and Detragiache (2002) speci?cally investigate the relation
between the explicit deposit insurance and stability in banking sector across
countries. The authors con?rm and strengthen the ?ndings of Demirguc-Kunt
and Detragiache (1998) that explicit deposit insurance can harm bank stability.
This happens because banks may be encouraged by the insurance to ?nance
high-risk and high-return projects, which in turn can lead to more bank
losses and failures. The authors ?nd that deposit insurance has a more
negative impact on the stability of banks in countries where the institutional
environment is weak, where the coverage o?ered to depositors is more intensive,
and where the scheme is run by the government rather than by the private
sector.
Demirguc-Kunt et al (2006) examine what happens to the structure of the
banking sector following a bank crisis. The authors ?nd that individuals and
companies leave weaker banks and deposit their funds in stronger banks; at
the same time, the aggregate bank deposits relative to countries’ GDP do not
signi?cantly decline. Total aggregate credit declines in countries after banking
crises, and banks tend to reallocate their asset portfolios away from loans and
improve their cost e?ciency.
Wheelock and Wilson (2000) analyze what factors predict bank failure
in the United States, particularly. The authors use competing-risks hazard
models with time-varying covariates. They ?nd that banks with lower
capitalization, higher ratios of loans to assets, poor quality loan portfolios
and lower earnings have higher risk of failure. Banks located in states where
branching is permitted are less likely to fail. This may indicate that an ability
to create a branch network, and an associated ability to diversify, reduces
banks’ susceptibility to failure. Further, the more e?ciently a bank operates,
the less likely the bank is to fail.
Berger and DeYoung (1997) analyze instances when US commercial banks
face increases in the proportion of nonperforming loans and reductions in cost
e?ciency between 1985 and 1994. The authors ?nd that these instances are
interrelated and Granger-cause each other.
2.4 Remedies for ?nancial crises
Caprio et al (2008) indicate that recent ?nancial crises often occur because of
booms in macroeconomic sectors; the crises are revealed following ‘identi?able
shocks’ that end the booms. Importantly, the underlying distortions
of economic markets build up for a long time before the crisis is identi?ed
(Demyanyk and Van Hemert (2008) identify such a process for the US subprime
mortgage crisis). Caprio et al (2008) discuss the role of ?nancial deregulation
in predicting crises and identify a mechanism for interaction between the
governments and regulated institutions. The authors propose a series of
reforms that could prevent future crises, such as lending reform, rating agency
reform and securitization reform. Most importantly, according to the authors,
13
regulation and supervision should be re-strengthened to prevent such crises in
the future.
In his research, Hunter (2008) attempts to understand the causes of, and
solutions for, the ?nancial crises. He de?nes the beginning of the recent crisis
in the United States to be the point in time when inter-bank lending stopped
in the Federal Funds Market. Following this de?nition, the US crisis began
around October 8, 2008, when the Federal Funds Rate hit a high of seven
percent during intraday trading. According to Hunter, the primary reason
for trading halt was that banks were unsure about the exposure of their
counterparties to MBS risk: ‘If a bank has a large share of its asset portfolio
devoted to MBS, then selling MBS to get operating cash is infeasible when
the price of MBS has declined signi?cantly. Banks in this situation are on the
brink of insolvency and may indeed have di?culty repaying loans they receive
through the Federal Funds Market’. The author suggests several solutions to
the crisis. Among them, he emphasizes the importance of transparency in the
operation of and analysis by MBS insurers and bond rating agencies. He also
stresses the development of a systematic way of evaluating counterparty risk
within the ?nancial system. In the short term, he suggests that the Fed could
encourage more borrowing through the Discount Window.
Diamond and Rajan (2009) also analyze the causes of the recent US
?nancial crisis and provide some remedies for it. According to the authors,
the ?rst reason for the crisis was a misallocation of investment, which occurred
because of the mismatch between the soft information loan o?cers based credit
decisions on and the hard information (like credit scores) the securities trading
agencies used to rate mortgage bonds. This was not a big problem as long as
house prices kept rising. However, when house prices began to decline and
defaults started increasing, the valuation of securities based on loans became
a big problem (as the ratings may not truly capture the risk of loans within
those securities). The second reason for the crisis was excessive holdings of
these securities by banks, which is associated with an increased default risk.
To solve or mitigate the crisis, Diamond and Rajan ?rst suggest that the
authorities can o?er to buy illiquid assets through auctions and house them
in a federal entity. The government should also ensure the stability of the
?nancial system by recapitalizing those banks that have a realistic possibility
of survival, and merging or closing those that do not.
Brunnermeier (2008) tries to explain the economic mechanisms that caused
the housing bubble and the turmoil in the ?nancial markets. According to the
author, there are three factors that led to the housing expansion. The ?rst is
a low interest-rate and mortgage-rate environment for a relatively long time
in the United States, likely resulting from large capital in?ows from abroad
(especially from Asian countries) and accompanied by the lax interest rate
policy of the Federal Reserve. Second, the Federal Reserve did not move
to prevent the buildup of the housing bubble, most likely because it feared a
possible de?ationary period following the bursting of the Internet stock bubble.
Third, and most importantly, the US banking system had been transformed
from a traditional relationship banking model, in which banks issue loans and
hold them until they are repaid, to an ‘originate-to-distribute’ banking model,
in which loans are pooled, tranched and then sold via securitization. This
14
transformation can reduce banks’ monitoring incentives and increase their
possibility of if they hold a large amount of such securities without fully
understanding the associated credit risk.
Brunnermeier further identi?es several economic mechanisms through
which the mortgage crisis was ampli?ed into a broader ?nancial crisis. All
of the mechanisms begin with the drop in house prices, which eroded the
capital of ?nancial institutions. At the same time, lenders tightened lending
standards and margins, which caused ?re sales, further pushing down prices
and tightening credit supplies. When banks became concerned about their
ability to access capital markets, they began to hoard funds. Consequently,
with the drop in balance sheet capital and di?culties in accessing additional
funding, banks that held large amounts of MBS failed (eg, Bear Stearns,
Lehman Brothers, and Washington Mutual), causing a sudden shock to the
?nancial market.
Several researchers conclude that the ongoing crisis does not re?ect a failure
of free markets, but a rather reaction of market participants to distorted
incentives (Demirguc-Kunt and Serven, 2009). Demirguc-Kunt and Serven
argue that the ‘sacred cows’ of ?nancial and macro policies are not ‘dead’
because of the crisis. Managing a systemic panic requires policy decisions to be
made in di?erent stages: the immediate containment stage and a longer-term
resolution accompanied by structural reforms. Policies employed to reestablish
con?dence in the short term, such as providing blanket guarantees or
government buying large stakes in the ?nancial sector, are fraught with moral
hazard problems in the long term and might be interpreted as permanent
deviations from well-established policy positions by the market. The long-term
?nancial sector policies should align private incentives with public interest
without taxing or subsidizing private risk-taking (Demirguc-Kunt and Serven,
2009). Although well designed prudential regulations cannot completely
eliminate the risk of crises, they can make crises less frequent. However,
balancing the short- and long-term policies becomes complex in the framework
of an integrated and globalized ?nancial system.
Analyzing the Asian ?nancial crisis, Johnson et al (2000) present evidence
that country-level corporate governance practices and institutions, such as the
legal environment, have an important e?ect on currency depreciations and
stock market declines during ?nancial crisis periods. The authors borrow from
the corporate governance literature (see Shleifer and Wishny, 1997) theoretical
arguments that corporate governance is an e?ective mechanism to minimize
agency con?icts between inside managers and outside stakeholders. The
authors empirically show that corporate governance — measured as e?ciency
of the legal system, corruption and rule of law — explains more of the variation
in exchange rates and stock market performance than do macroeconomic
variables during the Asian crisis.
Angkinand (2009) reviews methods used to evaluate the output loss from
?nancial crises. The author argues that an empirical methodology estimating
the total output loss per crisis from the deviation of actual output from the
potential output trend — the gap approach — estimates the economic costs
of crises better than a methodology that estimates a dummy variable to capture
15
the crisis — the dummy variable approach — because the output costs of di?erent
crisis episodes vary signi?cantly.
A book by Barth et al (2009) provides a descriptive analysis explaining how
the crisis emerged in the United States and what actions the US government
is taking to remedy the economic and credit market contractions. A valuable
contribution of the study is a list of US bailout allocations and obligations.
This list is also frequently updated and reported on the Milken Institute web
page.
6
3 Review of operations research models
In this section, we describe selected operations research models that are
frequently used in the empirical literature to predict defaults or failures of
banks and that could be used to predict defaults of loans or non-?nancial
institutions.
Predicting the default risk for banks, loans and securities is a classic, yet
timely issue. Since the work of Altman (1968), who suggested using the
so-called ‘Z score’ to predict ?rms’ default risk, hundreds of research articles
have studied this issue (for reference, see two review articles: Kumar and Ravi
(2007) and Fethi and Pasiouras (2009)).
Several studies have shown that intelligence modeling techniques used in
operations research can be applied for predicting the bank failures and crises.
For example, Celik and Karatepe (2007) ?nd that arti?cial neural network
models can be used to forecast the rates of non-performing loans relative
to total loans, capital relative to assets, pro?t relative to assets, and equity
relative to assets. In another example, Alam et al (2000) demonstrate that
fuzzy clustering and self-organizing neural networks provide classi?cation tools
for identifying potentially failing banks.
Most central banks have employed various Early Warning Systems (EWS)
to monitor the risk of banks for years. However, the repeated occurrence
of banking crises during the past two decades — such as the Asian crisis, the
Russian bank crisis, and the Brazilian bank crisis — indicates that safeguarding
the banking systemis no easy task. According to the Federal Deposit Insurance
Corporation Improvement Act of 1991, regulators in the United States must
conduct on-site examinations of bank risk every 12—18 months. Regulators use
a rating system (the CAMELS rating) to indicate the safety and soundness
of banks. CAMELS ratings include six parts: capital adequacy, asset quality,
management expertise, earnings strength, liquidity and sensitivity to market
risk.
Davis and Karim (2008a) evaluate statistical and intelligence techniques in
their analysis of the banking crises. Speci?cally, they compare the logistic
regression (Logit) and the Signal Extraction EWS methods.
7
They ?nd
6http://www.milkeninstitute.org/publications/publications.taf?function=detail&ID
=38801185&cat=resrep.
7
The term ‘signal extraction’ refers to a statistical tool that allows for isolation of a
pattern of the data — the signal — out of noisy or raw time-series data.
16
that the choice of estimation models makes a di?erence in terms of indicator
performance and crisis prediction. Speci?cally, Logit model performs better
as a global EWS and Signal Extraction is preferable as a country-speci?c
EWS. Davis and Karim (2008b) test whether EWS based on the Logit
and binomial tree approaches (this technique is described below) could have
helped predicting current subprime crisis in the US and UK. Using twelve
macroeconomic, ?nancial and institutional variables, they ?nd that among
global EWS for the US and UK, the Logit performs the best. However, this
model as many others has only a small ability to predict the crises.
West (1985) uses the Logit model, along with factor analysis, to measure
and describe banks’ ?nancial and operating characteristics. Data was taken
from Call and Income Reports, as well as Examination Reports for 1,900
commercial banks in several states of the US According to the analysis, the
factors identi?ed by the Logit model as important descriptive variables for
the banks’ operations are similar to those used for CAMELS ratings. He
demonstrates that his combined method of factor analysis and Logit estimation
is useful when evaluating banks’ operating conditions.
Among the statistical techniques analyzing and predicting bank failures,
Discriminant Analysis (DA) was the leading technique for many years (eg,
Karels and Prakash (1987), Haslemet al (1992)). There are three subcategories
of DA: Linear, Multivariate, and Quadratic. One drawback of DA is that it
requires a normal distribution of regressors.
8
When regressors are not normally
distributed, maximum likelihood methods, such as Logit, can be used.
9
DA is
a tool for analyzing cross-sectional data. If one needs to analyze time series
data on bank ?rm, or loan defaults, hazard or duration analysis models can
be used instead of DA models.
10
Canbas et al (2005) propose an Integrated Early Warning System (IEWS)
that combines DA, Logit, Probit, and Principal Component Analysis (PCA),
which can help predict bank failure. First, they use PCA to detect three
?nancial components that signi?cantly explain the changes in the ?nancial
condition of banks. They then employed DA, Logit and Probit regression
models. By combining all these together, they construct an IEWS. The authors
use the data for 40 privately owned Turkish commercial banks to test the
predictive power of the IEWS, concluding that the IEWS has more predictive
ability than the other models used in the literature.
Among intelligence techniques, Neural Networks (NN) is the most widely
used. The NN model have developed out of the ?elds of arti?cial intelligence
and brain modeling, and contains mathematical and algorithmic elements that
mimic the biological neural networks of the human nervous system. The
method considers an interrelated group of arti?cial neurons and processes
information associated with them using a so-called connectionist approach,
8
Martin (1977) is an early study that uses both Logit and DA statistical methods to
predict bank failures in the period from 1975 to 1976, based on data obtained from the
Federal Reserve System. The author ?nds that the two models have similar classi?cations
in terms of identifying failures/non-failures of banks.
9
As in, for example, Martin (1977), Ohlson (1980), Kolari et al (2002) and Demyanyk
(2008).
10
See Cole and Gunther (1995), Lane et al (1986), Molina (2002), among many others.
17
where network units are connected by a ?ow of information. The structure
of NN models changes based upon external or internal information that ?ows
through the network during the learning phase and uses nonlinear function
approximation tools to test the relationship between explanatory factors.
Boyacioglu et al (2008) compare various NN, Support Vector Machine
(SVM) and multivariate statistical methods to the bank failure prediction
problem in Turkey. They use similar ?nancial ratios as those used in CAMELS
ratings. In the category of NN, four di?erent architectures are employed,
namely MLP, CL, SOM and LVQ (the details of these architectures are
not described in this review). The multivariate statistical methods tested
are multivariate discriminant analysis, K-means cluster analysis, and Logit
regression analysis. According to the comparison, MLP and LVQ can be
considered the most successful models in predicting the ?nancial failure of
banks in the sample.
The Back-Propagation Neural Networks (BPNN) model is a multilayer NN
model. The ?rst layer is constructed from input units, the middle layer consists
of hidden units, and the last layer consists of output units. Each upper layer
receives inputs from units of a lower level and transmits output to units of the
layer above it. The important feature of BPNN is that the errors generated
by units of a hidden layer are calculated by back-propagating the errors of
the output sent by levels of its corresponding layer. BPNN overcomes the
classi?cation restriction of a single-layer network, and it is one of the most
commonly used methods for classi?cation and prediction problems. Many
studies compare the classi?cation and prediction accuracy between BPNN and
other methods and ?nd that, in most cases, BPNN outperforms other models.
For example, Tam (1991) uses a BPNN model to predict bank failures in
a sample of Texas banks one year and two years prior to their failures. The
input variables he uses are bases on the CAMELS criteria. He ?nds that BPNN
outperforms all other methods, such as DA, Logit, and K-nearest neighbor (this
method is described below) in terms of their predictive accuracy. Similarly,
several other studies, brie?y described below, ?nd that BPNN o?ers a better
prediction or a better classi?cation accuracy than other methods.
Ravi and Pramodh (2008) propose a Principal Component Neural Network
(PCNN) architecture for bankruptcy prediction in commercial banks. In this
architecture, the hidden layer is completely replaced by what is referred to as
a ‘principal component layer’. This layer consists of a few selected components
that perform the function of hidden nodes. The authors tested the framework
on a data from Spanish and Turkish banks. According to the estimated results,
hybrid models that combine PCNN and several other models predict banking
bankruptcy outperform other classi?ers used in the literature.
Tam and Kiang (1992) compare the power of linear discriminant analysis
(LDA), Logit, K-Nearest Neighbor (described below), Interactive Dichotomizer
3 (ID3), feedforward NN and BPNN on bank failure prediction problems.
They ?nd that BPNN outperforms the other techniques for a one-year-prior
training sample, while DAoutperforms the others for a two-years-prior training
sample. However, for holdout samples, BPNN outperforms the others in
both the one-year-prior and the two-years-prior samples. In the jackknife
method, BPNN also outperforms others in both the one-year-prior and the
18
two-years-prior holdout samples. In all, they conclude that NN outperforms
the DA method.
Bell (1997) compares Logit and BPNN models in predicting bank failures.
In his study, he uses 28 candidates for predictor variables. The architecture of
BPNN has twelve input nodes, six hidden nodes and one output node. He ?nds
that neither the Logit nor the BPNN model dominates the other in terms of
predictive ability. However, BPNN is found to be better for complex decision
processes.
Swicegood and Clark (2001) compare DA, BPNN and human judgment in
predicting bank failures. The authors use data from bank Call Reports. They
?nd that BPNN outperforms other models in identifying underperforming
banks.
Olmeda and Fernandez (1997) compare the accuracy of bankruptcy
prediction methods that include classi?ers in a stand-alone model with those in
a hybrid system, which integrates several classi?ers. They propose a framework
for formulating the optimal mixture of the technologies as an optimization
problem and solve it using a genetic algorithm. Using data from the Spanish
banking system, they ?nd BPNN performs the best, Logit the second best,
and multivariate adaptive splines (MARS), C4.5 (not described in this review)
and DA follow in that order. The authors then combine models using a voting
scheme and a compensation aggregation method. They ?nd that the prediction
rates produced by the combined models are higher than those produced by the
stand-alone model.
The Trait Recognition technique develops a model from di?erent segments
of the distribution of each variable and the interactions of these segments
with one or more other variables’ segmented distributions. It uses two sets
of discriminators, the ‘safe traits’ and the ‘unsafe traits’, known as features.
These features can then be used to predict bank failures by voting each
bank and classifying it as ‘failed’ or ‘non-failed’. Trait recognition is a
nonparametric approach that does not impose any distributional assumptions
on the testing variables already contained within the data. The advantage
of the trait recognition approach is that it exploits information about the
complex interrelations of variables. The power of this approach depends on
the adequate selection of cut points for each of the variables, so that all failed
banks can be located below some threshold and all non-failed banks above it.
Kolari et al (2002) develop an EWS based on Logit and the Trait
Recognition method for large US banks. The Logit model correctly classi?es
over 96% of the banks one year prior to failure and 95% of the banks two years
prior to failure. For the Trait Recognition model, half of the original sample is
used. They ?nd that with data classi?cation both one year and two years prior
to failure, the accuracy of the Trait Recognition model is 100%. Therefore,
they conclude that the Trait Recognition model outperforms the Logit model
in terms of type-I and type-II errors.
Lanine and Vander Vennet (2006) employ a Logit model and a Trait
Recognition approach to predict failures among Russian commercial banks.
The authors test the predictive power of the two models based on their
prediction accuracy using holdout samples. Although both models perform
better than the benchmark, the Trait Recognition approach outperforms Logit
19
in both the original and the holdout samples. For the predictable variables,
they ?nd that expected liquidity plays an important role in bank failure
prediction, as well as asset quality and capital adequacy.
The Support Vector Machine (SVM) technique is based on the Structural
Risk Minimization (SRM) principle from computational learning theory, which
was introduced by Vapnik (1995). In the SVMmethod, input data is structured
as two sets of vectors in a multi-dimensional space. The purpose is to maximize
the margin between the two data sets. In order to calculate the margin, two
parallel hyperplanes need to be constructed, one on each side of the separating
hyperplane, which are forced against the two data sets. A good separation
can be achieved by the hyperplane that has the largest distance from the
neighboring data points of both classes; the larger the margin, the better the
generalization error of the classi?er. In sum, SVM uses a special linear model
and the optimal separating hyperplane to achieve the maximum separation
between two classes. The training points that are closest to the maximum
margin hyperplane are called support vectors. Such models are utilized in
Vapnik (1995), Boyacioglu et al (2008), Chen and Shih (2006) and Huang et
al (2004), among others.
The Decision Tree (DT) technique, which comes from research on machine
learning, uses a recursive partitioning algorithm to institute rules on a given
data set. Most decision tree algorithms are used for solving classi?cation
problems. However, algorithms like classi?cation and regression trees (CART)
can also be used for solving prediction problems. In this case, a binary decision
tree needs to be developed through a set of IF-THEN rules. These rules can
be used to accurately classify cases (eg, banks). A number of algorithms
are used for building decision trees, including CHAID (chi-squared automatic
interaction detection), CART, C4.5 and C5.0. For more information, see
Marais et al (1984) and Frydman et al (1985)
The Rough Set technique is a mathematical method for modeling
incomplete data based on a concept given by Pawlak (1982). It uses an
approximation of the usually vague objective into a prede?ned categories,
which then can be iteratively analyzed. See Greco et al (1998) for details.
Case-Based Reasoning (CBR) is a method similar to the cognitive process
humans follow in solving problems intuitively. CBR can be represented by a
schematic cycle comprising four steps. The ?rst step is to retrieve the most
similar cases. The second is to reuse the cases to attempt to solve the problem.
The third is to revise the proposed solution, if necessary. And the fourth is to
retain the new solution as a part of a new case. CBR methodology enables an
analyst to predict failure of a company based on failures of other companies
that occurred in the past.
The Nearest Neighbor technique classi?es an object in the class of its
nearest neighbor in the measurement space, using a certain distance measure
such as local metrics, global metrics, or Mahalanobis or Euclidean distance.
The method has a variety of applications, ranging from analyzing settlement
and patterns in landscape, spam classi?cation, or any other distribution of
objects and events. One can determine if objects or events are random,
clustered, or distributed regularly. The K-nearest neighbor (K-NN) is a
modi?ed Nearest Neighbor technique. In this model, K is a positive, usually
20
small, integer. An object (for example, a bank) is assigned to the class
most common amongst its K nearest neighbors (the class is either ‘failed’ or
‘non-failed’).
Zhao et al (2009) compare the performance of several factors that are used
for predicting bank failures based on Logit, DT, NN, and K-NN models. The
authors ?nd that a model choice is important in terms of explanatory power
of predictors.
The Soft Computing technique is a hybrid system combining intelligence
and statistical techniques. Speci?cally, it refers to a combination of
computational techniques in order to model and analyze complex phenomena.
Compared to traditional ‘hard’ computing techniques — which use exact
computations and algorithms — soft computing is based on inexact
computation, trial-and-error reasoning, and subjective decision making. Such
computation builds on mathematical formalization of the cognitive processes
similar to those of human minds. More information is available in Back and
Sere (1996), Jo and Han (1996), Tung et al (2004).
Data Envelopment Analysis (DEA) is a non-parametric performance
method used to measure the relative e?ciencies of organizational or
decision-making units (DMUs). DEA applies linear programming to observing
inputs consumed and outputs produced by decision-making units (such as
branches of a bank or departments of an institution). It constructs an e?cient
production frontier based on best observed practices. Each DMU’s e?ciency
is then measured against this computed frontier. The relative e?ciency is
calculated by obtaining the ratio of the weighted sum of all outputs and
the weighted sum of all inputs. The weights are selected to achieve Pareto
optimality for each DMU.
Luo (2003) uses the DEA model to evaluate pro?tability and marketability
e?ciencies of large banks. In the model, the author analyzes banks’ revenue
and pro?t as the measured outputs of both e?ciencies. He ?nds that
marketability ine?ciency creates more problems for the analyzed banks than
pro?tability ine?ciency. In an application to prediction of banking crises, the
?ndings suggest that overall technical e?ciency of the pro?tability performance
is associated with a likelihood of bank failure.
Avkiran (2009) analyzes the pro?t e?ciency of commercial banks in the
United Arab Emirates by applying a standard DEA and a network DEA
(NDEA) technique. The author mentions that the standard DEA does not
provide su?cient details to identify the speci?c sources of ine?ciency; network
DEA gives access to this underlying diagnostic information, because each
division of an institution can be treated as an independent DMU under the
NDEA. Note that the e?ciency measures derived from stochastic DEA do not
account for statistical noise; the impact of measurement error on e?ciency
is generally overlooked and it is not possible to conduct a formal statistical
inference by using stochastic DEA.
Kao and Liu (2004) formulate a DEA model of interval data for use in
evaluating the performance of banks. Their study makes advance predictions
of the performance of 24 Taiwan banks based on uncertain ?nancial data
(reported in ranges) and also presents the prediction of e?ciency scores (again
in ranges). They ?nd that the model-predicted e?ciency scores are similar to
21
the actual (calculated from the data) e?ciency scores. They also show that the
poor performances of the two banks taken over by the Financial Restructuring
Fund of Taiwan could actually have been predicted in advance using their
method.
Tsionas and Papadakis (2009) provide a statistical framework that can
be used with stochastic DEA. In order to make inference on the e?ciency
scores, the authors use a Bayesian approach to the problem set up around
simulation techniques. They also test the new methods on the e?ciency of
Greek banks, and ?nd that the majority of the Greek banks operate close to
market best-practices.
Cielen et al (2004) compare the performance of a DEA model, Minimized
Sum of Deviations (MSD), and a rule induction (C5.0) model in bankruptcy
prediction. MSD is a combination of linear programming (LP) and DA. Using
data from the National Bank of Belgium, they ?nd that MSD, DEA and C5.0
obtain the correct classi?cation rates of failure for 78.9%, 86.4% and 85.5%
of banks, respectively. They conclude that DEA outperformed the C5.0 and
MSD models in terms of accuracy.
Kosmidou and Zopounidis (2008) develop a bank failure prediction
model based on a multicriteria decision technique called UTilites Additives
DIScriminants (UTADIS). The purpose of UTADIS method is to develop a
classi?cation model through an additive value function. Based on the values
obtained from the additive value function, the authors classify banks into
multiple groups by comparing them with some reference pro?les (also called
cut-o? points). UTADIS is well suited to the ordinal classi?cation problems
and it is not sensitive to the statistical problems because the additive utility
function is performed through mathematical linear programming techniques
instead of statistical methods. Using a sample of US banks for the years
1993—2003, the authors use this technique to di?erentiate US banks between
failed and non-failed. The results show that UTADIS is quite e?cient
for the evaluation of bank failure as early as four years before it occurs.
The authors also compare UTADIS with other traditional multivariate data
analysis techniques and ?nd that UTADIS performs better, and could be used
e?ciently for predicting bank failures.
The Multicriteria Decision Aid (MCDA) method is a model that allows
for the analysis of several preference criteria simultaneously. Zopounidis
and Doumpos (1999b) apply MCDA to sorting problems, where a set of
alternative actions is classi?ed into several prede?ned classes. Based on the
multidimensional nature of ?nancial risk, Doumpos and Zopounidis (2000)
propose a new operational approach called the Multi-Group Hierarchical
Discrimination (M.H.DIS) method — which originates from MCDA — to
determine the risk classes to which the alternatives belong. Using World
Bank data, the authors apply this method to develop a model which classi?es
143 countries into four risk classes based on their economic performance and
creditworthiness. The authors conclude that this method performs better than
traditional multiple discriminant analysis.
11
11
There are several other models, not discussed in this section, such as Fuzzy Logic (FL)
techniques, Evolutionary Approach, and others.
22
MCDA is can be used in credit ratings and bank soundness. For example,
Gaganis et al (2006) apply a MCDA model using the UTADIS method to
classify banks into three groups based on their soundness. The sample
includes 894 banks from 79 countries, and the model is developed through
a tenfold cross-validation procedure. Their results show that asset quality,
capitalization and the market where banks operate are the most important
criteria in classifying the soundness of banks. Pro?tability and e?ciency
are also important factors associated with banks performance. Furthermore,
they ?nd that UTADIS outperforms DA and Logit in terms of classi?cation
accuracies. Zopounidis and Doumposi (1999a) also explore if the UTADIS
methods are applicable for analyzing business failure. They compare this
method to DA and standard Logit and Probit statistical models.
Pasiouras et al (2007) test whether MCDA model can be used to replicate
the credit rating of Fitch on Asian banks. Five ?nancial and ?ve non-?nancial
variables measuring bank and country characteristics are included in the
model, and the model is tested through a tenfold cross-validation. The
results show that ‘equity/customer and short-termfunding, net interest margin
and return on average equity, are the most important ?nancial variables.
The number of shareholders, the number of subsidiaries and the banking
environment of the country’ are the most important non-?nancial factors. The
authors compare the accuracy of this prediction model with that of DA and
ordered Logit; they ?nd that MCDA is more e?cient and that it replicates the
Fitch credit ratings with the ‘satisfactory accuracy’.
Niemira and Saaty (2004) use a multiple criteria decision-making model to
predict the likelihood of a ?nancial crisis based on an Analytic Network Process
(ANP) framework. They test the model for the US bank crisis during 1990s,
and ?nd that the ANPanalysis provides a structure that can reduce judgmental
forecast error through improved reliability of information processing. They
conclude that the ANP framework is more ?exible and is more comprehensive
than traditional models, and it is a promising methodology to forecast the
probability of crises.
Ng et al (2008) propose a Fuzzy Cerebellar Model Articulation
Controller model (FCMAC) based on a compositional rule of inference called
FCMAC-CRI(S). The new architecture integrates fuzzy systems and NN to
create a hybrid structure called neural fuzzy networks. This new network
operates through localized learning. It takes as inputs data from public
?nancial information and analyzes patterns of ?nancial distress through fuzzy
IF-THEN rules. Such processing can provide a basis for an EWS and
insights for various aspects of ?nancial distress. The authors compare the
accuracy of FCMAC-CRI(S) to Cox’s proportional hazard model and the
GenSoFNN-CRI(S) network model and ?nd that the performance of the new
approach is better than that of the benchmark models.
23
4 Concluding remarks
This article summarizes empirical economics and operations research articles
that aim to explain, predict, and remedy ?nancial crises and bank failures
in the United States and other countries. The paper provides an analysis of
?nancial and economic circumstances associated with the subprime mortgage
crisis in the United States along with an extensive review of intelligence
techniques used in the operations research literature to predict bank failures.
We suggest that operations research techniques be more broadly applied in
analyses of ?nancial crises.
24
References
Alam, P — Booth, D — Lee, K — Thordarson, T (2000) The use of
fuzzy clustering algorithm and self-organizing neural networks for
identifying potentially failing banks: an experimental study. Expert
Systems with Applications, 18, 185—199.
Altman, E I (1968) Financial ratios, discriminant analysis and the
prediction of corporate bankruptcy. Journal of Finance, 23(4), 589—609.
Angkinand, A P (2009) Output loss and recovery from banking and
currency crises: Estimation issues. Working paper.
Ambrose, B — Capone, C (2000) The hazard rates of ?rst and second
defaults. Journal of Real Estate Finance and Economics, 20(3), 275—293.
Avkiran, N K (xxxx) Opening the black box of e?ciency analysis: An
illustration with uae banks. Omega.
Back, B T — Sere, K (1996) Neural network and genetic algorithm for
bankruptcy prediction. Expert Systems and Applications, 11(4), 407—413.
Barth, J R — Li, T — Lu, W — Phumiwasana, T — Yago, G (2009) The Rise and
Fall of the U.S. Mortgage and Credit Markets: A Comprehensive
Analysis of the Meltdown. John Wiley & Sons.
Bell, T B (1997) Neural nets or the logit model? A comparison of
each model’s ability to predict commercial bank failures. International
Journal of Intelligent Systems in Accounting, Finance and Management, 6,
249—264.
Bell, J — Pain, D (xxxx) Leading indicator models of banking crises — a
critical review. Bank of England Financial Stability Review.
Berger, A N — DeYoung, R (1997) Problem loans and cost e?ciency in
commercial banks. Journal of Banking and Finance, 21, 849—870.
Boyacioglu, M A — Kara, Y — Baykan, O K (2008) Predicting bank
?nancial failures using neural networks, support vector machines
and multivariate statistical methods: A comparative analysis in the
sample of savings deposit insurance fund (SDIF) transferred banks
in Turkey. Expert Systems with Applications, 36(2).
Brunnermeier, M K (2008) Deciphering the liquidity and credit crunch
2008—08. Working paper.
Calhoun, C — Deng, Y (2002) A dynamic analysis of ?xed- and
adjustable-rate mortgage terminations. Journal of Real Estate Finance
and Economics, 24, 9—33.
Canbas, S — Cabuk, A — Kilic, S B (2005) Prediction of commercial bank
failure via multivariate statistical analysis of ?nancial structures: the
Turkish case. European Journal of Operational Research, 166, 528—546.
25
Caprio, Jr G — Demirg-Kunt, A — Kane, E J (2008) The 2007 meltdown in
structured securitization. Working paper.
Celik, A E — Karatepe, Y (2007) Evaluating and forecasting banking
crises through neural network models: An application for Turkish
banking sector. Expert Systems with Applications, 33, 809—815.
Chen, W H — Shih, J Y (2006) A study of Taiwan’s issuer credit
rating systems using support vector machines. Expert Systems with
Applications, 30, 427—435.
Cielen, A — Peeters, L — Vanhoof, K (2004) Bankruptcy prediction using
a data envelopment analysis. European Journal of Operational Research,
154, 526—532.
Clapp, J — Deng, Y — An, X (2006) Unobserved heterogeneity in models
of competing mortgage termination risks. Real Estate Economics, 34(2),
243—273.
Cole, R — Gunther, A (1995) A CAMEL rating’s shelf life. Federal Reserve
Bank of Dallas Review, 13—20.
Davis, E P — Karim, D (2008a) Comparing early warning systems for
banking crises. Journal of Financial Stability, 4, 89—120.
Davis, E P — Karim, D (2008b) Could early warning systems have helped
to predict the sub-prime crisis? National Institute Economic Review, 206,
35—47.
Dell’Ariccia, G — Igan, D — Laeven, L (2008) Credit booms and lending
standards: Evidence from the subprime mortgage market. Working
paper.
Demirguc-Kunt, A — Detragiache, E (1998) The determinants of banking
crises in developing and developed countries. IMF Stu? papers.
Demirguc-Kunt, A — Detragiache, E (2002) Does deposit insurance
increase banking system stability? An empirical investigation.
Journal of Monetary Economics, 49, 1373—1406.
Demirguc-Kunt, A — Detragiache, E (2005) Cross-country empirical
studies of systemic bank distress: A survey. Working paper 3719, World
Bank Policy Research.
Demirguc-Kunt, A — Detragiache, E — Gupta, P (2006) Inside the crisis:
An empirical analysis of banking systems in distress. Journal of
International Money and Finance, 25, 702—718.
Demirguc-Kunt, A — Serven, L (2009) Are all the sacred cows dead?
Implications of the ?nancial crisis for macro and ?nancial policies.
Working paper 4807, World Bank.
26
Demyanyk, Y (2008) Quick exits of subprime motgages. Federal Reserve
Bank of St. Louis Review, 92(1), March/April.
Demyanyk, Y — Van Hemert, O (2008) Understanding the subprime
mortgage crisis. Review of Financial Studies, forthcoming.
Deng, Y (1997) Mortgage termination: An empirical hazard model
with stochastic term structure. Journal of Real Estate Finance and
Economics, 14(3), 309—331.
Deng, Y — Pavlov, A — Yang, L (2005) Spatial heterogeneity in mortgage
terminations by re?nance, sale and default. Real Estate Economics,
33(4), 739—764.
Deng, Y — Quigley, J M — Van Order, R A (2000) Mortgage terminations,
heterogeneity and the exercise of mortgage options. Econometrica,
68(2), 275—308.
Diamond, D W — Rajan, R (2009) The credit crisis: Conjectures about
causes and remedies. Working paper.
Doumpos, M — Zopounidis, C (2000) Assessing ?nancial risks using a
multicriteria sorting procedure: The case of country risk assessment.
Omega: The International Journal of Management Science, 29(1), 97—109.
Fethi, M D — Pasiouras, F (2009) Assessing bank performance with
operational research and arti?cal intelligence techniques: A survey.
Working paper.
Frydman, H — Altman, E I — Kao, D (1985) Introducing recursive
partitioning for ?nancial classi?cation: The case of ?nancial distress.
Journal of Finance, 40(1), 269—291.
Gaganis, C — Pasiouras, F — Zopounidis, C (2006) A multicriteria decision
framework for measuring banks’ soundness around the world. Journal
of Multi-Criteria Decision Analysis, 14, 103—111.
Greco, S — Matarazzo, B — Slowinski, R (1998) A new rough set approach
to multicriteria and multiatribute classi?cation. Rough Sets and
Current Trends in Computing, 60—67.
Haslem, J A — Scheraga, C A — Beding?eld, J P (1992) An analysis
of the foreign and domestic balance sheet strategies of the U.S.
banks and their association to pro?tability performance. Management
International Review, First Quarter.
Huang, Z — Chena, H — Hsua, C J — Chenb, W H — Wu, S (2004) Credit
rating analysis with support vector machines and neural networks:
a market comparative study. Decision Support Systems, 37, 543—558.
Hunter, G (2008) Anatomy of the 2008 ?nancial crisis: An economic
analysis postmortem. Working paper.
27
Jo, H — Han, I (1996) Integration of case-based forecasting, neural
network and discriminant analysis for bankruptcy prediction. Expert
Systems with Applications, 11(4), 415—422.
Johnson, S — Boone, P — Breach, A — Friedman, E (2000) Corporate
governance in the Asian ?nancial crisis. Journal of Financial Economics,
58, 141—186.
Kao, C — Liu, S T (2004) Predicting bank performance with ?nancial
forecasts: A case of Taiwan commercial banks. Journal of Banking and
Finance, 28, 2353—2368.
Karels, GV— Prakash, AJ (1987) Multivariate normality and forecasting
of business bankruptcy. Journal of Business Finance and Accounting, 14(4).
Keys, B J — Mukherjee, T — Seru, A — Vig, V (2008) Did securitization lead
to lax screening? Evidence from subprime loans. Working paper.
Kolari, J — Glennon, D — Shin, H — Caputo, M (2002) Predicting large
US commercial bank failures. Journal of Economics and Business, 54(4),
361—387.
Kosmidou, K — Zopounidis, C (2008) Predicting US commercial bank
failures via a multicriteria approach. International Journal of Risk
Assessment and Management, 9, 26—43.
Kumar, R P — Ravi, V (2007) Bankruptcy prediction in banks and ?rms
via statistical and intelligent techniques — A review. European Journal
of Operational Research, 180, 1—28.
Lane, W R — Looney, S W — Wansley, J W (1986) An application of the
Cox proportional hazards model to bank failure. Journal of Banking
and Finance, 10, 511—531.
Lanine, G — Vander Vennet, R (2006) Failure predictions in the Russian
bank sector with logit and trait recognition models. Expert Systems
with Applications, 30, 463—478.
Luo, X (2003) Evaluating the pro?tability and marketability e?ciency
of large banks: An application of data envelopment analysis. Journal
of Business Research, 56, 627—635.
Mayer, C — Pence, K (2008) Subprime mortgages: What, where, and to
whom? Working paper, Federal Reserve Board.
Marais, M L — Patel, J — Wolfson, M (1984) The experimental design
of classi?cation models: An application of recursive partitioning
and bootstrapping to commercial bank loan classi?cations. Journal
of Accounting Research, 22, 87—113.
Martin, D (1977) Early warning of bank failure: A logit regression
approach. Journal of Banking and Finance, 1, 249—276.
28
Mian, R A — Su?, A (2008) The consequences of mortgage credit
expansion: Evidence from the U.S. mortgage default crisis. Quarterly
Journal of Economics, Forthcoming.
Molina, C A (2002) Predicting bank failures using a hazard model: the
Venezuelan banking crisis. Emerging Market Review, 2, 31—50.
Niemira, M P — Saaty, T L (2004) An analytic network process model
for ?nancial-crisis forecasting. International Journal of Forecasting, 20,
573—587.
Ng, C S — Quek, C — Jiang, H (2008) FCMAC-EWS: A bank failure
early warning system based on a novel localized pattern learning
and semantically associative fuzzy neural network. Expert Systems with
Applications, 34, 989—1003.
Ohlson, J A (1980) Financial ratios and the probabilistic prediction of
bankruptcy. Journal of Accounting Research, 18, 109—131.
Olmeda, I — Fernandez, E (1997) Hybrid classi?ers for ?nancial
multicriteria decision making: The case of bankruptcy prediction.
Computational Economics, 10, 317—335.
Pasiouras, F — Gaganis, C — Doumpos, M (2007) A multicriteria
discrimination approach for the credit rating of Asian banks. Annals
of Finance, 3, 351—367.
Pawlak, Z (1982) Rough sets. International Journal of Computer and
Information Science, 11, 341—356.
Pennington-Cross, A (2003) Credit history and the performance of
prime and nonprime mortgages. Journal of Real Estate Finance and
Economics, 27(3), 279—301.
Pennington-Cross, A — Chomsisengphet, S (2007) Subprime re?nancing:
Equity extraction and mortgage termination. Real Estate Economics,
35(2), 233—263.
Ravi, V — Pramodh, C (2008) Threshold accepting trained principal
component neural network and feature subset selection: Application
to bankruptcy prediction in banks. Applied Soft Computing, 8(4),
1539—1548.
Reinhart, C M — Rogo?, K S (2008) Is the 2007 U.S. sub-prime ?nancial
crisis so di?erent? Working paper 13761, NBER.
Shleifer, A — Vishny, R (1997) A survey of corporate governance. The
Journal of Finance, 52(2), 737—783.
29
Swicegood, P — Clark, J A (2001) O?-site monitoring systems for
predicting bank underperformance: A comparison of neural
networks, discriminant analysis, and professional human judgment.
International Journal of Intelligent Systems in Accounting, Finance and
Management, 10, 169—186.
Tam, K Y (1991) Neural network models and the prediction of bank
bankruptcy. Omega: The International Journal of Management Science,
19(5), 429—445.
Tam, K Y — Kiang, M (1992) Predicting bank failures: A neural network
approach. Decision Sciences, 23, 926—947.
Taylor, J B (2008) The ?nancial crisis and the policy responses: An
empirical analysis of what went wrong. Working paper.
Tsionas, E G — Papadakis, E N (2009) A bayesian approach to statistical
inference in stochastic DEA. Omega, Forthcoming.
Tung, W L — Queka, C — Cheng, P (2004) GenSo-EWS: A novel
neural-fuzzy based early warning systemfor predicting bank failures.
Neural Networks, 17, 567—587.
Vapnik, V N (1995) The Nature of Statistical Learning Theory.
Springer-Verlag.
West, R C (1985) A factor analytic approach to bank condition. Journal
of Banking and Finance, 9, 253—266.
Wheelock, D C — Wilson, P W (2000) Why do banks disappear? The
determinants of U.S. bank failures and acquisitions. The Review of
Economics and Statistics, 82, 127—138.
Zhao, H — Sinha, A P — Ge, W (2009) E?ects of feature construction
on classi?cation performance: An empirical study in bank failure
prediction. Expert Systems with Applications, 36(2), 2633—2644.
Zopounidis, C — Doumpos, M (1999a) Business failure prediction using
the UTADIS multicriteria analysis method. Journal of Operational
Research Society.
Zopounidis, C — Doumpos, M (1999b) A multicriteria decision aid
methodology for sorting decision problems: The case of ?nancial
distress. Computational Economics, 14(3), 197—218.
30

BANK OF FINLAND RESEARCH
DISCUSSION PAPERS

ISSN 0785-3572, print; ISSN 1456-6184, online

1/2009 Leonardo Becchetti – Rocco Ciciretti – Iftekhar Hasan Corporate social
responsibility and shareholder’s value: an empirical analysis. 2009. 50 p.
ISBN 978-952-462-482-4, print; ISBN 978-952-462-483-1, online.

2/2009 Alistair Milne – Geoffrey Wood The bank lending channel reconsidered.
2009. 59 p. ISBN 978-952-462-484-8, print; ISBN 978-952-462-485-5, online.

3/2009 Peter Palmroos Effects of unobserved defaults on correlation between
probability of default and loss given default on mortgage loans. 2009. 28 p.
ISBN 978-952-462-486-2, print; ISBN 978-952-462-487-9, online.

4/2009 Sherrill Shaffer – Iftekhar Hasan – Mingming Zhou New small firms and
dimensions of economic performance. 2009. 35 p. ISBN 978-952-462-488-6,
print; ISBN 978-952-462-489-3, online.

5/2009 Seppo Honkapohja The 1990’s financial crises in Nordic countries. 2009.
29 p. ISBN 978-952-462-490-9, print; ISBN 978-952-462-491-6, online.

6/2009 Mervi Toivanen Financial interlinkages and risk of contagion in the Finnish
interbank market. 2009. 33 p. ISBN 978-952-462-492-3, print;
ISBN 978-952-462-491-6, online.

7/2009 Bill B Francis – Iftekhar Hasan – Xian Sun Political connections and the
process of going public: evidence from China. 2009. 49 p.
ISBN 978-952-462-494-7, print; ISBN 978-952-462-495-4, online.

8/2009 Giovanni Ganelli – Juha Tervala Public infrastructures, public consumption
and welfare in a new open economy macro model. 2009. 36 p.
ISBN 978-952-462-496-1, print; ISBN 978-952-462-497-8, online.

9/2009 Juha Kilponen Euler consumption equation with non-separable preferences
over consumption and leisure and collateral constraints. 2009. 33 p.
ISBN 978-952-462-500-5, print; ISBN 978-952-462-501-2, online.

10/2009 Risto Herrala Credit crunch? An empirical test of cyclical credit policy.
2009. 27 p. ISBN 978-952-462-502-9, print; ISBN 978-952-462-503-6, online.

11/2009 Jim Lee – Patrick M Crowley Evaluating the stresses from ECB monetary
policy in the euro area. 2009. 35 p. ISBN 978-952-462-504-3, print;
ISBN 978-952-462-505-0, online.

12/2009 Mikael Juselius – Moshe Kim – Staffan Ringbom Do markup dynamics
reflect fundamentals or changes in conduct? 2009. 45 p.
ISBN 978-952-462-506-7, print; ISBN 978-952-462-507-4, online.

13/2009 Iftekhar Hasan – Michael Koetter – Michael Wedow Regional growth and
finance in Europe: Is there a quality effect of bank efficiency? 2009. 27 p.
ISBN 978-952-462-508-1, print; ISBN 978-952-462-509-8, online.

14/2009 Markus Haavio – Heikki Kauppi House price fluctuations and residential
sorting. 2009. 56 p. ISBN 978-952-462-510-4, print; ISBN 978-952-462-511-1,
online.

15/2009 Juha Kilponen – Juuso Vanhala Productivity and job flows: heterogeneity of
new hires and continuing jobs in the business cycle. 2009. 48 p.
ISBN 978-952-462-514-2, print; ISBN 978-952-462-515-9, online.

16/2009 Juha-Pekka Niinimäki – Ville Mälkönen Blanket guarantee and
restructuring decisions for multinational banks in a bargaining model.
2009. 37 p. ISBN 978-952-462-516-6, print; ISBN 978-952-462-517-3, online.

17/2009 David G Mayes Early intervention and prompt corrective action in Europe.
2009. 41 p. ISBN 978-952-462-518-0, print; ISBN 978-952-462-519-7, online.

18/2009 Markku Lanne – Pentti Saikkonen Noncausal vector autoregression. 2009.
63 p. ISBN 978-952-462-520-3, print; ISBN 978-952-462-521-0, online.

19/2009 Juha-Pekka Niinimäki Screening in the credit market when the collateral
value is stochastic. 2009. 29 p. ISBN 978-952-462-522-7, print;
ISBN 978-952-462-523-4, online.

20/2009 Efrem Castelnuovo Testing the structural interpretation of the price puzzle
with a cost channel model. 2009. 34 p. ISBN 978-952-462-524-1, print;
ISBN 978-952-462-525-8, online.

21/2009 Peter Nyberg – Mika Vaihekoski A new value-weighted total return index
for the Finnish stock market. 2009. 61 p. ISBN 978-952-462-526-5, print;
ISBN 978-952-462-527-2, online.

22/2009 Mari Komulainen – Tuomas Takalo Does State Street lead to Europe? The
case of financial exchange innovations. 2009. 53 p. ISBN 978-952-462-528-9,
print; ISBN 978-952-462-529-6, online.

23/2009 Esa Jokivuolle – Ilkka Kiema – Timo Vesala Credit allocation, capital
requirements and procyclicality. 2009. 40 p. ISBN 978-952-462-530-2, print;
ISBN 978-952-462-531-9, online.

24/2009 George W Evans – Seppo Honkapohja Expectations, deflation traps and
macroeconomic policy. 2009. 35 p. ISBN 978-952-462-532-6, print;
ISBN 978-952-462-533-3, online.

25/2009 Alistair Milne – Mario Onorato Risk-adjusted measures of value creation in
financial institutions. 2009. 37 p. ISBN 978-952-462-538-8, print;
ISBN 978-952-462-539-5, online.

26/2009 Esa Jokivuolle – Matti Viren – Oskari Vähämaa Transmission of macro
shocks to loan losses in a deep crisis: the case of Finland. 2009. 31 p.
ISBN 978-952-462-540-1, print; ISBN 978-952-462-541-8, online.

27/2009 Laura Vajanne Inferring market power from retail deposit interest rates in
the euro area. 2009. 31 p. ISBN 978-952-462-542-5, print;
ISBN 978-952-462-543-2, online.

28/2009 Juha Tervala Export pricing and the cross-country correlation of stock
prices. 2009. 31 p. ISBN 978-952-462-544-9, print; ISBN 978-952-462-545-6,
online.

29/2009 Jukka Vauhkonen Bank safety under Basel II capital requirements. 2009.
33 p. ISBN 978-952-462-546-3, print; ISBN 978-952-462-547-0, online.

30/2009 Efrem Castelnuovo – Paolo Surico Monetary policy, inflation expectations
and the price puzzle. 2009. 35 p. ISBN 978-952-462-548-7, print;
ISBN 978-952-462-549-4, online.

31/2009 Martin T Bohl – David G Mayes – Pierre L Siklos The quality of monetary
policy and inflation performance: globalization and its aftermath. 2009.
51 p. ISBN 978-952-462-550-0, print; ISBN 978-952-462-551-7, online.

32/2009 Patrick M Crowley How do you make a time series sing like a choir? Using
the Hilbert-Huang transform to extract embedded frequencies from
economic or financial time series. 2009. 38 p. ISBN 978-952-462-552-4,
print; ISBN 978-952-462-553-1, online.

33/2009 Patrick C Crowley – Tony Schildt An analysis of the embedded frequency
content of macroeconomic indicators and their counterparts using the
Hilbert-Huang transform. 2009. 51 p. ISBN 978-952-462-554-8, print;
ISBN 978-952-462-555-5, online.

34/2009 Leonardo Becchetti – Andrea Carpentieri – Iftekhar Hasan The determinants
of option-adjusted delta credit spreads: a comparative analysis of the
United States, the United Kingdom and the euro area. 78 p.
ISBN 978-952-462-556-2, print; ISBN 978-952-462-557-9, online.

35/2009 Yuliya Demyanyk – Iftekhar Hasan Financial crises and bank failures: a
review of prediction methods. 34 p. ISBN 978-952-462-564-7, print;
ISBN 978-952-462-565-4, online.

Suomen Pankki
Bank of Finland
P.O.Box 160
FI-00101 HELSINKI
Finland

doc_970327047.pdf
 

Attachments

Back
Top