Asia +852 6854 1589
Australia +61 409 910 668 | +61 434 574 347

IABFM Articles > > Risk Management > Interview with Operational Risk Expert


Interview with Operational Risk Expert


By IAFM Research

23 February, 2009

An interesting debate about the approaches that banks have taken to quantify operational risk for Basel II. What have been the problems and what are some of the solutions, is openly discussed in this interview. Really does Basel II work at all?

IAFM // Obviously, the biggest challenge for op risk modelling is the tail of the loss distribution
 
MD // That would be true and this is no secret here. If this wasn`t a limiting function (the curve had a tendency to move towards the X-Axis and become more narrow as you move out) then there would be plenty of data and modelling exercises would be really straight forward however, most banks would be broke as it would be more common losing a substantial amount of money.
 
So we are going to have less data because as the probability drops off so do the number of events.
 
IAFM // My understanding is that there are basically two solutions to the lack of data - you either try to get more insight out of the data you do have, OR you try to add qualitative information to the data. Is it right to say that these are two separate schools of thought and that those separate schools still exist today?
 
MD // When it comes to modelling, operational risk and Basel II there are really three distinct schools of thought, they are:

(1) sbAMA The Scenario Based Approach is based on the assessment of forward-looking what-if scenarios. The output of the scenarios are entered into an operational risk model where regressive techniques such as Monte Carlo are used to compute regulatory capital.
 
(2) RDCA Risk Drivers and Controls Approach (formerly known as the scorecard approach) uses a series of weighted questions (some of which can be interpreted as scenarios) whose answers yield a score that can be aggregated to allow the calculation of capital between business units.
 
(3) LDA Loss Data Approach puts emphasis to the computation of capital on historic loss data. Standard statistical techniques such as those that have been used by the insurance industry for years as well as some of the more complex derived functions including extreme value theory are used to compute regulatory capital. Estimation of exposure is usually performed on the frequency and magnitude of event distributions separately within the Basel event classifications, the results are then aggregated for a clear dimension of Value at Risk.

IAFM // Is it right to say that there are these two separate schools of thought and that those separate schools still exist today?
 
MD // In theory yes but life is not such a dichotomy. When you ask people whether they politically believe in right wing or left wing, what they like and dislike and extend that to risk you will find that most risk analysts are no different. That is they land somewhere between LDA and sbAMA, a hybrid approach.
 
There are other reasons that drive the hybrid existence which are less obvious. Firstly lack of data features but more so is that the Loss Data Approach is backward looking. If an analyst Includes scenarios in their tail estimates, their model or the hypothetical probability distribution function will take on a forward looking perspective, something the regulators are keen on driving home.

IAFM // Is it also accurate to say that EVT is basically, the-get-more-out-of-the-data school?
 
MD // One thing I say about extreme value theory, is that there is nothing extreme about it. Take this statement from Jack King`s book Operational Risk Measurement and Modelling; he puts it well.
 
Extreme Value Theory offers a parametric statistical approach for the extreme values of data. Its roots are in the physical sciences and it has recently been applied to insurance.
Of course we are now applying it to banking but it is nothing more than a set of parametric distributions for the largest (or smallest) values (GEV) and excess values over a threshold (GDP) from a set of underlying losses.
He likens it to building a wall to keep the sea off a path. How tall should the wall be considering the highest and lowest tide over a year? How about ten years of high and low tides, how about 100 years without a breach of the wall; how tall would you build the wall?
 
In this way EVT can answer that question by using curve estimates (method of moments or maximum likelihood estimators) in the lower data end of the curve at the threshold.
 
We still need data, not in the extreme so much but also in the mid range of the curve to dimension the extreme. Most banks have been so poor at classifying losses correctly and attributing loss events to business units that they don`t have very much data in either the extreme or normal position.

IAFM // While Bayesian techniques are advocated by the mix-data-with-scenarios school?
 
MD // Bayesian techniques are a completely different mathematical technique which looks at the contribution of data to forward points. A Bayesian approach is more causal and draws conclusions on how one variable infers or contributes to another.
 
As written by Martin Neil from Agena
 
A Bayesian Network is a way of describing the relationships between causes and effects and is made up of nodes and arcs
We call this Bayes Theorem of Propagation So if we go back to our three key approaches LDA, sbAMA or RDCA. RDCA is the Bayes end of the approaches.
 

IAFM // One of my contacts claimed that the future of operational risk modelling will be based around Bayesian Methods, Do you agree?
 
MD // Personally I am a big fan of Bayesian Methods because they show what drives failure and that then supports management decision on what events are creating the worst hazards. Curve fitting (LDA) does not provide this type of information, it may show the maximum potential loss given a set of data but how does that assist management reduce the value.

I believe Bayesian Methods offer great relief to the operational risk capital problem however they are not without their difficulties. Firstly, Bayesian approaches are a lot but also I fair operational risk departments in banks have a tenacity to lack virulence and to go on common consensus; Bayesian approaches can have political concerns at implementation because they capture data which can be manipulated by the staff before such variables are connected to the causal model. There is a moral dilemma at the assessment process. People in short lie, for whatever reason; often not in the way you would believe and often they don`t know they have myopic vision.
 
For example years ago I worked with a risk department that exaggerated its control positions downwards to grab budget for improvements and then another department that refused to see error in their ways, how dare any loss be attributed to them. Bayesian models when mixed with people generally have stacks of error (myopic, Type 1, Type 2 and criminal).
 

IAFM // If so (Bayesian is the future), why has it taken so long for Bayesian methods to win converts, given that researchers first started trying to apply them to op risk back in 2000/01?
 
MD // Oh Bayes has been around a little longer than that --- Thomas Bayes (c. 1702 to 17 April 1761) was a British mathematician and Presbyterian minister, known for having formulated a specific case of the theorem that bears his name: Bayes` theorem, which was published posthumously.
 
To answer your question in short:

>> Expensive to capture relevant variables (time consuming)
>> Difficult to define which variables to capture (requires operation insight)
>> Hard to remove human error or manipulation of data out of the variable capture (the integrity of the system can be easily compromised) 

IAFM // One or two of my contacts were very pessimistic about the possibility of ever being able to model op risk to the kind of confidence level that regulators require. Can it be done, in your view?
 
MD // In this sense I would agree and I can side with him in many respects however, not to do something worthy in its intentions because it is hard to achieve is not an excuse. If we are ever to improve the standard delivery of banking, we are going to need diligent measures of what can go wrong so that we target our limited resources where they can best be applied.
 
Can it be done?: Yes but be aware, as we smooth the kernel we introduce error. It can be done, but the result will have error, nothing is a perfect measure of anything. There is always going to be propagated error in our model that we will fail to capture or be able to explain.
 
Can it be done, depends on how we define what has to be done to begin with. At present in the world of banking this is a lacking. Amazingly there are still plenty of operational risk analysts out there who do not have the inclination, interest or perseverance in finding a way forwards and thus the confusion continues. The regulators themselves contribute massively to this failing and the responsibility for being a difficult program does not lie directly on the heads of the banks themselves.
 
This is not so easy to print or write of course as it may be misinterpreted as being utterly irreverent but none the less it is blatantly obvious that what is lacking is the insight from some of those that are responsible for directing the discipline in their organizations.

Is this discovery something out of this world?
 
Not really! Like all measures of statistics there are going to be some banks that have it sorted which are run by outstanding analysts and then other institutions that press forwards with less time, presence and interest. The secret of Basel II is not can it be done but is to narrow the standard deviation between poor banks and good banks when it comes to risk awareness and control diligence. Basel II aims to bring all banks up to a specific benchmark or standard and cut out the left end of the tail in this case and in that respect, it is working.
 
So we ask ourselves what happened to the US banking community?
 
Well they actually didn`t prescribe to Basel II until it was too late, but that is another discussion all unto itself.

About the Authors

IAFM Research is the authoring team of IAFM specific research for members

Member Login

Search Articles

All rights reserved 2003-2024 International Academy of Business and Financial ManagementTM

Join our groups on linkedin and Facebook