Financial Services in Crisis: Operational Risk Management to the Rescue!
February 16, 2009

by Allan D. Grody, a founding board member of the Journal of Risk Management in Financial Institutions, founding professor (retired) of New York University's graduate course in risk management systems, and founding partner (retired) of Coopers & Lybrand's financial services consulting practice. He began his business career in General Electric's finance business and later went on to hold increasingly responsible positions in investment management, the securities industry, and in international banking. He is President of Financial InterGroup Advisors and Financial InterGroup Holdings Ltd, and an advisory board member of ARC Best Practices Ltd.

Abstract

In today's financial crisis, the capital reserves held by banks have become the measure by which an organisation counts down to failure, rather than the system to proactively prevent it. Indeed, the bankruptcies and near-death experience of a number of financial institutions provide yet further evidence of risk management controls gone awry. The question then has to be asked, what is it that offers a financial institution its greatest protection against failure if it is not capital? Quite simply, it is the risk culture embedded in its people and processes. At the core of any risk culture are the incentives for individual reward that balance risk and return with short-term self-interest and long-term stakeholder goals. Furthermore, it is the embedded early warning systems that highlight growing exposures to risk. Here, the greatest hope for preventing such crisis in the future lies in the final piece of Basel II - the yet to be implemented operational risk framework - the first-ever truly global regulation intended to foster a risk-adjusted performance culture.

We have been here before

In the modern era we have long known about the problems of models and valuation prices, and of markets freezing up. Programme insurance, the innovation of Leland, O'Brien, Rubenstein (LOR), an innovator in quantitative investment strategies, was the culprit back when the listed exchange and centralised counterparty markets froze up in October 1987. Back then, the stock, futures and options markets were convulsed by a cascade of sell orders prompted by the LOR model's blind adherence to its pre-programmed algorithms. The lesson learned then was that the equity markets were tightly coupled to the options and futures markets - trouble in one quickly cascaded into the other. The lesson of today's credit crunch is the same - the models went awry and the defaults began. This time, however, it is the debt markets that are interconnected, and these markets too can freeze up when distress in one market segment cascades into other segments.

When Drexel Burnham, the first firm to segment risk through models, went bankrupt in 1993, a casualty of declining confidence not unlike that which enveloped Bear Stearns, it caused liquidity to dry up in the non-investment grade (junk) bond market. Drexel was the only market maker for many of these bonds. Then as now, the back-tested models that showed the risk-adjusted default rates on junk bonds to be no worse than investment-grade bonds, failed to account for the future of these markets when junk bonds were issued under a new standard of care. Junk bonds were sold to institutional accounts by a Wall Street where the incentive pay schemes rewarded those who made it to bonus day and took their winnings off the table immediately. The bonds' holders were left to determine whether the �experiment' would pay off for them in the future. Prior to the revelation of Drexel's back tests �proving' junk was not junk, junk bonds were the province of private placement agents, really owner/founders of smaller public companies who placed this debt within a small circle of sophisticated speculators, and friends and family investors.

With its new found �academic' legitimacy, the risk of junk bond issuance was distributed across the world, to all manner of yield-seeking individuals and institutions. Sounds quite similar to today's �academic' legitimacy of securitised mortgage obligations, but with one difference - back then, credit rating agencies still called junk �junk'. Nonetheless, it still left creditors in bankruptcy trying to value its debt, much like the attempts to value the securitised debt of today's equivalent of junk bonds.

Fast-forward to today's models that used back-tested default rates of subprime mortgages to create their forward-looking risk models. As is now known, these models did not factor in the looser standards of credit quality being used by the mortgage agents, who also took the money and ran. And, of course, one should not forget Long Term Capital Management, whose adherence to mathematical models of arbitrage, compounded by huge leverage, went into a tailspin when the models based on past experience did not quite work that way in the future.

In the modern era of information technology, much has been learned about control weaknesses and fraudsters, and this too has been around for a long time. Nick Leeson, the trader-fraudster behind the Barings collapse used the infamous 88888 error account as the vehicle to escape his fraud for some time. Fast-forward 13 years and James Kerviel was using a dummy pending counterparty account to hide his fraud. Even the early warnings of the extraordinary build-up of futures positions from listed futures market operators - Eurex and Osaka in the cases of Kerviel and Leeson, respectively - were similarly deflected from being observed by the �right management people' at their respective firms, who �would have acted, had they only known'.

An even more egregious and pervasive trading fraud occurred in the 1990s, discovered by some University of Chicago academics who applied statistical analysis retrospectively to discover artificially wide spreads on highly-liquid companies like Apple and Microsoft in the US over-the counter (OTC) equities markets. The largest of the Wall Street dealing firms paid enormous fines as a result of the rampant collusion by traders who defrauded the public for their own benefit. Inherent in the �heads I win, tails you lose' culture are the seeds of all these frauds, and incentive compensation schemes tied to performance are in need of serious repair.

Governance then and now

The complexity of the modern financial services company defies the once-cherished risk management technique of partnerships from the earlier era, which was the predominant governance model only four short decades ago. Then, the Monday morning meeting, at which the partners would decide the week's allocation of company capital, was all the risk management that was needed. Then, with their money literally and figuratively in their pockets, the partners would walk out onto the trading floor and manage their own and their partners' capital. They sat at their desks, directly overseeing all aspects of the process they had established to trade and control their business, and commanded quite literally a full view of all those that were to be managed.

Today, these investment and capital market partnerships are global in scope, many embedded in multinational banking institutions, almost all now owned by public shareholders. Partners have been replaced by staff whose pay schemes reward those who take big risks with shareholders' capital, the capital of �other people's money' fame, and are rewarded each year at bonus time. The clients' real bets are left to �pay off' over a longer-term horizon, should the financial instrument or investment strategy ever perform as designed, or rather, as marketed.

Product complexity

A typical example of the complexity of today's institutional products can be found in the description of Citibank's super senior collateralised debt obligations (CDOs), mainly collateralised by residential mortgage securities. In order to price these securities, Citi's model estimates housing price changes, unemployment rates, interest rates, credit scores, documentation status and changes in mortgage-holder age, in addition to other criteria such as loan-to-value and debt-to-income ratios. The model also estimates the impact of the geographic concentration of mortgages and the discount rates for each level of exposure. Finally, the model projects future cash flows using housing prices, interest rates, unemployment rates and discount rates. Can Citibank, or anyone for that matter, get all this right, all the time? Surely financial markets are vulnerable to bad assumptions if these products have to be marked-to-model when mark-to-market prices dry up? Of course they are. Result: today's subprime debacle, liquidity crisis and the almost complete nationalisation of the US housing finance market.

Another example of the complexity of these products can be found in a recent financial engineering conference description where the solutions to the problem of the faulty CDO structure were described thus:

��industry participants have still largely relied on first generation models, with well documented practical and theoretical limitations. For example, the Gaussian Copula framework as commonly applied to pricing synthetic CDOs is static in nature and not arbitrage-free, and its treatment of bespoke portfolios is ad hoc. Furthermore, given their complexity, the valuation of cash CDOs still commonly relies on simple bond-models and matrix pricing, and the application of more sophisticated option-pricing models for this instruments is not common practice.'
There is obviously something wrong here, but mere mortals (including perhaps auditors and regulators), with all due respect, cannot discover the problem.

A further example of complexity, this one in the retail product space, can be found in Genworth Financial's Foundation Annuity product. Its Life Time Income Plus Benefit provides guaranteed withdrawals for the life of the annuitant at least equal to purchase payments with upside potential provided the annuitant meets certain conditions such as allocating the contract to directed investment strategies and limiting initial gross withdrawals in any benefit year to an amount no greater than the withdrawal limit. The withdrawal limit is calculated each valuation day by multiplying the greatest of the contract value on the prior contract date, the withdrawal base and the roll-up value, all multiplied by the withdrawal factor. The withdrawal base is equal to the initial purchase payment adjusted as subsequent purchase payments are received. The roll-up value is equal to the initial purchase payment adjusted on each valuation date to reflect both the additional payments made up to the current valuation date, and the roll-up rate multiplied by the cumulative purchase payments.

How can the average person understand what he or she is buying in such products? It is no wonder that the US president has recently established a Financial Literacy Task Force to find means to better educate society in all manner of financial matters. How a financial institution monitors the risk that the enumerated parameters are all adhered to is hard to understand, especially when one realises the above synopsis is but one aspect of a multi-featured product. While the sales agent takes his cut up front, the client is making a decision to which they will be bound for the long term. As such, for the client to make an appropriate decision, they must understand all the relevant features of the product.

Now, the purpose here is not to solve the problems of the risk inherent in customer choices related to the complexity of the products offered by financial institutions. Rather, it is to discuss what financial institutions can do about the risks - to themselves and their counterparties - of offering these products, as well as any other operational risks to which they are exposed. It is no wonder then that the industry has turned to reporting mechanisms based on a new mantra - risk management - as an all-encompassing concept to view and explain the overwhelming complexity of modern financial products. How this risk management mantra, translated by regulators into a risk management framework, will hold up is still work in progress. The greatest hope for preventing such crisis in the future lies in the final piece of Basel II - the yet to be implemented operational risk framework - the first-ever truly global regulation intended to foster a risk-adjusted performance culture.

Does capital do it?

Regulators have always fostered an expectation that capital is what sustains banks in periods of stress and prevents them from failing. Perhaps a more appropriate view of the capital reserves that banks are forced to hold, with an eye to the current financial crisis, is that they constitute the measure by which the organisation counts down to failure, rather than the system that proactively prevents it. So, if it is not capital, what is it that offers a bank its greatest protection against failure? Quite simply, it is the risk culture embedded in its people and processes. At the core of any risk culture are the incentives for individual compensation that balance risk and return with short-term self-interest and long-term stakeholder goals, along with the embedded early warning systems that highlight growing exposure to risk.

Indeed, the near-death experience of Société Générale and Bear Stearns are egregious examples of risk management controls gone awry that found no visibility in the form of escalating risk exposure measurements. But escalating risk exposure is precisely what the Basel II founding fathers envisaged for operational risk management when it was first proposed in 1999 and what still offers the greatest hope for preventing crises in the future.

Basel II defines operational risk as �the risk of loss resulting from inadequate or failed internal processes, people and systems or from external events'. In more specific language, it further defines the coverage of operational risk as risks arising from such recent headline-grabbing events as internal and external fraud, errant employment practices, faulty workplace safety rules and procedures, risks of improper clients, products and business practices, losses from damage to physical assets, business interruptions and systems failures, and failures in trade execution, delivery and process management.

Many of the recent events can be slotted into one or more of the above operational risk categories. Citibank reports that its market value-at-risk number does not include CDO positions because they are hard to value due to the absence of prices or model inputs - a data problem; MF Global reports a $141m trading loss due to a reported systems control problem allowing a trader to avoid contract limits; Merrill acknowledges $43bn of OTC derivative cash flow improperly recorded on both sides of the balance sheet; Credit Suisse has a $2.8bn write-down due to valuation model pricing errors and use of stale prices; Société Générale reports a net $4.9bn loss due to trader fraud; and Bear Stearns nearly collapses because it was unable to price its mortgage portfolios, among other things.

Unfinished business

The financial industry has not yet found its way onto the difficult Basel II path that leads towards the creation of monitoring systems that provide a common measurement framework applied to the risks inherent in business operating environments. Instead it is building operational risk management systems around self-assessments, indicators and scenario analyses - an expedient but, if left to stand for the long term, errant path.

It may seem extreme to suggest that an entire industry may not be on the right path, but there is real evidence that this could well be the case. Consider this comment from the most sophisticated of banks, as represented by the Advanced Measurement Approach Group of the Risk Management Association in a May 2007 response to the US federal regulators' outline of an operational risk supervisory framework:�Practically speaking, the requirement to produce comprehensive management reports including "changes in factors signaling an increased risk of future losses" cannot be met at this point in time or in the near future. In many instances, operational risk factors that led to a particular event cannot be uniquely determined retrospectively, let alone detecting a change in factors that signals an increase in future losses.' This statement begs a key question: what is the value of a global risk management, capital adequacy or economic capital regime if the banks applying it, by their own admission, are unable �at this point in time or in the near future' to fulfil a requirement as fundamental as being able to demonstrate the link between changes in risk factors and past and likely future negative outcomes? The truth is that without this capability, risk management programmes have very little value.

The failure of risk management, particularly the modelling of past losses to predict future capital requirements, is coming under increasing scrutiny. In fact, in a blunt statement made in a recent Wall Street Journal editorial, William Isaac, former head of the Federal Deposit Insurance Corporation (FDIC) during the turbulent period of the US Savings & Loan crisis in the early 1980s, recommends throwing out Basel II before it even gets implemented in US banks. Further, Lee Pickard, who as Head of the Securities and Exchange Commission (SEC) Trading and Market Division helped write the first set of capital adequacy rules for investment firms in 1975, following the Wall Street �paper crisis' of the 1960s, blames the failure of three of the largest brokerage firms on the SEC, for having allowed them to adopt the Basel II guidelines in 2004, despite their highly-leveraged positions.

So what is the problem here? Is it simply that risk managers have been focused on the path of adapting models of past losses as a means of calculating operational risk capital while forgoing the challenge of monitoring current operating metrics as a predictor of potential future loss events. To use a metaphor, measuring the seismic activity of a volcano before it explodes is a better indicator of a potential eruption, while having an historical perspective on past eruptions certainly predicts the potential magnitude of the eruption. It would seem to this author that a sounder approach would have financial institutions implementing the required components of Basel II that signal �an increased risk of future losses' along with the potential size of those losses. Here regulators need to provide more detailed guidance, as the industry has failed to do it itself

In early 2007, at a select gathering of 20 top-of-the-industry operational risk management professionals in London, each was asked to complete a survey on topical issues. By far, the area of greatest agreement was that operational risk management would benefit were it possible to directly measure exposure-to-risk of potential future losses, as is the case in credit and market risk. Thus, there is clear evidence that operational risk managers really do covet this capability. But where to begin? Perhaps by realigning thinking regarding who is accountable for what in the management of operational risks.

Operating management partnering with risk managers

A critical and accepted function of operating management is to respond to changes in operations that affect profitability. It would be logical then that they should also be rewarded if their actions have the effect of mitigating risk in line with the risk tolerances expressed by management in the form of capital reserves. Risk managers, meanwhile, monitor exposure to risk in accordance with the risk appetite set by their boards of directors which, in turn, should be linked to sound model-based predictions of future losses and capital requirements.

It follows, but it has not yet been generally accepted, that creating an approach to the measurement of risk in operating environments should be the province of business management. Equating such measures to loss frequency and severity of losses, which in turn is equated to capital reserves, should be the province of risk managers.

Operating line managers need to report in quantitative terms how the risk they manage is affected by factors such as high transaction counts, non-reconciled positions, failures to deliver securities, overtime hours, absenteeism, systems downtime, number of unauthorised accesses, number of password changes per employee and number of internal non-client accounts opened - that is, if they are ever recorded in the first place. They are certainly not available in aggregated form for executive management in a manner that equates changes in these operating factors to real-time or near real-time measures of risk exposures and, in turn, to operational loss predictions and capital requirements.

So, what is the right path to detecting a potential for a loss before it becomes one? An examination of the measurable properties contained in actual business processes offers the greatest possibilities. Research and pilots in this area suggest that risk exposure quantification can be achieved through a system of standardised weightings and risk factors applied to the transaction values supported by business processes comprising different types of operational activity. Combined with the actual status of risk mitigation versus generally accepted best practices, an aggregateable management information risk exposure measurement system can be constructed around the measures of risk level inherent in financial transactions.

But it will be difficult to deliver this final piece of Basel II without the active engagement of line operating management. They need to be focused on ways of directly measuring the operational exposures they manage, as described above. When this capability is in place they can then join forces with their risk management colleagues to construct the bridge that will connect risk-exposure measurements in the operating business space with predictive loss models in the risk management space and thereby complete that final piece of Basel II's risk management framework.

A confirming view of uniting business managers with risk managers was noted recently in a study carried out by UK research firm Datamonitor and the European Financial Management & Marketing Association. The study stated that the recent well-publicised failures in risk management had highlighted the necessity of risk and finance integration. It went on to say that the current climate is making a persuasive business case for a combined approach to risk and financial reporting. And finally, it stated that those firms that continue to see the risk disciplines as separate from the underlying business itself will be doomed to repeat previous mistakes.

The price of not implementing the Basel II operational risk framework

The bankruptcy of New Century Mortgage and Lehman Bros; the takeover in liquidation of Indy Mac by the FDIC; the UK Government's takeover of Northern Rock; the Federal Reserve's bailout of Bear Stearns; and the nationalisation of AIG, Fannie Mae and Freddie Mac are but the most recent examples of failures to monitor risk exposures as they accumulate. Today, the public's interest is more at stake than in the past. Financial institutions are mainly public shareholder companies. They make significant contributions to global economic value, are among the largest employers, and among the largest-value components of the major market indices. They are now so large that their failures resonate around the globe and shock the financial system at its core.

When they fail there is significant consequence - and they fail with some regularity. Witness in previous eras the decoupling of the US stock markets and the FDIC's Savings & Loan bailout of the 1980s; the SEC's SIPC broker-dealer liquidations and the failure of Germany's Herstaat Bank in the 1970s; and the fraud and subsequent bankruptcy of commodity broker Ira Haupt and the bailout of brokerage firm Hayden Stone in the 1960s. Each of these events had subsequent and fundamental impacts on mitigating risk in the global financial system, primarily through the subsequent establishment of various mutualised intra-industry risk-mitigating payment and settlement mechanisms. Today's financial crisis points to similar solutions.

The recent US Department of the Treasury's report �Blueprint for a Modernised Financial Regulatory Structure' recognises the need for a new regulatory structure focused on payment and settlement systems. The President's Working Group report, authored at about the same time, points to more structure in the OTC derivatives and structured products markets, creating a central counterparty regime within a more automated payment and settlement system. Similar initiatives are underway internationally to remove risks between financial institutions, and between their central securities depositories and contract market clearinghouses. Further, regulators worldwide are demanding more interactive, automated inputs for their regulatory reports, requiring even more structure, precision and standardisation of data. Surely more mutualised risk-sharing will be the result of all these efforts.

The industry now faces mitigating risk in individual financial institutions. Here, the process should start with a proactive management information system that equates profitability measures and operational metrics with the disciplines of risk management. Thereafter, by tying risk-adjusted performance to measurements of exposures to risk as they accumulate in the operating environment, it will finally be possible to correlate these exposures to loss history for modelling capital requirements. This is what Basel II's operational risk framework is all about.

Conclusion

In its final implementation notice this past year, Basel II's US regulators called for �consistent and comprehensive capture and assessment of data elements needed to identify, measure, monitor, and control the bank's operational risk exposure. This includes identifying the nature, type(s), and underlying cause(s) of the operational loss event(s).'

It is important to reject the push-back voiced by others, and quoted earlier, that �operational risk factors that led to a particular event cannot be uniquely determined retrospectively, let alone detecting a change in factors that signals an increase in future losses'.

Industry must get on with implementing risk management systems that do determine operational risk factors that lead to a particular event and that can detect a change in factors that signals an increase in future losses. Come the next financial crisis, it is such efforts that will save individual institutions and it is what the Basel II overseers expect.




This site, like many others, uses small files called cookies to customize your experience. Cookies appear to be blocked on this browser. Please consider allowing cookies so that you can enjoy more content across assetman.net.

How do I enable cookies in my browser?

Internet Explorer
1. Click the Tools button (or press ALT and T on the keyboard), and then click Internet Options.
2. Click the Privacy tab
3. Move the slider away from 'Block all cookies' to a setting you're comfortable with.

Firefox
1. At the top of the Firefox window, click on the Tools menu and select Options...
2. Select the Privacy panel.
3. Set Firefox will: to Use custom settings for history.
4. Make sure Accept cookies from sites is selected.

Safari Browser
1. Click Safari icon in Menu Bar
2. Click Preferences (gear icon)
3. Click Security icon
4. Accept cookies: select Radio button "only from sites I visit"

Chrome
1. Click the menu icon to the right of the address bar (looks like 3 lines)
2. Click Settings
3. Click the "Show advanced settings" tab at the bottom
4. Click the "Content settings..." button in the Privacy section
5. At the top under Cookies make sure it is set to "Allow local data to be set (recommended)"

Opera
1. Click the red O button in the upper left hand corner
2. Select Settings -> Preferences
3. Select the Advanced Tab
4. Select Cookies in the list on the left side
5. Set it to "Accept cookies" or "Accept cookies only from the sites I visit"
6. Click OK

by Allan D. Grody, a founding board member of the Journal of Risk Management in Financial Institutions, founding professor (retired) of New York University's graduate course in risk management systems, and founding partner (retired) of Coopers & Lybrand's financial services consulting practice. He began his business career in General Electric's finance business and later went on to hold increasingly responsible positions in investment management, the securities industry, and in international banking. He is President of Financial InterGroup Advisors and Financial InterGroup Holdings Ltd, and an advisory board member of ARC Best Practices Ltd.

Abstract

In today's financial crisis, the capital reserves held by banks have become the measure by which an organisation counts down to failure, rather than the system to proactively prevent it. Indeed, the bankruptcies and near-death experience of a number of financial institutions provide yet further evidence of risk management controls gone awry. The question then has to be asked, what is it that offers a financial institution its greatest protection against failure if it is not capital? Quite simply, it is the risk culture embedded in its people and processes. At the core of any risk culture are the incentives for individual reward that balance risk and return with short-term self-interest and long-term stakeholder goals. Furthermore, it is the embedded early warning systems that highlight growing exposures to risk. Here, the greatest hope for preventing such crisis in the future lies in the final piece of Basel II - the yet to be implemented operational risk framework - the first-ever truly global regulation intended to foster a risk-adjusted performance culture.

We have been here before

In the modern era we have long known about the problems of models and valuation prices, and of markets freezing up. Programme insurance, the innovation of Leland, O'Brien, Rubenstein (LOR), an innovator in quantitative investment strategies, was the culprit back when the listed exchange and centralised counterparty markets froze up in October 1987. Back then, the stock, futures and options markets were convulsed by a cascade of sell orders prompted by the LOR model's blind adherence to its pre-programmed algorithms. The lesson learned then was that the equity markets were tightly coupled to the options and futures markets - trouble in one quickly cascaded into the other. The lesson of today's credit crunch is the same - the models went awry and the defaults began. This time, however, it is the debt markets that are interconnected, and these markets too can freeze up when distress in one market segment cascades into other segments.

When Drexel Burnham, the first firm to segment risk through models, went bankrupt in 1993, a casualty of declining confidence not unlike that which enveloped Bear Stearns, it caused liquidity to dry up in the non-investment grade (junk) bond market. Drexel was the only market maker for many of these bonds. Then as now, the back-tested models that showed the risk-adjusted default rates on junk bonds to be no worse than investment-grade bonds, failed to account for the future of these markets when junk bonds were issued under a new standard of care. Junk bonds were sold to institutional accounts by a Wall Street where the incentive pay schemes rewarded those who made it to bonus day and took their winnings off the table immediately. The bonds' holders were left to determine whether the �experiment' would pay off for them in the future. Prior to the revelation of Drexel's back tests �proving' junk was not junk, junk bonds were the province of private placement agents, really owner/founders of smaller public companies who placed this debt within a small circle of sophisticated speculators, and friends and family investors.

With its new found �academic' legitimacy, the risk of junk bond issuance was distributed across the world, to all manner of yield-seeking individuals and institutions. Sounds quite similar to today's �academic' legitimacy of securitised mortgage obligations, but with one difference - back then, credit rating agencies still called junk �junk'. Nonetheless, it still left creditors in bankruptcy trying to value its debt, much like the attempts to value the securitised debt of today's equivalent of junk bonds.

Fast-forward to today's models that used back-tested default rates of subprime mortgages to create their forward-looking risk models. As is now known, these models did not factor in the looser standards of credit quality being used by the mortgage agents, who also took the money and ran. And, of course, one should not forget Long Term Capital Management, whose adherence to mathematical models of arbitrage, compounded by huge leverage, went into a tailspin when the models based on past experience did not quite work that way in the future.

In the modern era of information technology, much has been learned about control weaknesses and fraudsters, and this too has been around for a long time. Nick Leeson, the trader-fraudster behind the Barings collapse used the infamous 88888 error account as the vehicle to escape his fraud for some time. Fast-forward 13 years and James Kerviel was using a dummy pending counterparty account to hide his fraud. Even the early warnings of the extraordinary build-up of futures positions from listed futures market operators - Eurex and Osaka in the cases of Kerviel and Leeson, respectively - were similarly deflected from being observed by the �right management people' at their respective firms, who �would have acted, had they only known'.

An even more egregious and pervasive trading fraud occurred in the 1990s, discovered by some University of Chicago academics who applied statistical analysis retrospectively to discover artificially wide spreads on highly-liquid companies like Apple and Microsoft in the US over-the counter (OTC) equities markets. The largest of the Wall Street dealing firms paid enormous fines as a result of the rampant collusion by traders who defrauded the public for their own benefit. Inherent in the �heads I win, tails you lose' culture are the seeds of all these frauds, and incentive compensation schemes tied to performance are in need of serious repair.

Governance then and now

The complexity of the modern financial services company defies the once-cherished risk management technique of partnerships from the earlier era, which was the predominant governance model only four short decades ago. Then, the Monday morning meeting, at which the partners would decide the week's allocation of company capital, was all the risk management that was needed. Then, with their money literally and figuratively in their pockets, the partners would walk out onto the trading floor and manage their own and their partners' capital. They sat at their desks, directly overseeing all aspects of the process they had established to trade and control their business, and commanded quite literally a full view of all those that were to be managed.

Today, these investment and capital market partnerships are global in scope, many embedded in multinational banking institutions, almost all now owned by public shareholders. Partners have been replaced by staff whose pay schemes reward those who take big risks with shareholders' capital, the capital of �other people's money' fame, and are rewarded each year at bonus time. The clients' real bets are left to �pay off' over a longer-term horizon, should the financial instrument or investment strategy ever perform as designed, or rather, as marketed.

Product complexity

A typical example of the complexity of today's institutional products can be found in the description of Citibank's super senior collateralised debt obligations (CDOs), mainly collateralised by residential mortgage securities. In order to price these securities, Citi's model estimates housing price changes, unemployment rates, interest rates, credit scores, documentation status and changes in mortgage-holder age, in addition to other criteria such as loan-to-value and debt-to-income ratios. The model also estimates the impact of the geographic concentration of mortgages and the discount rates for each level of exposure. Finally, the model projects future cash flows using housing prices, interest rates, unemployment rates and discount rates. Can Citibank, or anyone for that matter, get all this right, all the time? Surely financial markets are vulnerable to bad assumptions if these products have to be marked-to-model when mark-to-market prices dry up? Of course they are. Result: today's subprime debacle, liquidity crisis and the almost complete nationalisation of the US housing finance market.

Another example of the complexity of these products can be found in a recent financial engineering conference description where the solutions to the problem of the faulty CDO structure were described thus:

��industry participants have still largely relied on first generation models, with well documented practical and theoretical limitations. For example, the Gaussian Copula framework as commonly applied to pricing synthetic CDOs is static in nature and not arbitrage-free, and its treatment of bespoke portfolios is ad hoc. Furthermore, given their complexity, the valuation of cash CDOs still commonly relies on simple bond-models and matrix pricing, and the application of more sophisticated option-pricing models for this instruments is not common practice.'
There is obviously something wrong here, but mere mortals (including perhaps auditors and regulators), with all due respect, cannot discover the problem.

A further example of complexity, this one in the retail product space, can be found in Genworth Financial's Foundation Annuity product. Its Life Time Income Plus Benefit provides guaranteed withdrawals for the life of the annuitant at least equal to purchase payments with upside potential provided the annuitant meets certain conditions such as allocating the contract to directed investment strategies and limiting initial gross withdrawals in any benefit year to an amount no greater than the withdrawal limit. The withdrawal limit is calculated each valuation day by multiplying the greatest of the contract value on the prior contract date, the withdrawal base and the roll-up value, all multiplied by the withdrawal factor. The withdrawal base is equal to the initial purchase payment adjusted as subsequent purchase payments are received. The roll-up value is equal to the initial purchase payment adjusted on each valuation date to reflect both the additional payments made up to the current valuation date, and the roll-up rate multiplied by the cumulative purchase payments.

How can the average person understand what he or she is buying in such products? It is no wonder that the US president has recently established a Financial Literacy Task Force to find means to better educate society in all manner of financial matters. How a financial institution monitors the risk that the enumerated parameters are all adhered to is hard to understand, especially when one realises the above synopsis is but one aspect of a multi-featured product. While the sales agent takes his cut up front, the client is making a decision to which they will be bound for the long term. As such, for the client to make an appropriate decision, they must understand all the relevant features of the product.

Now, the purpose here is not to solve the problems of the risk inherent in customer choices related to the complexity of the products offered by financial institutions. Rather, it is to discuss what financial institutions can do about the risks - to themselves and their counterparties - of offering these products, as well as any other operational risks to which they are exposed. It is no wonder then that the industry has turned to reporting mechanisms based on a new mantra - risk management - as an all-encompassing concept to view and explain the overwhelming complexity of modern financial products. How this risk management mantra, translated by regulators into a risk management framework, will hold up is still work in progress. The greatest hope for preventing such crisis in the future lies in the final piece of Basel II - the yet to be implemented operational risk framework - the first-ever truly global regulation intended to foster a risk-adjusted performance culture.

Does capital do it?

Regulators have always fostered an expectation that capital is what sustains banks in periods of stress and prevents them from failing. Perhaps a more appropriate view of the capital reserves that banks are forced to hold, with an eye to the current financial crisis, is that they constitute the measure by which the organisation counts down to failure, rather than the system that proactively prevents it. So, if it is not capital, what is it that offers a bank its greatest protection against failure? Quite simply, it is the risk culture embedded in its people and processes. At the core of any risk culture are the incentives for individual compensation that balance risk and return with short-term self-interest and long-term stakeholder goals, along with the embedded early warning systems that highlight growing exposure to risk.

Indeed, the near-death experience of Société Générale and Bear Stearns are egregious examples of risk management controls gone awry that found no visibility in the form of escalating risk exposure measurements. But escalating risk exposure is precisely what the Basel II founding fathers envisaged for operational risk management when it was first proposed in 1999 and what still offers the greatest hope for preventing crises in the future.

Basel II defines operational risk as �the risk of loss resulting from inadequate or failed internal processes, people and systems or from external events'. In more specific language, it further defines the coverage of operational risk as risks arising from such recent headline-grabbing events as internal and external fraud, errant employment practices, faulty workplace safety rules and procedures, risks of improper clients, products and business practices, losses from damage to physical assets, business interruptions and systems failures, and failures in trade execution, delivery and process management.

Many of the recent events can be slotted into one or more of the above operational risk categories. Citibank reports that its market value-at-risk number does not include CDO positions because they are hard to value due to the absence of prices or model inputs - a data problem; MF Global reports a $141m trading loss due to a reported systems control problem allowing a trader to avoid contract limits; Merrill acknowledges $43bn of OTC derivative cash flow improperly recorded on both sides of the balance sheet; Credit Suisse has a $2.8bn write-down due to valuation model pricing errors and use of stale prices; Société Générale reports a net $4.9bn loss due to trader fraud; and Bear Stearns nearly collapses because it was unable to price its mortgage portfolios, among other things.

Unfinished business

The financial industry has not yet found its way onto the difficult Basel II path that leads towards the creation of monitoring systems that provide a common measurement framework applied to the risks inherent in business operating environments. Instead it is building operational risk management systems around self-assessments, indicators and scenario analyses - an expedient but, if left to stand for the long term, errant path.

It may seem extreme to suggest that an entire industry may not be on the right path, but there is real evidence that this could well be the case. Consider this comment from the most sophisticated of banks, as represented by the Advanced Measurement Approach Group of the Risk Management Association in a May 2007 response to the US federal regulators' outline of an operational risk supervisory framework:�Practically speaking, the requirement to produce comprehensive management reports including "changes in factors signaling an increased risk of future losses" cannot be met at this point in time or in the near future. In many instances, operational risk factors that led to a particular event cannot be uniquely determined retrospectively, let alone detecting a change in factors that signals an increase in future losses.' This statement begs a key question: what is the value of a global risk management, capital adequacy or economic capital regime if the banks applying it, by their own admission, are unable �at this point in time or in the near future' to fulfil a requirement as fundamental as being able to demonstrate the link between changes in risk factors and past and likely future negative outcomes? The truth is that without this capability, risk management programmes have very little value.

The failure of risk management, particularly the modelling of past losses to predict future capital requirements, is coming under increasing scrutiny. In fact, in a blunt statement made in a recent Wall Street Journal editorial, William Isaac, former head of the Federal Deposit Insurance Corporation (FDIC) during the turbulent period of the US Savings & Loan crisis in the early 1980s, recommends throwing out Basel II before it even gets implemented in US banks. Further, Lee Pickard, who as Head of the Securities and Exchange Commission (SEC) Trading and Market Division helped write the first set of capital adequacy rules for investment firms in 1975, following the Wall Street �paper crisis' of the 1960s, blames the failure of three of the largest brokerage firms on the SEC, for having allowed them to adopt the Basel II guidelines in 2004, despite their highly-leveraged positions.

So what is the problem here? Is it simply that risk managers have been focused on the path of adapting models of past losses as a means of calculating operational risk capital while forgoing the challenge of monitoring current operating metrics as a predictor of potential future loss events. To use a metaphor, measuring the seismic activity of a volcano before it explodes is a better indicator of a potential eruption, while having an historical perspective on past eruptions certainly predicts the potential magnitude of the eruption. It would seem to this author that a sounder approach would have financial institutions implementing the required components of Basel II that signal �an increased risk of future losses' along with the potential size of those losses. Here regulators need to provide more detailed guidance, as the industry has failed to do it itself

In early 2007, at a select gathering of 20 top-of-the-industry operational risk management professionals in London, each was asked to complete a survey on topical issues. By far, the area of greatest agreement was that operational risk management would benefit were it possible to directly measure exposure-to-risk of potential future losses, as is the case in credit and market risk. Thus, there is clear evidence that operational risk managers really do covet this capability. But where to begin? Perhaps by realigning thinking regarding who is accountable for what in the management of operational risks.

Operating management partnering with risk managers

A critical and accepted function of operating management is to respond to changes in operations that affect profitability. It would be logical then that they should also be rewarded if their actions have the effect of mitigating risk in line with the risk tolerances expressed by management in the form of capital reserves. Risk managers, meanwhile, monitor exposure to risk in accordance with the risk appetite set by their boards of directors which, in turn, should be linked to sound model-based predictions of future losses and capital requirements.

It follows, but it has not yet been generally accepted, that creating an approach to the measurement of risk in operating environments should be the province of business management. Equating such measures to loss frequency and severity of losses, which in turn is equated to capital reserves, should be the province of risk managers.

Operating line managers need to report in quantitative terms how the risk they manage is affected by factors such as high transaction counts, non-reconciled positions, failures to deliver securities, overtime hours, absenteeism, systems downtime, number of unauthorised accesses, number of password changes per employee and number of internal non-client accounts opened - that is, if they are ever recorded in the first place. They are certainly not available in aggregated form for executive management in a manner that equates changes in these operating factors to real-time or near real-time measures of risk exposures and, in turn, to operational loss predictions and capital requirements.

So, what is the right path to detecting a potential for a loss before it becomes one? An examination of the measurable properties contained in actual business processes offers the greatest possibilities. Research and pilots in this area suggest that risk exposure quantification can be achieved through a system of standardised weightings and risk factors applied to the transaction values supported by business processes comprising different types of operational activity. Combined with the actual status of risk mitigation versus generally accepted best practices, an aggregateable management information risk exposure measurement system can be constructed around the measures of risk level inherent in financial transactions.

But it will be difficult to deliver this final piece of Basel II without the active engagement of line operating management. They need to be focused on ways of directly measuring the operational exposures they manage, as described above. When this capability is in place they can then join forces with their risk management colleagues to construct the bridge that will connect risk-exposure measurements in the operating business space with predictive loss models in the risk management space and thereby complete that final piece of Basel II's risk management framework.

A confirming view of uniting business managers with risk managers was noted recently in a study carried out by UK research firm Datamonitor and the European Financial Management & Marketing Association. The study stated that the recent well-publicised failures in risk management had highlighted the necessity of risk and finance integration. It went on to say that the current climate is making a persuasive business case for a combined approach to risk and financial reporting. And finally, it stated that those firms that continue to see the risk disciplines as separate from the underlying business itself will be doomed to repeat previous mistakes.

The price of not implementing the Basel II operational risk framework

The bankruptcy of New Century Mortgage and Lehman Bros; the takeover in liquidation of Indy Mac by the FDIC; the UK Government's takeover of Northern Rock; the Federal Reserve's bailout of Bear Stearns; and the nationalisation of AIG, Fannie Mae and Freddie Mac are but the most recent examples of failures to monitor risk exposures as they accumulate. Today, the public's interest is more at stake than in the past. Financial institutions are mainly public shareholder companies. They make significant contributions to global economic value, are among the largest employers, and among the largest-value components of the major market indices. They are now so large that their failures resonate around the globe and shock the financial system at its core.

When they fail there is significant consequence - and they fail with some regularity. Witness in previous eras the decoupling of the US stock markets and the FDIC's Savings & Loan bailout of the 1980s; the SEC's SIPC broker-dealer liquidations and the failure of Germany's Herstaat Bank in the 1970s; and the fraud and subsequent bankruptcy of commodity broker Ira Haupt and the bailout of brokerage firm Hayden Stone in the 1960s. Each of these events had subsequent and fundamental impacts on mitigating risk in the global financial system, primarily through the subsequent establishment of various mutualised intra-industry risk-mitigating payment and settlement mechanisms. Today's financial crisis points to similar solutions.

The recent US Department of the Treasury's report �Blueprint for a Modernised Financial Regulatory Structure' recognises the need for a new regulatory structure focused on payment and settlement systems. The President's Working Group report, authored at about the same time, points to more structure in the OTC derivatives and structured products markets, creating a central counterparty regime within a more automated payment and settlement system. Similar initiatives are underway internationally to remove risks between financial institutions, and between their central securities depositories and contract market clearinghouses. Further, regulators worldwide are demanding more interactive, automated inputs for their regulatory reports, requiring even more structure, precision and standardisation of data. Surely more mutualised risk-sharing will be the result of all these efforts.

The industry now faces mitigating risk in individual financial institutions. Here, the process should start with a proactive management information system that equates profitability measures and operational metrics with the disciplines of risk management. Thereafter, by tying risk-adjusted performance to measurements of exposures to risk as they accumulate in the operating environment, it will finally be possible to correlate these exposures to loss history for modelling capital requirements. This is what Basel II's operational risk framework is all about.

Conclusion

In its final implementation notice this past year, Basel II's US regulators called for �consistent and comprehensive capture and assessment of data elements needed to identify, measure, monitor, and control the bank's operational risk exposure. This includes identifying the nature, type(s), and underlying cause(s) of the operational loss event(s).'

It is important to reject the push-back voiced by others, and quoted earlier, that �operational risk factors that led to a particular event cannot be uniquely determined retrospectively, let alone detecting a change in factors that signals an increase in future losses'.

Industry must get on with implementing risk management systems that do determine operational risk factors that lead to a particular event and that can detect a change in factors that signals an increase in future losses. Come the next financial crisis, it is such efforts that will save individual institutions and it is what the Basel II overseers expect.