Shifting the analysis from broad classifications of asset/liability types to the elements that actually drive the value and risk of these instruments makes it possible to do dynamic modeling at a much higher level of accuracy. It enables regulators, financial institution executives, auditors, and others to anticipate the impact of systemic or institutional stress. When structured financial contracts are in place, information technology will make it possible to (1) run standard risk assessments far more rapidly, (2) perform ongoing discovery analysis to identify new sources of dynamic and static risk, and (3) create “risk isoquants” that measure a key set of metrics around what needs to happen to reach a failure point, either institution-specific or systemically. These metrics also can be monitored by regulators, central banks, and investors.
Financial institutions need more effective risk modeling techniques. “Effective” means that the focus of the risks measured shifts from static to dynamic analysis and examines the components of the terms of the financial contract instead of a multitude of individual groupings of roughly similar instruments. It’s an intriguing idea because in theory it would make it possible to understand, measure, and rank the sources of future risks faster and more accurately than the existing stress test methodologies.
The carefully crafted risk management banking reforms of the past 20 years did not prevent the financial collapse that began to unfold in 2007. There are many efforts under way to try to prevent a repeat of that debacle, and since they address different issues, they take different forms. One of the frequently cited root causes of the panic was the lack of transparency in the value of complex assets (such as mortgage-backed securities, swaps, and non-exchange-traded options) held by financial institutions.
I think an important reason why this lack of visibility developed was the mismatch between increasingly complex 21st-century securities trading methods and the 20th-century (or even 19th-) legal and statutory underpinnings of the assets/liabilities themselves. While parts of the financial services industry was using increasingly sophisticated IT systems to value and trade financial instruments, the underlying information available about those instruments in this trading context was (and remains) fundamentally flawed.
Along with this, it’s also caused people to re-examine how best to manage risk. Some are asking whether it’s even possible to measure and model risk. I think the answer is yes, but in many cases the traditional approaches to modeling risk need to be replaced with methods that are more concrete and tied to each and every asset and liability. Until recently, such an approach was not practical because the information technology needed to support this approach was either not there or not cost-effective. Today, however, the lower cost of processing speed and memory, in-memory computing, near-real-time residential real estate price databases, and complex-event processing techniques (to name just a few) have made it possible to bring risk management much closer to the reality of underlying assets and liabilities and make it much less of a theoretical exercise.
To make the approach workable requires mapping the terms and conditions (Ts&Cs) of financial contracts to a standard taxonomy. This method is better aligned with today’s financial markets. Not only would it make it far easier to monitor and measure risk in all assets held in financial institutions, but also it would facilitate all aspects of the securitization process. Up until the 1980s, most loans and other such financial assets were held to maturity and serviced by the originator. Swaps and other derivatives were in their infancy. Collecting a standard set of data about financial contract terms wasn’t necessary because financial institutions had all the information they needed from internal sources, and they had every incentive to ensure that they limited loss exposure to at least reasonably prudent standards. There was no need to monitor these assets using IT automation, which in any case would have been impractical well into the 1990s because the cost would have been prohibitively high. Today, though, the increased flexibility of financial markets worldwide to match the needs of borrowers and lenders, combined with hedges and other derivatives to offset or shade risks, makes it necessary to understand and monitor these assets/liabilities at an elemental level. Luckily, the information technologies necessary are readily available and suitably cheap. Creating and managing a taxonomy of all of the terms and conditions of the full range of financial contracts would not be a simple undertaking, but it is hardly rocket science. While the initial version should strive to be as complete as possible, the taxonomy must be flexible enough to incorporate new elements to adapt to market changes. Even so, some of these contracts inevitably will need to have nonstandard terms, and that must also be accommodated, but the vast majority of the value of financial assets in the hands of financial institutions can be defined, analyzed, and monitored using structured frameworks.
Along with this, once the taxonomy is in place, it might be able to replace today’s verbiage-rich legal templates with a more consistent, modular (Lego-like) structure so that drafting loan agreements, securitized leases, indentures, and other financial contracts becomes more of a “fill in the blanks” exercise rather than a custom-built project. While securities lawyers and clever investors (those who can find value in poorly drafted instruments) may suffer, everyone else will benefit from a more efficient process and more transparent financial assets/liabilities.
While not immediately apparent, the financial services industry created a “garbage-in-junk-out” information system. The solution is not to return to the past. One interesting idea that has emerged on the risk management front is developing better modeling and analysis to make the process more effective. We need to reform how we get information about financial assets/liabilities and modernize techniques for risk management and analysis using better information. Today, risk analysts synthesize a view of an entity’s portfolio. While the assumptions used to create these synthetic models are fine in a steady state, they are useless under extreme conditions.
It’s important to understand that the solution to today’s imbalance between 21st-century trading methods and 20th-century financial instrument structures cannot be to try to return to the past. The genie is out of the bottle and the tools are available to address the imbalance. Instead of attempting to stamp out financial innovation, we need to use these tools to enable more intelligent regulation, risk management, and investing. ###
Asset bubbles and their aftermaths were the most prominent feature of finance in the first decade of the 21st century. In the United States, each burst led to a re-examination of the controls that might or should have been in place to prevent the bubble from happening or at least would have limited its size and extent.
Once a system of structured Ts&Cs are in place, information technology will be able to substantially improve the analysis and reporting of financial assets and liabilities Coach Factory Outlet Online, thereby increasing transparency and enabling more effective risk management. It will bring the structure of financial instruments and contracts back in line with today’s trading technology and practices.
Using today’s information technology, it would become possible (and essential) to create financial models that take advantage of the standardized definitions of Ts&Cs for regulatory and management purposes. Having a standard taxonomy and framework for risk modeling will enable more flexible and dynamically accurate assessment of risk by financial regulators, company managements, and their boards, external rating agencies, and any interested third party with access to the data. This also allows risk managers to focus on the underlying detailed structure of the assets and liabilities rather than on some artificial classification of them, bringing the process closer to the source and substance of their value and risk.
Today, it can easily take a couple of weeks to assess the value of a single mortgage-backed security because the information needed to make this determination is scattered and often must be gathered through a combination of techniques. In theory, all of the information needed to assess the value of an MBS could be gathered electronically in a matter of seconds or less. In reality, it will take an overhaul of contractual requirements of all parties and the securities’ indentures to make this possible.
Related:
No comments:
Post a Comment