When you're - like me - a born professional optimist, but nevertheless sometimes worry about the unavoidable misery in the world, you ask yourself this question:
Think about this question and try to answer it, before reading any further..
The answer to this question is very simple:
The moral of this anecdote is that when you're fully aware of all the risks and their possible impact, chances are high you'll not be able to take any well-argued decision at all, as any decision will eventually fail when your objective is to rule out all possible risks.
On the other hand, if you're not risk-conscious at all regarding a decision to be taken, most probably you'll take the wrong decision.
'Mathematical Confident'
So this leaves us with the inevitable conclusion that in our eager to take risk-based decisions, a reasoned decision is nothing more than the somehow optimized outcome of a weighted sum of a limited number of subjective perceived risks. 'Perceived' and 'Weighted', thanks to the fact that we're unaware of certain risks, or 'filter', 'manipulate' or 'model' risks in such a way that we can be 'mathematical confident'. In other words, we've become victims of the "My calculator tells me I'm right! - Effect".
Risk Consciousness Fallacy
This way of taking risk based decisions has the 'advantage' that practice will prove it's never quite right. Implying you can gradually 'adjust' and 'improve' or 'optimize' your decision model endlessly.
Endlessly, up to the point where you've included so much new or adjusted risk sources and possible impacts, that the degrees in freedom of being able to take a 'confident' decision have become zero.
Risk & Investment Management Crisis
After a number of crises - in particular the 2008 systemic crisis - we've come to the point that we realize:
Fallacy
One of the most basic implicit fallacies in investment modeling, is that mathematical confidence levels based on historical data are seen as 'trusted' confidence levels regarding future projections. Key point is that a confidence level (itself) is a conditional (Bayesian) probability .
Let's illustrate this in short.
A calculated model confidence level (CL) is only valid under the 'condition' that the 'Risk Structure' (e.g. mean, standard deviation, moments, etc.) of our analysed historical data set (H) that is used for modeling, is also valid in the future (F). This implies that our traditional confidence level is in fact a conditional probability : P(confidence level = x% | F=H ).
Example
Unconditional Financial Institutions Confidence Levels will be in line with our own poor economic forecast confidence levels.
A detailed Societe Generale (SG) report tells us that not only economic forecasts like GDP growth, but also stocks can not be forecasted by analysts.
Over the period 2000-2006 the US average 24-month forecast error is 93% (12-month: 47%). With an average 24-month forecast error of 95% (12-month: 43%), Europe doesn't do any better. Forecasts with this kind of scale of error are totally worthless.
Confidence Level Crisis
Just focusing on sky high risk confidence levels of 99.9% or more is prohibiting financial institutions to take risks that are fundamental to their existence. 'Taking Risk' is part of the core business of a financial institution. Elimination of risk will therefore kill financial institutions on the long run. One way or the other, we have to deal with this Confidence Level Crisis.
The way out
The way for financial institutions to get out of this risk paradox is to recognize, identify and examine nonlinear and systemic risks and to structure not only capital, but also assets and obligations in such a (dynamic) way that they are financial and economic 'crisis proof'. All this without being blinded by a 'one point' theoretical Confidence Level..
Actuaries, econometricians and economists can help by developing nonlinear interactive asset models that demonstrate how (much) returns and risks and strategies are interrelated in a dynamic economic environment of continuing crises.
This way boards, management and investment advisory committees are supported in their continuous decision process to add value to all stakeholders and across all assets, obligations and capital.
Calculating small default probabilities in the order of the Planck Constant (6.626 069 57 x 10-34 J.s) are useless. Only creating strategies that prevent defaults, make sense.
Let's get more confident! ;-)
Sources/Links
- SG-Report: Mind Matters (Forecasting fails)
- Are Men Overconfident Users?
Why does God not act?
Think about this question and try to answer it, before reading any further..
The answer to this question is very simple:
God does not act because he's conscious of everything
The moral of this anecdote is that when you're fully aware of all the risks and their possible impact, chances are high you'll not be able to take any well-argued decision at all, as any decision will eventually fail when your objective is to rule out all possible risks.
You see, a question has come up that we can't agree on,
perhaps because we've read too many books.
Bertolt Brecht, Life of Galileo (Leben des Galilei)
perhaps because we've read too many books.
Bertolt Brecht, Life of Galileo (Leben des Galilei)
On the other hand, if you're not risk-conscious at all regarding a decision to be taken, most probably you'll take the wrong decision.
'Mathematical Confident'
So this leaves us with the inevitable conclusion that in our eager to take risk-based decisions, a reasoned decision is nothing more than the somehow optimized outcome of a weighted sum of a limited number of subjective perceived risks. 'Perceived' and 'Weighted', thanks to the fact that we're unaware of certain risks, or 'filter', 'manipulate' or 'model' risks in such a way that we can be 'mathematical confident'. In other words, we've become victims of the "My calculator tells me I'm right! - Effect".
Risk Consciousness Fallacy
This way of taking risk based decisions has the 'advantage' that practice will prove it's never quite right. Implying you can gradually 'adjust' and 'improve' or 'optimize' your decision model endlessly.
Endlessly, up to the point where you've included so much new or adjusted risk sources and possible impacts, that the degrees in freedom of being able to take a 'confident' decision have become zero.
Risk & Investment Management Crisis
After a number of crises - in particular the 2008 systemic crisis - we've come to the point that we realize:
- There are much more types of risk than we thought there would be
- Most type of risks are nonlinear instead of linear
- New risks are constantly 'born'
- We'll not ever be able to identify or significantly control every possible kind of risk
- Our current (outdated) investment model can't capture nonlinear risk
- Most (investment) risks depend heavily on political measures and policy
- Investment risks are more artificial and political based and driven, than statistical
- Market Values are 'manipulable' and therefore 'artificial'
- Risk free rates are volatile, unsure and decreasing
- Traditional mathematical calculated 'confidence levels' fall short (model risk)
- As Confidence Levels rise, Confidence Intervals and Value at Risk increase
Fallacy
One of the most basic implicit fallacies in investment modeling, is that mathematical confidence levels based on historical data are seen as 'trusted' confidence levels regarding future projections. Key point is that a confidence level (itself) is a conditional (Bayesian) probability .
Let's illustrate this in short.
A calculated model confidence level (CL) is only valid under the 'condition' that the 'Risk Structure' (e.g. mean, standard deviation, moments, etc.) of our analysed historical data set (H) that is used for modeling, is also valid in the future (F). This implies that our traditional confidence level is in fact a conditional probability : P(confidence level = x% | F=H ).
Example
- The (increasing) Basel III confidence level is set at P( x ∈ VaR-Confidence-Interval | F=H) = 99.9% in accordance with a one year default level of 0.1% (= 1-99,9%).
- Now please estimate roughly the probability P(F=H), that the risk structure of the historical (asset classes and obligations) data set (H) that is used for Basel III calculations, will also be 100% valid in the near future (F).
- Let's assume you rate this probability based on the enormous economic shifts in our economy (optimistic and independent) at P(F=H)=95% for the next year.
- The actual unconditional confidence level now becomes P( x ∈ VaR-Confidence-Interval) = P( x ∈ VaR-Confidence-Interval | F=H) × P(F=H) = 99.9% × 95% = 94.905%
Unconditional Financial Institutions Confidence Levels will be in line with our own poor economic forecast confidence levels.
A detailed Societe Generale (SG) report tells us that not only economic forecasts like GDP growth, but also stocks can not be forecasted by analysts.
Over the period 2000-2006 the US average 24-month forecast error is 93% (12-month: 47%). With an average 24-month forecast error of 95% (12-month: 43%), Europe doesn't do any better. Forecasts with this kind of scale of error are totally worthless.
Confidence Level Crisis
Just focusing on sky high risk confidence levels of 99.9% or more is prohibiting financial institutions to take risks that are fundamental to their existence. 'Taking Risk' is part of the core business of a financial institution. Elimination of risk will therefore kill financial institutions on the long run. One way or the other, we have to deal with this Confidence Level Crisis.
The way out
The way for financial institutions to get out of this risk paradox is to recognize, identify and examine nonlinear and systemic risks and to structure not only capital, but also assets and obligations in such a (dynamic) way that they are financial and economic 'crisis proof'. All this without being blinded by a 'one point' theoretical Confidence Level..
Actuaries, econometricians and economists can help by developing nonlinear interactive asset models that demonstrate how (much) returns and risks and strategies are interrelated in a dynamic economic environment of continuing crises.
This way boards, management and investment advisory committees are supported in their continuous decision process to add value to all stakeholders and across all assets, obligations and capital.
Calculating small default probabilities in the order of the Planck Constant (6.626 069 57 x 10-34 J.s) are useless. Only creating strategies that prevent defaults, make sense.
Let's get more confident! ;-)
Sources/Links
- SG-Report: Mind Matters (Forecasting fails)
- Are Men Overconfident Users?