This post is motivated by the September edition of IEEE Signal Processing Magazine. We all must be familiar of risk management which is a topic of importance in both engineering and finance. It has been the outcome of man's desire to control the nature's unexpected phenomena. Though not fully controllable this branch of science has helped him reduce the devastation that would've been inflicted otherwise. Broadly we can state Risk management as the study and mitigation of rare events that have potentially devastating outcomes.
The problem of risk management is closely related to the reliability theory which calls on the need to formulate robust solution to real-world problems arising around us. The events addressed could be variants ranging through Gulf oil spills, global warming or terrorism act. Improved methodologies that focus on point estimation, or maximal probability events, as well as the occurrence, prediction and cost of outliers are the need of the era especially in the domains like security, defence etc. The cost we pay due to the lack of innovative research in this arenas is beyond the bounds..
The study of risk management can be branched out into two : 1. Statistical study and theory of extreme events [1]. For instance the telecommunication traffic research has its backbone on packet size distributions which does share similar properties to finance data [2].
2. The second : Modelling and predicting future distribution of losses. The derived loss-distribution function should not only reflect conditional uncertainty under various model assumptions, but also the unconditional nature. This naturally calls for the Bayesian approach which still has rudiment research done on it. Our emphasis while dealing with this domain should be on the far right tail; the extreme losses or errors where we might have to sacrifice predictive accuracy to get a better estimate of rare events.
The events occurring around us should be closely analyzed for correlation factors whereby we can design a model. Most of the events occurring strictly follow the distribution models which make our job easier. The only skill we need to develop is to look around us for parameters to catch the correlation. The correlation is a sign that induces predictability which in turn completes our conditional analysis. The tougher part of the cake is the unconditional behavior in the model, which would be a heavily intuitive study.
Readers who complain of lack of research arenas should try tuning your minds to these problems. I am sure by now you might be dreaming of umpteen number of thesis reports in your name :)
REFRENCES
1 .S. Resnik, "Heavy Tail Phenomena : Probabilistic and Statistical modelling" New York: Springer-Verlag, 2007.
2.W.E. Leland, M.S Taqqu, W.Willinger and D.V. Wilson, "On the self similar nature of Ethernet Traffic", IEEE/ACM Transaction on Networking, Vol.2, No.1, pp: 1-15, 1994