· session iii: portfolio construction and optimization ... submission to r/rmetrics workshop,...
TRANSCRIPT
R in Finance and Insurance
8th R/Rmetrics Workshop and Summer School
June 26 – 28 , 2014Paris, France
WORKSHOP
BOOKLET
WELCOME
Program
Abstracts
SPONSORS
Rmetrics www.rmetrics.org
www.ethz.ch
www.finance.ch
www.neuraltechsoft.com
www.rstudio.org
www.federal-finance.fr
www.suravenir.fr
www.natixis.com
www.credit-suisse.com
ORGANIZATION
CHAIRMEN
PATRICK HÉNAFF UNIVERSITÉ PARIS I, FRANCE
DIETHELM WÜRTZ ETH ZURICH SWITZERLAND
PROGRAM COMMITTEE
PATRICK HENAFF UNIVERSITÉ PARIS I, FRANCE
MAHENDRA MEHTA NEURALTECHSOFT MUMBAI
DIETHELM WÜRTZ ETH ZURICH SWITZERLAND
STEFAN THEUSSL RAIFFEISEN RESEARCH, AUSTRIA
CONFERENCE OFFICE
VIKTORIA FORGACS RMETRICS ASSOCIATION, ZURICHPAULA BERDUGO UNIVERSITÉ PARIS I, FRANCE
WELCOME
Welcome to the 8th R/Rmetrics Workshop and Summer School, organized by ETH Zurich,
Université Paris I and the LabEx RéFi. After seven years in the Swiss Alps, the conference
meets this year in the historic Collège des Bernardins, a 13th century former monastery in the
Latin Quarter of Paris.
We are very glad that you found the time to come to this inspiring place. Many of you have
traveled from the U.S., Europe and various places in Asia.
With the R/Rmetrics meetings, we hope to provide a forum where institutional investors and
risk managers from banks and insurance companies, researchers from industry and
academia, and students too, can exchange ideas and engage in stimulating discussions.
About 40 participants are attending the conference, and the mixture, as planned, is quite
heterogeneous. About one third is from academia, one half from the software and financial
industries, and the rest are students supported by scholarships provided by generous
sponsors. We thank all of them very much.
We wish you an interesting conference with many inspiring and stimulating discussions.
Patrick Hénaff
Diethelm Würtz
Program
26 June 2014
Tutorial (9:00-10:15)Past Performance is Not Necessarily Indicative of Future Results: Sta-
bility Concepts in Modern Porfolio Design . . . . . . . . . . . . . 7Diethelm Wuertz (ETH Zurich)
Keynote Session I (11:00-12:30)Bayesian Model Choice . . . . . . . . . . . . . . . . . . . . . . . . . . 8Christian Robert (Université Paris-Dauphine)
Keynote Session II (14:00-15:30)Accurate Methods for Approximate Bayesian Computation Filtering . . 9Veronika Czellar (EM Lyon)
Session I: Shiny Apps (16:00-17:30)A Passive Hedging Monitor . . . . . . . . . . . . . . . . . . . . . . . . 10Omid Pakseresht and Jan Witte (Record Currency Management, Ltd)Dynamic Portfolio Strategies . . . . . . . . . . . . . . . . . . . . . . . 11Marc Weibel (Zurich University of Applied Sciences)Rmetrics Apps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12Venetia Christodoulopoulou, Tobias Setz and Diethelm Wuertz (ETHZurich)
27 June 2014
Session II: Statistical Models of Return (9:00-10:30)Explicit Models for Bilateral Distributions with Heavy Tails . . . . . . . 13Patrice Kiener (InModelia)Intermediate and Long Memory Time Series . . . . . . . . . . . . . . . 14Karl-Kuno Kunze (Fractional Views GmbH)
5
Robust and Bias-corrected Estimation of the Coeficient of Tail Depen-dence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Christophe Dutang (Université du Maine)
Keynote Session III (11:00-12:30)Economic Scenarios Generation . . . . . . . . . . . . . . . . . . . . . 16Thierry Moudiki and Frédéric Planchet (IFSA)
Keynote Session IV (14:00-15:30)Anomalous Price Impact and Critical Liquidity in Financial Markets . . 17Jean-Philippe Bouchaud (École Polytechnique and CFM)
Session III: Portfolio Construction and Optimization (16:00-17:30)Dynamic style allocation of characteristic-based equity portfolios . . . . 18Marjan Wauters (KU Leuven)The ROI Package in Action: Portfolio Optimization and Beyond . . . . 19Stefan Theussl (Raiffeisen Research)A Bayesian Investment Strategy: For a Better Design of Portfolios . . . 20Tobias Setz (ETH Zurich)
28 June 2014
Session IV: Portfolio Construction and Optimization (9:00-10:30)New Directions in Tactical Asset Management: The Analysis of Spot
and Forward FX Markets . . . . . . . . . . . . . . . . . . . . . . 21Jan H. Witte (Record Currency Management, Ltd)Stability Analysis of the Swiss Performance Index . . . . . . . . . . . . 22Cyril Bachelard (OLZ)Risk Parity for CVaR and Downside Risk . . . . . . . . . . . . . . . . 23Evgeny Bauman (Markov Process International, LLC)
Session V: Topics in Financial Statistics (11:00-12:15)Quantifying Model Risk in Pricing Path-dependent Derivatives . . . . . 24Vineet Virmani (Indian Institute of Management)Modeling Life Expentancy at Birth for Multi-population: a Cointegra-
tion Approach . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25A. Ntamjokouen (Bergamo University)Informed trades, uninformed trades and market resiliency: Evidence
from a limit order book market . . . . . . . . . . . . . . . . . . . 26Rajat Tayal (Indira Gandhi Institute of Development Research)
6
Past Performance is Not Necessarily Indicative of Future Results -‐ Stability Concepts in Modern Portfolio Design -‐
Diethelm Würtz, Tobias Setz, Venetia Christodoulopoulou Swiss Federal Institute of Technology, ETH Zurich Curriculum for Scientific Computing & Institute for Theoretical Physics Rmetrics Association Zurich We present new unconventional methods based on predictive Bayesian change point (BCP) analytics and on morphological shape factor analytics (MSFA) to design portfolios. Our approach makes optimization obsolete and comes with many additional advantages compared to conventional investment strategies. In the first part of the tutorial we report about ideas based on concepts of stability analysis to get an alternative view on performance and risk in portfolios. The main topic we like to address is concerned with the detection of vulnerabilities in the dynamics of financial return series. For this we use the “Product Partition Method” of Barry and Hartigan [1993] in the implementation of Erdman and Emerson [2008] based on Bayesian Statistics and Markov Chain Monte Carlo simulations. In the second part of the tutorial we present a multivariate Morphological Shape Factor Analysis of the feasible set of a portfolio. The feasible set will be approximated by an ellipse which is characterized by its first four geometric moments. The moments, Hu [1962], Flusser et al. [2009], allow us to define the center of mass, the area, the orientation and the excentricity. Rolling these objects on a running window allows us to perform a time series analysis and to predict the future behavior of the portfolio. Robust statistics and Bayesian stability analytics can then be used to create valuable measures for the performance and for the risk of portfolios.
References: D. Barry and J.A. Hartigan [1993] A Bayesian Analysis of Change Point Problems, Journal of the American Statistical Association 88, 309, 1993. Ch. Erdman and J.W. Emerson [2008] A Fast Bayesian Change Point Analysis for the Segmentation of Microarray Data, Bioinformatics 24, 2143 M.K. Hu [1962] Visual Pattern Recognition by Moment Invariants, IRE Trans. on Inf. Theory 8, 179 J. Flusser, T. Suk and B. Zitova [2009] Moments and Moment Invariants in Pattern Recognition, ISBN 978-‐0-‐470-‐69987-‐4 -‐ John Wiley & Sons
Tutorial presented at the R/Rmetrics Workshop and Summer School, Paris, 26 – 28. June 2014 Contact Address: [email protected], www.rmetrics.org 7
Bayesian Model Choice
Christian RobertUniversité Paris-Dauphine
This talk will provide (a) an illustrated survey of the most common computational tech-niques for conducting Bayesian model choice through the standard Bayes factor, along with(b) theoretical investigations on approximative Bayes (ABC) model choice and (c) a newmethodological proposal restating testing within a mixture setting.
It covers joint works with J.-M. Marin, K. Mengersen, N. Pillai, and J. Rousseau:http://arxiv.org/abs/1205.5658http://arxiv.org/abs/1110.4700v4http://arxiv.org/abs/1102.4432
8
Accurate Methods for Approximate BayesianComputation Filtering
Laurent Calvet and Veronika Czellar
The Approximate Bayesian Computation (ABC) filter extends the particle fil- teringmethodology to general state-space models in which the density of the observation condi-tional on the state is intractable. We provide an exact upper bound for the mean squarederror of the ABCfilter, and derive sufficient con- ditions on the bandwidth and kernel un-der which the ABCfilter converges to the target distribution as the number of particles goesto infinity. The optimal convergence rate decreases with the dimension of the observationspace but is invariant to the complexity of the state space. We show that the adaptive band-width commonly used in the ABCliterature can lead to an inconsistent filter. We developa plug-in bandwidth guaranteeing convergence at the op- timal rate, and demonstrate thepowerful estimation, model selection, and forecasting performance of the resulting filter ina variety of examples. Keywords: bandwidth, kernel density estimation, likelihood estima-tion, model selection, particle filter, state-space model, value-at-risk forecasts.
9
A Passive Hedging Monitor
Omid PaksereshtRecord Currency Management, Ltd
Jan WitteRecord Currency Management, Ltd
We present a Passive Hedging Monitor in Shiny. The application considers dif-ferent international bond and equity portfolios, and visualises the historic effects ofexchange rate movements. The effects of exchange rate protection (ie, FX hedging)can be studied comfortably, and several pieces of conventional wisdom (eg, you shouldalways hedge your bonds) can be investigated easily.
10
About FinDiv
FinDiv allows to assess the true diversification level of a given portfolio as well as the risk contribution of each asset.
Description
This risk decomposition and portfolio diversification tool has been developed within a Bachelor Project of the Zurich University of Applied Sciences.
It allows to upload your own assets or asset classes time series as well as the portfolio allocation over time and the tool will proceed with the decomposition of the portfolio's risk and will display the mean-diversification frontier, putting into perspective the expected return of a given portfolio and its corresponding diversification level.
You also have the possibility to take our own return expectations into account in the risk modeling procedure. We use here a natural set of uncorrelated bets to manage diversification called the Minimum-Torsion Bets, which are the uncorrelated factors closest to the factors used by the portfolio manager. The contributions to risk from the Minimum-Torsion Bets constitutes a generalization of the marginal contributions to risk used in traditional risk parity.
11
Submission to R/RMetrics workshop, June 26-28, 2014, Paris
Explicit models for bilateral distributionswith heavy tails
Patrice KienerInModelia, 5 rue Malebranche, 75005 Paris, France
We present two new models that describe with great accuracy bilateral distributions withheavy tails like those that occur in financial markets.The main interest of these new models is the physical meaning of the tail parameters: theycorrespond exactly to the power exponents that appear in Pareto and Karamata formulas.The three parameter model (mediane, scale, tail) corresponds to symetric distributionsand takes explicit forms in the density function, the cumulative function and the quantilefunction. The four parameter model (mediane, scale, left tail, right tail) or (mediane,scale, tail, excentricity) corresponds to asymetric distributions and takes an explicit formin the quantile function.The models are presented along with examples on well-known stocks and indexes. Theiruse in risk measurement, random processes and quantitative finance are being discussed.R was the preferred software to assess these new models. The core functions that definethem have been gathered together in a new R package and they will be briefly described.
References[1] B. Mandelbrot, Sur certains prix spéculatifs : faits empiriques et modèle basé sur lesprocessus stables additifs non gaussiens de Paul Lévy, C. R. Acad. Sci. Paris, vol. 254,(1962) 3968-3970.[2] J-P. Bouchaud, M. Potters, Théorie des risques financiers, CEA Aléa Saclay, 1997.[3] L. De Haan, On regular variation and its application to the weak convergence of sampleextremes, Mathematical Centre Tracts 3é, Matematish Centrum Amsterdam, 1975.[3] The R Project for Statistical Computing, http://www.r-project.org
113
Announcing package Intermediate and Long MemoryTime Series (ILMTS)
Karl-Kuno Kunze∗
Fractional View GmbH, Graz†
Ostfalia University of Applied Sciences Wolfsburg, Germany‡
March 2014
Abstract
The proposed lightning talk will present the very recently started project on intermedi-ate and long memory time series estimation, simulation, testing, and pertaining diagnos-tics. As work will still be in progress in June, an outline of the package together with someexamples will be shown.
By now, long memory processes have made it into introductory textbooks on time seriesand the notion is known to a broad audience. However, only the very basic concepts are touchedand as applies to all of statistics, a naıve use of only the most common estimators and tests canentail misleading results.
For long memory processes, autocorrelations are not summable. Intermediate memoryprocesses share some properties with the former but have summable autocorrelations. To namebut a few challenges when dealing with intermediate and long memory processes, care has tobe taken when
• applying asympotic thresholds to tests on series of finite size, as asympotics are some-times reached only for very long time series, that are unrealistic, e.g. for financial timeseries;
• separating short memory effects, such as e.g. autoregressive processes with finite order,from long memory effects, as estimators may be heavily biased when short memory ispresent;
• applying some tests (e.g. the modified rescaled range test) to samples of finite size;
• discerning long memory from structural breaks;
• performing rolling or time varying analyis, as choosing window size is not straight for-ward.
The package includes a wide variety of time- and frequency domain estimators and tests.Special focus is given to finite size effects as key findings of research or financial analyticsmay be seriously affected. Accepted simulation techniques in literature are included in thepackage. Broadly used time series objects are accepted as input and output. The package willbe available on www.fractionalview.com until acceptance by CRAN.
∗[email protected]†Head of Research and Development, Merangasse 73, 8010 Graz, Austria‡Professor for Business Mathematics and Statistics
14
Robust and bias-corrected estimationof the coefficient of tail dependence
Christophe Dutang1,? joint work with Yuri Goegebeur2 and Armelle Guillou3
1. Laboratoire Manceau de Mathematique, Universite du Maine, Avenue Olivier Messaien, F-72000 Le Mans, France?Contact author: [email protected]
2. Department of Mathematics and Computer Science, University of Southern Denmark, Campusvej 55, 5230 Odense M, Denmark
3. Institut Recherche Mathematique Avancee, UMR 7501, Universite de Strasbourg et CNRS, 7 rue Rene Descartes, 67084 Strasbourg
cedex, France
Keywords: bias-correction, tail dependence, robustness, tail quantile process, insurance risk modelling
Multivariate extreme value statistics deals with the estimation of the tail of a multivariate distribution functionbased on a random sample. Of particular interest is the estimation of the extremal dependence between two ormore variables. Modelling tail dependence is a crucial problem in actuarial science (see e.g. Joe (2011)), firstly,because of the forthcoming Solvency II regulation framework which obliges insurers and mutuals to compute99.5% quantiles.
Secondly, tail dependence can be used in the daily work of actuaries, for instance for pricing an excess-of-lossreinsurance treaty (see Cebrian et al. (2003), and the references therein), and for approximating very largequantiles of the distribution of the sums of possibly dependent risks (Barbe et al. (2006)). In finance, obviousapplications also arise, see e.g. Charpentier and Juri (2006), and Poon et al. (2004). Therefore, accuratemodelling of extremal events is needed to better understand the relationship of possibly dependent risks at thetail.
As proposed in Dutang et al. (2014), we introduce a robust and asymptotically unbiased estimator for thecoefficient of tail dependence in multivariate extreme value statistics. The estimator is obtained by fitting asecond order model to the data by means of the minimum density power divergence criterion. The asymptoticproperties of the estimator are investigated.
This approach is implemented in the R package RTDE. The efficiency of our methodology is illustrated on asmall simulation study and by a real dataset from the actuarial context.
References
Barbe, P., A.-L. Fougeres, and C. Genest (2006). On the tail behavior of sums of dependent risks. ASTINBulletin 36, 361–373.
Cebrian, A., M. Denuit, and P. Lambert (2003). Analysis of bivariate tail dependence using extreme valuecopulas: An application to the SOA medical large claims database. Belgian Actuarial Bulletin 3, 33–41.
Charpentier, A. and A. Juri (2006). Limiting dependence structures for tail events, with applications to creditderivatives. Journal of Applied Probability 43, 563–586.
Dutang, C., Y. Goegebeur, and A. Guillou (2014). Robust and bias-corrected estimation of the coefficient oftail dependence. accepted in Insurance: Mathematics and Economics.
Joe, H. (2011). Tail dependence in vine copulae. In Dependence Modeling, pp. 165–187. World Sci. Publ.,Hackensack, NJ.
Poon, S., M. Rockinger, and J. Tawn (2004). Extreme-value dependence in financial markets: diagnostics,models, and financial implications. Review of Financial Studies 17, 581–610.15
An Economic Scenearios Generator in theSolvency II Framework
Thierry, Moudiki Frederic PlanchetISFA
In the past ten years, we have observed a generalization of economic valuations usein different framework used by insurers: regulation (Solvency 2), accounting (IFRS)and financial reporting (MCEV). This led to the use by insurers of methods originallydeveloped for pricing financial instruments to calculate their liabilities. On this occa-sion, many challenges have emerged, particularly in life insurance:
• long duration of life insurance liabilities;
• no market;
• partially endogenous risk factors;
• volatility of the value which does not reflect the risks carried.
The aim of this presentation is to present the specific elements induced by the use ofeconomic valuations in the insurance business and the consequences for constructionof a Economic Scenarios generator (ESG)..
16
Anomalous Price Impact and Critical Liquidity inFinancial Markets
Jean-Philippe BouchaudEcole Polytechnique & CFM
Two striking empirical stylized facts of market microstructure are now firmly estab-lished. One is the long-range correlation of the direction of the trades, and the other isthe square-root dependence of the impact on the volume of a metaorder. We argue thatboth are a consequence of the fact that markets operate in a regime of vanishing liq-uidity. Using minimal assumptions, we show that the local volume in the (latent) orderbook is a linear function of the price depth, which implies a concave impact function.Our analytical results are confirmed in details using a numerical model of order flowbased on realistic assumptions.
17
Dynamic style allocation of characteristic–basedequity portfolios
David Ardia∗ Kris Boudt† Marjan Wauters‡
Abstract
Many ETFs track simple equity indices in which the portfolio weightsare a normalized version of the individual stock characteristics, such as thestock’s market capitalization, its earnings or its risk. These characteristic–based portfolios claim to be good proxies for the mean–variance efficientportfolio, but earlier research has shown that their performance is time–dependent. The differences in life–cycle performance give rise to a diversifi-cation opportunity. Combining investment styles allows an investor to benefitfrom the diversification opportunity and exploit the life–cycle specificity.
The widespread use of characteristic–based portfolios raises several im-portant research questions that are addressed in this article. First, can themean–variance criterion be used to create dynamic style portfolios that allo-cate across those characteristic–based portfolios? Second, is there a gain inswitching from mean–variance allocation at the most granular equity level tomean–variance allocation across the characteristic–based portfolios. Third,can we improve performance by replacing the mean–variance criterion withthe diversification objective of the equal–weight or equally–weighted riskcontribution allocation?
In this article, we construct dynamic style portfolios to exploit the ex-pected timing–gains and apply this to the S&P 500 universe over the period1990-2012. We use the mean–variance criterion to create dynamic style port-folios and find that an investor is willing to pay an annual fee of 2.9% to 5.4%to switch from mean–variance allocation at the asset level to a characteristic–based allocation. Furthermore, we find that imposing structure reduces theimpact of estimation error and improves performance. We find similar resultsfor other criteria (like equal weighting) and when accounting for turnoverconstraints.
∗Departement de finance, assurance et immobilier, Universite Laval, Quebec City (Quebec),Canada; [email protected]
†Vrije Universiteit Brussel and V.U. University Amsterdam; [email protected]‡KU Leuven; [email protected]; Corresponding author.
1
18
The ROI Package in Action: Portfolio Optimization and
Beyond
Stefan Theußl
May 6, 2014
Affiliation:
Dep. Quant Research/Emerging Markets, Raiffeisen RESEARCH, Austria
Abstract:
Currently, R (R Development Core Team, 2014) and a wide variety of contributed packageson CRAN as well as other package repositories offer tools to solve many different optimizationproblems. The development of many extensions to R over the recent years (see e.g., the CRANTask View on Optimization and Mathematical Programming, Theußl, 2014) shows that there isdemand beyond the built-in routines. With increasing popularity of R in the area of finance,higher demand for appropriate tools to handle different optimization problem classes transparentlyemerges. However, the user interfaces to available optimizers and the output, i.e., the format ofthe returned solution, often differs considerably. Therefore, offering an integrative multi-purposeoptimization framework for R to solve various types of problems such as portfolio optimizationproblems seems to be desirable.
In this talk we present the R Optimization Infrastructure package (ROI, Theußl et al., 2013), anextensible framework for modeling and solving linear as well as nonlinear (possibly mixed-integer)optimization problems. It offers a modelling layer which is able to communicate via so-calledplug-ins to many different (open source and commercial) solvers. These solvers are capable ofhandling optimization problems of various classes.
We show how one can create and solve portfolio optimization models using the ROI package.Furthermore, we give an outlook on how to use ROI beyond portfolio optimization.
References
R Development Core Team. R: A Language and Environment for Statistical Computing. R Foun-dation for Statistical Computing, Vienna, Austria, 2014. URL http://www.R-project.org/.ISBN 3-900051-07-0.
Stefan Theußl. CRAN task view: Optimization and mathematical programming, 2014. URLhttp://CRAN.R-project.org/view=Optimization.
Stefan Theußl, Kurt Hornik, and David Meyer. ROI: R Optimization Infrastructure, 2013. URLhttp://CRAN.R-project.org/package=ROI. R package version 0.1-0.
119
A Bayesian Investment Strategy For a Better Design of Portfolios
Tobias Setz, Diethelm Würtz*, Venetia Christodoulopoulou
Swiss Federal Institute of Technology, Zurich, Switzerland *Contact author: [email protected]
Keywords: Portfolio Design, Bayesian Change Points, MCMC Simulations, Shiny We present an investment strategy based on predictive Bayesian Change Point (BCP) stability analytics [4] and Markov Chain Monte Carlo (MCMC) simulations [2, 3] for a better design of portfolios. Our approach makes optimization obsolete and comes with many additional advantages compared with the standard Markowitz investment approach [1].
Our portfolios are characterized by a high degree of stability of the underlying price process resulting in a steady increase of returns, low drawdowns, short recovery times, and low volatilities. We present results for several performance and risk measures and compare them with common benchmarks to demonstrate the advantage of the stabilization process.
As an example we discuss a Euro based ETF portfolio build from Large and Small Cap Equities, REITS, and Government Bonds. All calculations were done in R using the Rmetrics package family. Furthermore, an R shiny web application that visualizes stability forecasts and portfolio rebalancing will be demonstrated.
References [1] Markowitz Harry (1952). Portfolio Selection. Journal of Finance Vol. 7 No. 1, 77-91. [2] Barry Daniel, and Hartigan John A. (1993). A Bayesian Analysis for Change Point Problems. Journal of the American Statistical Association 35, 309-319. [3] Erdman Chandra, and Emerson John W. (2008). A fast Bayesian change point analysis for the segmentation of microarray data. Bioinformatics 24, 2143-2148. [4] Würtz Diethelm, Chalabi Yohan, Ellis Andrew, and Theussl Stefan (2010). Proceedings of the
Singapore Conference on “Computational Finance and Financial Engineering”, pp. 205 – 213. Presented at the R/Rmetrics Workshop and Summer School, Paris, 26 – 28. June 2014
20
New Directions in Tactical Asset Management: The Analysis of Spot and Forward FX Markets
Jan Hendrik Witte1, Tobias Setz2, Diethelm Würtz2, Venetia Christodoulopoulou2 1Record Currency Management, Windsor, UK, 2Swiss Federal Institute of Technology, Zurich, CH
R/Rmetrics Workshop and Summer School, Paris, 26 – 28. June 2014
Modern Portfolio Theory goes back to Harry Markowitz [1952]. When he published his article Portfolio Selection more than half a century ago, our knowledge of mathematical finance, econometrics, and statistics, as well as computer science, was much less developed than the options and the tools we have available today. In this presentation, we would like to share new ideas based on modern concepts of stability analytics, which allow for an alternative view on performance and risk in funds and portfolios and their impact on indexation techniques and tactical asset management. The main topics we would like to address are based on statistical methods for the identification of instabilities. We would like to detect and explore vulnerabilities in the dynamics behind financial markets. With our new approach, we would like to reinvestigate and to contribute to a better understanding of performance and risk. The indicators and measures we use allow us to create a distinct view of the weaknesses of classical allocation methods and modern portfolio theory and to venture into alternative and new directions. We have introduced (Würtz et al. [2010]) a method to identify and locate change points and structural breaks in financial and economic time series. We named this method BCP (Bayesian change point) Stability Analytics, an approach based on the work of Barry and Hartigan [1993] on the partitioning of a time series in strings with different parameter settings. The parameters are the same within each partition, but change from one to the next. The method makes use of Bayesian Statistics and a Monte Carlo Markov Chain approach (implemented by Erdman and Emerson [2008]). The analytics can be used to explore financial markets and financial investments before, during, and after critical financial and economic periods, which include, for example, the recent sub-‐prime crisis, or the European debt crisis. From a „stabilization“ of a financial time series, we expect a steady increase in the cumulative returns that go with low volatility, small drawdowns, and short recovery times. We show that BCP Stability analytics can detect, analyse, quantify, and even predict vulnerabilities to external forces of a time series process. Thus, we can define figures to measure the structure and strength of instabilities appearing over time. As a practical example, we assess the fragility of different currencies in spot and forward FX markets. We show how wealth protected FX indices can be constructed, and how they can be combined in an FX portfolio. As a valuable tool to visualize the results we additionally present an R Shiny app. References
Markowitz Harry, [1952], Portfolio Selection, Journal of Finance Vol. 7 No. 1, pp. 77 -‐ 91
Barry Daniel, and Hartigan John A. [1993], A Bayesian Analysis for Change Point Problems, Journal of the American Statistical Association 35, 309.
Erdman Chandra, and Emerson John W. [2008], A fast Bayesian change point analysis for the segmentation of microarray data, Bioinformatics Vol. 24, 2143.
Würtz Diethelm, Chalabi Yohan, Ellis Andrew, and Theussl Stefan [2010], Proceedings of the Singapore Conference on “Computational Finance and Financial Engineering, pp. 205 -‐ 213, Finance Online Publishing, Zurich
Note: All calculations were done in the R, R Shiny and Rmerics Software Environments. _______________________________ Contact Address: Diethelm Würtz, [email protected]
21
Stability Analysis of the Swiss Performance Index
Cyril Bachelard2, Tobias Setz1, Diethelm Wurtz1, and Lorenz Beyeler2
1Institute for Theoretical Physics, ETH Zurich2OLZ & Partners Asset and Liability Management AG, Berne
May 2014
Abstract
“Bayesian Statistics” and “Markov Chain Monte Carlo” methods are two powerful approaches.
They allow us to locate and quantify structural changes and structural breaks in the dynamics of
financial return series. Wurtz et al. [2010] used this and related approaches to define new financial
stability measures to evaluate performance and risk on a common level.
In this talk we use the “Product Partition Method” of Barry and Hartigan [1992, 1993] in the
implementation of Erdman and Emerson [2008] to define a new investment indicator and rebalan-
cing scheme for the Swiss Performance Index, SPI. The aim is to better protect the capital when
investing into the Swiss market. Our investment strategy is applied to the SPI Index, to its sector
indices, and to more than 100 equities listed in the index.
The result is a capital protected investment with a stable positive trend in the wealth, essentially
lower drawdowns, shorter recovery times from losses, and much lower volatilities.
Talk presented at the R/Rmetrics Workhop and Summer School, Paris June 2014
Contact Address: [email protected]
References
D. Wurtz, Y. Chalabi, A Ellis, W. Chen, S. Theussl,
Proceedings of the Singapore Workshop on Computational Finance,
Editors: Diethelm Wurtz, Mahendra Mehta, David Scott, and Juri Hinz,
Rmetrics Publishing, ISBN/EAN 9783906041087, www.rmetrics.org, p. 209, 2010.
D. Barry and J.A. Hartigan,
Product Partition Models for Change Point Problems,
The Annals of Statistics 20, 260, 1992.
D. Barry and J.A. Hartigan,
A Bayesian Analysis of Change Point Problems,
Journal of the American Statistical Association 88, 309, 1993.
Ch. Erdman and J.W. Emerson,
A Fast Bayesian Change Point Analysis for the Segmentation of Microarray Data,
Bioinformatics 24, 2143, 2008.
122
Risk Parity for CVaR and Downside Risk
Evgeny Bauman
Quantitative Research Department
Markov Processes International LLC
Summit, NJ, 07091, US
Summary
Risk Parity is one of the most popular heuristic asset allocation methods today [1, 2]. The idea of the
technique is to construct a portfolio balanced in such a way that the risk contribution of different assets is
the same. Originally Risk Parity approach is based on the volatility of portfolio as a risk measure. There
are attempts to apply Risk Parity methodology to other risk measures: Downside Volatility, Value at Risk,
Conditional Value at Risk (CVaR) etc. Principles of Risk Parity portfolio constructing were developed in
[2] for coherent measures as introduced in [3].
We consider the problems of Risk Parity and Budgeting portfolios for two measures: Downside
Volatility and CVaR. CVaR (aka Expected Shortfall) is the expected loss given that loss is beyond the
given quintile of loss distribution [4]. Downside Volatility (Semi-variance) is an expected value of the
squared negative deviations about a specified “target” rate of return [5]. Both measures are coherent.
However algorithms for these two cases have not been well developed yet. The distinctive feature of
these two measures is the possibility of existence of risk-free portfolio. We suggested a new optimization
formulation for this problem, which allows us to develop efficient algorithms. The conditions for
existence of Risk Parity portfolio are investigated. The comparative analysis of CVaR Risk Parity,
Downside Risk Parity and Volatility Risk Parity portfolios is carried out.
Key words and phrases: Risk Parity, Risk Measure, Downside Volatility, CVaR.
References:
[1]. Thiagarajan R.S. and Schachter B. (2011). Risk Parity: Rewards, Risks, and Research
Opportunities, Journal of Investing, Spring, 79-89.
[2]. Roncalli T. (2014). Introduction to Risk Parity and Budgeting, CRS Press, Taylor & Francis
Group, Boca Raton, London, New York, 410p..
[3]. Artzner P., Delbaen F., Eber J.-M., and Heath D. (1999a): Coherent Measures of Risk,
Mathematical Finance 9(3), 203-228.
[4]. R. T. Rockafellar and S. P. Uryasev. (2002) Conditional value-at-risk for general loss
distributions. Journal of Banking and Finance 2, 1443–1471.
[5]. Bauman E., Markov M. Downside Risk Optimization via Quasi-Gradient Algorithm,
Proceedings 24th European Conference on Operation Research (EURO 2010 Lisbon, 11-14
July 2010), 260.
23
Quantifying Model Risk in Pricing of
Path-dependent Derivatives
Vineet VirmaniIndian Institute of Management
Ahmedabad, India
March 29, 2014
Abstract
Model selection and model uncertainty go hand-in-hand in deriva-tives pricing. However, while there is uncertainty associated withthe selection of any model, the context is paramount. Pricingpath-dependent derivatives, particularly the ones whose payoffdepends crucially on ‘forward volatility’, inevitably involve work-ing with sophisticated stochastic volatility models. This intro-duces two levels of model uncertainty in valuation of such prod-ucts - the risk associated with using an ‘incorrect’ model andthat associated with errors in calibration of model parameters.
This study is an attempt at understanding model uncertainty inpricing of barrier and cliquet options (a portfolio of ‘forward set-ting’ call spreads) when using four alternative stochastic volatil-ity models (Heston, Bates, Double Lognormal and Double He-ston). After discussing issues in pricing cliquet-like products,we study calibration issues in some detail and try and identify‘best’ optimization routine within R for the task (preliminaryevidence points towards Differential Evolution). Drawing fromCont (2006), the study ends with a review of ways used for quan-tifying model risk in practice.
24
Modeling Life expectancy at birth formulti-population: A cointegration approach
A. Ntamjokouen, Bergamo UniversityS. Habermann, Cass Business School
G.Consigli, Bergamo University
Abstract
The continuous improvements in mortality rates and life expectancy in thelast century have given particular attention from academics, life insurers,financial engineers and pension planners, particularly in developped coun-tries. Mortality-linked securities such as longevity bonds(EIB & BNP aswell as Swiss Re bonds), survivor swaps or mortality forward( q-forward)have appeared recently in the industry to help operators to hegde such risks.A classical longevity bond has been proposed in literature with coupons pay-ment linked to life time of the last survivor in insurance reference portfolio.It appears therefore to be crucial to improve the accuracy of future life ex-pectancy. In this paper, we investigate time varying dependency associatedwith common trends which drive regional life expectancy within Canada.We aim to compare the three main models appeared recently in literaturesuch as auregressive integrated moving average (ARIMA), vector autore-gressive model (VAR) and the vector of error correction model (VECM), toanalyse the common factors that have determined a progressive shift of lifeexpectancy in six canadian regions. The latter shows better performanceover VAR and ARIMA in terms of backtesting and ability to capture thedynamics of common life expectancy. These results contrast with what hasbeen done in the existing literature. Findings from this analysis are usefulfor local insurers in their goal to project life expectancy improvements andalso to forecast future trends.KEY WORDS: Life expectancy at birth, VECM, VAR, ARIMA, ConfidenceInterval
1
25
Informed trades, uninformed trades and market
resiliency: Evidence from a limit order book
market
Rajat TayalIndira Gandhi Institute of Development Research, Mumbai, India
April 27, 2014
Abstract
Advantages of speed, simplicity, anonymity and low costs have driventhe rapid adoption of electronic limit order book as the market microstruc-ture to trade equities, bonds, foreign exchange and derivatives worldwide.However, unlike traditional markets such as NYSE where the marketmaker stands ready to supply liquidity at any time, limit order book mar-kets depend on public limit orders to provide liquidity, raising questionsreagarding the resiliency of the market mechanism under stress.
Resiliency addresses an important question for market participantsand regulators,i.e., when trades change market prices and lead to tempo-rary pricing errors, how fast are these pricing errors eliminated throughthe competitive action of value traders, dealers and other market partici-pants. The seminal literature on market resiliency by Kyle (1985) definesit as the speed with which pricing errors caused by “uninformative orderflow shocks” are corrected or neutralised in the market.
Conventional studies of market resiliency study the replenishment mech-anism of the order book after any liquidity demand shock, not just thosethat are “random” and “uninformative”. This paper is the first study toattempt the measurement of resiliency using a novel proprietary datasetof informed and uninformed trades and high frequency order book infor-mation from the one of the fastest growing stock markets of the world.
We make use of intraday trades in stocks that are part of index fundstraded on the National Stock Exchange of India. Index fund trades in theunderlying securities forming the fund are identified as ‘uninformed trades’done in response to investment in the scheme, and subscriptions/redemptionsin the index fund. High frequency order book information at a one secondfrequency is used to characterise the prices and liquidity of the market.Intraday event study framework is implemented in R to estimate the re-siliency of the stocks.
Various dimensions of the resiliency of the stocks is explored includingcross-sectional heterogeneity, asymmetry of resiliency in response to buyand sell trades, intraday seasonality and cross-sectional determinants ofresiliency of stocks.
JEL Classification: G10, G14, C58.Keywords: Market microstructure, limit order book, resiliency, event studies,
high frequency.
1
26