Browse ORBi by ORBi project

- Background
- Content
- Benefits and challenges
- Legal aspects
- Functions and services
- Team
- Help and tutorials

A robust statistical approach to select adequate error distributions for financial returns Hambuckers, julien ; Heuchenne, Cédric in Journal of Applied Statistics (in press) In this article, we propose a robust statistical approach to select an appropriate error distribution, in a classical multiplicative heteroscedastic model. In a first step, unlike to the traditional ... [more ▼] In this article, we propose a robust statistical approach to select an appropriate error distribution, in a classical multiplicative heteroscedastic model. In a first step, unlike to the traditional approach, we don't use any GARCH-type estimation of the conditional variance. Instead, we propose to use a recently developed nonparametric procedure (Mercurio and Spokoiny, 2004): the Local Adaptive Volatility Estimation (LAVE). The motivation for using this method is to avoid a possible model misspecification for the conditional variance. In a second step, we suggest a set of estimation and model selection procedures (Berk-Jones tests, kernel density-based selection, censored likelihood score, coverage probability) based on the so-obtained residuals. These methods enable to assess the global fit of a set of distributions as well as to focus on their behavior in the tails, giving us the capacity to map the strengths and weaknesses of the candidate distributions. A bootstrap procedure is provided to compute the rejection regions in this semiparametric context. Finally, we illustrate our methodology throughout a small simulation study and an application on three time series of daily returns (UBS stock returns, BOVESPA returns and EUR/USD exchange rates) [less ▲] Detailed reference viewed: 45 (8 ULg)Estimating the out-of-sample predictive ability of trading rules: a robust bootstrap approach Hambuckers, julien ; Heuchenne, Cédric in Journal of Forecasting (in press) In this paper, we provide a novel way to estimate the out-of-sample predictive ability of a trading rule. Usually, this ability is estimated using a sample splitting scheme, true out-of-sample data being ... [more ▼] In this paper, we provide a novel way to estimate the out-of-sample predictive ability of a trading rule. Usually, this ability is estimated using a sample splitting scheme, true out-of-sample data being rarely available. We argue that this method makes a poor use of the available data and creates data mining possibilities. Instead, we introduce an alternative .632 bootstrap approach. This method enables to build in- sample and out-of-sample bootstrap datasets that do not overlap but exhibit the same time dependencies. We show in a simulation study that this technique drastically reduces the mean squared error of the estimated predictive ability. We illustrate our methodology on IBM, MSFT and DJIA stock prices, where we compare 11 trading rules speci cations. For the considered datasets, two different filter rule specifications have the highest out-of-sample mean excess returns. However, all tested rules cannot beat a simple buy-and-hold strategy when trading at a daily frequency. [less ▲] Detailed reference viewed: 36 (4 ULg)Modeling operational losses: a conditional Generalized Pareto regression based on a single-index assumption Hambuckers, julien ; Heuchenne, Cédric ; Scientific conference (2016, March 09) In this paper, we consider a regression model in which the tail of the conditional distribution of the response can be approximated by a Generalized Pareto distribution. Our model is based on a ... [more ▼] In this paper, we consider a regression model in which the tail of the conditional distribution of the response can be approximated by a Generalized Pareto distribution. Our model is based on a semiparametric single-index assumption on the conditional tail index while no further assumption on the conditional scale parameter is made. The underlying dimension reduction assumption allows the procedure to be of prime interest in the case where the dimension of the covariates is high, in which case the purely nonparametric techniques fail while the purely parametric one are too rough to correctly fit to the data. We propose an iterative algorithm in order to perform their practical implementation. Our results are supported by some simulations. To illustrate the proposed approach, the method is applied to a novel database of operational losses from the bank UniCredit [less ▲] Detailed reference viewed: 14 (1 ULg)Modeling operational losses: a conditional Generalized Pareto regression based on a single-index assumption Hambuckers, julien ; Heuchenne, Cédric ; Scientific conference (2016, February 24) In this paper, we consider a regression model in which the tail of the conditional distribution of the response can be approximated by a Generalized Pareto distribution. Our model is based on a ... [more ▼] In this paper, we consider a regression model in which the tail of the conditional distribution of the response can be approximated by a Generalized Pareto distribution. Our model is based on a semiparametric single-index assumption on the conditional tail index while no further assumption on the conditional scale parameter is made. The underlying dimension reduction assumption allows the procedure to be of prime interest in the case where the dimension of the covariates is high, in which case the purely nonparametric techniques fail while the purely parametric one are too rough to correctly fit to the data. We propose an iterative algorithm in order to perform their practical implementation. Our results are supported by some simulations. To illustrate the proposed approach, the method is applied to a novel database of operational losses from the bank UniCredit [less ▲] Detailed reference viewed: 16 (3 ULg)Modeling the dependence between extreme operational losses and economic factors: a conditional semi-parametric Generalized Pareto approach Hambuckers, julien ; Heuchenne, Cédric ; Conference (2015, December) In this paper, we model the severity distribution of operational losses data, condi- tional on some covariates. Indeed, previous studies [Chernobai et al., 2011, Cope et al., 2012, Chavez-Demoulin et al ... [more ▼] In this paper, we model the severity distribution of operational losses data, condi- tional on some covariates. Indeed, previous studies [Chernobai et al., 2011, Cope et al., 2012, Chavez-Demoulin et al., 2014a] suggest that this distribution might be in uenced by macroeconomic and rm-speci c factors. We introduce a conditional Generalized Pareto model, where the shape parameter is an unknown function of a linear combina- tion of the covariates. More precisely, we rely on a single-index assumption to perform a dimension reduction that enables to use univariate nonparametric techniques. Hence, we su er neither from too strong parametric assumption nor from the curse of dimen- sionality. Then, we develop an iterative approach to estimate this model, based on the maximisation of a semiparametric likelihood function. Finally, we apply this method- ology on a novel database provided by the bank UniCredit. We use rm-speci c factors to estimate the conditional shape parameter of the severity distribution. Our analysis suggests that the leverage ratio of the company, the proportion of the revenue coming from fees as well as the risk category have an important impact on the tail thickness of this distribution and thus on the probability of su ering from large operational losses. [less ▲] Detailed reference viewed: 33 (1 ULg)What are the determinants of the operational losses severity distribution ? A multivariate analysis based on a semiparametric approach. Hambuckers, julien ; Heuchenne, Cédric ; Poster (2015, June) In this paper, we analyse a database of around 41,000 operational losses from the European bank UniCredit. We investigate three kinds of covariates: firm-specific, fi- nancial and macroeconomic covariates ... [more ▼] In this paper, we analyse a database of around 41,000 operational losses from the European bank UniCredit. We investigate three kinds of covariates: firm-specific, fi- nancial and macroeconomic covariates and we study their relationship with the shape parameter of the severity distribution. To do so, we introduce a semiparametric approach to estimate the shape parameter of the severity distribution, conditionally to large sets of covariates. Relying on a single index assumption to perform a dimension reduction, this approach avoids the curse of dimensionality of pure multivariate nonparametric techniques as well as too restrictive parametric assumptions. We show that taking into account variables measuring the economic well being of the bank could cause the required Operational Value-at-Risk to vary drastically. Especially, high pre-tax ROE, efficiency ratio and stock price are associated with a low shape parameter of the severity distribution, whereas a high market volatility, leverage ratio and unemployment rate are associated with higher tail risks. Finally, we discuss the fact that the considered approach could be an interesting tool to improve the estimation of the parameters in a Loss Distribution Approach and to offer an interesting methodology to study capital requirements variations throughout scenario analyses. [less ▲] Detailed reference viewed: 31 (3 ULg)Nonparametric and bootstrap techniques applied to financial risk modeling Hambuckers, julien Doctoral thesis (2015) For the purpose of quantifying financial risks, risk managers need to model the behavior of financial variables. However, the construction of such mathematical models is a difficult task that requires ... [more ▼] For the purpose of quantifying financial risks, risk managers need to model the behavior of financial variables. However, the construction of such mathematical models is a difficult task that requires careful statistical approaches. Among the important choices that must be addressed,we can list the error distribution, the structure of the variance process, the relationship between parameters of interest and explanatory variables. In particular, one may avoid procedures that rely either on too rigid parametric assumptions or on inefficient estimation procedures. In this thesis, we develop statistical procedures that tackle some of these issues, in the context of three financial risk modelling applications. In the first application, we are interested in selecting the error distribution in a multiplicative heteroscedastic model without relying on a parametric volatility assumption. To avoid this uncertainty, we develop a set of model estimation and selection tests relying on nonparametric volatility estimators and focusing on the tails of the distribution. We illustrate this technique on UBS, BOVESPA and EUR/USD daily stock returns. In the second application, we are concerned by modeling the tail of the operational losses severity distribution, conditionally to several covariates. We develop a flexible conditional GPD model, where the shape parameter is an unspecified link function (nonparametric part) of a linear combination of covariates (single index part), avoiding the curse of dimensionality. We apply successfully this technique on two original databases, using macroeconomic and firm-specific variables as covariates. In the last application, we provide an efficient way to estimate the predictive ability of trading algorithms. Instead of relying on subjective and noisy sample splitting techniques, we propose an adaptation of the .632 bootstrap technique to the time series context. We apply these techniques on stock prices to compare 12,000 trading rules parametrizations and show that none can beat a simple buy-and-hold strategy. [less ▲] Detailed reference viewed: 89 (26 ULg)A semiparametric model for Generalized Pareto regression based on a dimension reduction assumption Hambuckers, julien ; Heuchenne, Cédric ; E-print/Working paper (2015) In this paper, we consider a regression model in which the tail of the conditional distribution of the response can be approximated by a Generalized Pareto distribution. Our model is based on a ... [more ▼] In this paper, we consider a regression model in which the tail of the conditional distribution of the response can be approximated by a Generalized Pareto distribution. Our model is based on a semiparametric single-index assumption on the conditional tail index; while no further assumption on the conditional scale parameter is made. The underlying dimension reduction assumption allows the procedure to be of prime interest in the case where the dimension of the covariates is high, in which case the purely nonparametric techniques fail while the purely parametric ones are too rough to correctly fit to the data. We derive asymptotic normality of the estimators that we define, and propose an iterative algorithm in order to perform their practical implementation. Our results are supported by some simulations and a practical application on a public database of operational losses. [less ▲] Detailed reference viewed: 53 (8 ULg)Identifying the best technical trading rule: a .632 bootstrap approach. Hambuckers, julien ; Heuchenne, Cédric Conference (2014, December 07) In this paper, we estimate the out-of-sample predictive ability of a set of trading rules. Usually, this ability is estimated using a rolling-window sample-splitting scheme, true out-of-sample data being ... [more ▼] In this paper, we estimate the out-of-sample predictive ability of a set of trading rules. Usually, this ability is estimated using a rolling-window sample-splitting scheme, true out-of-sample data being rarely available. We argue that this method makes a poor use of the available information and creates data mining possibilities. Instead, we introduce an alternative bootstrap approach, based on the .632 bootstrap principle. This method enables to build in-sample and out-of-sample bootstrap data sets that do not overlap and exhibit the same time dependencies. We illustrate our methodology on IBM and Microsoft daily stock prices, where we compare 11 trading rules specifications. For the data sets considered, two different filter rule specifications have the highest out-of-sample mean excess returns. However, all tested rules cannot beat a simple buy-and-hold strategy when trading at a daily frequency. [less ▲] Detailed reference viewed: 60 (10 ULg)Estimating the out-of-sample predictive ability of trading rules: a robust bootstrap approach Hambuckers, julien ; Heuchenne, Cédric E-print/Working paper (2014) In this paper, we estimate the out-of-sample predictive ability of a set of trading rules. Usually, this ability is estimated using a rolling-window sample-splitting scheme, true out-of-sample data being ... [more ▼] In this paper, we estimate the out-of-sample predictive ability of a set of trading rules. Usually, this ability is estimated using a rolling-window sample-splitting scheme, true out-of-sample data being rarely available. We argue that this method makes a poor use of the available information and creates data mining possibilities. Instead, we introduce an alternative bootstrap approach, based on the .632 bootstrap principle. This method enables to build in-sample and out-of-sample bootstrap data sets that do not overlap and exhibit the same time dependencies. We illustrate our methodology on IBM and Microsoft daily stock prices, where we compare 11 trading rules specifications. For the data sets considered, two different filter rule specifications have the highest out-of-sample mean excess returns. However, all tested rules cannot beat a simple buy-and-hold strategy when trading at a daily frequency. [less ▲] Detailed reference viewed: 67 (18 ULg)A new methodological approach for error distributions selection in Finance Hambuckers, julien ; Heuchenne, Cédric E-print/Working paper (2014) In this article, we propose a robust methodology to select the most appropriate error distribution candidate, in a classical multiplicative heteroscedastic model. In a first step, unlike to the ... [more ▼] In this article, we propose a robust methodology to select the most appropriate error distribution candidate, in a classical multiplicative heteroscedastic model. In a first step, unlike to the traditional approach, we don't use any GARCH-type estimation of the conditional variance. Instead, we propose to use a recently developed nonparametric procedure (Mercurio and Spokoiny, 2004): the Local Adaptive Volatility Estimation (LAVE). The motivation for using this method is to avoid a possible model misspecification for the conditional variance. In a second step, we suggest a set of estimation and model selection procedures (Berk-Jones tests, kernel density-based selection, censored likelihood score, coverage probability) based on the so-obtained residuals. These methods enable to assess the global fit of a given distribution as well as to focus on its behavior in the tails. Finally, we illustrate our methodology on three time series (UBS stock returns, BOVESPA returns and EUR/USD exchange rates). [less ▲] Detailed reference viewed: 68 (28 ULg)A new methodological approach for error distributions selection in Finance Hambuckers, julien ; Heuchenne, Cédric Conference (2014, April) In this article, we propose a robust methodology to select the most appropriate error distribution candidate, in a classical multiplicative heteroscedastic model. In a first step, unlike to the ... [more ▼] In this article, we propose a robust methodology to select the most appropriate error distribution candidate, in a classical multiplicative heteroscedastic model. In a first step, unlike to the traditional approach, we don't use any GARCH-type estimation of the conditional variance. Instead, we propose to use a recently developed nonparametric procedure (Mercurio and Spokoiny, 2004): the Local Adaptive Volatility Estimation (LAVE). The motivation for using this method is to avoid a possible model misspecification for the conditional variance. In a second step, we suggest a set of estimation and model selection procedures (Berk-Jones tests, kernel density-based selection, censored likelihood score, coverage probability) based on the so-obtained residuals. These methods enable to assess the global fit of a given distribution as well as to focus on its behavior in the tails. Finally, we illustrate our methodology on three time series (UBS stock returns, BOVESPA returns and EUR/USD exchange rates). [less ▲] Detailed reference viewed: 74 (31 ULg)A new methodological approach for error distributions selection Hambuckers, julien ; Heuchenne, Cédric Conference (2013, December 15) Since 2008 and its ﬁnancial crisis, an increasing attention has been devoted to the selection of an adequate error distribution in risk models, in particular for Value-at-Risk (VaR) predictions. We ... [more ▼] Since 2008 and its ﬁnancial crisis, an increasing attention has been devoted to the selection of an adequate error distribution in risk models, in particular for Value-at-Risk (VaR) predictions. We propose a robust methodology to select the most appropriate error distribution candidate, in a classical multiplicative heteroscedastic model. In a ﬁrst step, unlike to the traditional approach, we do not use any GARCH-type estimation of the conditional variance. Instead, we propose to use a recently developed nonparametric procedure: the Local Adaptive Volatility Estimation (LAVE). The motivation for using this method is to avoid a possible model misspeciﬁcation for the conditional variance. In a second step, we suggest a set of estimation and model selection procedures tests based on the so-obtained residuals. These methods enable to assess the global ﬁt of a given distribution as well as to focus on its behaviour in the tails. Finally, we illustrate our methodology on three time series (UBS stock returns, BOVESPA returns and EUR/USD exchange rates). [less ▲] Detailed reference viewed: 29 (9 ULg)A new methodological approach for error distributions selection Hambuckers, julien ; Heuchenne, Cédric Scientific conference (2013, November) Since 2008 and its ﬁnancial crisis, an increasing attention has been devoted to the selection of an adequate error distribution in risk models, in particular for Value-at-Risk (VaR) predictions. We ... [more ▼] Since 2008 and its ﬁnancial crisis, an increasing attention has been devoted to the selection of an adequate error distribution in risk models, in particular for Value-at-Risk (VaR) predictions. We propose a robust methodology to select the most appropriate error distribution candidate, in a classical multiplicative heteroscedastic model. In a ﬁrst step, unlike to the traditional approach, we do not use any GARCH-type estimation of the conditional variance. Instead, we propose to use a recently developed nonparametric procedure: the Local Adaptive Volatility Estimation (LAVE). The motivation for using this method is to avoid a possible model misspeciﬁcation for the conditional variance. In a second step, we suggest a set of estimation and model selection procedures tests based on the so-obtained residuals. These methods enable to assess the global ﬁt of a given distribution as well as to focus on its behaviour in the tails. Finally, we illustrate our methodology on three time series (UBS stock returns, BOVESPA returns and EUR/USD exchange rates). [less ▲] Detailed reference viewed: 33 (9 ULg)New issues for the Goodness-of-fit test of the error distribution : a comparison between Sinh-arscinh and Generalized Hyperbolic distribution Hambuckers, julien ; Heuchenne, Cédric Conference (2013, April 30) In this article, we consider a multiplicative heteroskedastic structure of financial returns and propose a methodology to study the goodness-of-fit of the error distribution. We use non-conventional ... [more ▼] In this article, we consider a multiplicative heteroskedastic structure of financial returns and propose a methodology to study the goodness-of-fit of the error distribution. We use non-conventional estimation and model selection procedures (Berk-Jones (1978) tests, Sarno and Valente (2004) hypothesis testing, Diks et al. (2011) weighting method), based on the local volatility estimator of Mercurio and Spokoiny (2004) and the bootstrap methodology to compare the fit performances of candidate density functions. In particular, we introduce the sinh-arcsinh distributions (Jones and Pewsey, 2009) and we show that this family of density functions provides better bootstrap IMSE and better weighted Kullback-Leibler distances. [less ▲] Detailed reference viewed: 54 (22 ULg)New issues for the Goodness-of-fit test of the error distribution : a comparison between Sinh-arcsinh and Generalized Hyperbolic distributions Hambuckers, julien ; Heuchenne, Cédric Conference (2013, April 19) In this article, we consider a multiplicative heteroskedastic structure of financial returns and propose a methodology to study the goodness-of-fit of the error distribution. We use non-conventional ... [more ▼] In this article, we consider a multiplicative heteroskedastic structure of financial returns and propose a methodology to study the goodness-of-fit of the error distribution. We use non-conventional estimation and model selection procedures (Berk-Jones (1978) tests, Sarno and Valente (2004) hypothesis testing, Diks et al. (2011) weighting method), based on the local volatility estimator of Mercurio and Spokoiny (2004) and the bootstrap methodology to compare the fit performances of candidate density functions. In particular, we introduce the sinh-arcsinh distributions (Jones and Pewsey, 2009) and we show that this family of density functions provides better bootstrap IMSE and better weighted Kullback-Leibler distances. [less ▲] Detailed reference viewed: 30 (13 ULg)Comments to 'The time inconsistency factor: how banks adapt to their savers mix' (C. Laureti and A. Szafarz, working paper, 2012) Hambuckers, julien Scientific conference (2012, October 23) Comments about 'The time-Inconsistency Factor: How banks adapt to their savers Mix' by C. Laureti and A. Szafarz (working paper, 2012). Detailed reference viewed: 42 (6 ULg)Modélisation d'évènements rares à l'aide de distributions non normales : application en finance avec la fonction sinh-arcsinh Hambuckers, julien Master's dissertation (2011) In 2008, the financial crisis put forward the relative inaccuracy of the market risk forecasting models in the financial industry. In particular, extreme events were shown to be regularly underestimated ... [more ▼] In 2008, the financial crisis put forward the relative inaccuracy of the market risk forecasting models in the financial industry. In particular, extreme events were shown to be regularly underestimated. This problematic, initially developed in the seminal work of Mandelbrot (1963), is mainly due to financial models using the normal law while empirical evidence show strong leptokurticity in financial time series. This stylized effect is particularly damaging the forecasting of indicators like Value-at-Risk (VAR). In this study, we try to tackle problem by testing a newly-developed probability distribution, never used in finance: sinh-arcsinh function. By creating different datasets from non-parametric and GARCH models, we adjust common functions (normal, t location-scale, GED, gen. hyperbolic) and sinh-arcsinh function on the data. We show that, regarding the leptokurtic datasets extracted from the DJA and the NIKKEI 225, the sinh-arcsinh function performs a better adjustment than any other function tested. We also tested simple VAR models using normal laws, Student’s t or sinh-arcsinh functions, to assess the operational efficiency of the sinh-arcsinh function. We show that models using sinh-arcsinh functions provide more accurate and better in-sample and out-of-sample VAR forecasts than any other model using the normal laws. [less ▲] Detailed reference viewed: 119 (36 ULg) |
||