ORBi Collection: Quantitative methods in economics & management
http://hdl.handle.net/2268/71
The Collection's search engineSearch this channelsearch
http://orbi.ulg.ac.be/simple-search
Constrained Vehicle Routing and speed optimization Problem (CVRsoP) for electric vehicles.
http://hdl.handle.net/2268/182055
Title: Constrained Vehicle Routing and speed optimization Problem (CVRsoP) for electric vehicles.
<br/>
<br/>Author, co-author: Bay, Maud; Limbourg, SabineExact and Heuristic Solution Methods for a VRP with Time Windows and Variable Service Start Time
http://hdl.handle.net/2268/181793
Title: Exact and Heuristic Solution Methods for a VRP with Time Windows and Variable Service Start Time
<br/>
<br/>Author, co-author: Michelini, Stefano; Arda, Yasemin; Crama, Yves; Küçükaydin, Hande
<br/>
<br/>Abstract: We consider a VRP with time windows in which the total cost of a solution depends on the duration of the vehicle routes, and the starting time for each vehicle is a decision variable. We first develop a Branch-and-Price algorithm considering the pricing subproblem, an elementary shortest path problem with resource constraints (ESPPRC). We discuss research on this exact solution methodology, based on a bidirectional dynamic programming approach for the ESPPRC, and on the design of a matheuristic.A Simple Method for Designing Shewhart X ̅ and S2 Control Charts to Guarantee the In-Control Performance
http://hdl.handle.net/2268/181722
Title: A Simple Method for Designing Shewhart X ̅ and S2 Control Charts to Guarantee the In-Control Performance
<br/>
<br/>Author, co-author: Faraz, Alireza; Heuchenne, Cédric; Woodall, Bill
<br/>
<br/>Abstract: The in-control performance of the Shewhart X ̅ and S2 control charts with estimated in-control parameters has been evaluated by a number of authors. Results indicate an unrealistically large amount of Phase I data is needed to have the desired in-control average run length (ARL) value in Phase II. To overcome this problem, it has been recommended that the control limits be adjusted based on a bootstrap method to guarantee that the in-control ARL is at least a specified value with a certain specified probability. In our paper we present simple formulas for the required control limits so that practitioners do not have to use the bootstrap method. An assumption of normality is required. The advantage of our proposed method is in its simplicity; there is no bootstrapping and the control chart constants do not depend on the Phase I sample data.TSP model for electric vehicle deliveries, considering speed, loading and path slope.
http://hdl.handle.net/2268/181419
Title: TSP model for electric vehicle deliveries, considering speed, loading and path slope.
<br/>
<br/>Author, co-author: Bay, Maud; Limbourg, Sabine
<br/>
<br/>Abstract: In the current Transport White Paper, the European Union presents a roadmap for a more competitive and sustainable European transport system. Concerning Urban Freight Transport, responsible for about a quarter of CO2 emissions of the transport sector, one of the goals of the EU is to achieve essentially CO2-free city logistics in major urban centres by 2030 by developing and deploying new and sustainable fuels and propulsion systems. The gradual phasing out of ‘conventionally-fuelled’ vehicles from the urban environment contributes to reduce oil dependence, greenhouse gas emissions and local air and noise pollution. To meet European air quality standards, authorities of some major European cities have already introduced Low Emissions Zones where access to urban areas is limited to freight vehicles that meet certain emissions standards.
Greater use of low-emission urban trucks based on electric, hydrogen and hybrid technologies would reduce air emissions, but also noise, letting to use road infrastructure more efficiently by making night deliveries and avoiding morning and afternoon peak periods. In addition to their role in the reduction of polluting emissions, the development of low-emission vehicles also allows to mitigate the dependence of the transportation sector to high fossil fuel prices. Electric vehicles have the potential to be powered by renewable energy sources such as wind and solar energy.
However, weaknesses can be found in the limited capacity of the battery and the time needed to recharge, and consequently, the limited driving range of electric vehicle. The main sources for final energy consumption are the vehicle size and the engine characteristics, the load factor, the driving pattern, the gradient which represent the average topology of the country, the speed and the acceleration. To maximize the driving range, a routing model, which aims at minimizing the energy consumption, has to be developed.
Our paper focus on the Electric Vehicle Traveling Salesman Problem (EV-TSP): given n cities, find the shortest tour, i.e., the shortest directed cycle containing all cities. The classical objective is to minimize the cost tour scheduling to fulfill delivery requests at each location. In this paper we consider the energy cost and we present an extension of the classical problem to minimize the remaining storage capacity of all electric vehicles at the destination node, knowing that there is no recharge operation on the tour. The objective function accounts not just for the travel distance but also on the load of the vehicle and on its speed while the energy consumption of the engine additionally depends on the path to travel, on the slope of the roads and on the vehicle specifications. Moreover, negative consumption that may happen due to regenerative breaking and kinetic energy capture on downhill paths is taken into consideration. Mathematical model is described and computational experiments are performed.Quadratizations of pseudo-Boolean functions
http://hdl.handle.net/2268/181402
Title: Quadratizations of pseudo-Boolean functions
<br/>
<br/>Author, co-author: Rodriguez Heck, Elisabeth; Crama, YvesWhat the heck is Revenue Management
http://hdl.handle.net/2268/181277
Title: What the heck is Revenue Management
<br/>
<br/>Author, co-author: Lurkin, Virginie; Schyns, Michael; Garrow, Laurie
<br/>
<br/>Abstract: The airline industry changed dramatically in 1978 when it became deregulated. Operations research analysts played a critical role after deregulation by developing algorithms and decision-support systems designed to help airlines to maximize their revenue. More than thirty-five years after deregulation, the airline industry is faced with new challenges. The increased use of the Internet as the major distribution channel and the increased market penetration of low cost carriers have led to an increasing interest in using discrete choice models to model air travel demand as the collection of individual's decisions.Early detection of university students in potential difficulty
http://hdl.handle.net/2268/181053
Title: Early detection of university students in potential difficulty
<br/>
<br/>Author, co-author: Hoffait, Anne-Sophie; Schyns, Michael
<br/>
<br/>Abstract: Rate of success in the first year at University in Belgium is very low regarding other foreign universities. The University of Liege, as other Universities, has already taken different initiatives.
But by early identifying students who have a high probability to face difficulties if nothing is done, the Universities might develop adapted methods to attack the problem with more emphasis where it is more needed and when it is still possible.
Thus we want to develop a decision tool able to identify these students to help them. For that, we consider three standard datamining methods: logistic regression, artificial neural networks and decision trees and focus on early detection, i.e. before starting at the University.
Then, we suggest to adapt these three methods as well as the classification framework in order to increase the probability of correct identification of the students. In our approach, we do not restrict the classification to two extreme classes, e.g. failure or success, but we create subcategories for different levels of confidence: high risk of failure, risk of failure, expected success or high probability of success. The algorithms are modified accordingly and to give more weight to the class that really matters.
Note that this approach remains valid for any other classification problems for which the focus is on some extreme classes; e.g. fraud detection, credit default...
We check if the factors of success/failure we can identify are similar to those reported in the literature.
We also make a ``what-if sensitivity analysis''. The goal is to measure in more depth the impact of some factors and the impact of some solutions, e.g., a complementary training or a reorientation.Functions of binary variables
http://hdl.handle.net/2268/181035
Title: Functions of binary variables
<br/>
<br/>Author, co-author: Crama, YvesWhat are the determinants of the operational losses severity distribution ? A multivariate analysis based on a semiparametric approach.
http://hdl.handle.net/2268/180102
Title: What are the determinants of the operational losses severity distribution ? A multivariate analysis based on a semiparametric approach.
<br/>
<br/>Author, co-author: Hambuckers, julien; Heuchenne, Cédric; Lopez, Olivier
<br/>
<br/>Abstract: In this paper, we analyse a database of around 41,000 operational losses from the European bank UniCredit. We investigate three kinds of covariates: firm-specific, fi-
nancial and macroeconomic covariates and we study their relationship with the shape
parameter of the severity distribution. To do so, we introduce a semiparametric approach
to estimate the shape parameter of the severity distribution, conditionally to
large sets of covariates. Relying on a single index assumption to perform a dimension
reduction, this approach avoids the curse of dimensionality of pure multivariate nonparametric
techniques as well as too restrictive parametric assumptions. We show
that taking into account variables measuring the economic well being of the bank
could cause the required Operational Value-at-Risk to vary drastically. Especially,
high pre-tax ROE, efficiency ratio and stock price are associated with a low shape
parameter of the severity distribution, whereas a high market volatility, leverage ratio
and unemployment rate are associated with higher tail risks. Finally, we discuss the
fact that the considered approach could be an interesting tool to improve the estimation
of the parameters in a Loss Distribution Approach and to offer an interesting
methodology to study capital requirements variations throughout scenario analyses.Nonparametric and bootstrap techniques applied to financial risk modeling
http://hdl.handle.net/2268/180100
Title: Nonparametric and bootstrap techniques applied to financial risk modeling
<br/>
<br/>Author, co-author: Hambuckers, julien
<br/>
<br/>Abstract: For the purpose of quantifying financial risks, risk managers need to model the behavior of financial variables. However, the construction of such mathematical models is a difficult task that requires careful statistical approaches. Among the important choices that must be addressed,we can list the error distribution, the structure of the variance process, the relationship between parameters of interest and explanatory variables. In particular, one may avoid procedures that rely either on too rigid parametric assumptions or on inefficient estimation procedures.
In this thesis, we develop statistical procedures that tackle some of these issues, in the context of three financial risk modelling applications. In the first application, we are interested in selecting the error distribution in a multiplicative heteroscedastic model without relying on a parametric volatility assumption. To avoid this uncertainty, we develop a set of model estimation and selection tests relying on nonparametric volatility estimators and focusing on the tails of the distribution. We illustrate this technique on UBS, BOVESPA and EUR/USD daily stock returns.
In the second application, we are concerned by modeling the tail of the operational losses severity distribution, conditionally to several covariates. We develop a flexible conditional GPD model, where the shape parameter is an unspecified link function (nonparametric part) of a linear combination of covariates (single index part), avoiding the curse of dimensionality. We apply successfully this technique on two original databases, using macroeconomic and firm-specific variables as covariates.
In the last application, we provide an efficient way to estimate the predictive ability of trading algorithms. Instead of relying on subjective and noisy sample splitting techniques, we propose an adaptation of the .632 bootstrap technique to the time series context. We apply these techniques on stock prices to compare 12,000 trading rules parametrizations and show that none can beat a simple buy-and-hold strategy.A semiparametric model for Generalized Pareto regression based on a dimension reduction assumption
http://hdl.handle.net/2268/180099
Title: A semiparametric model for Generalized Pareto regression based on a dimension reduction assumption
<br/>
<br/>Author, co-author: Hambuckers, julien; Heuchenne, Cédric; Lopez, Olivier
<br/>
<br/>Abstract: In this paper, we consider a regression model in which the tail of the conditional
distribution of the response can be approximated by a Generalized Pareto distribution.
Our model is based on a semiparametric single-index assumption on the
conditional tail index; while no further assumption on the conditional scale
parameter is made. The underlying dimension reduction assumption allows the procedure
to be of prime interest in the case where the dimension of the covariates
is high, in which case the purely nonparametric techniques fail while the purely
parametric ones are too rough to correctly fit to the data. We derive asymptotic
normality of the estimators that we define, and propose an iterative algorithm in
order to perform their practical implementation. Our results are supported by some
simulations and a practical application on a public database of operational losses.A cyclical square-root model for the term structure of interest rates
http://hdl.handle.net/2268/179626
Title: A cyclical square-root model for the term structure of interest rates
<br/>
<br/>Author, co-author: Platania, Federico; Moreno, Manuel
<br/>
<br/>Abstract: This paper presents a cyclical square-root model for the term structure of interest rates assuming that
the spot rate converges to a certain time-dependent long-term level. This model incorporates the fact that the interest rate volatility depends on the interest rate level and specifies the mean reversion level and the interest rate volatility using harmonic oscillators. In this way, we incorporate a good deal of flexibility and provide a high analytical tractability. Under these assumptions, we compute closed-form expressions for the values of different fixed income and interest rate derivatives. Finally, we analyse the empirical performance of the cyclical model versus that proposed in Cox et al. (1985) and show that it outperforms this benchmark, providing a better fitting to market data.Real options valuation under uncertainty
http://hdl.handle.net/2268/179601
Title: Real options valuation under uncertainty
<br/>
<br/>Author, co-author: Platania, Federico; Lambert, Marie; Moreno, Manuel
<br/>
<br/>Abstract: In this paper we develop a novel valuation model and methodology to value a pharmaceutical R&D project based on real options approach. The real options approach enables the possibility of optimally abandon the project before completion whenever the investment cost turns out to be larger than the expected net cash flow stream. On the other hand, the proposed model accounts for two different sources of uncertainty, those are technical and economic risk. This model incorporates a novel economic state vector where each economic state captures the interaction among different market and economic forces using Fourier series as the particular basis for the economic function space. In this sense, Fourier series are considered as an aggregate of forces playing a relevant role in the process evolution determining the cash flow structure and also allowing us to properly define an economic scenario where the project will be developed.Sequential testing of k-out-of-n systems with imperfect tests
http://hdl.handle.net/2268/179226
Title: Sequential testing of k-out-of-n systems with imperfect tests
<br/>
<br/>Author, co-author: Wenchao, Wei; Coolen, Kris; Talla Nobibon, Fabrice; Leus, Roel
<br/>
<br/>Abstract: A k-out-of-n system configuration requires that, for the overall system to be functional, at
least k out of the total of n components be working. We consider the problem of sequentially testing the components of a k-out-of-n system in order to learn the state of the system, when the tests are costly and when the individual component tests are imperfect, which means that a test can identify a component as working when in reality it is down, and vice versa. Each component is tested at most once. Since tests are imperfect, even when all components are tested the state of the system is not necessarily known with certainty, and so reaching a lower bound on the probability of correctness of the system state is used as a stopping criterion for the inspection.
We define different classes of inspection policies and we examine global optimality of each of the classes. We find that a globally optimal policy for diagnosing k-out-of-n systems with imperfect tests can be found in polynomial time when the predictive error probabilities are the same for all the components. Of the three policy classes studied, the dominant policies always contain a global optimum, while elementary policies are compact in representation. The newly introduced class of so-called `interrupted block-walking' policies combines these merits of global optimality and of compactness.Quand les maths nous transportent...
http://hdl.handle.net/2268/178957
Title: Quand les maths nous transportent...
<br/>
<br/>Author, co-author: Crama, Yves
<br/>
<br/>Abstract: Voici trois siècles, il se racontait dans la ville prussienne de Königsberg qu’un promeneur ne pouvait pas traverser successivement les sept ponts reliant les différentes parties de la cité sans emprunter deux fois le même pont. Leonhard Euler apporta en 1736 une démonstration élégante de cette affirmation. Il produisit ainsi l’une des premières applications des mathématiques à la construction d’itinéraires optimisés.
Aujourd’hui, la modélisation mathématique est devenue un outil indispensable pour les décideurs dans le monde du transport et, plus généralement, pour la gestion et la planification des activités logistiques. Qu’il s’agisse de calculer l’itinéraire le plus rapide de Liège à Marbella, d’optimiser les tournées de livraison d’une chaîne de grande distribution, de charger des navires ou des avions en assurant leur stabilité, de réduire les stocks excédentaires d’une entreprise pharmaceutique, de fixer le prix de billets de TGV en fonction des places disponibles, dans chaque cas, les mathématiques, alliées à l’informatique, permettent de formuler de façon précise le problème rencontré et d’y apporter des réponses pertinentes.
Cet exposé présentera quelques applications typiques des mathématiques dans le domaine de la logistique et fournira un aperçu des méthodes qui y sont mises en œuvre.Quand les maths nous transportent...
http://hdl.handle.net/2268/178956
Title: Quand les maths nous transportent...
<br/>
<br/>Author, co-author: Crama, Yves
<br/>
<br/>Abstract: Voici trois siècles, il se racontait dans la ville prussienne de Königsberg qu’un promeneur ne pouvait pas traverser successivement les sept ponts reliant les différentes parties de la cité sans emprunter deux fois le même pont. Leonhard Euler apporta en 1736 une démonstration élégante de cette affirmation. Il produisit ainsi l’une des premières applications des mathématiques à la construction d’itinéraires optimisés.
Aujourd’hui, la modélisation mathématique est devenue un outil indispensable pour les décideurs dans le monde du transport et, plus généralement, pour la gestion et la planification des activités logistiques. Qu’il s’agisse de calculer l’itinéraire le plus rapide de Liège à Marbella, d’optimiser les tournées de livraison d’une chaîne de grande distribution, de charger des navires ou des avions en assurant leur stabilité, de réduire les stocks excédentaires d’une entreprise pharmaceutique, de fixer le prix de billets de TGV en fonction des places disponibles, dans chaque cas, les mathématiques, alliées à l’informatique, permettent de formuler de façon précise le problème rencontré et d’y apporter des réponses pertinentes.
Cet exposé présentera quelques applications typiques des mathématiques dans le domaine de la logistique et fournira un aperçu des méthodes qui y sont mises en œuvre.
<br/>
<br/>Commentary: Conférence à destination des étudiants du cycle secondaire supérieur.What is the impact of ticket-level fare information on classic itinerary choice models ?
http://hdl.handle.net/2268/178941
Title: What is the impact of ticket-level fare information on classic itinerary choice models ?
<br/>
<br/>Author, co-author: Lurkin, Virginie; Garrow, Laurie; Schyns, Michael
<br/>
<br/>Abstract: I have been invited by Prof. Dr. Catherine Cleophas as an external guest to the "Revenue Management Colloquium" in Aix-la-Chapelle from March 25 to March 26. The aim is to present my own doctoral project and discuss the presentations of other PhD studentsEarly detection of university students in potential difficulty
http://hdl.handle.net/2268/178867
Title: Early detection of university students in potential difficulty
<br/>
<br/>Author, co-author: Hoffait, Anne-Sophie; Schyns, Michael
<br/>
<br/>Abstract: Rate of success in the first year at University in Belgium is very low regarding other foreign universities. The University of Liege, as other Universities, has already taken different initiatives.
But by early identifying students who have a high probability to face difficulties if nothing is done, the Universities might develop adapted methods to attack the problem with more emphasis where it is more needed and when it is still possible.
Thus we want to develop a decision tool able to identify these students to help them. For that, we consider three standard datamining methods: logistic regression, artificial neural networks and decision trees and focus on early detection, i.e. before starting at the University.
Then, we suggest to adapt these three methods as well as the classification framework in order to increase the probability of correct identification of the students. In our approach, we do not restrict the classification to two extreme classes, e.g. failure or success, but we create subcategories for different levels of confidence: high risk of failure, risk of failure, expected success or high probability of success. The algorithms are modified accordingly and to give more weight to the class that really matters.
Note that this approach remains valid for any other classification problems for which the focus is on some extreme classes; e.g. fraud detection, credit default...
Finally, simulations are conducted to measure the performances of the three methods, with and without the suggested adaptation. We check if the factors of success/failure we can identify are similar to those reported in the literature.
We also make a ``what-if sensitivity analysis''. The goal is to measure in more depth the impact of some factors and the impact of some solutions, e.g., a complementary training or a reorientation.Economic statistical design of nonparametric control charts
http://hdl.handle.net/2268/178825
Title: Economic statistical design of nonparametric control charts
<br/>
<br/>Author, co-author: Marcos Alvarez, Alejandro
<br/>
<br/>Abstract: In this work, we apply the economic statistical design framework to nonparametric control charts. To this end, we develop bounds for the type II error probability, i.e. the false negatives rate, of the nonparametric charts that are then used within the model. We implement the optimization problem defining economic statistical design and use it to find the design parameters of nonparametric control charts. We then compare the behavior of this design with parametric and nonparametric charts on different probability distributions with different values of the distribution parameters. We finally perform a brief analysis of the obtained results that emphasizes the differences between the economic statistical design of parametric and nonparametric control charts. In this study, we also give a number of advantages and shortcomings of both approaches so that the interested reader can make the best possible decision on which control chart it is better to use for a given application.A Markov chain model of power indices in corporate structures
http://hdl.handle.net/2268/178393
Title: A Markov chain model of power indices in corporate structures
<br/>
<br/>Author, co-author: Crama, Yves; Leruth, Luc; Wang, Su
<br/>
<br/>Abstract: This paper proposes to use a game-theoretic framework in analyzing
complex corporate networks, notably in measuring the ``amount of
control'' of both direct and indirect shareholders. The values of
the indices are defined by complex voting games, composed by
interlocked weighted majority games. This paper proposes a
characterization of corporate networks in which the notion of
``control'' can be well defined, as well as an algorithm that
consistently estimates the power indices when it is the case.