Research Seminar 2019-2020
Titles, abstracts and documents :
by Virgine Lurkin, Eindhoven University
N1 – 220 – Etilux
by Freek Van Gils, Tilburg University
B33 – Trifac 1
by Oscar Tellez, INSA Lyon
N1 – 220 – Etilux
by Marie Kratz, ESSEC Business School & CREAR Risk Center
N1 – 025
Abstract : There is an accepted idea that risk measurements are pro-cyclical: in times of crisis, they overestimate the future risk, while they underestimate it in quiet times. We lay down a methodology to evaluate the amount of pro-cyclicality in the way financial institutions measure risk, and identify two main factors explaining this pro-cyclical behavior: the clustering and return-to-the-mean of volatility, as it could have been anticipated but not yet quantified, and, more surprisingly, the very way risk is measured, independently of this return-to-the-mean effect. We provide CLTs and FCLT’s for functionals of quantile and dispersion estimators to support theoretically those empirical findings.
by Marc Bourreau, Telecom ParisTech
B31 – Sem 11
Abstract : In this paper, we consider a market where a platform has private information about which product markets are profitable. We study under which conditions the platform benefits from selling its market information exclusively to one participating seller rather than to any interested seller. We show that exclusive dealing is the preferred data monetization strategy if the relative value of operating as a monopoly seller in a profitable market is sufficiently high. In this case, we show that the platform also prefers selling exclusively its data to a seller rather than vertically integrating into the product market.
by Aurelie Slechten, Lancaster University
B31 – 2/92
by Arnaud Dufays, UNamur
N1 – 138
Abstract : Change-point (CP) VAR models face a dimensionality curse due to the proliferation of parameters that arises when new breaks are detected. To handle large data sets, we introduce the Sparse CP-VAR model that determines which parameters truly vary when a break is detected. By doing so, the number of new parameters to estimate at each regime is drastically reduced and the CP dynamic becomes easier to interpret. The Sparse CP-VAR model disentangles the dynamics of the mean
parameters and the covariance matrix. The latter is driven by an infinite hidden Markov framework while the former uses CP dynamics with shrinkage prior distributions. A simulation study highlights that the framework detects correctly the number of breaks per model parameter, and that it takes advantage of common breaks in the cross-sectional dimension to more precisely estimate them. Our applications on financial and macroeconomic systems highlight that the Sparse CP-VAR model help interpreting the detected breaks. It turns out that many spillover effects have zero regimes meaning that they are zero for the entire sample period. Forecasting wise, the Sparse CP-VAR model is competitive against several recent time-varying parameter and CP-VAR models in terms of log predictive densities.
by Giovanni Felici, Istituto di Analisi dei Sistemi ed Informatica, Consiglio Nazionale delle Ricerche, Roma
N3 – 033
Abstract : Feature selection is receiving increasing attention in Machine Learning and Statistics. In the context of linear regression, feature selection is often formulated as a regularization problem, where the regressors are selected with the help of a term associated with the size of the regression coefficients. Such approach has led to the well-established Ridge and Lasso methods. More recently, approaches based on Mixed Integer Programming (MIP) have been introduced to directly control the size of the active set. Although computationally demanding, such approaches exhibit interesting properties and are gaining popularity due the increasing power of solvers. In this talk I will introduce the basic concepts of regularization in regression and a recent MIP-based method with reduced computational burden and improved performances in the presence of feature collinearity and signals that vary in nature and strength.
by Alexandre de Cornière, Toulouse School of Economics
B31 – Sem 11
Abstract : The question of data has been at the center of recent debates around competition policy in the digital era. Concerns in this area are wide-ranging, and encompass privacy, collusion, barriers to entry, exploitative practices, and data-driven mergers.
Data can serve several purposes: for instance it can be used to improve algorithms, to target advertising, or to offer personalized discounts to consumers. While this heterogeneity of uses for data has sparked a large literature in economics, the multiplicity of models makes it difficult to draw general conclusions about the competitive effects of data.
In this paper we introduce data into a competition-in-utility framework à la Armstrong and Vickers (2001). The three key features of data are that (i) it allows to generate more revenue for a given level of utility, (ii) it is a byproduct of firms’ economic activity, and (iii) it is a club good (non-rival and excludable).
We provide a sufficient condition for data to be pro-competitive, and apply it to several environments illustrating the variety of uses for data. This analysis sheds light on the tension between the static and the dynamic effects of data: this is precisely when data increases short-term equilibrium consumer surplus that it can be used as a barrier to entry or that it can result in market-tipping.
We then use this framework to study data-driven mergers. We consider two data-connected markets A and B: the data obtained as a byproduct of firm A’s activity can be used by firms in market B. While the concerns expressed so far by antitrust authorities revolve around the idea of input foreclosure, we show that a merger between firms in the A and B markets also affect the incentives to collect data. A critical condition for the merger to be desirable is for data-trade to be impossible absent the merger.
by Elias Carroni, University of Bologna
B31 – Sem 7
Abstract : This article studies incentives for a premium provider (Superstar) to offer exclusive contracts to competing platforms mediating the interactions between consumers and firms. When platform competition is intense, more consumers subscribe to the platform hosting the Superstar exclusively. This mechanism is self-reinforcing as firms follow consumer decisions and (some) join exclusively the platform with the Superstar. Exclusivity always benefits firms and may benefit consumers. Moreover, when the Superstar is integrated with a platform, non-exclusivity becomes more likely than if the Superstar was independent. This analysis provides several implications for managers and policy makers operating in digital and traditional markets.
by Alper Sen, Bilkent University
N1 – 1715 7
Abstract : Shortages are highly costly in retail, but are less of a concern for store managers, as their exact amounts are usually not recorded. In order to align incentives and attain desired service levels, retailers need to design mechanisms in the absence of information on shortage quantities.
We consider the incentive design problem of a retailer that delegates stocking decisions to its store managers who are privately informed about local demand. The headquarters knows that the underlying demand process at a store is one of J possible Wiener processes, whereas the store manager knows the specific process. The store manager creates a single order before each period. The headquarters uses an incentive scheme that is based on the end-of-period leftover inventory and on a stock-out occasion at a prespecified inspection time before the end of a period. The problem for the headquarters is to determine the inspection time and the significance of a stock-out relative to leftover inventory in evaluating the performance of the store manager. We formulate the problem as a constrained nonlinear optimization problem in the single period setting and a dynamic program in the multiperiod setting. We show that the proposed “early inspection” scheme leads to perfect alignment when J=2 under mild conditions. In more general cases, we show that the scheme performs strictly better than inspecting stock-outs at the end and achieves near-perfect alignment. Our numerical experiments, using both synthetic and real data, reveal that this scheme clearly outperforms centralized ordering systems that are common practice and can lead to considerable cost reductions.
by Vikas Agarwal, Georgia State University
N1 – 138
Abstract : We investigate hedge funds’ unobserved performance (UP), measured as the riskadjusted return difference between a fund firm’s reported return and the hypothetical portfolio return derived from its disclosed long equity holdings. We find that high UP is (i) positively associated with measures of managerial incentives, discretion, and skill, and (ii) driven by a fund firm’s frequent trading in equity positions, derivatives usage, short selling, and confidential holdings. Fund firms with high UP outperform fund firms with low UP by more than 6% p.a. after accounting for typical hedge fund risk factors and fund characteristics.
by Wynne Lam, University of East Anglia
B31 – Sem 11
Abstract : Data protection legislation such as the European GDPR uses publicity, fines and a new consumer opt-in requirement to incentivise firms to protect personal data. We argue that the significance of the opt-in requirement is best understood by taking account of consumer loss aversion. Digital firms attract consumers by offering service enhancements and data security which require separate types of investment. We set out the conditions under which opt-in increases both types of investment and when security comes at the expense of service quality. We also study the welfare consequences of regulation for different types of consumer. We show that data protection legislation which includes the opt-in requirement may dilute the effect of fines but also be less harmful to investment in service enhancement.
by Marie Baratto, HEC Liège – (2:00 pm): Food Waste by Anaïs Lemaire, HEC Liège
N1 – 220
Abstract : Introduction the research topics « Kidney exchange problem » (Marie Baratto) and « Food Waste » (Anaïs Lemaire) in the context of the advanced course « Advanced Topics in Supply Chain Management » given by Yasemin Arda.
by Rudolf Giffinger, Technische Universität Wien
N1 – 1701
Abstract : The Smart City discussion increased during last years. Along with changing urban challenges the discussions concentrates on different topics and – most of all – on different methodologies of implementation and shows a shift in its objectives. Generally spoken, today we can increasingly distinguish between the understanding of Smart City 1.0, SC 2.0 and recently of SC 3.0.
In this contribution, I want to look at these changes, discuss some examples with respective advantages and disadvantages. Finally, I will discuss and argue that the concept so called urban living labs are supporting the idea of ‘open Innovation’ – and therefore provide new possibilities and obstacles for the inclusion of local actors with local knowledge.
by Catherine Casamatta, TSE (Toulouse School of Economics)
N1 – 1701
Abstract : We offer an overlapping generations equilibrium model of cryptocurrency pricing and confront it to new data on bitcoin transactional benefits and costs. The model emphasizes that the fundamental value of the cryptocurrency is the stream of net transactional benefits it will provide, which depend on its future prices. The link between future and present prices implies that returns can exhibit large volatility unrelated to fundamentals. We construct an index measuring the ease with which bitcoins can be used to purchase goods and services, and we also measure costs incurred by bitcoin owners. Consistent with the model, estimated transactional net benefits explain a statistically significant fraction of bitcoin returns.
par Anne Guisset, Université Saint-Louis, Bruxelles
B33 – 2/8
Abstract : En Belgique, la représentation des acteurs socio-économiques et leur participation à la prise de décision politique sont organisées par l’entremise de la concertation sociale. Historiquement, les organisations représentatives des travailleurs et les organisations représentatives des employeurs se rassemblent au sein d’instances spécifiques, afin de dégager des accords consensuels qui porteront potentiellement effet dans le processus de décision politique. Il s’agit dès lors d’explorer les possibilités pour les acteurs portant les intérêts de l’économie sociale d’être représentés et de prendre part à ce système. La recherche doctorale menée à ce sujet met au jour les intentions des acteurs de l’économie sociale face à la concertation sociale et éclaire les positions marginales qu’ils occupent actuellement dans les instances belges, bruxelloises, flamandes et wallonnes, qui y sont relatives.
by Diego Cattaruzza, Centrale Lille
N1 – 220
Abstract : E-commerce is a thriving market around the world and suits very well the busy lifestyle of today’s customers. In 2016 consumers in US were purchasing more things online than in stores. Meantime. This growing e-commerce poses a huge challenge for transportation companies, especially in the last mile delivery that hefty share in total parcel delivery cost often reaches or even exceeds 50%, making it a top concern for many companies. In order to gain a competitive advantage, companies are exploring creative and innovative solutions to improve the efficiency of the last mile delivery. Nowadays, the most common delivery service is home delivery. Customers wait at home to get their orders. Besides home delivery, companies like Amazon, Fedex, etc., develop locker delivery. When customers shop online, they can choose a nearby locker as a pickup location. In the recent two years, there is a new concept called trunk delivery that allows to deliver goods to the trunk of a car. Volvo launched its in-car delivery service in Sweden in 2016 which allows the owners of Volvo to choose their cars as delivery locations. What interests us is to combine all these delivery services and study an efficient last mile delivery system. The problem that arises in this context is a vehicle routing problem with synchronization characteristics since when the order is delivered to a customer’s car, it requires the simultaneous presence of the shipper and the car at the same location. In this seminar we will formally introduce the problem and the mathematical formulation. We then present the algorithms that we developed to tackle the problem and the results that we obtain on benchmark instances.
by Thomas Kneib, Georg-August-Universität Göttingen
N1 – 1701
Abstract : We propose a generic framework for Bayesian inference in distributional regression models in which each parameter of a potentially complex response distribution and not only the mean is related to a structured additive predictor. The latter is composed additively of a variety of different functional effect types such as nonlinear effects, spatial effects, random coefficients, interaction surfaces or other (possibly nonstandard) basis function representations. To enforce specific properties of the functional effects such as smoothness, informative multivariate Gaussian priors are assigned to the basis function coefficients. Inference can then be based on computationally efficient Markov chain Monte Carlo simulation techniques where a generic procedure makes use of distribution-specific iteratively weighted least squares approximations to the full conditionals. We will discuss practical aspects of distributional regression along different applications concerning for example the analysis of income inequality in Germany.
by Humberto Brea Solis, Skema Business School
B31 – Seminaire 1
Abstract : This study investigates the export behavior of SMEs in an emerging country, Russia. Instead of isolating individual conditions that lead companies to export, we focus on the conjunctural causation of exporting and the presence of equifinality and asymmetry in the decision to export. This last causal pattern is explored by contrasting the sufficient and necessary conditions for non-exporting SMEs with those that export. For performing our analysis, we selected frequently used independent variables in the export literature and study the export behavior of private manufacturing Russian SMEs in 2013. The outcome of our empirical analysis shows the importance of considering causal complexity when assessing exporting behavior. In particular, we were able to identify sufficient and necessary conditions for nonexporting behavior, but only necessary conditions related with exporting.
by Ludovic Phalippou, University of Oxford (Saïd Business School)
N1 – 1701
Abstract : We introduce a new statistical methodology to form portfolios of private market funds with similar risk-return profiles, which we interpret as private factors. We find that two private factors are priced. The first priced factor, dominated by European Buyout funds, is well spanned by commonly used public equity factors. The second priced factor is dominated by valueadd real estate and mezzanine funds, is unspanned by public equity factors. A non-priced factor, dominated by natural resources funds, provides significant diversification benefits and an inflation hedge despite being well spanned by public equity factors.
par Vincent Jacquet, UC Louvain
B33 – 2/8
Abstract : Un nombre croissant de théoriciens et d’acteurs soutiennent le développement d’une conception participative et délibérative de la démocratie pour pallier le malaise contemporain du gouvernement représentatif. Cette tendance est marquée par le développement de dispositifs où des citoyens, recrutés suite à un processus de tirage au sort, prennent part à une expérience de délibération publique. Les exemples les plus standardisés sont les panels citoyens, les conférences de consensus et les sondages délibératifs, mais également les jurys d’assises qui ont une histoire plus ancienne. Le séminaire propose d’explorer ces pratiques en traitant deux questions: (1) Les citoyens sont-ils intéressés par un tel mode de participation (2) quelles sont les conséquences externes de ces dispositifs?
Inscriptions auprès d’Elodie Dessy: firstname.lastname@example.org
par Sabine Garroy, Université de Liège
B33 – 2/8
Abstract : En Europe, une entreprise sociale se livre à une activité économique tout en poursuivant une finalité sociale et en mettant en œuvre une gouvernance inclusive. À suivre le discours des instances internationales, dans une optique favorable à leur développement et à leur pérennité, les entreprises sociales ne devraient pas être taxées comme les entreprises commerciales, dans la mesure où une telle charge fiscale pourrait, à long terme, menacer leur viabilité (nécessité d’un « cadre fiscal approprié et cohérent »). Par le biais d’une étude rétrospective, descriptive et prospective, nous pouvons conclure que, en matière d’impôts sur les revenus, ce cadre fiscal approprié et cohérent n’existe pas en Belgique (première partie) et que la mise en place d’un tel cadre nécessiterait un changement de paradigme : ce n’est plus la réalisation de profits qui devrait déterminer le régime de fiscalisation mais l’affectation réservée à ces profits (deuxième partie). Ce séminaire permettra notamment d’évoquer des évolutions quasi-croisées entre droit des personnes morales et droit fiscal : alors qu’au fil du temps le régime de fiscalisation des revenus des entreprises sociales s’est orienté vers une sanctuarisation du critère de l’activité (pour une même activité réalisée, un même impôt doit en principe être appliqué), sous l’effet du Code des sociétés et des associations, le droit des personnes morales connaitra une sanctuarisation du critère de la finalité (association, fondation et société se distingueront uniquement selon leur finalité).
by Dan Kärreman, Copenhagen Business School
N1 – 1701
Abstract : Over its history, organization studies has too rarely interrogated a fundamental, yet deceptively simple, question: What is an organization? Although the lion’s share of scholarship in management and organization studies conceives of organizations as entities within which communication occurs, “Communication Constitutes Organization” (CCO) scholarship has attracted interest because it makes a productive reversal, that is, by asking how organization happens in communication. CCO scholarship holds that organizations, as well as organizational phenomena, come into existence, persist, and are transformed in and through interconnected communication practices. The presentation aims to resituate CCO theorizing within the linguistic turn and to position CCO with respect to other lines of scholarship underwritten by a rich conception of language and discourse. We contribute to CCO scholarship with reflections on three theoretical orientations and provide a set of possibilities for its further development.
by Boris Fays, HEC Liège
N1 – 1701
Abstract : This paper examines the relation between the information contained in the two first Greeks of ATM options – Delta and Gamma – and the pricing of stocks. Specifically, I sort the cross-section of US stocks on a Probability Adjusted Implied Volatility Spread (PAVS), defined as the difference in the normalized Delta/Gamma ratio of a zero-delta straddle strategy. I show that a zero-cost trading strategy on this measure provides statistically significant average monthly returns. In particular, the new metric improves the spread from the put-call parity deviations of Cremers and Weinbaum (2010), as it implicitly recovers the probability distribution of stock returns contained in the option pricing model, to obtain market participants’ views about future stock prices.
by Aleksandar Andanov (University of Amsterdam)
N1 – 1701
Abstract : We investigate the characteristics of infrastructure as an asset class from an investment perspective of a limited partner. While non U.S. institutional investors gain exposure to infrastructure assets through a mix of direct investmentsand private fund vehicles, U.S. investors predominantly invest in infrastructure through private funds. We find that the stream of cash flows delivered by private infrastructure funds to institutional investors is very similar to that delivered by other typesof private equity, as reflected by the frequency and amounts of net cash flows. U.S. public pension funds perform worse than other institutional investors in their infrastructure fund investments, although they are exposed to underlying deals with very similarproject stage, concession terms, ownership structure, industry, and geographical location. By selecting funds that invest in projects with poor financial performance, U.S. public pension funds have created an implicit subsidy to infrastructure as an assetclass, which we estimate within the range of $730 million to $3.16 billion per year depending on the benchmark.
by Hande Yaman Paternotte (KU Leuven)
N1 – 1711
Abstract : Team as a Service (TaaS) is a new outsourcing service that enables on-demand creation and management of distributed teams for fast growing companies. The companies that use the TaaS model form a team according to the needs of a given project and provide managerial service throughout. Motivated by this new application, we study the Team Formation Problem (TFP) in which the quality and cost of communication is taken into account using the proximity of the potential members in a social network. Given a set of required skills, TFP aims to construct a capable team that can communicate and collaborate effectively. We study TFP with two measures for communication e ffectiveness: sum of distances and diameter. We compare the quality of solutions using various performance measures. Our experiments indicate the following: First, considering only the sum of distances or the diameter as the communication cost may yield solutions that perform poorly in other performance measures. Second, the heuristics in the literature fi nd solutions that are signifi cantly far from optimal whereas a general purpose solver is capable of solving small and medium size instances to optimality. To fi nd solutions that perform well on several performance measures, we propose the diameter constrained TFP with sum of distances objective. To solve larger sizes, we propose a novel branch and bound approach: we formulate the problem as a quadratic set covering problem, use the reformulation-linearization technique and relaxation to be able to decompose it into a series of linear set covering problems and impose the relaxed constraints through branching. Our computational experiments show that the algorithm is capable of solving large sizes that are intractable for the solver. Finally, we study a two stage stochastic variant of the problem with uncertain communication costs and report the results of preliminary experiments.
by Le Minh Nguyen, Japan Advanced Institute of Science and Technology
N1 – 1701
Abstract : Deep Learning methods have revolutionized the field of AI, and have shown much promise in applications such as computer vision and natural language processing. In this talk, we will focus on the legal domain. Specifically, we will present our current research (algorithms and results) on deep learning methods applied for analyzing legal texts. These latter texts are particularly problematic to classical NLP algorithms as they are syntactically complex and long, which makes it hard to create computational models of the underlying language. We will describe how deep learning helps in overcoming some of these challenges.
by Patrick Roger (Université de Strasbourg)
N1 – 1701
Abstract : Conventional finance models indicate that the magnitude of stock prices should not influence portfolio choices or future returns. This view is contradicted, however, by empirical evidence. In this paper, we report the results of an experiment showing that trading prices, in experimental markets, are processed differently by participants, depending on their magnitude. Our experiment has two consecutive treatments. One where the fundamental value is a small number (the small price market) and a second one where the fundamental value is a large number (the large price market). Small price markets exhibit greater mispricing than large price markets. We obtain this result both between-participants and within-participants. Our findings show that price magnitude influences the way people perceive the distribution of future returns. This result is at odds with standard finance theory but is consistent with: (1) a number of observations in the empirical finance and accounting literature; and (2) evidence in neuropsychology on the use of different mental scales for small and large numbers.
by Axel Gautier (HEC Liège)
B31 – Sem 6
Abstract : In Wallonia (Belgium), the public support for residential solar photovoltaic panels has two interesting features. First, the subsidies for solar production, in the form of tradable green certificates, were particularly generous which encouraged households to have large installations. Second, the region used a net metering system to record power exchanges with the grid. With net metering, the meter runs backwards when power is supplied to the grid. But, if production exceeds consumption on a yearly basis, the production surpluses are freely available for consumption. In this context, we test for a possible rebound effect. Based on a large sample of residential PV installation in Wallonia, we observe in our data and estimations, that a large proportion of households oversize their installation to benefit from the subsidies and, later consume most of their excess production. The effect is highly significant and production surpluses are almost entirely consumed. There are thus evidence of a strong increase in energy consumption by residential PV owners.
by Donal Palcic and Darragh Flannery (University of Limerick)
B31 – Sem 6
Abstract : TBA
by Thomas Bauwens (University of Utrecht)
N1 – Room 1701
Abstract : The promotion of “energy democracy” in low carbon transitions has become a significant topic for energy social science research. Furthermore, research on community energy has tended to assume that grassroots energy initiatives lead to citizen involvement and shared local benefit, despite limited empirical evidence. Here we report the results of cross-national research examining civic participation amongst the members of two renewable energy cooperatives, one based in the UK and the other in Belgium. Results show that levels of participation beyond financial investment varies markedly, with higher levels for relatively low cost, less active activities (e.g. reading newsletters) and lower levels for more active, timeconsuming activities (e.g. attending the Annual General Meeting, volunteering). Social identification is the most important factor explaining multiple forms of participation, with environmental and economic factors less important. The concept of “stealth democracy” is used to discuss the findings and draw out the benefits and risks associated with the importance of social factors for engaging communities in energy transitions.
Biography : Thomas Bauwens is an economist specialized in environmental and energy issues with a pronounced interest in multi-disciplinarity. He is currently a post-doctoral researcher at the Utrecht University’s Copernicus Institute of Sustainable Development. He is conducting research on the roles of circular start-up hubs in the transition towards a circular economy within the framework of a research project funded by the Netherlands Organisation for Scientific Research. He holds a PhD in economics from the Centre for Social Economy within HEC-Management School of the University of Liège (Belgium) and a master in economics from the Economic School of Louvain. During his PhD, he conducted a long-term stay as a visiting researcher at the Environmental Change Institute, University of Oxford. After that he completed a post-doctoral research stay at the Swiss Federal Institute of Technology in Lausanne (EPFL) within the Human-Environment Relations in Urban Systems (HERUS) Laboratory.
by Xavier Lambin (Toulouse School of Economics)
B31 – Sem 6
Abstract : Following the development of decentralized generation and smart appliances, energy communities have become a phenomenon of increased interest. While the benefits of such communities have been discussed, there is increasing concern that inadequate grid tariffs may lead to excess adoption of such business models. Furthermore, snowball effects may be observed following the effects these communities have on grid tariffs. We show that restraining the study to a simple cost-benefit analysis is far from satisfactory. Therefore, we use the framework of cooperative game theory to take account of the ability of communities to share gains between members. The interaction between energy communities and the DSO then results in a non-cooperative equilibrium. We provide mathematical formulations and intuitions of such effects and carry out realistic numerical applications where communities can invest jointly in solar panels and batteries. We show that such a snowball effect may be observed, but its magnitude and its welfare effects will depend on the grid tariff structure that is implemented, leading to possible PV over-investments.
by Schyns Michaël & Valembois Quentin (HEC Liège)
N1 – Room 025
Abstract : Nous vivons dans un monde digital où la technologie permet de transformer en profondeur de nombreux processus de gestion. Machine et deep learning, blockchain, machines intelligentes et véhicules autonomes, digital twins, IoT, e-textiles, immersion via réalité virtuelle, augmentée et mixte sont quelques-unes des technologies les plus prometteuses pour les entreprises. Dans cet exposé, nous nous concentrerons essentiellement sur les technologies immersives et les présenterons brièvement. Avec la réalité virtuelle, (presque) tout est possible ! En coiffant un casque de réalité virtuelle, l’utilisateur se déconnecte momentanément du monde réel pour être immergé dans un environnement graphique et sonore qui devient sa nouvelle réalité. Dans ce nouvel univers entièrement contrôlable, il peut expérimenter différentes tâches, parfois délicates, en toute sécurité et à moindre coût. Grâce à sa cousine, la réalité augmentée, une personne peut enrichir son environnement (réel) avec des couches d’information et ainsi être aidée dans ses prises de décision et dans le pilotage de processus.
Au HEC Digital Lab, dans le cadre de nos missions d’enseignement, de recherche et de service à la communauté, régulièrement pour répondre à des demandes d’entreprises partenaires, nous concevons de tels environnements et testons comment les rendre les plus immersifs et efficaces. Nous sommes également un membre fondateur du groupe Teaching With VR qui, au niveau de l’Université, prépare de nouvelles formes d’enseignement basée sur ces technologies.
Lors de cette séance, nous utiliserons des casques Oculus, Vive et Hololens pour présenter quelques-uns de nos environnements: apprentissage d’une langue étrangère, validation d’un projet de construction (HEC), gestion d’un entrepôt logistique, sensibilisation à la sécurité sur chantier et aux procédures d’urgence, formation au pilotage d’un Boeing 737, guidage pour la conception de palettes, tourisme et marketing, visite de sites…
by Samuel Ferey (Université de Nancy)
B31 – Sem 6
Abstract : « Les cas de dommage causés par une multiplicité d’auteurs en droit de la responsabilité civile laissent souvent la jurisprudence comme la doctrine dans une position délicate. De nombreux paradoxes – comme la surdétermination causale par exemple – ont fait douter de la pertinence des notions de causalité juridique appliquées à ces cas. L’article se propose de montrer qu’une modélisation en termes de jeux coopératifs permet de lever un certain nombre de ces paradoxes en fournissant des clefs de répartition entre co-auteurs tout en maintenant une approche cohérente de la causalité juridique. »
by Tina Ehrke-Rabel (Université de Graz)
N1 – Room 1701
Abstract : Algorithms are being used by businesses to attract customers in the most targeted way possible or to select the best employees, they are being increasingly used by public bodies to prevent or prosecute crime, to calculate certain lump sum expenses to be deducted from the tax base or to select taxpayers for audit. Intelligence services use big data algorithms even more intrusively to protect national security. In theory, big data analytics set up with the insights of behavioural economics can render taxation „close to invisible“ to the taxpayers and thus increase efficiency, effectiveness for the State and convenience for the taxpayer.
However, liberal democracies are founded on the concept of personal autonomy and free choice of the individual. Interferences with personal autonomy are subject to strict prerogatives.
The workshop will address the subject from an utopia, where taxpayers are not being aware of their status of a taxpayer anymore because filing of tax returns and collaborating in the form of answering requests and allowing to be audited have been supplanted by the big tax brother not only watching but also scoring taxpayers and directly interacting with their payment service providers. It will be argued that, although we are being distant from this utopia we might be approaching it. It shall be discussed if this is the solution we want to tackle tax non-compliance. I plead for upholding and defending the core values of liberal democracies and for refraining from equalising tax non-compliance with serious crimes such as terrorism or money laundering. However, I believe that big data analytics have the potential of increasing efficiency and effectiveness of tax administrations without necessarily undermining individual autonomy, if properly framed.
Biography : Prof. Ehrke-Rabel is Full Professor at the University of Graz (Austria) and Director of the Institute of Tax and Fiscal Law.
She studied Law at the University of Graz and at the University of Paris-Nanterre and Paris I Panthéon Sorbonne.
She has been active in research on tax procedure in an international, European and comparative context for a number of years. Her expertise lies predominantly in European tax law, Value Added Tax, tax procedure and constitutioal aspects of taxation. She has published recently papers on co-operative compliance in tax matters, the impact of sanctions on taxpayer’s compliance and more generally on the challenges of globalization and digitalization for tax collection and enforcement.
by Pascal François (HEC Montréal)
N1 – Room 1701
Abstract : Invoking the scale invariance property used by most option pricing models, Bates (2005) presents a model-free method to compute the delta and the gamma from the shape of the volatility smile. That promising approach yields hedging strategies which solely rely on market information and whose performance had, to the best of our knowledge, yet to be tested on a large-scale sample. That is what this paper does using roughly 6 million daily hedges of S&P500 calls and puts from January 1996 to April 2016.
Our first major finding is that smile-implied delta-neutral and delta-gamma-neutral hedging strategies cannot outperform their Black-Scholes counterparts neither in a mean-variance framework, nor with a tail risk metric.
In light of these disappointing results, we turn our attention to managing volatility-related Greeks. We extend the relation that holds in the stochastic implied volatility model to build smile-implied vegas, vannas and volgas from smile-implied gammas. We find that the smile-implied delta-gamma-vega neutral hedging strategy strongly outperforms its Black-Scholes counterpart, especially for put options. Adding the vanna or the volga into the hedging strategy does not improve the quality of replication. All in all, our findings suggest that option market information is particularly worthwhile for managing volatility risk, not underlying price risk.
by Vassil Karov (European Trade Union Institute)
N1 – Room 1715
Abstract : While digitalization is not new as such, its rapid and extensive impacts transform deeply many aspects of life, including work and employment. Both very pessimistic and optimistic scenarios about the future of work predict the end of working as we know it. In the context of digitalization, shifts are apparent in how work and employment are carried out, such as the emergence of new types of work organization and labour (for instance, through new means of allocating work such as crowd employment and the sharing economy); occupation shifts and new identities (including de-professionalization and user-generated content) and new work relationships arising from geographical dispersion of labour (which involve strategic use of location, potential pressure on regulation regimes). This presents new challenges for policy actors and stakeholders and complicates the potential and arenas for regulation. The objectives of the presentation are to critically discuss the theoretical debates in sociology related to the digitalization of work and employment and to address policy challenges related to digitalization of work and employment. The presentation is based on the recently published book: Meil, P., Kirov, V. (eds.), (2017), “Policy Implications of Virtual Work”, Basingstoke: Palgrave.
by Yan Alperovych (EM Lyon)
N1 – Room 1701
Abstract : There is ample evidence that Governmental Venture Capital (GVC) funds exert a negligible role on the performance of target ventures, except when they syndicate with private Venture Capital (PVC) investors. Yet, the origination of PVC-GVC syndicates still received no academic attention. Interestingly, very often lead PVC investors decide to invite GVC in their syndicate. In this paper we study the reasons why PVC would chose to do so. While most of the motivations for syndication highlighted by existing literature do not seem to explain the choice of PVC to syndicate with a GVC, we identify three motivations that instead could be relevant. PVC investors could decide to partner up with GVC to i) build a track record, if they do not have one, 2) to diversify their portoflio and invest outside of their industry “comfort zone”, and 3) to invest in risky opportunities, exploiting GVC as a way to lower the exposure to the venture’s risk. We provide evidence on these three causes for syndication with GVC based on a unique international sample of syndicates in which PVC is the lead investor. We show that less experienced PVC investing in industries where they are less comfortable are more likely to invite a GVC partner. In first rounds, also the industry riskiness seems to motivate such choice. We also show that PVC that in the first years of their life partner up with GVC more often, later are able to build a track record faster, and syndicate more with other PVC. Interesting differences emerge across geographical locations: the track record argument is more strongly supported in Europe, while the comfort zone argument effect is driven by US-based syndicates. This result suggests that GVC has a role in the development of VC markets, but such role is different in different economies.
by Morteza Davari (KU Leuven)
N1 – Room 119
Abstract : We consider a single-machine scheduling problem with release dates and inventory constraints. Each job has a deterministic processing time and has an impact (either positive or negative) on the central inventory level. We aim to find a sequence of jobs such that the makespan is minimized while all release dates and inventory constraints are met. We show that the problem is strongly NP-hard even when the capacity of the inventory is infinite. To solve the problem, we introduce two mixed integer programming (MIP) formulations, a dynamic programming (DP) approach and also a branch-and-bound (B&B) algorithm.
This problem has some interesting applications in scheduling load/unload operations of warehouses. In a loading/unloading terminal, trucks arrive in a loading/unloading dock either to deliver (unload) or to pick up (load) a certain number of final products. The terminal include a storage space that is used to store these final products in inventory.
by Leonardo Madio CORE (UCLouvain)
B31 – Sem 6
Abstract : Data brokers collect customer information from several sources and sell them to other firms. We propose a simple model of the sector, in which two data brokers can sell customer level information to a downstream firm. The information enables the downstream firm to price discriminate. The optimal strategy of the data brokers depends on the nature of the information available. In particular, we show that when data are « additive » or « supra-additive », both brokers find it optimal to compete and the downstream firm acquires information from both. However, when data are « sub-additive », with the combined value being lower than the sum of the value of the two datasets, data brokers prefer to share data and sell it jointly.
by Samedi Heng (HEC Liège)
N1 – 126
Abstract : User Stories (US) are the most commonly used requirements artifacts within agile methods such as eXtreme Programming (XP) and Scrum. They are written in natural language using prose or following a specific template. In practice, many templates have been proposed with no semantic for each syntax used in these templates. Within this research, we first propose the unified model for US templates and provide semantic of each syntax. Then, we adapt the goal-oriented modeling language for analyzing US sets; it is called Rationale Tree (RT). It allows identifying dependencies between US and provides a global view of the systems which allows improving the planning activities, and therefore the development life cycle of agile methods.
by Dimitri Paolini (University of Sassari)
B31 – Sem 6
A streaming platform obtains contents from content providers and offers paying commercial spaces to advertisers. Users value contents’ variety and are heterogeneously bothered by ads. When the size of the market is large, the platform offers only a paying subscription, whereas a small market results in the offer of a menu of subscriptions, with ad-intolerant users paying a positive price and moderately-averse users opting for a free-of-charge solution.
by Pinar Tufan (KU Leuven)
N1 – 1715
Within psychological contract framework, this research examines employment relationships of contingent workers. To do so, it introduces contingency-related psychological contracts, which focuses on unique needs and motivators of contingent workers. The central aim is to investigate the relationship between contingent employees’ perceptions of breach/fulfillment of contingency-related psychological contracts and their work outcomes (i.e., job satisfaction, turnover intention, organizational identification, and organizational citizenship behavior). This research gives special attention to one particular group of contingent workers, who are involved in any form of triangular employment relationships (e.g., agency workers, platform workers, and etc.).
by Julien Hambuckers (HEC Liège)
N1 – Room 1701
Abstract : In this paper, we take a new look at the relationship between interest rate differentials and exchange rates.
To do so, we introduce a forecasting approach that accounts for high-order dependencies in daily exchange rates.
Our model relies on a GARCH-in-mean specification where the innovations are assumed to follow a sinh-arcsinh (SH) ditribution with a time-varying asymmetry parameter, depending on interest rates. This model is used to predict the direction of change of three major currency pairs (USD/EUR, USD/GBP and USD/CHF) over the period 1999-2016. We find a dynamic asymmetry specification to be significant and driven by the interest rates. Economically speaking, we show that, notwithstanding a better in-sample performance compared to benchmark models for all pairs, weare able to obtain a positive out-of-sample performance for the USD/CHF pair.
N1 – Room 138
Abstract : not available
by Declan Dineen (University of Limerick, Ireland)
B31 – Ricardo
Abstract :In particular, this paper explores different two-stage semi-parametric methods to determine technical efficiency in long-term care provision in Ireland. Technical efficiency (TE) scores in the health sector literature are often estimated using either a non-parametric conventional Data Envelopment Analysis (DEA) or the fully parametric Stochastic Frontier Analysis (SFA) approach. While the SFA technique accounts for noise in the data and also allows for an unbiased estimation of the factors affecting efficiency, this method requires specification of the functional form of a production or cost function. On the other hand, DEA does not impose these restrictions. However, the estimation of the efficiency determinants using the conventional DEA scores in the second stage may lead to biased estimates of both the TE scores and the efficiency determining variables. This paper tries to fill the gap between the conventional DEA and SFA approaches and applies alternative semi-parametric methods. The methods considered do not impose any functional form restrictions on the data, and they also account for random sampling variability and reduce the bias in both the estimated TE scores and the estimated efficiency determinants. In this paper, we model TE in terms of an input distance function which allows us to estimate input-oriented technical efficiencies by investigating how much the input vector can be proportionally reduced while holding the output vector fixed. Output is measured as total patient days, while inputs are measured as medical staff, non-medical staff and the number of beds in long-term care units. Furthermore, we use several efficiency determining variables which are divided into objective quality characteristics and conventional determinants such as size, ownership, location, chain, case mix and the age of the nursing home facilities. Additionally, we split the sample into different groups: private and public nursing homes; chain and non-chain private units; and nursing homes located in rural and urban areas. We find notable differences in the results between the conventional DEA and the semi-parametric methods. We also derive some conclusive findings with respect to the estimated TE scores and the efficiency determining variables for the long-term care sector in Ireland.
by Hugues Langlois (HEC Paris)
N1 – 1701
Abstract :We estimate international factor models with time-varying factor exposures and risk premia at the individual stock level using a large unbalanced panel of 58,674 stocks in 46 countries over the 1985-2017 period. We consider market, size, value, momentum, profitability, and investment factors aggregated at the country, regional, and world level. The country market in excess of the world or regional market is required in addition to world or regional factors to capture the factor structure for both developed and emerging markets. We do not reject mixed CAPM models with regional and excess country market factors for 76% of the countries. We do not reject mixed multi-factor models in 80% to 94% of countries. Value and momentum premia show more variability over time and across countries than profitability and investment premia. The excess country market premium is statistically significant in many developed and emerging markets but economically larger in emerging markets.
by Chris Benner (UC Santa Cruz)
N1 – 126
Abstract :The U.S. is experiencing unprecedented economic, social and ideological stratification linked in part to rapid technological change, economic restructuring, and the growth of « narrowcast » media. All of these contributes to a fragmentation of the very knowledge base of society. We can learn how to address growing inequality and insecurity from certain metropolitan regions that have overcome fragmentation by building diverse knowledge communities and embracing a common regional destiny. Recognizing the importance of collectively produced information and knowledge serves as a basis for rethinking the future of work and employment. This can help us develop a range of new income strategies and policies—including perhaps a universal technology dividend—to address the high levels of inequality and insecurity we currently face.
by Farnaz Nickpour (University of Liverpool)
N1 – 1701
Abstract :Moving beyond initial notions of ‘disability’, ‘ageing’ and ‘physical accessibility’, inclusive design needs to embrace wider contemporary contexts and explore the full spectrum of ‘human diversity’. Thus, moving from ‘physicality’ to overall ‘quality’ of live and exploring non-physical and ‘psycho-social’ elements of inclusivity. This talk explores this concept and its dimensions, exploringthe way forward.
by Katie Turnbull (Texas A&M Transportation Institute (TTI))
N1 – 120
Abstract :This presentation examines the use of transformational technologies and innovative services to enhance mobility and accessibility in Smart Cities. It explores current examples and possible future opportunities to provide improved services to all segments of society and addressing environmental concerns.
by Biley-Adelphe Ekponon (HEC Montréal, HEC Liège)
N1 – 1701
Abstract :This paper investigates the relative impact of various types of systematic risk on corporate asset prices. Equity risk premium and credit spreads are priced in a consumption-based corporate finance model with time-varying macroeconomic conditions. We decompose the risk premia into different sources of systematic risk compensation and show that long-run risk commands most of the risk premium (about 70%), for both equities and bonds. The role of long-run risk in the equity risk premium is amplified in recessions but remains stable over the business cycle for credit spreads. The relative importance of short- vs. long-run risk also varies at the cross-section. An empirical analysis over the period 1952-2016 provides support for the main predictions of the model.
by Sourour Elloumi and Arnaud Lazare (UMA-CEDRIC, ENSTA, Palaiseau, France)
N1 – 220
Abstract :This paper addresses the resolution of unconstrained binary polynomial programs(P). Wepropose a new 3-phases algorithm to solve(P). The first phase consists in reformulating(P)into a quadratic program (QP)using standard linearization inequalities. In the second phase, we reformulate(P) into a convex quadratic program (QPC). This convexification is computed thanks to a semidefinite relaxation. We compute the optimal value of the continuous relaxationof (QPC) using the binary identity. Moreover, in order to start the third phase (Branchand Bound phase) with a tight bound, we use new valid equalities depending on the chosenquadratization. These equalities highly increase the quality of the bound as it will be shown by testing our method on several benchmark instances and comparing it to other polynomial solvers.
by Giuseppe Lucio Gaeta (University of Naples, Italy)
N1 – 224
Abstract :From 2002 until 2012 the annual number of new Ph.D. holders in Italy approximately moved from 4,000 to 12,000. By using recently released Italian cross-sectional data, the seminar will provide an in-depth examination of Italian Ph.D. holders career prospects. The seminar will be focused on the issue of vertical mismatch (i.e. overeducation and overskilling) whose incidence and detrimental consequences on wage will be inspected.
by Louis Wehenkel (Montefiore Institute, ULiège)
N1 – 115
Abstract :The talk introduces the topic of electric power systems reliability management and explains the needs to evolve its algorithmictools in order to comply with the European targets for cleaner and more efficient electric energy supply. It then reports on a research effort of 4 years carried out within the European FP7 project GARPUR (http://www.garpur-project.eu), in order to develop consistent probabilistic reliability management approaches for electric power system planning, asset management, and operation. The resulting tools will be based on recent progresses in the context of large-scale optimization and machine learning methods.
by Denada Ibrushi (HEC Montréal, HEC Liège)
N1 – 1701
Abstract :The relative contributions of cash flow and discount rate news to the conditional variance of market returns exhibit significant variation over time. We identify lagged changes in PPI inflation as the main macroeconomic determinant of this time variation. We analyze the economic importance of these results by allowing for time variation in cash flow and discount rate betas in Campbell and Vuolteenaho (2004) asset pricing framework. A conditional version of their twobeta framework not only provides reasonable estimates of risk prices and relative risk aversion coefficients but also outperforms other models in accounting for the cross-sectional variation in expected returns.
by Gilbert Laporte (HEC Montréal)
N1 – 030
Abstract :The United Nations Humanitarian Response Depot (UNHRD) is a logistics service provider for the United Nations. In its 2014-2017 strategic plan, it targeted the reduction of operational costs on its network. We have analyzed the potential cost benefits of adding a regional distribution center in Kampala, Uganda, to the existing network in order to better respond to humanitarian crises in East Africa. To this end we have used fieldwork, network optimization, simulation and statistical analyses to assess the benefits of prepositioning high-demand non-food items in Kampala and to propose a robust stocking solution. The study is based on an actual case and uses real data. The UNHRD has already implemented the proposed solution for an estimated expected annual cost reduction of 21% with respect to the previous network configuration. This work was done jointly with Émilie Dufour, Julie Paquette and Marie-Ève Rancourt.
by David Ardia (Université de Neuchâtel (Suisse))
Abstract :We perform a large-scale empirical study to compare the forecasting performance of single-regime and Markov-switching GARCH (MSGARCH) models from a risk management perspective. We find that, for daily, weekly, and ten-day equity log-returns, MSGARCH models yield more accurate Value-at-Risk, Expected Shortfall, and left-tail distribution forecasts than their single-regime counterpart. Also, our results indicate that accounting for parameter uncertainty improves left-tail predictions, independently of the inclusion of the Markov-switching mechanism.
by Dimitra Kyriakopoulou (UCL, CORE)
B33 – TriFac2
Abstract :One of the implications of the intertemporal capital asset pricing model (CAPM) is that the risk premium of the market portfolio is a linear function of its variance. Yet, estimation theory of classical GARCH-in-mean models with linear-in-variance risk premium requires strong assumptions and is incomplete. We show that exponential-type GARCH models such as EGARCH or Log-GARCH are more natural in dealing with linear-in-variance risk premia. For the popular and more difficult case of EGARCH-in-mean, we derive conditions for the existence of a unique stationary and ergodic solution and invertibility following a stochastic recurrence equation approach. We then show consistency and asymptotic normality of the quasi maximum likelihood estimator under weak moment assumptions. An empirical application estimates the dynamic risk premia of a variety of stock indices using both EGARCH-M and Log-GARCH-M models.
by Fabian Gaessler (Max Planck Institute)
B31 – Domat
Abstract :Firms in the pharmaceutical industry typically rely on a period of market exclusivity derived from patent protection and data exclusivity to recoup their investments in R&D. The invalidation of patent rights during drug development renders data exclusivity the sole source of protection and shifts the period of market exclusivity on the R&D project level. Invalidation therefore constitutes a natural experiment that allows us to identify how the duration of market exclusivity affects firms’ incentives to innovate. Our analysis is based on a novel data set that links the development histories of drug candidates with underlying patent data. We identify causal effects relying on an instrument for the potentially endogenous patent invalidation. Our findings highlight that shorter durations of market exclusivity reduce the likelihood of successful drug commercialization.
by Iannis Kerkemezos (University of Rotterdam)
B31 – Sem.Comte
Abstract :This paper presents a novel strategy to empirically identify engagement in anti-competitive behaviour by firms in monopoly and duopoly markets. Moreover, it provides an estimate of the resulting price premium that consumers pay. This is done by examining transitions across market structures that take place because of firm entry or exit and classifying different types of markets within a given structure based on their structural history. Using panel data from the U.S. airline industry between 1993 and 2014, we find that a ”quiet-life” duopoly prices significantly higher compared to a duopoly that comes about by entry in monopoly, and that a ”quiet-life” monopoly prices significantly lower compared to a monopoly that comes about by exit in duopoly, but still significantly higher than both types of duopoly. These effects, both economically and statistically significant, suggest that collusion is likely to occur in duopoly markets and that entry deterring strategies, such as limit pricing, are likely to be deployed in monopoly markets.
by Lorenzo Cicatiello (University of Naples « L’Orientale »)
Abstract :Since the 1990s, revitalizing the railways has become a key objective of the EU’s transport policy. EU’s efforts have focused on the liberalization of these services and the creation of a single European railway area. The declared goal has been to shift the balance between transport modes, from private to public modes, and specifically from car and plane to rail. One of the most strategic implications of this modal shift is its positive impact on the environment as rail is usually considered to be a ‘green’ mode of transport, at least ‘greener’ than cars and planes in terms CO2 emissions. This research seminar seeks to answer the following research question: Has railway liberalization corresponded to lower CO2 emissions? Drawing on a panel approach on a large set of countries, we developed a preliminary econometric model where per capita CO2 emissions for transport were used as dependent variable and the OECD’s Energy, Transport and Communications Regulation (ETCR) indicator on railway liberalization was used as independent variable.
by Thomas Bonesire (HEC Liège)
Abstract :This paper studies the investment and financing choices of an entrepreneur willing to fund her own business. Her project is characterized by its non-tradability on the market, low divisibility and high risk. We adopt a portfolio approach to identify the entrepreneur’s optimal portfolio allocation and her cost of capital (the project’s hurdle rate) given her specific investment and financing constraints. The entrepreneur’s optimal portfolio is related to the one of an unconstrained investor with similar risk aversion. We derive the entrepreneur’s optimal investment curve and show that it is related to the Capital Market Line. The entrepreneur’s cost of capital is positively influenced by her risk aversion and the project’s (co)variances and size. We show that the cost of capital can be decreased when combining the project with other assets and increasingly so when optimizing the allocation. After that, a numerical analysis on realistic cases provides estimates of the entrepreneur’s cost of capital and optimal diversification. Finally, we build on our portfolio approach to investigate how public authorities may relax the entrepreneur’s constraints. To conclude, our portfolio approach contributes to explaining the entrepreneur’s cost of capital puzzle – why the cost of capital of entrepreneurs is not that high relatively to the risk taken.
by Thomas Renault (Université Paris I Panthéon-Sorbonne – Laboratoire PRISM; Catholic University of Lille – IESEG School of Management)
Abstract :Social media can help investors gather and share information about stock markets. However, it also presents opportunities for fraudsters to spread false or misleading statements in the marketplace. Analyzing millions of messages sent on the social media platform Twitter about small capitalization firms, we find that an abnormally high message activity on social media is associated with a large price increase on the event day and followed by a sharp price reversal over the next week. Examining users’ characteristics, we find that the price reversal pattern is stronger when the events are generated by the tweeting activity of deleted/deactivated Twitter accounts, by the tweeting activity of suspicious accounts, or by the tweeting activity of accounts dedicated to tracking pump-and-dump schemes. Overall, our findings are consistent with the patterns of a pump-and-dump scheme, where fraudsters/promoters use social media to temporarily inflate the price of small capitalization stocks.
by Steven Vallas (Northeastern University, Boston)
N1 1711 – 1715
Abstract :Active labor market policies have assumed growing importance since the end of the nineties.The concept of flexicurity (Wilthagen and Tross, 2004) has been particularly influential. Although under the financial crisis and austerity measures flexicurity has received severe criticisms (Burroni and Keune, 2011), the concrete impact of such policy orientations, presumed to reduce the risks of precariousness and job insecurity for nonstandard workers, strongly varies from country to country. The result has generated a host of dilemmas concerning policy responses to long term unemployment, labor market subsidies for older, disabled, or minority workers, as well as how best to combine security for workers facing transitions with the flexibility that firms need to compete on the global stage. Moreover, the search for new trade-offs between flexibility and security does not only emanate from public labor policies: it also results from experimental initiatives, emerging at a micro level either in the private or non-profit sectors (Pulignano, Doerflinger, and DeFranceschi, 2016; Lorquet, Pichault and Orianne, 2015). The papers in this workshop confront these issues, addressing theoretical and empirical questions regarding how labor market uncertainties might best be managed in an era of rapid economic change.
by Marianne Verdier (CRED, Paris 2)
B31 – Sem.3
Abstract :To be defined
by François Pichault (LENTIC – HEC Liège), Giseline Rondeaux (LENTIC – HEC Liège), Fanny Fox (LENTIC – HEC Liège), Frédéric Naedenoen (LENTIC – HEC Liège), Christophe Dubois (CRIS – FaSS – Uliege), Jean-François Orianne (CRIS – FaSS – Uliege), Julie Gérard (CRIS – FaSS – Uliege), Sophie Thunus (FSP – UCL)
B31 – Sem.6
Abstract :L’objectif de ce séminaire est d’inviter les participants à s’interroger sur leur pratique de chercheur au travers d’un échange d’expériences et de bonnes pratiques. La rencontre vise à dévoiler les particularités des terrains, démarches et postures de divers praticiens de la recherche en sciences sociales et de mettre en relief plusieurs façons de faire de la recherche. Au-travers de témoignages, de récits empiriques et de réflexions méthodologiques proposés par des chercheurs du CRIS et du LENTIC, nous appréhenderons les différences et les ressemblances entre recherche dite « académique » et recherche dite « intervention ».
by Johannes Johnen (CORE, UCL)
B33 – TriFac2
Abstract :We introduce a simple model of a market in which consumers must make tradeoffs between “browsing” more products superficially, and “studying” fewer products in detail. Each firm chooses two price components, a headline price and an additional price, and specifies conditions under which a consumer can avoid the additional price. Each consumer can either fully understand the offer of one firm (studying), or look at only the headline prices of two firms (browsing). In equilibrium, high-value consumers browse and pay the additional price, but low-value consumers study to avoid the additional price. Although high-value consumers pay higher total prices, the average price consumers pay is decreasing in the share of high-value consumers. This result is consistent with evidence that a number of essential products are more expensive in lower-income neighborhoods, and our model also helps explain why entry into such neighborhoods does not solve the problem. More importantly, our framework generates a novel and powerful competition-policy-based argument for regulating the additional price or other secondary product features. In contrast to existing arguments that such regulations may be ineffective or even distortionary, we show that they have a multiplier effect: because consumers do not need to worry about the regulated feature, they do more browsing, enhancing competition. In many situations, the increase in competition also increases efficiency, but we identify a class of situations in which there is a tradeoff between competition and efficiency.
by Robert Somogyi (CORE, UCL)
B31 – Sem.3
Abstract :This paper studies zero-rating, an emerging business practice consisting in a mobile internet service provider (ISP) excluding the data generated by certain content providers (CPs) from its consumers’ monthly data cap. Being at odds with the principle of net neutrality, these arrangements have recently attracted regulatory scrutiny all over the word. I analyze zero-rating incentives of a monopolistic ISP facing a capacity constraint in a two-sided market where consumption provides utility for homogeneous consumers as well as advertising revenue for CPs. Focusing on a market with two CPs competing with each other and all other content which is never zero-rated, I identify parameter regions in which zero, one or two CPs are zero-rated. Surprisingly, the ISP may zero rate content when content is either very unattractive or very attractive for consumers, but not in the intermediary region. I show that zero-rating benefits consumers if content is attractive, whereas it may decrease social welfare in the case of unattractive content.
by Olivier Degroote (KU Leuven)
B31 – Sem.3
Abstract :This paper addresses the impact of study programs in secondary education on long run educational and labor market outcomes. I estimate a dynamic model of educational decisions that allows for observed and unobserved differences in initial ability. It is novel in that it adds unobserved e¤ort as a choice variable, along with the choice of study program. This replaces traditional approaches, which assume end-ofyear performance follows an exogenous law of motion. I use the model to calculate how each study program contributes to di¤erent outcomes and I investigate policies that aim to match students to the right program. I find that academically rigorous programs are important to improve higher education outcomes, while vocational programs preventdrop out, grade retention and unemployment. At the same time, policies that encourage underperforming students to switch to less academic programs do not have a negative impact on higher education outcomes and they substantially reduce grade retention and drop out. I also find that ignoring the fact that students choose their effort level generates biases in counterfactual predictions.
by Marco Lübbecke (RWTH Aachen University)
N1 – 1701
Abstract :Many practical optimization problems can be formulated as models with an enormous number of variables. These variables typically represent combinatorial objects like subsets, configurations, or permutations. We are able to solve such models, even to optimality, because variables can be dynamically generated via column generation. Embedded in branch-and-bound we obtain branch-and-price. Formally, one may arrive at such models via a Dantzig-Wolfe refomulation of some integer program. One benefit of the reformulation is a potentially stronger relaxation. We describe efforts to develop a generic solver that is able to automatically perform the reformulation, and solve the latter by branch-and-price. An important part of this is to detect exploitable model structure, where we present our state of knowledge and identify blind spots where more research is needed.
by Dainis Zegers (U Cologne)
B31 – Sem.3
Abstract :An important strategy for sellers to build a reputation is to practice introductory pricing. By selling at a lower introductory price, sellers increase demand, induce more buyers to provide feedback, and thus build a reputation more quickly. I examine a form of introductory pricing that is particularly popular in digital markets: offering digital content for free. I argue that offering free content to build a reputation can be a double-edged strategy. It does not only attract buyers with a high preference, but also buyers with a low preference. Low-preference buyers give worse feedback than high-preference buyers, inducing a negative selection effect on a seller’s reputation. I estimate the strength of this effect using data from an online self-publishing platform where I observe authors either selling their e-books at a price or giving them away as free content. By exploiting the fact that I observe ratings for free and purchased versions of the same e-book, I show that those buyers who obtain an e-book as free content rate it worse than those buyers purchasing the e-book at a positive price, consistent with a negative selection effect on reputation.
by Paul Karehnke (UNSW Business School (Sydney))
N1 – 1701
Abstract :We draw on the skewness literature to propose performance evaluation tests which extend the linear factor regression approach and are designed for investments with option-like returns. These tests deliver conclusions valid for all risk-averse mean- variance-skewness investors and are more exible and better able to account for non- linearities in returns than option-based factor models. Applied to mutual funds and hedge funds, our tests usually suggest selecting di erent funds than standard tests, and nd that a signi cant fraction, 11%, of hedge funds add value to investors, whereas this is an insigni cant 4% for mutual funds. We also analyze the economic signi – cance of these option-like returns, their out-of-sample persistence, and their relation to subsequent fund ows.
by Emmanuel Kamdem (Professeur des Universités, ESSEC, Université de Douala, Cameroun)
N1 – 035
Abstract :To be defined
by Roman Kozhan (Warwick Business School)
N1 – 1701
Abstract :We show short selling in corporate bonds forecasts future bond returns. Short selling predicts bond returns where private information is more likely, in high-yield bonds, particularly after Lehman’s collapse. Short selling predicts returns following both high and low past bond returns. This, together with short selling increasing following past buying order imbalances, suggests short sellers trade against price pressures as well as trade on information. Short selling predicts bond returns both in the individual bonds that are shorted and in other bonds by the same issuer. Past stock returns and short selling in stocks predict bond returns, but do not eliminate bond short selling predicting bond returns. Bond short selling does not predict the issuer’s stock returns. These results show bond short sellers contribute to efficient bond prices and that short sellers’ information flows from stocks to bonds, but not from bonds to stocks.
by Emmanuel Kamdem (Professeur des Universités, ESSEC, Université de Douala, Cameroun)
B31 – Salle du conseil
Abstract :To be defined
by Wim Marneffe (UHasselt)
B31 – Sem.3
Abstract :Notwithstanding the benefits of competition within the lawyer profession, economic theory supports the existence of a lawyer-induced litigation effect. Given concerns about the growing litigiousness in many European countries and the growing awareness that observed increases in lawyers might further induce litigation rates, policymakers require a thorough understanding of the relationship between the number of lawyers and litigation rates. Utilizing a European crosscountry dataset, we contribute to the scant empirical literature on the lawyer-induced litigation hypothesis. To address endogeneity problems that arise when estimating the effect of the number of lawyers on litigation, we use two strategies. Following existing literature, we first estimate our model by means of the 2SLS procedure. Second, we exploit the instrumental variable approach based on the linear GMM estimator of Arellano and Bond (1991). To date, the Arellano-Bond estimator has not yet been used to address the endogeneity concerns between lawyers and litigation rates despite its advantages and popularity in other research areas. The estimations result in a positive and significant effect of lawyers that is robust across regressions. We discuss the policy implications of our findings.
by Pascal François (HEC Montréal)
N1 – 1701
Abstract :We present a model for CVA calculation in which the recovery rate is inferred from the term structure of CDS spreads. We show that the assumption of constant recovery is inconsistent with observed CDS spreads and leads to substantial underestimation of the CVA. We further document that the degree of misspecification induced by the constant recovery assumption is affected by the ratings of the two counterparties and the phase of the business cycle. Assuming constant recovery biases CVA calculations through two channels: estimating the default probability and measuring the impact of correlation and the associated wrong-way risk.
by Maxime Ogier (Centrale Lille)
N1 – 220
Abstract :In this presentation we address an integrated warehouse order picking problem. The aim is to solve industrial instances given by a partner specialized in ready-to-wear. The warehouse is divided in the picking and the storage areas. We focus on the picking area. It contains a set of aisles, each composed by a set of storage positions. For each period of the working day each position contains several pieces of a unique product, defined by its reference. The warehouse is not automated, and the order pickers can prepare up to K parcels in a given picking route. For each period of the working day a set of customer orders has to be prepared. An order is a set of product references, each associated with a quantity, i.e. the number of pieces required. The problem consists in jointly deciding: (1) the assignment of references to storage positions in the aisles which need to be filled up; (2) the division of orders into several parcels, respecting weight and size constraints; (3) the batching of parcels into groups of size K, that implicitly define the routing into the picking area. The routing is assumed to follow a return policy, i.e. an order picker enters and leaves each aisle from the same end. The objective function is to minimize the total routing cost. In order to deal with industrial instances of large size (considering hundreds of clients, thousands of positions and product references) in a short computation time, a heuristic method based on the split and dynamic programming paradigms is proposed.
by Nicolas Moreno (HEC-Liège)
N1 – 1701
Abstract :The research aims at providing new insights on the risk drivers underlying the value premium documented by Fama and French (1993). By using news stories supplied by Thomson Reuters, we seek to explain comovement in value and growth stock returns from common information factors embedded in news. We also distinguish between information coming from common news stories and idiosyncratic shocks concerning a smaller subset of firms. This framework allows us to answer three questions: 1) Is there a common risk factor in the news which explains the value anomaly? 2) Is idiosyncratic news information priced? and 3) Does the amount of media attention to an information prime over the polarity of this information?