Econometrics and Statistics Seminar
Time and place: Thursdays, 14:00-15:00 h in the Faculty Lounge (Room 0.036), Juridicum, Adenauerallee 24-42, 53113 Bonn
May 09, 2019
Ana Colubi (Justus-Liebig-University Giessen)
Title:"On functional representations to deal with (fuzzy) set-valued data"
Abstract: Numerous experimental studies involve semi-quantitative expert information, or measured in a non-precise way, which can be modeled with interval (fluctuations, grouped data, etc.) or fuzzy (ratings, opinions, perceptions etc.) data. A general framework to analyze these kinds of inexact data with statistical tools developed for Hilbertian random variables will be presented. The space of nonempty convex and compact (fuzzy) subsets of R^p, has been traditionally used to handle this kind of imprecise data. Mathematically, these elements can be characterized via the support function, which agrees with the usual Minkowski addition, and naturally embeds the considered into a cone of a separable Hilbert space. The support function embedding holds interesting properties, but it lacks of an intuitive interpretation for imprecise data. Moreover, although the Minkowski addition is very natural when p = 1, if p > 1 the shapes which are obtained when two sets are aggregated are apparently unrelated to the original sets, because it tends to convexify. An alternative and more intuitive functional representation will be introduced in order to circumvent these difficulties. The imprecise data will be modeled by using star-shaped sets on R^p. These sets will be characterized through a center and the corresponding polar coordinates, which have a clear interpretation in terms of location and imprecision, and lead to a natural directionally extension of the Minkowski addition. The structures required for a meaningful statistical analysis from the so-called ontic perspective are introduced, and how to determine the representation in practice is discussed.
May 16, 2019
Johan Vikström (IFAU - Uppsala)
Title:"Empirical Monte Carlo Evidence on Estimation of Timing-of-Events Models;
Abstract:This paper uses an Empirical Monte Carlo simulation approach to study estimation of Timing-of-Events (ToE) models. We exploit rich Swedish data of jobseekers with information on participation in a training program to simulate placebo treatment durations. We first use these simulations to examine which covariates are major confounding variables to be included in selection models. We then show that the joint inclusion of specific types of short-term employment history variables (notably, the share of time spent in employment), together with baseline socio-economic characteristics, regional and inflow timing information, is able to remove selection bias. Next, we omit sets of covariates and estimate ToE models with discrete distributions for the ensuing systematic unobserved heterogeneity. In many cases the ToE approach provides accurate effect estimates, especially if calendar-time variation is taken into account. However, assuming too many or too few support points for unobserved heterogeneity may lead to large biases. Information criteria, in particular those penalizing parameter abundance, are useful to select the number of support points.
June 27, 2019
Matei Demetrescu (University of Kiel)
Title: "Testing the predictability of stock returns with smoothly varying deterministic mean"
Abstract: Checking whether stock returns may be predicted using financial valuation ratios and other fundamental values is facing several methodological and empirical challenges. Most importantly, the typical putative predictor variable exhibits high persistence, which leads to nonstandard limiting distributions of the OLS estimator and associated t statistic in predictive regressions. While there are several methods to deal with the issue of nonstandard distributions, the high predictor persistence also opens the door to spurious regression findings induced by time-varying mean components of stock returns, if not properly controlled for. Such control requires additional information, which may not be available in practice. Here, we take a different approach and robustify IVX predictive regression (Kostakis et al., 2015, Review of Financial Studies 28, 1506–1553) to the presence of smooth trend components. To this end, we employ a particular local mean adjustment scheme to account for possibly time-varying means. The limiting distribution of the resulting t statistic is derived under sequences of local alternatives, and a wild bootstrap implementation improving the finite-sample behavior is provided. Compared to IVX predictive regression, there is a price to pay for robustness in terms of power; at the same time, the IVX statistic without adjustment consistently rejects the false null of no predictability in the presence of ignored time-varying deterministic mean components.
June 27, 2019
Christian Conrad (University of Heidelberg)
Title: "Modelling the Forecast Errors: the MEM GARCH model"
Abstract: We suggest a multiplicative mixed frequency component MEM GARCH model. The model consists of a daily (high-frequency) GARCH component and one or multiple low-frequency components. The low-frequency components are based on MEM equations for the cumulated standardized forecast errors of the GARCH component within the low-frequency periods. We derive conditions for strict and weak stationarity of the MEM GARCH and discuss misspecification testing. Since the new model is dynamically complete, it is straightforward to construct multi-step ahead volatility forecasts. We apply the model to forecasting the volatility of the S&P 500 and three international stock indices. The MEM GARCH significantly outperforms the nested one-component GARCH out-of-sample. (Joint with Robert Engle, Stern School of Business.)