Bent Nielsen
Professor of Econometrics
Nuffield College
My research focuses on the theory of econometric modelling and forecasting. Currently I have two focus areas which I am currently investigating:
The theoretical properties of algorithms such as the Forward Search and Autometrics;
Cohort models such as the age-period-cohort model and the chain ladder method used to forecast future liabilities in general insurance.
-
-
CUMULATED SUM OF SQUARES STATISTICS FOR NONLINEAR AND NONSTATIONARY REGRESSIONS
February 2020|Scholarly edition -
A likelihood approach to bornhuetter–ferguson analysis
December 2019|Journal article|Risks© 2019 by the authors. Licensee MDPI, Basel, Switzerland. A new Bornhuetter–Ferguson method is suggested herein. This is a variant of the traditional chain ladder method. The actuary can adjust the relative ultimates using externally estimated relative ultimates. These correspond to linear constraints on the Poisson likelihood underpinning the chain ladder method. Adjusted cash flow estimates were obtained as constrained maximum likelihood estimates. The statistical derivation of the new method is provided in the generalised linear model framework. A related approach in the literature, combining unconstrained and constrained maximum likelihood estimates, is presented in the same framework and compared theoretically. A data illustration is described using a motor portfolio from a Greek insurer. -
Analysis of the forward search using some new results for martingales and empirical processes (vol 22, pg 1131, 2016)
November 2019|Journal article|BERNOULLI -
Partial Cointegrated Vector Autoregressive Models with Structural Breaks in Deterministic Terms
October 2019|Journal article|Econometrics<jats:p>This paper proposes a class of partial cointegrated models allowing for structural breaks in the deterministic terms. Moving-average representations of the models are given. It is then shown that, under the assumption of martingale difference innovations, the limit distributions of partial quasi-likelihood ratio tests for cointegrating rank have a close connection to those for standard full models. This connection facilitates a response surface analysis that is required to extract critical information about moments from large-scale simulation studies. An empirical illustration of the proposed methodology is also provided.</jats:p>
-
Models where the Least Trimmed Squares and Least Median of Squares estimators are maximum likelihood
September 2019|Working paperThe Least Trimmed Squares (LTS) and Least Median of Squares (LMS) estimators are popular robust regression estimators. The idea behind the estimators is to find, for a given h, a sub-sample of h good observations among n observations and estimate the regression on that sub-sample. We find models, based on the normal or the uniform distribution respectively, in which these estimators are maximum likelihood. We provide an asymptotic theory for the location-scale case in those models. The LTS estimator is found to be sqrt(h) consistent and asymptotically standard normal. The LMS estimator is found to be h consistent and asymptotically Laplace.C01, C13, Chebychev estimator, LMS, Uniform distribution, Least squares estimator, LTS, Normal distribution, Regression, Robust statistics -
Department of Economics Discussion Paper Series
Uniform Consistency of Marked and Weighted Empirical Distributions of Residuals
May 2019|Working paper|Department of Economics Discussion Paper SeriesA uniform weak consistency theory is presented for the marked and weighted empirical distribution function of residuals. New and weaker sufficient conditions for uniform consistency are derived. The theory allows for a wide variety of regressors and error distributions. We apply the theory to 1-step Huber-skip estimators. These estimators describe the widespread practice of removing outlying observations from an intial estimation of the model of interest and updating the estimation in a second step by applying least squares to the selected observations. Two results are presented. First, we give new and weaker conditions for consistency of the estimators. Second, we analyze the gauge, which is the rate of false detection of outliers, and which can be used to decide the cut-off in the rule for selecting outliers.1-step Huber skip, Asymptotic theory, Empirical processes, Gauge, Marked and Weighted Empirical processes, Non-stationarity, Robust Statistics, Sta-tionarity -
Department of Economics Discussion Paper Series
The analysis of marked and weighted empirical processes of estimated residuals
May 2019|Working paper|Department of Economics Discussion Paper SeriesAn extended and improved theory is presented for marked and weighted empirical processes of residuals of time series regressions. The theory is motivated by 1-step Huber-skip estimators, where a set of good observations are selected using an initial estimator and an updated estimator is found by applying least squares to the selected observations. In this case, the weights and marks represent powers of the regressors and the regression errors, respectively. The inclusion of marks is a non-trivial extention to previous theory and requires refined martingale arguments.1-step Huber-skip, Non-stationarity, Robust Statistics, Stationarity