Hostname: page-component-89b8bd64d-72crv Total loading time: 0 Render date: 2026-05-07T10:31:18.240Z Has data issue: false hasContentIssue false

DIFFERENCING TRANSFORMATIONS AND INFERENCE IN PREDICTIVE REGRESSION MODELS

Published online by Cambridge University Press:  09 October 2014

Lorenzo Camponovo*
Affiliation:
University of St. Gallen
*
*Address correspondence to Lorenzo Camponovo, School of Economics and Political Science, Department of Economics, University of St. Gallen, Bodanstrasse 6, CH-9000 St. Gallen, Switzerland; e-mail: Lorenzo.Camponovo@unisg.ch.

Abstract

The limit distribution of conventional test statistics for predictability may depend on the degree of persistence of the predictors. Therefore, diverging results and conclusions may arise because of the different asymptotic theories adopted. Using differencing transformations, we introduce a new class of estimators and test statistics for predictive regression models with Gaussian limit distribution that is instead insensitive to the degree of persistence of the predictors. This desirable feature allows to construct Gaussian confidence intervals for the parameter of interest in stationary, nonstationary, and even locally explosive settings. Besides the limit distribution, we also study the efficiency and the rate of convergence of our new class of estimators. We show that the rate of convergence is $\sqrt n $ in stationary cases, while it can be arbitrarily close to n in nonstationary settings, still preserving the Gaussian limit distribution. Monte Carlo simulations confirm the high reliability and accuracy of our test statistics.

Information

Type
ARTICLES
Copyright
Copyright © Cambridge University Press 2014 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Article purchase

Temporarily unavailable