In a previous article we showed that ordinary least squares with panel corrected standard errors is superior to the Parks generalized least squares approach to the estimation of time-series-cross-section models. In this article we compare our proposed method with another leading technique, Kmenta's “cross-sectionally heteroskedastic and timewise autocorrelated” model. This estimator uses generalized least squares to correct for both panel heteroskedasticity and temporally correlated errors. We argue that it is best to model dynamics via a lagged dependent variable rather than via serially correlated errors. The lagged dependent variable approach makes it easier for researchers to examine dynamics and allows for natural generalizations in a manner that the serially correlated errors approach does not. We also show that the generalized least squares correction for panel heteroskedasticity is, in general, no improvement over ordinary least squares and is, in the presence of parameter heterogeneity, inferior to it. In the conclusion we present a unified method for analyzing time-series-cross-section data.
Email your librarian or administrator to recommend adding this journal to your organisation's collection.
* Views captured on Cambridge Core between 4th January 2017 - 20th February 2018. This data will be updated every 24 hours.