Hostname: page-component-76fb5796d-vvkck Total loading time: 0 Render date: 2024-04-25T07:04:52.986Z Has data issue: false hasContentIssue false

Some Exact Formulae for Autoregressive Moving Average Processes

Published online by Cambridge University Press:  18 October 2010

Victoria Zinde-Walsh*
Affiliation:
McGill University

Abstract

This paper demonstrates that for a finite stationary autoregressive moving average process the inverse of the covariance matrix differs from the matrix of the covariances of the inverse process by a matrix of low rank. The formula for the exact inverse of the covariance matrix of the scalar or multivariate process is provided. We obtain approximations based on this formula and evaluate some of the approximate results in the existing literature. Applications to computational algorithms and to the distributions of two-step estimators are discussed. In addition the paper contains the formula for the determinant of the covariance matrix which is useful in exact maximum likelihood estimation; it also lists the expressions for the autocovariances of scalar autoregressive moving average processes.

Type
Research Article
Copyright
Copyright © Cambridge University Press 1988 

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

REFERENCES

1. Ansley, G.F. An algorithm for the exact likelihood of a mixed autoregressive moving average process. Biometrika 66 (1979): 5965.10.1093/biomet/66.1.59Google Scholar
2. Godolphin, E.G. A direct representation for the large-sample maximum likelihood estimator of a Gaussian autoregressive moving average process. Biometrika 71 (1984): 281289.Google Scholar
3. Godolphin, E.G. & de Gooijer, J.G.. On the maximum likelihood estimation of the parameters of a Gaussian moving average process. Biometrika 69 (1982): 443451.Google Scholar
4. Gohberg, I.C. & Feldman, I.A.. Convolution equations and projection methods f or their solution. Translated Math. Monographs, Vol. 41. Providence, Rhode Island: American Mathematical Society. 1974.Google Scholar
5. Gradshteyn, I.S. & Ryzhik, I.M.. Table of integrals, series and products. New York: Academic Press, 1980.Google Scholar
6. Kailath, T., Bruckstein, A. & Morgan, D.. Fast matrix factorizations via discrete transmission lines. Linear Algebra and its Applications 75 (1986): 125.10.1016/0024-3795(86)90178-3Google Scholar
7. Markushevich, A.I. The theory of analytic functions: a brief course. Moscow: Mir Publishers, 1983.Google Scholar
8. Pandit, S.M. & Wu, S.M.. Time series and system analysis with applications. New York: Wiley, 1983.Google Scholar
9. Pesaran, M.H. Exact maximum likelihood estimation of a regression equation with first-order moving average error. Review of Economic Studies 40 (1973): 529536.Google Scholar
10. Phadke, M.S. & Kedem, G.. Computation of the exact likelihood function of multivariate moving average models. Biometrika 65 (1978): 511519.Google Scholar
11. Priestly, M.B. Spectral analysis and time series. London: Academic Press, 1981.Google Scholar
12. Rozanov, Yu.A. Stationary random processes. San-Francisco: Holden-Day, 1967.Google Scholar
13. Shaman, P. On the inverse of the covariance matrix for an autoregressive moving average process. Biometrika 60 (1973): 193196.Google Scholar
14. Shaman, P. Approximations for stationary covariance matrices and their inverses with applications to ARMA models. The Annals of Statistics 4 (1976): 292301.Google Scholar
15. Toyooka, Y. Second-order expansion of mean square error matrix of generalized least-squares estimator with estimated parameters. Biometrika 69 (1982): 269273.Google Scholar
16. Ullah, A., Srivastava, V.K., Magee, L. & Srivastava, A.. Estimation of a linear regression model with autocorrelated disturbances. Journal of Time Series Analysis 4 (1983): 127135.Google Scholar
17. Vinod, H.D. Exact maximum likelihood regression estimation with ARMA (n, n – 1) errors. Economics Letters 17 (1985): 355358.Google Scholar
18. Wise, J. The autocorrelation function and the spectral density function. Biometrika 42 (1955): 151160.Google Scholar
19. Yosida, K. Functional analysis. Berlin: Springer-Verlag, 1965.Google Scholar
20. Zinde-Walsh, V. & Wen, Q.. Estimation of a linear regression model with stationary ARMA (p, q) errors. Submitted for publication.Google Scholar