Properties of estimators for the parameter of the first order moving average process

Thumbnail Image
Date
1981
Authors
Macpherson, Brian
Major Professor
Advisor
Committee Member
Journal Title
Journal ISSN
Volume Title
Publisher
Altmetrics
Authors
Research Projects
Organizational Units
Organizational Unit
Statistics
As leaders in statistical research, collaboration, and education, the Department of Statistics at Iowa State University offers students an education like no other. We are committed to our mission of developing and applying statistical methods, and proud of our award-winning students and faculty.
Journal Issue
Is Version Of
Versions
Series
Department
Statistics
Abstract

The first order moving average time series Y(,t)is defined by; Y(,t) = e(,t) + (beta)e(,t-1), t = 1,2,...,;where the e(,t) are independent identically distributed random variables, and (beta) is a constant. For normally distributed e(,t), maximizing the likelihood function with respect to (beta) is approximately equivalent to minimizing the sum of squares function. A technique that can be used to achieve this minimization is the modified Gauss-Newton nonlinear least squares procedure. Details for the Gauss-Newton procedure are presented and difference equations for the derivatives used in the computations are developed. Alternative initial estimators of (beta) and of e(,0) are discussed;The approximate bias of the least squares estimator of (beta) is obtained. For the model, Y(,t) = e(,t) + (beta)e(,t-1) with (VBAR)(beta)(VBAR) < 1, the bias is n('-1) (beta) + O(n('-2)), and for the model Y(,t) = (mu) + e(,t) + (beta)e(,t-1) with (VBAR)(beta)(VBAR) < 1, the bias is n('-1)(2(beta)-1) + O(n('-2));Most estimation procedures for the parameter of the model restrict the parameter space to the open interval - 1 < (beta) < 1. The nonlinear least squares estimator can be defined for the parameter space - 1 (LESSTHEQ) (beta) (LESSTHEQ) 1 and it is shown to be consistent for (beta) = (+OR-) 1;The results of a Monte Carlo study of estimators of (beta) are reported. The empirical mean, variance and mean square error of the estimates are tabulated for 1,000 realizations at various parameter value and sample size combinations. Several t-statistics are considered and tests of goodness of fit performed. It is found that the nonlinear least squares estimator is substantially more efficient than the other estimators considered, particularly for large (VBAR)(beta)(VBAR). The empirical bias is in good agreement with the theoretical bias, particularly for large sample sizes and small values of (VBAR)(beta)(VBAR). The empirical variance is found to exceed the large sample variance for all sample size and parameter combinations. The t-statistics approximate Student's-t distribution only for large sample sizes. The distribution is closer to that of Student's-t for (beta) close to zero.

Comments
Description
Keywords
Citation
Source
Subject Categories
Keywords
Copyright
Thu Jan 01 00:00:00 UTC 1981