Date of Award
Doctor of Philosophy
Wayne A. Fuller
Assume observations Y[subscript] t, defined on a complete probability space ([omega], F, P), are generated by the model Y[subscript]t=f[subscript]t([theta][superscript]0)+e[subscript]t, t=1,2,..., where [theta][superscript]0 is the true parameter vector lying in a subset [theta] of a Euclidean space, f[subscript] t is a known function of both [omega] and [theta], where [omega] is an element of [omega] and [theta] is an element of [theta], f[subscript] t([omega],·) is almost surely twice continuously differentiable with respect to [theta] in some open, convex neighborhood S of [theta][superscript]0, and the e[subscript] t are zero mean unobservable random variables. The e[subscript] t can be independent or martingale differences. The sums of squares of the first partial derivatives of f[subscript] t([theta]) with respect to different components of the parameter [theta] are permitted to increase at different rates. Also, f[subscript] t can be a function of an increasing number of lagged values of the dependent variable Y[subscript] t as the index t increases;Under some regularity conditions it is demonstrated that there is a solution of the least squares equations which is a strongly consistent estimator of [theta][superscript]0. Furthermore, the properly normalized estimator has an asymptotic distribution. The limiting distribution of some components of the least squares estimator can be nonnormal. With differential rates, as sample size n increases, different components of the least squares estimator require different normalizers;Our theory can be applied to the estimation of regression models with autoregressive errors and to nonlinear models with time trend or random walks among the explanatory variables. Another example of the application of our theory is the ordinary least squares estimation of the parameters of the autoregressive moving average model of order (1,1) with an autoregressive unit root. A small Monte Carlo study of the estimators for the autoregressive moving average model indicates that the large sample results provide a reasonable approximation for samples on the order of 100.
Digital Repository @ Iowa State University, http://lib.dr.iastate.edu/
Sarkar, Sahadeb, "Nonlinear least squares estimators with differential rates of convergence " (1990). Retrospective Theses and Dissertations. 9463.