Time Series Analysis and Examples |
Consider the univariate AR(
) process
Define the design matrix
.
Let
.
The least squares estimate,
,
is the approximation to the maximum likelihood
estimate of
if
is assumed to be Gaussian
error disturbances. Combining
and
as
the
matrix can be decomposed as
where
is an orthogonal matrix and
is an upper
triangular matrix,
, and
.
The least squares estimate that uses Householder transformation
is computed by solving the linear system
The unbiased residual variance estimate is
and
In practice, least squares estimation does not require the
orthogonal matrix
. The TIMSAC subroutines compute
the upper triangular matrix without computing the matrix
.
Consider the additive time series model
Practically, it is not possible to estimate parameters
, since the number of
parameters exceeds the number of available observations.
Let denote the seasonal difference operator with
seasons and degree of ; that is, .
Suppose that . Some constraints on the trend and seasonal
components need to be imposed such that the sum of squares of
, , and is
small. The constrained least squares estimates are obtained by
minimizing
Using matrix notation,
where
,
,
and
is the initial guess of
. The matrix
is a
control matrix in which structure varies
according to the order of differencing in trend and season.
where
The
matrix
has the same structure
as the matrix
, and
is the
identity matrix.
The solution of the constrained least squares method is equivalent
to that of maximizing the function
Therefore, the PDF of the data
is
The prior PDF of the parameter vector
is
When the constant
is known, the estimate
of
is the mean of the posterior distribution, where
the posterior PDF of the parameter
is proportional to
the function
.
It is obvious that
is the minimizer of
,
where
The value of
is determined by the minimum ABIC
procedure. The ABIC is defined as
In this section, the mathematical formulas for state space
modeling are introduced. The Kalman filter algorithms are
derived from the state space model. As an example, the state
space model of the TSDECOMP subroutine is formulated.
Define the following state space model:
where
and
.
If the observations,
, and the initial
conditions,
and
, are
available, the one-step predictor
of the state vector
and its mean square error (MSE)
matrix
are
written as
Using the current observation, the filtered value of
and its variance
are updated.
where
and
.
The log-likelihood function is computed as
where
is the conditional variance of
the one-step prediction error
.
Consider the additive time series decomposition
where
is a
regressor vector and
is a
time-varying coefficient vector.
Each component has the following constraints:
where
and
.
The AR component
is assumed to be stationary. The
trading-day component
represents the number of the
th day of the week in time
.
If
, and
(monthly data),
The state vector is defined as
The matrix
is
where
The matrix can be denoted as
where
Finally, the matrix
is time-varying,
where