Table Of ContentARMA ARIMA
Univariate Time Series Models
• Where we attempt to predict returns using only information contained in their
past values.
Some Notation and Concepts
• A Strictly Stationary Process
A strictly stationary process is one where
P{y ≤ b ,..., y ≤ b } = P{y ≤ b ,..., y ≤ b }
t 1 t n t +m 1 t +m n
1 n 1 n
i.e. the probability measure for the sequence {y } is the same as that for {y } ∀ m.
t t+m
• A Weakly Stationary Process
If a series satisfies the next three equations, it is said to be weakly or covariance
stationary
1. E(y ) = µ, t = 1,2,...,∞
t
2
2. E(y − µ)(y − µ) =σ < ∞
t t
3. E(y −µ)(y −µ) =γ ∀ t , t
t t t −t 1 2
1 2 2 1
Univariate Time Series Models (cont’d)
• So if the process is covariance stationary, all the variances are the same and all
the covariances depend on the difference between t and t . The moments
1 2
E(y − E(y ))(y − E(y )) =γ
, s = 0,1,2, ...
t t t+s t+s s
are known as the covariance function.
• The covariances, γ, are known as autocovariances.
s
• However, the value of the autocovariances depend on the units of measurement
of y .
t
• It is thus more convenient to use the autocorrelations which are the
autocovariances normalised by dividing by the variance:
γ
, s = 0,1,2, ...
τ = s
s
γ
0
• If we plot τ against s=0,1,2,... then we obtain the autocorrelation function or
s
correlogram.
A White Noise Process
• A white noise process is one with (virtually) no discernible structure. A
definition of a white noise process is
E(y ) = µ
t
Var(y ) =σ2
t
⎧σ2 if t = r
γ = ⎨
t−r ⎩0 otherwise
• Thus the autocorrelation function will be zero apart from a single peak of 1
at s = 0. τ ∼ approximately N(0,1/T) where T = sample size
s
• We can use this to do significance tests for the autocorrelation coefficients
by constructing a confidence interval.
1
±.196×
• For example, a 95% confidence interval would be given by . If
T
the sample autocorrelation coefficient, τ (cid:3) , falls outside this region for any
s
value of s, then we reject the null hypothesis that the true value of the
coefficient at lag s is zero.
Joint Hypothesis Tests
• We can also test the joint hypothesis that all m of the τ correlation coefficients
k
are simultaneously equal to zero using the Q-statistic developed by Box and
m
Pierce:
Q = T∑τ2
k
k=1
where T = sample size, m = maximum lag length
• The Q-statistic is asymptotically distributed as a χ 2 .
m
• However, the Box Pierce test has poor small sample properties, so a variant
has been developed, called the Ljung-Box statistic:
m τ2
Q∗ = T(T + 2)∑ k ~ χ2
m
T − k
k=1
• This statistic is very useful as a portmanteau (general) test of linear dependence
in time series.
An ACF Example
• Question:
Suppose that a researcher had estimated the first 5 autocorrelation coefficients
using a series of length 100 observations, and found them to be (from 1 to 5):
0.207, -0.013, 0.086, 0.005, -0.022.
Test each of the individual coefficient for significance, and use both the Box-
Pierce and Ljung-Box tests to establish whether they are jointly significant.
• Solution:
A coefficient would be significant if it lies outside (-0.196,+0.196) at the 5%
level, so only the first autocorrelation coefficient is significant.
Q=5.09 and Q*=5.26
Compared with a tabulated χ2(5)=11.1 at the 5% level, so the 5 coefficients
are jointly insignificant.
Moving Average Processes
• Let u (t=1,2,3,...) be a sequence of independently and identically
t
distributed (iid) random variables with E(u )=0 and Var(u )=σ 2 , then
ε
t t
y = µ+ u + θu + θu + ... + θu
t t 1 t-1 2 t-2 q t-q
is a qth order moving average model MA(q).
• Its properties are
E(y )=µ; Var(y ) = γ = (1+θ 2 + θ 2 + . . . + θ 2 )σ2
t t 0 1 2 q
Covariances
⎧(θ +θ θ +θ θ + ... +θθ )σ2 for s = 1,2,...,q
⎪
s s+1 1 s+2 2 q q−s
γ =
⎨
s
⎪0 for s > q
⎩
Example of an MA Problem
1. Consider the following MA(2) process:
X = u +θu +θ u
t t 1 t−1 2 t−2
where ε is a zero mean white noise process with variance σ 2.
t
(i) Calculate the mean and variance of X
t
(ii) Derive the autocorrelation function for this process (i.e. express the
autocorrelations, τ, τ, ... as functions of the parameters θ and θ).
1 2 1 2
(iii) If θ = -0.5 and θ = 0.25, sketch the acf of X .
1 2 t
Solution
(i) If E(u )=0, then E(u )=0 ∀ i.
t t-i
So
E(X ) = E(u + θu + θu )= E(u )+ θE(u )+ θE(u )=0
t t 1 t-1 2 t-2 t 1 t-1 2 t-2
Var(X ) = E[X -E(X )][X -E(X )]
t t t t t
but E(X ) = 0, so
t
Var(X ) = E[(X )(X )]
t t t
= E[(u + θu + θu )(u + θu + θu )]
t 1 t-1 2 t-2 t 1 t-1 2 t-2
= E[ u 2 + θ 2 u 2 + θ 2 u 2 +cross-products]
t 1 t−1 2 t−2
But E[cross-products]=0 since Cov(u ,u )=0 for s≠0.
t t-s
Solution (cont’d)
So Var(X ) = γ = E [ u 2 + θ 2 u 2 + θ 2 u 2 ]
t 0 t 1 t−1 2 t−2
= σ2 +θ2σ2 +θ2σ2
1 2
= (1+θ2 +θ2)σ2
1 2
(ii) The acf of X .
t
γ = E[X -E(X )][X -E(X )]
1 t t t-1 t-1
= E[X ][X ]
t t-1
= E[(u +θu + θu )(u + θu + θu )]
t 1 t-1 2 t-2 t-1 1 t-2 2 t-3
= E[( θ u 2 + θ θ u 2 )]
1 t−1 1 2 t−2
= θ σ2 +θθ σ2
1 1 2
= (θ +θθ )σ2
1 1 2
Description:Univariate Time Series Models • Where we attempt to predict returns using only information contained in their past values. Some Notation and Concepts