260 likes | 590 Views
Stochastic models - time series. Random process . an infinite collection of consistent distributions probabilities exist Random function . a family of random variables, e.g. {Y(t), t in Z}. Specified if given
E N D
Stochastic models - time series. Random process. an infinite collection of consistent distributions probabilities exist Random function. a family of random variables, e.g. {Y(t), t in Z}
Specified if given F(y1,...,yn;t1 ,...,tn ) = Prob{Y(t1)y1,...,Y(tn )yn } that are symmetric F(y;t) = F(y;t), a permutation compatible F(y1 ,...,ym ,,...,;t1,...,tm,tm+1,...,tn} = F(y1,...,ym;t1,...,tm)
Finite dimensional distributions First-order F(y;t) = Prob{Y(t) t} Second-order F(y1,y2;t1,t2) = Prob{Y(t1) y1 and Y(t2) y2} and so on
Other methods i) Y(t;), : random variable ii) urn model iii) probability on function space iv) analytic formula Y(t) = cos(t + ) : fixed : uniform on (-,]
There may be densities The Y(t) may be discrete, angles, proportions, ... Kolmogorov extension theorem. To specify a stochastic process give the distribution of any finite subset {Y(1),...,Y(n)} in a consistent way, in A
Moment functions. Mean function cY(t) = E{Y(t)} = y dF(y;t) = y f(y;t) dy if continuous = yjf(yj; t) if discrete E{1Y1(t) + 2Y2(t)} =1c1(t) +2c2(t) vector-valued case mean level - signal plus noise: S(t) + (t) S(.): fixed
Second-moments. autocovariance function cYY(s,t) = cov{Y(s),Y(t)} = E{Y(s)Y(t)} - E{Y(s)}E{Y(t)} non-negative definite jkcYY(tj , tk ) 0 scalars crosscovariance function c12(s,t) = cov{Y1(s),Y2(t)}
Stationarity. Joint distributions, {Y(t+u1),...,Y(t+uk-1),Y(t)}, do not depend on t for k=1,2,... Often reasonable in practice - for some time stretches Replaces "identically distributed"
mean E{Y(t)} = cY for t in Z autocovariance function cov{Y(t+u),Y(t)} = cYY(u) t,u in Z u: lag = E{Y(t+u)Y(t)} if mean 0 autocorrelation function(u) = corr{Y(t+u),Y(t)}, |(u)| 1 crosscovariance function cov{X(t+u),Y(t)} = cXY(u)
joint density Prob{x < Y(t+u) < x+dx and y < Y(t) < y+ dy} = f(x,y|u) dxdy
Some useful modelsChatfield notation Purely random / white noise often mean 0 Building block
Random walk not stationary
Moving average, MA(q) From (*) stationary
MA(1) 0=1 1 = -.7
Backward shift operator Linear process. Need convergence condition
autoregressive process, AR(p) first-order, AR(1) Markov * Linear process For convergence/stationarity
a.c.f. From (*) p.a.c.f. corr{Y(t),Y(t-m)|Y(t-1),...,Y(t-m+1)} linearly = 0 for m p when Y is AR(p)
In general case, Useful for prediction
Yule-Walker equations for AR(p). Correlate, with Xt-k, each side of
Cumulants. multilinear functional 0 if some subset of variantes independent of rest 0 of order > 2 for normal normal is determined by its moments