Loading in 5 sec....

Stochastic models - time series.PowerPoint Presentation

Stochastic models - time series.

- 164 Views
- Updated On :

Stochastic models - time series. Random process . an infinite collection of consistent distributions probabilities exist Random function . a family of random variables, e.g. {Y(t), t in Z}. Specified if given

Related searches for Stochastic models - time series.

Download Presentation
## PowerPoint Slideshow about 'Stochastic models - time series.' - Antony

**An Image/Link below is provided (as is) to download presentation**

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -

Presentation Transcript

Stochastic models - time series.

Random process.

an infinite collection of consistent distributions

probabilities exist

Random function.

a family of random variables, e.g. {Y(t), t in Z}

F(y1,...,yn;t1 ,...,tn ) = Prob{Y(t1)y1,...,Y(tn )yn }

that are symmetric

F(y;t) = F(y;t), a permutation

compatible

F(y1 ,...,ym ,,...,;t1,...,tm,tm+1,...,tn} = F(y1,...,ym;t1,...,tm)

Finite dimensional distributions

First-order

F(y;t) = Prob{Y(t) t}

Second-order

F(y1,y2;t1,t2) = Prob{Y(t1) y1 and Y(t2) y2}

and so on

i) Y(t;), : random variable

ii) urn model

iii) probability on function space

iv) analytic formula

Y(t) = cos(t + )

: fixed : uniform on (-,]

The Y(t) may be discrete, angles, proportions, ...

Kolmogorov extension theorem. To specify a stochastic process give the distribution of any finite subset {Y(1),...,Y(n)} in a consistent way, in A

Mean function

cY(t) = E{Y(t)} = y dF(y;t)

= y f(y;t) dy if continuous

= yjf(yj; t) if discrete

E{1Y1(t) + 2Y2(t)} =1c1(t) +2c2(t)

vector-valued case

mean level - signal plus noise: S(t) + (t) S(.): fixed

autocovariance function

cYY(s,t) = cov{Y(s),Y(t)} = E{Y(s)Y(t)} - E{Y(s)}E{Y(t)}

non-negative definite

jkcYY(tj , tk ) 0 scalars

crosscovariance function

c12(s,t) = cov{Y1(s),Y2(t)}

Joint distributions,

{Y(t+u1),...,Y(t+uk-1),Y(t)},

do not depend on t for k=1,2,...

Often reasonable in practice

- for some time stretches

Replaces "identically distributed"

E{Y(t)} = cY for t in Z

autocovariance function

cov{Y(t+u),Y(t)} = cYY(u) t,u in Z u: lag

= E{Y(t+u)Y(t)} if mean 0

autocorrelation function(u) = corr{Y(t+u),Y(t)}, |(u)| 1

crosscovariance function

cov{X(t+u),Y(t)} = cXY(u)

not stationary

0=1 1 = -.7

first-order, AR(1) Markov

*

Linear process

For convergence/stationarity

p.a.c.f.

corr{Y(t),Y(t-m)|Y(t-1),...,Y(t-m+1)} linearly

= 0 for m p when Y is AR(p)

Useful for prediction

Yule-Walker equations for AR(p).

Correlate, with Xt-k, each side of

multilinear functional

0 if some subset of variantes independent of rest

0 of order > 2 for normal

normal is determined by its moments

Download Presentation

Connecting to Server..