Point processes on the line
Download
1 / 38

Point processes on the line . Nerve firing. - PowerPoint PPT Presentation


  • 47 Views
  • Uploaded on

Point processes on the line . Nerve firing. Stochastic point process . Building blocks Process on R {N(t)}, t in R, with consistent set of distributions Pr{N(I 1 )=k 1 ,..., N(I n )=k n } k 1 ,...,k n integers  0 I's Borel sets of R.

loader
I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.
capcha
Download Presentation

PowerPoint Slideshow about ' Point processes on the line . Nerve firing.' - martena-boyer


An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.


- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript

Stochastic point process. Building blocks

Process on R {N(t)}, t in R, with consistent set of distributions

Pr{N(I1)=k1 ,..., N(In)=kn } k1 ,...,kn integers  0

I's Borel sets of R.

Consistentency example. If I1 , I2 disjoint

Pr{N(I1)= k1 , N(I2)=k2 , N(I1 or I2)=k3 }

=1 if k1 + k2 =k3

= 0 otherwise

Guttorp book, Chapter 5


Points: ... -1  0  1  ...

discontinuities of {N}

N(t) = #{0 < j  t}

Simple: j k if j  k

points are isolated

dN(t) = 0 or 1

Surprise. A simple point process is determined by its void probabilities

Pr{N(I) = 0} I compact


Conditional intensity. Simple case

History Ht = {j  t}

Pr{dN(t)=1 | Ht } = (t:)dt  r.v.

Has all the information

Probability points in [0,T) are t1 ,...,tN

Pr{dN(t1)=1,..., dN(tN)=1} =

(t1)...(tN)exp{- (t)dt}dt1 ... dtN

[1-(h)h][1-(2h)h] ... (t1)(t2) ...


Parameters. Suppose points are isolated

dN(t) = 1 if point in (t,t+dt]

= 0 otherwise

1. (Mean) rate/intensity.

E{dN(t)} = pN(t)dt

= Pr{dN(t) = 1}

j g(j) =  g(s)dN(s)

E{j g(j)} =  g(s)pN(s)ds

Trend: pN(t) = exp{+t} Cycle:  + cos(t+)  0


Product density of order 2.

Pr{dN(s)=1 and dN(t)=1}

= E{dN(s)dN(t)}

= [(s-t)pN(t) + pNN (s,t)]dsdt

Factorial moment


Autointensity.

Pr{dN(t)=1|dN(s)=1}

= (pNN (s,t)/pN (s))dt s  t

= hNN(s,t)dt

= pN (t)dt if increments uncorrelated


Covariance density/cumulant density of order 2.

cov{dN(s),dN(t)} = qNN(s,t)dsdt st

= [(s-t)pN(s)+qNN(s,t)]dsdt generally

qNN(s,t) = pNN(s,t) - pN(s) pN(t) st


Identities.

1. j,k g(j ,k ) =  g(s,t)dN(s)dN(t)

Expected value.

E{ g(s,t)dN(s)dN(t)}

=  g(s,t)[(s-t)pN(t)+pNN (s,t)]dsdt

=  g(t,t)pN(t)dt +  g(s,t)pNN(s,t)dsdt


2. cov{ g(j ),  h(k )}

= cov{ g(s)dN(s),  h(t)dN(t)}

=  g(s) h(t)[(s-t)pN(s)+qNN(s,t)]dsdt

=  g(t)h(t)pN(t)dt +  g(s)h(t)qNN(s,t)dsdt


Product density of order k.

t1,...,tk all distinct

Prob{dN(t1)=1,...,dN(tk)=1}

=E{dN(t1)...dN(tk)}

= pN...N (t1,...,tk)dt1 ...dtk


Proof of Central Limit Theorem via cumulants in i.i.d. case.

Normal distribution facts.

1. Determined by its moments

2. Cumulants of order  2 identically 0

Y1, Y2, ... i.i.d. mean 0, variance 2, all moments, E{Yk}

k=1,2,3,4,... existing

Sn = Y1 + Y2 + ... + Yn E{Sn } = 0 var{ Sn} = n 2

cumr Sn = n r cumr Y = cum{Y,...,Y}

cumr {Sn / n} = n r / nr/2

 0 for r = 3, 4, ...

 2 r = 2 as n  


Cumulant density of order k.

t1,...,tk distinct

cum{dN(t1),...,dN(tk)}

= qN...N (t1 ,...,tk)dt1 ...dtk


Stationarity.

Joint distributions,

Pr{N(I1+t)=k1 ,..., N(In+t)=kn} k1 ,...,kn integers  0

do not depend on t for n=1,2,...

Rate.

E{dN(t)=pNdt

Product density of order 2.

Pr{dN(t+u)=1 and dN(t)=1}

= [(u)pN + pNN (u)]dtdu


Autointensity.

Pr{dN(t+u)=1|dN(t)=1}

= (pNN (u)/pN)du u  0

= hN(u)du

Covariance density.

cov{dN(t+u),dN(t)}

= [(u)pN + qNN (u)]dtdu



Algebra/calculus of point processes. stationary point process," Journal of the Royal Statistical Society B Vol. 38 (1976), pp. 60-66

Consider process {j, j+u}. Stationary case

dN(t) = dM(t) + dM(t+u)

Taking "E", pNdt = pMdt+ pMdt

pN = 2 pM


Taking "E" again, stationary point process," Journal of the Royal Statistical Society B Vol. 38 (1976), pp. 60-66


Association stationary point process," Journal of the Royal Statistical Society B Vol. 38 (1976), pp. 60-66. Measuring? Due to chance?

Are two processes associated? Eg. t.s. and p.p.

How strongly?

Can one predict one from the other?

Some characteristics of dependence:

E(XY)  E(X) E(Y)

E(Y|X) = g(X)

X = g (), Y = h(),  r.v.

f (x,y)  f (x) f(y)

corr(X,Y)  0


Bivariate point process case stationary point process," Journal of the Royal Statistical Society B Vol. 38 (1976), pp. 60-66.

Two types of points (j ,k)

Crossintensity.

Prob{dN(t)=1|dM(s)=1}

=(pMN(t,s)/pM(s))dt

Cross-covariance density.

cov{dM(s),dN(t)}

= qMN(s,t)dsdt no ()


Mixing stationary point process," Journal of the Royal Statistical Society B Vol. 38 (1976), pp. 60-66.

cov{dN(t+u),dN(t)} small for large |u|

|pNN(u) - pNpN| small for large |u|

hNN(u) = pNN(u)/pN ~ pN for large |u|

 |qNN(u)|du < 

See preceding examples


The Fourier transform. stationary point process," Journal of the Royal Statistical Society B Vol. 38 (1976), pp. 60-66 regularity conditions

Functions, A(), - <  < 

 |A()|d finite

FT. a(t) =  exp{it)A()d

Inverse A() =(2)-1 exp{-it} a(t) dt

unique

C()=  A() +  B()

c(t) =  c(t) +  b(t)

2  1


Convolution (filtering). stationary point process," Journal of the Royal Statistical Society B Vol. 38 (1976), pp. 60-66

d(t) =  b(t-s) c( s)ds

D() = B()C()

Discrete FT.

a(t) =  exp{-i2ts/T} A(2s/T) s, t = 0,1,...,T-1

A(2s/T) =T-1 exp {i2st/T) a(t)

FFTs exist


Dirac delta. stationary point process," Journal of the Royal Statistical Society B Vol. 38 (1976), pp. 60-66

 H() () d = H(0)

 exp {it}() d  = 1

inverse

() = (2)-1 exp {-it}dt


Power spectral density stationary point process," Journal of the Royal Statistical Society B Vol. 38 (1976), pp. 60-66. frequency-side, , vs. time-side, t

/2 : frequency (cycles/unit time)

Non-negative

Unifies analyses of processes of widely varying types


Examples. stationary point process," Journal of the Royal Statistical Society B Vol. 38 (1976), pp. 60-66


Spectral representation stationary point process," Journal of the Royal Statistical Society B Vol. 38 (1976), pp. 60-66. stationary increments - Kolmogorov


Algebra/calculus of point processes. stationary point process," Journal of the Royal Statistical Society B Vol. 38 (1976), pp. 60-66

Consider process {j, j+u}. Stationary case

dN(t) = dM(t) + dM(t+u)

Taking "E", pNdt = pMdt+ pMdt

pN = 2 pM


Taking "E" again, stationary point process," Journal of the Royal Statistical Society B Vol. 38 (1976), pp. 60-66


Association stationary point process," Journal of the Royal Statistical Society B Vol. 38 (1976), pp. 60-66. Measuring? Due to chance?

Are two processes associated? Eg. t.s. and p.p.

How strongly?

Can one predict one from the other?

Some characteristics of dependence:

E(XY)  E(X) E(Y)

E(Y|X) = g(X)

X = g (), Y = h(),  r.v.

f (x,y)  f (x) f(y)

corr(X,Y)  0


Bivariate point process case stationary point process," Journal of the Royal Statistical Society B Vol. 38 (1976), pp. 60-66.

Two types of points (j ,k)

Crossintensity.

Prob{dN(t)=1|dM(s)=1}

=(pMN(t,s)/pM(s))dt

Cross-covariance density.

cov{dM(s),dN(t)}

= qMN(s,t)dsdt no ()


ad