Lecture 11 – Stochastic Processes

1 / 18

# Lecture 11 – Stochastic Processes - PowerPoint PPT Presentation

Lecture 11 – Stochastic Processes. Topics Definitions Review of probability Realization of a stochastic process Continuous vs. discrete systems Examples. Basic Definitions. Stochastic process : System that changes over time in an uncertain manner Examples Automated teller machine (ATM)

I am the owner, or an agent authorized to act on behalf of the owner, of the copyrighted work described.

## PowerPoint Slideshow about 'Lecture 11 – Stochastic Processes' - chi

An Image/Link below is provided (as is) to download presentation

Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author.While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server.

- - - - - - - - - - - - - - - - - - - - - - - - - - E N D - - - - - - - - - - - - - - - - - - - - - - - - - -
Presentation Transcript
Lecture 11 – Stochastic Processes
• Topics
• Definitions
• Review of probability
• Realization of a stochastic process
• Continuous vs. discrete systems
• Examples
Basic Definitions
• Stochastic process: System that changes over time in an uncertain manner
• Examples
• Automated teller machine (ATM)
• Printed circuit board assembly operation
• Runway activity at airport
• State: Snapshot of the system at some fixed point in time
• Transition: Movement from one state to another
Elements of Probability Theory

Experiment: Any situation where the outcome is uncertain.

Sample Space,S:All possible outcomes of an experiment (we will call them the “state space”).

Event:Any collection of outcomes (points) in the sample space. A collection of events E1, E2,…,En is said to be mutually exclusiveif EiEj =for all i ≠ j = 1,…,n.

Random Variable (RV): Function or procedure that assigns a real number to each outcome in the sample space.

Cumulative Distribution Function (CDF),F(·): Probability distribution function for the random variable X such that

F(a)  Pr{X ≤ a}

Time: Either continuous or discrete parameter.

Components of Stochastic Model

State: Describes the attributes of a system at some point in time.

s = (s1, s2, . . . , sv); for ATM example s = (n)

• Convenient to assign a unique nonnegative integer index to each possible value of the state vector. We call this X and require that for each sX.
• For ATM example, X = n.
• In general, Xt is a random variable.

Transition: Caused by an event and results in movement from one state to another. For ATM example,

# = state, a = arrival, d = departure

Model Components (continued)

Activity: Takes some amount of time – duration. Culminates in an event.

For ATM example  service completion.

Stochastic Process: A collection of random variables {Xt}, where t T = {0, 1, 2, . . .}.

Number in system, n

(no transient response)

Realization of the Process

Deterministic Process

Markovian Property

Given that the present state is known, the conditional probability of the next state is independent of the states prior to the present state.

Present state at time t is i: Xt = i

Next state at time t + 1 is j: Xt+1 = j

Conditional Probability Statement of Markovian Property:

Pr{Xt+1= j | X0 = k0, X1 = k1,…,Xt = i} = Pr{Xt+1= j | Xt = i}

for t = 0, 1,…, and all possible sequences i, j, k0, k1, . . . , kt–1

Interpretation: Given the present, the past is irrelevant in determining the future.

Transitions for Markov Processes

State space: S = {1, 2, . . . , m}

Probability of going from state i to state j in one move: pij

State-transition matrix

Theoretical requirements: 0 pij 1, jpij = 1, i = 1,…,m

0

1

2

0

0.6

0.3

0.1

1

0.8

0.2

0

P =

2

1

0

0

Discrete-Time Markov Chain
• A discrete state space
• Markovian property for transitions
• One-step transition probabilities, pij, remain constant over time (stationary)

Simple Example

State-transition matrix State-transition diagram

Game of Craps
• Roll 2 dice
• Outcomes
• Win = 7 or 11
• Loose = 2, 3, 12
• Point = 4, 5, 6, 8, 9, 10
• If point, then roll again.
• Win if point
• Loose if 7
• Otherwise roll again, and so on
• (There are other possible bets not included here.)
Transition Matrix for Game of Craps

Probability of win = Pr{ 7 or 11 } = 0.167 + 0.056 = 0.223

Probability of loss = Pr{ 2, 3, 12 } = 0.028 + 0.056 + 0.028 = 0.112

Multistage assembly process with single worker, no queue

State = 0, worker is idle

State = k, worker is performing operation k = 1, . . . , 5

Examples of Stochastic Processes

Single stage assembly process with single worker, no queue

State = 0, worker is idle

State = 1, worker is busy

Examples (continued)

Multistage assembly process with single worker and queue

(Assume 3 stages only; i.e., 3 operations)

s = (s1, s2) where

Operations

k = 1, 2, 3

What You Should Know About Stochastic Processes
• Definition of a state and an event.
• Meaning of realization of a system (stationary vs. transient).
• Definition of the state-transition matrix.
• How to draw a state-transition network.
• Difference between a continuous and discrete-time system.
• Common applications with multiple stages and servers.