1 / 46

CSE 535 – Mobile Computing Lecture 14: Basic Probability & Queuing Theory

CSE 535 – Mobile Computing Lecture 14: Basic Probability & Queuing Theory. Sandeep K. S. Gupta School of Computing and Informatics Arizona State University. Agenda. Basic Probability Queuing Theory. Based on slides by G. Varsamopoulos. Probability Review. Presentation Outline.

jovan
Download Presentation

CSE 535 – Mobile Computing Lecture 14: Basic Probability & Queuing Theory

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSE 535 – Mobile ComputingLecture 14: Basic Probability & Queuing Theory Sandeep K. S. Gupta School of Computing and Informatics Arizona State University

  2. Agenda Basic Probability Queuing Theory

  3. Based on slides by G. Varsamopoulos Probability Review

  4. Presentation Outline • Basic probability review • Important distributions • Markov Chains & Random Walks • Poison Process • Queuing Systems

  5. Basic Probability concepts

  6. Basic Probability concepts and terms • Basic concepts by example: dice rolling • (Intuitively) coin tossing and dice rolling are Stochastic Processes • The outcome of a dice rolling is an Event • The values of the outcomes can be expressed as a Random Variable, which assumes a new value with each event. • The likelihood of an event (or instance) of a random variable is expressed as a real number in [0,1]. • Dice: Random variable • instances or values: 1,2,3,4,5,6 • Events: Dice=1, Dice=2, Dice=3, Dice=4, Dice=5, Dice=6 • Pr{Dice=1}=0.1666 • Pr{Dice=2}=0.1666 • The outcome of the dice is independent of any previous rolls (given that the constitution of the dice is not altered)

  7. Basic Probability Properties • The sum of probabilities of all possible values of a random variable is exactly 1. • Pr{Coin=HEADS}+Pr{Coin=TAILS} = 1 • The probability of the union of two events of the same variable is the sum of the separate probabilities of the events • Pr{ Dice=1  Dice=2 } = Pr{Dice=1}+{Dice=2} = 1/3

  8. Properties of two or more random variables • The tossing of two or more coins (such that each does not touch any of the other(s) ) simulateously is called (statistically) independent processes (so are the related variables and events). • The probability of the intersection of two or more independent events is the product of the separate probabilities: • Pr{ Coin1=HEADS  Coin2=HEADS } =Pr{ Coin1=HEADS} Pr{Coin2=HEADS}

  9. Conditional Probabilities • The likelihood of an event can change if the knowledge and occurrence of another event exists. • Notation: • Usually we use conditional probabilities like this:

  10. Conditional Probability Example • Problem statement (Dice picking): • There are 3 identical sacks with colored dice. Sack A has 1 red and 4 green dice, sack B has 2 red and 3 green dice and sack C has 3 red and 2 green dice. We choose 1 sack randomly and pull a dice while blind-folded. What is the probability that we chose a red dice? • Thinking: • If we pick sack A, there is 0.2 probability that we get a red dice • If we pick sack B, there is 0.4 probability that we get a red dice • If we pick sack C, there is 0.6 probability that we get a red dice • For each sack, there is 0.333 probability that we choose that sack • Result ( R stands for “picking red”):

  11. Moments and expected values • mth moment: • Expected value is the 1st moment: X =E[X] • mth central moment: • 2nd central moment is called variance (Var[X],X)

  12. Discrete Distributions

  13. Geometric Distribution • Based on random variables that can have 2 outcomes (A and A): Pr{A}, Pr{A}=1-Pr{A} • Expresses the probability of number of trials needed to obtain the event A. • Example: what is the probability that we need k dice rolls in order to obtain a 6? • Formula: (for the dice example, p=1/6)

  14. Geometric distribution example • A wireless network protocol uses a stop-and-wait transmission policy. Each packet has a probability pE of being corrupted or lost. • What is the probability that the protocol will need 3 transmissions to send a packet successfully? • Solution: 3 transmissions is 2 failures and 1 success, therefore: • What is the average number of transmissions needed per packet?

  15. Binomial Distribution • Expresses the probability of some events occurring out of a larger set of events • Example: we roll the dice n times. What is the probability that we get k 6’s? • Formula: (for the dice example, p=1/6)

  16. Binomial Distribution Example • Every packet has n bits. There is a probability pΒthat a bit gets corrupted. • What is the probability that a packet has exactly 1 corrupted bit? • What is the probability that a packet is not corrupted? • What is the probability that a packet is corrupted?

  17. Continuous Distributions • Continuous Distributions have: • Probability Density function • Distribution function:

  18. Continuous Distributions • Normal Distribution: • Uniform Distribution: • Exponential Distribution:

  19. Poisson Process

  20. Poisson Distribution • Poisson Distribution is the limit of Binomial when n and p 0, while the product np is constant. In such a case, assessing or using the probability of a specific event makes little sense, yet it makes sense to use the “rate” of occurrence (λ=np). • Example: if the average number of falling stars observed every night is λ then what is the probability that we observe k stars fall in a night? • Formula:

  21. Poisson Distribution Example • In a bank branch, a rechargeable Bluetooth device sends a “ping” packet to a computer every time a customer enters the door. • Customers arrive with Poisson distribution of  customers per day. • The Bluetooth device has a battery capacity of m Joules. Every packet takes n Joules, therefore the device can send u=m/n packets before it runs out of battery. • Assuming that the device starts fully charged in the morning, what is the probability that it runs out of energy by the end of the day?

  22. Poisson Process • It is an extension to the Poisson distribution in time • It expresses the number of events in a period of time given the rate of occurrence. • Formula • A Poisson process with parameter  has expected value  and variance . • The time intervals between two events of a Poisson process have an exponential distribution (with mean 1/).

  23. Poisson dist. example problem • An interrupt service unit takes to sec to service an interrupt before it can accept a new one. Interrupt arrivals follow a Poisson process with an average of  interrupts/sec. • What is the probability that an interrupt is lost? • An interrupt is lost if 2 or more interrupts arrive within a period of to sec.

  24. Queueing Theory

  25. Basic Queueing Process  • Arrivals • Arrival time distribution • Calling population (infinite or finite) • Service • Number of servers(one or more) • Service time distribution • Queue • Capacity(infinite or finite) • Queueing discipline “Queueing System”

  26. Examples and Applications • Call centers (“help” desks, ordering goods) • Manufacturing • Banks • Telecommunication networks • Internet service • Intelligence gathering • Restaurants • Other examples….

  27. Labeling Convention (Kendall-Lee) ///// Interarrival timedistribution Service timedistribution Number of servers Queueing discipline System capacity Calling population size Notes: FCFS first come, first served LCFS last come, first served SIRO service in random order GD general discipline M Markovian (exponential interarrival times, Poisson number of arrivals) D Deterministic Ek Erlang with shape parameter k G General

  28. Labeling Convention (Kendall-Lee) Examples:M/M/1M/M/5M/G/1M/M/3/LCFSEk/G/2//10M/M/1///100

  29. Terminology and Notation • State of the systemNumber of customers in the queueing system (includes customers in service) • Queue lengthNumber of customers waiting for service = State of the system - number of customers being served • N(t) = State of the system at time t, t ≥ 0 • Pn(t) = Probability that exactly n customers are in the queueing system at time t

  30. Terminology and Notation • n= Mean arrival rate (expected # arrivals per unit time) of new customers when n customers are in the system • s = Number of servers (parallel service channels) • n= Mean service rate for overall system (expected # customers completing service per unit time) when n customers are in the system Note: n represents the combined rate at which all busy servers (those serving customers) achieve service completion.

  31. Terminology and Notation When arrival and service rates are constant for all n, = mean arrival rate (expected # arrivals per unit time)  = mean service rate for a busy server 1/ = expected interarrival time 1/ = expected service time  = /s= utilization factor for the service facility= expected fraction of time the system’s service capacity (s) is being utilized by arriving customers ()

  32. Terminology and NotationSteady State When the system is in steady state, then Pn = probability that exactly n customers are in the queueing system L = expected number of customers in queueing system = … Lq = expected queue length (excludes customers being served) = …

  33. Terminology and NotationSteady State When the system is in steady state, then  = waiting time in system (includes service time) for each individual customer W = E[] q = waiting time in queue (excludes service time) for each individual customer Wq= E[q]

  34. Little’s Formula Demonstrates the relationships between L, W, Lq, and Wq • Assume n= and n= (arrival and service rates constant for alln) • In a steady-state queue, Intuitive Explanation:

  35. Little’s Formula (continued) • This relationship also holds true for (expected arrival rate) when n are not equal. Recall, Pn is the steady state probability of having n customers in the system

  36. Heading toward M/M/s • The most widely studied queueing models are of the form M/M/s (s=1,2,…) • What kind of arrival and service distributions does this model assume? • Reviewing the exponential distribution…. • If T ~ exponential(α), then • A picture of the distribution:

  37. Exponential Distribution Reviewed If T ~ exponential(), then Var(T) = ______ E[T] = ______

  38. Property 1Strictly Decreasing The pdf of exponential, fT(t), is a strictly decreasing function • A picture of the pdf: fT(t)  t

  39. Property 2Memoryless The exponential distribution has lack of memory i.e. P(T > t+s | T > s) = P(T > t) for all s, t ≥ 0. Example: P(T > 15 min | T > 5 min) = P(T > 10 min) The probability distribution has no memory of what has alreadyoccurred.

  40. Property 2Memoryless • Prove the memoryless property • Is this assumption reasonable? • For interarrival times • For service times

  41. Property 3Minimum of Exponentials The minimum of several independent exponential random variables has an exponential distribution If T1, T2, …, Tn are independent r.v.s, Ti ~ expon(i) and U = min(T1, T2, …, Tn), U ~expon( ) Example: If there are n servers, each with exponential service times with mean , then U = time until next service completion ~ expon(____)

  42. Property 4Poisson and Exponential If the time between events, Xn ~ expon(), thenthe number of events occurring by time t, N(t) ~ Poisson(t) Note: E[X(t)] = αt, thus the expected number of events per unit time is α

  43. Property 5Proportionality For all positive values of t, and for smallt, P(T ≤ t+t | T > t) ≈ t i.e. the probability of an event in interval t is proportional to the length of that interval

  44. Property 6Aggregation and Disaggregation The process is unaffected by aggregation and disaggregation Aggregation Disaggregation N1 ~ Poisson(1) N1 ~ Poisson(p1) p1 N2 ~ Poisson(2) N2 ~ Poisson(p2) N ~ Poisson() N ~ Poisson() p2 … … pk  = 1+2+…+k Nk ~ Poisson(pk) Nk ~ Poisson(k) Note: p1+p2+…+pk=1

  45. Using the notations below – the closed forms for various queuing system on next slide Note the change in notation from previous slides.

  46. M/G/1 … M/M/1 … M/D/1 Ignore the formula for terms not defined earlier

More Related