1 / 21

Discrete Probabilities

Discrete Probabilities. CS5303 – Logical Foundations of Computer Science. Applications of probabilities to CS. Algorithms (average complexity) Probabilistic Algorithms Modeling and Simulation Queuing processes Signal processing. Definition.

daire
Download Presentation

Discrete Probabilities

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Discrete Probabilities CS5303 – Logical Foundations of Computer Science

  2. Applications of probabilities to CS • Algorithms (average complexity) • Probabilistic Algorithms • Modeling and Simulation • Queuing processes • Signal processing

  3. Definition • Probability: study of processes involving randomness (cards, numbers of phone calls, access to a network, how long is a system survivable etc…) • Probabilities describe the mathematical theory behind random experiments, processes etc…

  4. Trials • Trial/Observation: the result of a random experiment. ωΩ where Ω is the set of all possible results Exercise: what is Ω for • I throw 3 quarters in the air and look at if it’s head or tail • I throw 3 quarters and look at the number of heads • I throw two dice that are distinguishable and look at the combination • Same question with the two dice being indistinguishable

  5. Events • A random event A is a set of results A={ωΩ| A is realized if ω is the result of the experiment} Exercise: in the case where two dice are distinguishable, what is A: the total is at least 10

  6. Relations between events • Certain event  Ω • Impossible event   • Opposite event  Ac • And   • Or   • Incompatible events  A  B =  • Exhaustive system  partition • Implication  inclusion AB

  7. Probabilisable space • Definition: TP(Ω) is a σ-algebra if 1. ΩT 2. AT implies AcT 3. nIN, AnT implies nIN AnT • 3. is called σ-additivity • If Ω is finite or countable, we will often take T = P(Ω) (does it satisfy 1,2,3?) • Definition: (Ω,T) is called a probabilisable space

  8. Probability Space • Definition: let (Ω,T) be a prob. Space. Then P:T [0,1] is a probability if: 1. For all A in T P(A) [0,1] 2. P(Ω) = 1 3. If (An)nIN is a family of elements of T disjoint 2 by 2 then P(An) = P(An) • Definition: (Ω,T,P) is a probability space

  9. Exercise • Let Ω be finite, and T = P(Ω). Define P: T[0,1] by P(A) = |A|/|Ω| for A in T. Show that P is a probability.

  10. Properties • P(AB) = P(A)+P(B) – P(AB) • P(Ac) = 1-P(A) • If AB then P(A)P(B) • P(i=1n Ai) i=1nP(Ai) • If An is an increasing sequence for  then P(nINAn) = limn P(An) • If An is a decreasing sequence for then P(nINAn) = limn P(An) • P(i=1n Ai) = k=1n(-1)k-1i1<…<ik P(Ai1…Aik) Exercise: A and B shoot a target. A hits 25% and B hits 40%. What is the probability that the target is hit?

  11. Conditional probabilities and independent events • Probability that someone is color blind? • Probability that a man a color blind? • Probability that a woman is color blind? • Definition: Let (Ω,T,P) be a probability space and A a non-null event then P(B|A) = P(AB)/P(A) • Property: P(.|A) is a probability over (Ω,T)

  12. Example • Consider families with 2 children. Then Ω={BB,GG,BG,GB} writing the children in decreasing order w.r.t age. Knowing there is at least one boy, what is the probability that there are 2? • A={BB,GB,BG}, B={BB} • P(A|B) = |AB|/|A| = 1/3

  13. Generalization • Property: Let A1,…,An be events • P(A1…An)=P(A1)P(A2|A1)P(A3|A1A2)… P(An|A1…An-1) Proof is by induction on n Exercise: consider the transmission of a message yes or no in a given population. Each person transmits the message received with probability p and the opposite message with probability q=1-p. Let Xn be the msg received by the nth individual In. We assume that I0 gives yes to I1. What is P(In received yes)?

  14. Bayes Formula • Lemma: Let (Ω,T,P) be a probability space and (Ak)kIN an exhaustive system, then if B is an event: P(B)=kIN P(B|Ak)P(Ak) Exercise: prove it • Theorem: if B is not null then under the hypotheses of the lemma: P(An|B)=P(An)P(B|An)/kIN P(B|Ak)P(Ak)

  15. Application • A city is divided in 3 political areas Area number % of electors score of C • 30 40 • 50 48 • 20 60 An elector is picked at random. What is the probability he/she voted for C? Given that e has voted for C, what is the probability that e is from area 3?

  16. Random Variables • Definition: Let (Ω,T,P) be a probability space. A (real valued) random variable on this space is a function X:ΩIR such that for any open interval I of IR, X-1(I)T • Under the same definition, a discrete random variable from Ω to D is such that X-1(d) T for any d in D

  17. Distribution • Definition: let X be a d.r.v. The probability distribution of X is defined by PX(d) = P(X=d) • The repartition function of X is defined by FX(d) = P(X<d) = d’<dPX(d’)

  18. Mathematical Expectation • Definition: if X is a d.r.v then its mathematical expectation is defined by E(X) = ωΩ X(ω)P(ω)=dD d.P(X=d) • The mathematical expectation is a linear form over the space of discrete random variables (defined on the same sets) • Property: if Y=f(X) with f:DIR, then E(Y) = dD f(d).P(X=d)

  19. Variance - Covariance • The quadratic moment (when it exists) is defined by E(X2) = dD d2.P(X=d) • The variance is defined by var(X) = E(X2)-(E(X))2 • The covariance of X,Y is defined by Γ(X,Y)=E(X-E(X))E(Y-E(Y))=E(XY)-E(X)E(Y) • The coefficient of correlation is defined by ρ(X,Y)= Γ(X,Y)/σ(X)σ(Y)

  20. Properties • If X and Y are independent random variables then • E(XY)=E(X)E(Y) • var(X+Y) = var(X)+var(Y)

  21. Classical distributions • Bernoulli law: X:{0,1}{0,1}. Law is denoted by B(p). P(X=1)=p, P(X=0)=1-p • Find E(X) and var(X) • Binomial law: some of n independent Bernoulli law denoted b(n,p) • Find E(X) and var(X) • Law of Poisson: X:ΩIN follows a law of poisson of parameter λ if P(X=k) = e-λ.λk/k! • Find E(X) and var(X) • For n large and p small, the law of Poisson is an approximation of the binomial law

More Related