1 / 26

CSE 221: Probabilistic Analysis of Computer Systems

CSE 221: Probabilistic Analysis of Computer Systems. Topics covered: Course outline and schedule Introduction (Sec. 1.1-1.4). General information. CSE 221 : Probabilistic Analysis of Computer Systems Instructor : Swapna S. Gokhale Phone : 6-2772. Email : ssg@engr.uconn.edu

hugh
Download Presentation

CSE 221: Probabilistic Analysis of Computer Systems

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Course outline and schedule Introduction (Sec. 1.1-1.4)

  2. General information CSE 221 : Probabilistic Analysis of Computer Systems Instructor : Swapna S. Gokhale Phone : 6-2772. Email : ssg@engr.uconn.edu Office : ITEB 237 Lecture time : Mon/Fri 12:30 – 1:45 pm Office hours : By appointment (I will hang around for a few minutes at the end of each class). Web page : http://www.engr.uconn.edu/~ssg/cse221.html (Lecture notes, homeworks, and general announcements will be posted on the web page)

  3. Course goals • Appreciation and motivation for the study of probability theory. • Definition of a probability model • Application of discrete and continuous random variables • Computation of expectation and moments • Application of discrete and continuous time Markov chains. • Estimation of parameters of a distribution. • Testing hypothesis on distribution parameters

  4. Expected learning outcomes • Sample space and events: • Define a sample space (outcomes) of a random experiment and identify events of interest and independent events on the sample space. • Compute conditional and posterior probabilities using Bayes rule. • Identify and compute probabilities for a sequence of Bernoulli trials. • Discrete random variables: • Define a discrete random variable on a sample space along with the associated probability mass function. • Compute the distribution function of a discrete random variable. • Apply special discrete random variables to real-life problems. • Compute the probability generating function of a discrete random variable. • Compute joint pmf of a vector of discrete random variables. • Determine if a set of random variables are independent.

  5. Expected learning outcomes (contd..) • Continuous random variables: • Define general distribution and density functions. • Apply special continuous random variables to real problems. • Define and apply the concepts of reliability, conditional failure rate, hazard rate and inverse bath-tub curve. • Expectation and moments: • Obtain the expectation, moments and transforms of special and general random variables. • Stochastic processes: • Define and classify stochastic processes. • Derive the metrics for Bernoulli and Poisson processes.

  6. Expected learning outcomes (contd..) • Discrete time Markov chains: • Define the state space, state transitions and transition probability matrix • Compute the steady state probabilities. • Analyze the performance and reliability of a software application based on its architecture. • Statistical inference: • Understand the role of statistical inference in applying probability theory. • Derive the maximum likelihood estimators for general and special random variables. • Test two-sided hypothesis concerning the mean of a random variable.

  7. Expected learning outcomes (contd..) • Continuous time Markov chains: • Define the state space, state transitions and generator matrix. • Compute the steady state or limiting probabilities. • Model real world phenomenon as birth-death processes and compute limiting probabilities. • Model real world phenomenon as pure birth, and pure death processes. • Model and compute system availability.

  8. Textbooks • Required text book: • K. S. Trivedi, Probability and Statistics with Reliability, Queuing and • Computer Science Applications, Second Edition, John Wiley. • (Book will be available week of Sept. 6)

  9. Course topics • Introduction (Ch. 1, Sec. 1.1-1.5, 1.7-1.11): • Sample space and events, Event algebra, Probability axioms, Combinatorial problems, Independent events, Bayes rule, Bernoulli trials • Discrete random variables (Ch. 2, Sec. 2.1-2.4, 2.5.1-2.5.3, 2.5.5,2.5.7,2.7-2.9): • Definition of a discrete random variable, Probability mass and distribution functions, Bernoulli, Binomial, Geometric, Modified Geometric, and Poisson, Uniform pmfs, Probability generating function, Discrete random vectors, Independent events. • Continuous random variables (Ch. 3, Sec. 3.1-3.3, 3.4.6,3.4.7): • Probability density function and cumulative distribution functions, Exponential and uniform distributions, Reliability and failure rate, Normal distribution

  10. Course topics (contd..) • Expectation (Ch. 4, Sec. 4.1-4.4, 4.5.2-4.5.7): • Expectation of single and multiple random variables, Moments and transforms • Stochastic processes (Ch. 6, Sec. 6.1, 6.3 and 6.4) • Definition and classification of stochastic processes, Bernoulli and Poisson processes. • Discrete time Markov chains (Ch. 7, Sec. 7.1-7.3): • Definition, transition probabilities, steady state concept. Application of discrete time Markov chains to software performance and reliability analysis • Statistical inference (Ch. 10, Sec. 10.1, 10.2.2, 10.3.1): • Motivation, Maximum likelihood estimates for the parameters of Bernoulli, Binomial, Geometric, Poisson, Exponential and Normal distributions, Parameter estimation of Discrete Time Markov Chains (DTMCs), Hypothesis testing.

  11. Course topics (contd..) • Continuous time Markov chains (Ch. 8, Sec. 8.1-8.3, 8.4.1): • Definition, Generator matrix, Computation of steady state/limiting probabilities, Birth-death process, M/M/1 and M/M/m queues, Pure birth and pure death process, Availability analysis.

  12. Course topics and exams calendar Week #1 (Aug. 28): 1. Aug. 28: Logistics, Introduction, Sample Space, Events 2. Sept. 1: Event algebra, Probability axioms, Combinatorial problems Week #2 (Sept. 4): Sept. 4: Labor Day (no class) 3. Sept. 8: Combinatorial problems, Conditional probability, Independent events. Week #3 (Sept. 11): Sept. 11: No class. 4. Sept. 15: Bayes rule, Bernoulli trials (HW #1) Week #4 (Sept. 18): 5. Sept. 18: Discrete random variables, Mass and Distribution functions 6. Sept. 22: Bernoulli, Binomial and Geometric pmfs. Week #5 (Sept. 25): 7. Sept. 25: Poisson pmf, Probability Generating Function (PGF) 8. Sept. 29: Discrete random vectors, Independent random variables. (HW #2)

  13. Course topics and exams calendar (contd..) Week #6 (Oct. 2): 9. Oct. 2: Continuous random variables, Uniform & Normal distributions 10. Oct. 6: Exponential distribution, Reliability, Failure rate (HW#3) Week #7 (Oct. 9): 11. Oct 9: Expectation of random variables, Moments 12. Oct. 13: Multiple random variables, Transform methods Week #8 (Oct. 16): 13. Oct. 16: Moments and transforms of some distributions 14. Oct. 20: Stochastic process, Bernoulli and Poisson process (HW #4) Week #9 (Oct. 23): 15. Oct. 23: Discrete Time Markov Chains 16. Oct. 27: Discrete Time Markov Chains Week #10 (Oct. 30): 17. Oct. 30: Discrete Time Markov Chains (HW #5) 18. Nov. 3: Statistical inference, Parameter estimation Week #11 (Nov. 6): 19. Nov. 6: Statistical inference, Parameter estimation Nov. 10 – no class

  14. Course topics and exams calendar (contd..) Week #12 (Nov. 13): 20. Nov. 13: Hypothesis testing (HW #6) 21. Nov. 17: Continuous Time Markov Chains, Birth-Death process (Project) Week #13 (Nov. 20): Thanksgiving (no class) Week #14: (Nov. 27) 22. Nov. 27: Simple queuing models 23. Dec. 1: Simple queuing models (contd..) Week #15: (Dec. 4) 23. Dec. 4: Pure birth/pure death process, Availability analysis (HW #7) 24. Dec. 8: Overview

  15. Assignment/Homework logistics • There will be one homework based on each topic (approximately) • One week will be allocated to complete each homework • Homeworks will not be graded, but I encourage you to do homeworks since the exam problems will be similar to the homeworks. • Solution to each homework will be provided after a week. • Homework schedule is as follows: • HW #1 (Handed: Sept. 15, Lectures #1-#4) • HW #2 (Handed: Sept. 29, Lectures #5-#8) • HW #3 (Handed: Oct. 6, Lectures #9-#10) • HW #4 (Handed: Oct. 20, Lectures #11-#14) • HW #5 (Handed: Oct. 30, Lectures, #15-#17) • HW #6 (Handed: Nov. 13, Lectures #18-#20) • HW #7 (Handed: Dec. 4, Lectures #21-#24)

  16. Exam logistics • Exams will have problems similar to that of the homeworks. • Exam I: (Oct. 6) • Lectures 1 through 8 • Exam II: (Nov. 3) • Lectures 9 through 14 • Exam III: (Dec. 1) • Lectures 15 through 20 • Exams will be take-home.

  17. Project logistics • Project will be handed in the week before Thanksgiving, and will be due in the last week of classes. • 2-3 problems: • Experimenting with design options to explore tradeoffs and to determine which system has better performance/reliability etc. • Parameter estimation, hypothesis testing with real data. • May involve some programming (can be done using Java, Matlab etc.) • Project report must describe: • Approach used to solve the problem. • Results and analysis.

  18. Grading system Homeworks – 0% - Ungraded homeworks. Midterms - 45% - Three midterms, 15% per midterm Project – 25% - Two to three problems. Final - 30% - Heavy emphasis on the final

  19. Attendance policy • Attendance not mandatory. • Attending classes helps! • Many examples, derivations (not in the book) in the class • Problems, examples covered in the class fair game for the exams. • Everything not in the lecture notes

  20. Feedback Please provide informal feedback early and often, before the formal review process.

  21. Introduction and motivation • Why study probability theory? • Answer questions such as:

  22. Probability model • Examples of random/chance phenomenon: • What is a probability model?

  23. Sample space • Definition: • Example: Status of a computer system • Example: Status of two components: CPU, Memory • Example: Outcomes of three coin tosses

  24. Types of sample space • Based on the number of elements in the sample space: • Example: Coin toss • Countably finite/infinite • Countably infinite

  25. Events • Definition of an event: • Example: Sequence of three coin tosses: • Example: System up.

  26. Events (contd..) • Universal event • Null event • Elementary event

More Related