1 / 29

Business Statistics For Contemporary Decision Making 9 th Edition

Business Statistics For Contemporary Decision Making 9 th Edition. Ken Black. Chapter 4 Probability. Learning Objectives. 1. Describe what probability is and when one would use it.

smontgomery
Download Presentation

Business Statistics For Contemporary Decision Making 9 th Edition

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Business Statistics For Contemporary Decision Making9th Edition Ken Black Chapter 4 Probability

  2. Learning Objectives 1. Describe what probability is and when one would use it. 2. Differentiate among three methods of assigning probabilities: the classical method, relative frequency of occurrence, and subjective probability. 3. Deconstruct the elements of probability by defining experiments, sample spaces, and events, classifying events as mutually exclusive, collectively exhaustive, complementary, or independent, and counting possibilities. • Compare marginal, union, joint, and conditional probabilities by defining each one.

  3. Learning Objectives • Calculate probabilities using the general law of addition, along with a joint probability table, the complement of a union, or the special law of addition if necessary. 6. Calculate joint probabilities of both independent and dependent events using the general and special laws of multiplication. 7. Calculate conditional probabilities with various forms of the law of conditional probability, and use them to determine if two events are independent. 8. Calculate conditional probabilities using Bayes' rule.

  4. 4.1 Introduction to Probability • Inferential statistics involves taking a sample and using sample statistics to infer the values of population parameters. • Laws of probability can often allow the researcher to assign a probability that the inference is correct.

  5. 4.1 Introduction to Probability Classical Method of Assigning Probabilities • An experiment is a process that produces outcomes. • An event is the outcome of an experiment. • In the classical method, the probability of an individual event is determined by the ratio of the number of items that contain the event (n) to the total number of items in the population (N). • Each outcome is equally likely. where N = total number of possible outcomes ne = the number of outcomes in which the event occurs

  6. 4.1 Introduction to Probability Classical Method of Assigning Probabilities, continued. • Because n can never be greater than N, the highest value of a probability is 1. • The lowest probability, if none of the N possibilities has the desired characteristic, e, is 0. • Thus 0 ≤ P(E) ≤ 1. • Example: Machine A always produces 40% of products, and always has a 10% defective rate. • Thus the probability of that a product is defective and came from Machine A is 0.04. • A priori probability– the probability can be determined before the experiment takes place.

  7. 4.1 Introduction to Probability Relative Frequency of Occurrence • Probability of an event occurring is equal to the number of times the event has occurred in the past divided by the total number of opportunities for the event to have occurred. • P(E) = • Based on historical data, and the past may or may not be a good predictor of the future. • Example: A company wants to know the probability that its inspectors will reject a shipment from a particular supplier. • They have received 90 shipments in the past and rejected 10. • Thus the probability that they will reject is 10/90 = 0.11.

  8. 4.1 Introduction to Probability Subjective Probability • Based on the insights or feelings of the person determining the probability. • Different individuals may (correctly or incorrectly) assign different numeric probabilities to the same event. • Examples: • An experienced airline mechanic estimates the probability that a particular plane will have a certain type of defect. • A doctor assigns a probability to the expected life span of a patient with cancer.

  9. 4.2 Structure of Probability • Experiment: a process that produces an outcome. • Sampling every 200th bottle of cola and weighing it. • Auditing every 10th account. • Testing drugs on patients and recording outcomes. • Event: an outcome of an experiment • There are 10 bottles that are too full. • There are 3 accounts with problems. • Half of the patients improve. • Elementary event: event that cannot be decomposed or broken down into other events • Suppose that the experiment is to roll a die. • Elementary events are to roll a 1, a 2, a 3, etc. • Elementary events are denoted by lowercase letters. • In this case, there are six elementary events,

  10. 4.2 Structure of Probability • Sample Space: a complete roster/listing of all elementary events for an experiment • Rolling a single die: sample space is {1,2,3,4,5,6} • For the experiment of rolling 2 dice, the sample space has 36 elementary events.

  11. 4.2 Structure of Probability Unions and Intersections • Set notation is the use of braces to group numbers. • The union of sets X, Y is denoted • An element is part of the union if it is in set X, set Y, or both. • “X or Y” • The intersection of sets X, Y is denoted • An element is part of the intersection if it is in set X and set Y. • “X and Y” Union of X and Y Intersection of X and Y

  12. 4.2 Structure of Probability Mutually Exclusive Events • Mutually exclusive events are events with no common outcomes • Occurrence of one event precludes the occurrenceof the other event • Example: if you toss a coin and get heads, you cannot get tails. Mutually Exclusive Events

  13. 4.2 Structure of Probability Independent Events Events are independent if the occurrence of one event does not affect the occurrence or nonoccurrence of the other event. • The probability of someone wearing glasses is unlikely to affect the probability that the person likes milk. • Many events are not independent. • The probability of carrying an umbrella changes when the weather forecast predicts rain. If events are independent, then: P(X|Y) is the probability that X occurs given that Y has occurred.

  14. 4.2 Structure of Probability Collectively Exhaustive Events A list of collectively exhaustive events contains all possible elementary events for an experiment. • The sample space for an experiment can be described as mutually exclusive and collectively exhaustive. Complementary Events All elementary events not in the event X are in its complementary event, X’, which is called “not X.”

  15. 4.2 Structure of Probability Counting the Possibilities The mn Counting Rule: • If an operation can be done m ways and a second operation can be done n ways, then there are mn ways for the two operations to occur in order. • A cafeteria offers 5 salads, 4 meats, 8 vegetables, 3 breads, 4 desserts, and 3 drinks. A meal is two servings of vegetables, which may be identical. • How many meals are available? • 5 * 4 * 8 * 3 * 4 * 3 = 5760 Sampling from a Population with Replacement: • Sampling n items from a population of size N with replacement would provide possibilities. • Six lottery numbers are drawn from the digits 0-9, with replacement. • There are (10)6 = 1,000,000 possible outcomes

  16. Marginal Union Joint Conditional The probability of X occurring The probability of X or Y occurring The probability of X and Y occurring The probability of X occurring given that Y has occurred Y Y X X X 4.3 Marginal, Union, Joint, and Conditional Probabilities Y X

  17. Y X 4.4 Addition Laws The General Law of addition is used to find the probability of the union of two events.

  18. N S . .70 .67 4.4 Addition Laws Addition Law Example: • Workers were asked which changes in office design would increase productivity. • 70% chose noise reduction (N), 67% chose more storage space (S). • 56% chose BOTH noise reduction AND more space. • What is the probability that a person chooses EITHER noise reduction OR more storage space? .56

  19. 4.4 Addition Laws Joint Probability Tables A joint probability table displays the intersection (joint) probabilities along with the marginal probabilities of a given problem. • The joint probability table for the office productivity problem. • Inner cells show joint probabilities. • Outer cells show marginal probabilities. Increase storage space Noise reduction

  20. X Y 4.4 Addition Laws The complement of a union is the probability that that the outcome is neither X nor Y. • For the office productivity problem, since the probability of N or S was 0.81, the probability that a worker chooses neither noise reduction nor storage space is 1 – 0.81 = 0.19. Neither X nor Y

  21. 4.4 Addition Laws The Special Law of Addition applies to mutually exclusive events. • Since there is no intersection of X and Y, there is no need to subtract the joint probability (because it is zero). • For example, if workers are asked which most hinders their productivity, and 20% cite lack of direction, 18% cite too much work, 18% cite lack of support, 8% cite inefficient process, 7% cite lack of supplies, and 29% cite other reasons, what is the probability that a worker cites too much work orinefficient process?

  22. 4.5 Multiplication Laws General Law of Multiplication • Used to find the joint probability • Example:A company has 140 employees, of which 30 are supervisors (S). Eighty of the employees are married (M), and 20% of the married employees are supervisors. If a company employee is randomly selected, what is the probability that the employee is married and is a supervisor? • P(M) = 80/140 = .57 • = .11

  23. 4.5 Multiplication Laws Special Law of Multiplication • If X and Y are independent, • Example:A study found that 28% of Americans believe that the ATM has had a most significant impact on everyday life (A). A different study found that 71% of workers believe that working in a team reduces stress (S). These studies are unrelated, so they can be considered independent. • What is the probability that a randomly selected person believes that an ATM is significant AND is less stressed working in a team?

  24. 4.5 Conditional Probability Law of Conditional Probability • Example: Recall that in the office productivity study, 70% favored noise reduction, and 67% favored increases in storage space. 56% believed that both would improve productivity. • What is the probability that a randomly selected person believes storage space would increase productivity, given that he or she believes that noise reduction improves productivity?

  25. 4.5 Conditional Probability Independent Events If events are independent, then • Example: Recall that in the office productivity study, 70% favored noise reduction, and 67% favored increases in storage space. 56% believed that both would improve productivity. • Are these events independent? • Since these probabilities are not equal, these events are not independent. • Could also check P(N|S)

  26. 4.6 Revision of Probabilities: Bayes’ Rule Bayes’ Rule extends the use of the law of conditional probabilities to allow revision of original probabilities with new information. • The denominator is a weighted average of the conditional probabilities, with the weights being the prior probabilities. • Formula allows statisticians to incorporate new information to revise probability estimates.

  27. 4.6 Revision of Probabilities: Bayes’ Rule Example: A printer company sells cartridges under its own brand name that are produced by two suppliers: Alamo (A) and South Jersey (SJ). Once the cartridges are relabeled, it is hard to tell which company they came from. • 65% of cartridges come from Alamo. • 35% of cartridges come from South Jersey. • These are the prior probabilities. The defective rates for the two companies also vary. • 8% of Alamo cartridges are defective: P(D|A) = .08. • 12% of South Jersey cartridges are defective: P(D|SJ) = .12. If a randomly-selected cartridge is defective, what is the probability that it came from Alamo? • The denominator is a weighted average of the conditional probabilities, with the weights being the prior probabilities. • Formula allows statisticians to incorporate new information to revise probability estimates.

  28. 4.6 Revision of Probabilities: Bayes’ Rule Example, continued. If a randomly-selected cartridge is defective, what is the probability that it came from Alamo? • If a random cartridge is picked up, and it is defective, there is a 5.2% chance that it came from Alamo.

  29. 4.6 Revision of Probabilities: Bayes’ Rule Example, continued. • We now revise the probability that, given that a cartridge is defective, it came from Alamo.

More Related