1 / 16

Entropy and the 2 nd Law of Thermodynamics

Entropy and the 2 nd Law of Thermodynamics.

hye
Download Presentation

Entropy and the 2 nd Law of Thermodynamics

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Entropy and the 2nd Law of Thermodynamics If you do a Google search on the term “entropy” or on the phrase “2nd Law of Thermodynamics”, you will get millions of hits… but they won’t all say the same thing! There are several different ways to state these ideas; they sound so different from each other, but they’re all related.

  2. Flipping a Fair Coin If we toss two fair coins, there are four possible outcomes: It will be easier to list these outcomes with letters (H means “heads” and T means “tails”): TT TH HT HH Thesefour outcomes are equally likely with fair coins. That’s true whether we flip one coin twice, or two coins at the same time and then arrange them in some sequential order.

  3. Now we flip 3 fair coins When we list the possible outcomes, we see that there are eight (8): TTT TTH THT HTT THH HTH HHT HHH We’ll be exploring the outcomes with larger numbers of coins, so it will be convenient for us to categorize these outcomes. We’ll categorize them by the number of heads:

  4. Now we flip 4 fair coins The 16 possible outcomes are categorized above, and I’ve also included the number of specific outcomes in each category. We’ve seen these numbers before… they’re the rows of Pascal’s Triangle! It’s easy to calculate the numbers for each category – just use the “combination” function:

  5. Entropy and the 2nd Law of Thermodynamics “Entropy” is just the number of specific outcomes for a given category. These numbers are shown in the table below, and are plotted on the vertical axis of the bar chart at right. The 2nd Law of Thermodynamics simply says that the system is most likely to be observed in the category having the largest entropy. Duh!

  6. Switching to Probabilities So, we can use the combination function to calculate the number of specific outcomes in each category. The total number of all outcomesis given by 2n where n is the number of coins flipped. Therefore, it is also easy to turn these into a probability of observing each category. Thus, P(H=0.4) = 210/1024 for example. If we list the resulting probability for each category, we have something called a “probability distribution”. The following slides show probability distributions for increasing numbers of coins.

  7. Evolution of the Coin System Alternatively, we could flip the coins one at a time and keep track of the evolving probability. Each coin flip has a random outcome, yet after many flips, the system will evolve toward its most likely outcome. Also, the variation away from the equilibrium value (i.e. the zig-zags in the graph) will disappear after many flips.

  8. Evolution of the Peg and Washer System We began with 160 units of energy uniformly distributed among 40 particles, and this system quickly evolved to a more likely distribution of energy. With only 40 particles in our system, the variation in entropy persists , even at equilibrium… with a mole of particles, any variation would be undetectable.

  9. View of a Typical Example of an Evolved System The image at left shows an entropy board after 1450 collisions. Describe the distribution of energy in this system.

  10. Discussion Questions If I toss 10 fair coins, which of the following specific outcomes is more likely?HHHHHHHHHH or HTHHTHTTTH ? 2) The first example above is from the category H=1.0 while the second is from the category H=0.5. Which category are we more likely to observe if we toss 10 coins? 3) Let’s say I toss 10 coins and then arrange them in some order. Then I tell you only which of the 11 categories I’ve got, but ask you to guess the specific order of heads and tails. Which category would give you the lowest chance of success? This is meant to show why the best definition may be, “Entropy is a measure of the amount of missing information”. With this new definition, restate the 2nd Law of Thermodynamics. 4) Using the trend observed in slides 7 – 12, what would the probability distribution look like if we tossed one billion coins? 5) Evaluate this statement from the new Next Generation Science Standards: “Uncontrolled systems always evolve toward more stable states—that is, toward more uniform energy distribution”.

More Related