1 / 35

Thermodynamics and the Gibbs Paradox

Thermodynamics and the Gibbs Paradox. Presented by: Chua Hui Ying Grace Goh Ying Ying Ng Gek Puey Yvonne. Overview. The three laws of thermodynamics The Gibbs Paradox The Resolution of the Paradox Gibbs / Jaynes Von Neumann

benjamin
Download Presentation

Thermodynamics and the Gibbs Paradox

An Image/Link below is provided (as is) to download presentation Download Policy: Content on the Website is provided to you AS IS for your information and personal use and may not be sold / licensed / shared on other websites without getting consent from its author. Content is provided to you AS IS for your information and personal use only. Download presentation by click this link. While downloading, if for some reason you are not able to download a presentation, the publisher may have deleted the file from their server. During download, if you can't get a presentation, the file might be deleted by the publisher.

E N D

Presentation Transcript


  1. Thermodynamics and the Gibbs Paradox Presented by: Chua Hui Ying Grace Goh Ying Ying Ng Gek Puey Yvonne

  2. Overview • The three laws of thermodynamics • The Gibbs Paradox • The Resolution of the Paradox • Gibbs / Jaynes • Von Neumann • Shu Kun Lin’s revolutionary idea • Conclusion

  3. The Three Laws of Thermodynamics • 1st Law • Energy is always conserved • 2nd Law • Entropy of the Universe always increase • 3rd Law • Entropy of a perfect crystalline substance is taken as zero at the absolute temperature of 0K.

  4. Unravel the mystery of The Gibbs Paradox

  5. The mixing of non-identical gases Before

  6. After Shows obvious increase in entropy (disorder)

  7. The mixing of identical gases Before

  8. After Shows zero increase in entropy as action is reversible

  9. Compare the two scenarios of mixing and we realize that…… There is a CONTRADICTION!!!

  10. To resolve the Contradiction • Look at how people do this • Gibbs /Jaynes • Von Neumann • Lin Shu Kun

  11. Gibbs’ opinion • When 2 non-identical gases mix and entropy increase, we imply that the gases can be separated and returned to their original state • When 2 identical gases mix, it is impossible to separate the two gases into their original state as there is no recognizable difference between the gases

  12. Gibbs’ opinion (2) • Thus, these two cases stand on different footing and should not be compared with each other • The mixing of gases of different kinds that resulted in the entropy change was independent of the nature of the gases • Hence independent of the degree of similarity between them

  13. Entropy Smax S=0 Z=0 Similarity Z = 1

  14. Jaynes’ explanation • The entropy of a macrostate is given as Where S(X) is the entropy associated with a chosen set of macroscopic quantities W(C) is the phase volume occupied by all the microstates in a chosen reference class C

  15. Jaynes’ explanation (2) • This thermodynamic entropy S(X) is not a property of a microstate, but of a certain reference class C(X) of microstates • For entropy to always increase, we need to specify the variables we want to control and those we want to change. • Any manipulation of variables outside this chosen set may cause us to see a violation of the second law.

  16. Von Neumann’s Resolution • Makes use of the quantum mechanical approach to the problem • He derives the equation Where  measures the degree of orthogonality, which is the degree of similarity between the gases.

  17. Von Neumann’s Resolution (2) • Hence when = 0 entropy is at its highest and when  = 1 entropy is at its lowest • Therefore entropy decreases continuously with increasing similarity

  18. Entropy Smax S=0 Z=0 Similarity Z = 1

  19. Resolving the Gibbs Paradox - Using Entropy and its revised relation with Similarity proposed by Lin Shu Kun. • Draws a connection between information theory and entropy • proposed that entropy increases continuously with similarity of the gases

  20. Why “entropy increases with similarity” ? • Due to Lin’s proposition that • entropy is the degree of symmetry and • information is the degree of non-symmetry Analyse 3 concepts! (1) high symmetry = high similarity, (2) entropy = information loss and (3) similarity = information loss.

  21. (1) high symmetry = high similarity • symmetry is a measure of indistinguishability • high symmetry contributes to high indistinguishability • similarity can be described as a continuous measure of imperfect symmetry • High Symmetry Indistinguishability High similarity High symmetry can be described as high similarity !

  22. (2) entropy = information loss • an increase in entropy means an increase in disorder. • a decrease in entropy reflects an increase in order. • A more ordered system is more highly organized  thus possesses greater information content.

  23. Do you have any idea what the picture is all about?

  24. From the previous example, • Greater entropy would result in least information registered •  Higher entropy , higher information loss • Thus if the system is more ordered, • This means lower entropy and thus less information loss. entropy = information loss

  25. (3) similarity = information loss. 1Particle (n-1)particles For a system with distinguishable particles, Information on N particles = different information of each particle = N pieces of information For a system with indistinguishable particles, Information of N particles = Information of 1 particle = 1 piece of information High similarity (high symmetry)  there is greater information loss.

  26. Concepts explained: (1) high symmetry = high similarity (2) entropy = information loss and (3) similarity = information loss After establishing the links between the various concepts, If a system is highly symmetrical high similarity Greaterinformationloss Higher entropy

  27. The mixing of identical gases (revisited) Before

  28. After

  29. Lin’s Resolution of the Gibbs Paradox • Compared to the non-identical gases, we have less information about the identical gases • According to his theory, • less information=higher entropy Therefore, the mixing of gases should result in an increase with entropy. No Paradox!

  30. Entropy Entropy Entropy Smax Smax Smax S=0 S=0 Z=0 Similarity Z = 1 S=0 Z=0 Similarity Z = 1 Z=0 Similarity Z = 1 Comparing the 3 graphs Gibbs Von Neumann Lin

  31. Why are there different ways in resolving the paradox? • Different ways of considering Entropy • Lin—Static Entropy: consideration of configurations of fixed particles in a system • Gibbs & von Neumann—Dynamic Entropy: dependent of the changes in the dispersal of energy in the microstates of atoms and molecules

  32. We cannot compare the two ways of resolving the paradox! • Since Lin’s definition of entropy is essentially different from that of Gibbs and von Neumann, it is unjustified to compare the two ways of resolving the paradox.

  33. Conclusion • The Gibbs Paradox poses problem to the second law due to an inadequate understanding of the system involved. • Lin’s novel idea sheds new light on entropy and information theory, but which also leaves conflicting grey areas for further exploration.

  34. Acknowledgements • We would like to thank Dr. Chin Wee Shong for her support and guidance throughout the semester Dr Kuldip Singh for his kind support And all who have helped in one way or another

More Related