Farewell To Entropy

Free Download

Authors:

Edition: WS

ISBN: 9789812707079, 9812707077

Size: 2 MB (2433087 bytes)

Pages: 411/411

File format:

Language:

Publishing Year:

Category:

Arieh Ben-Naim9789812707079, 9812707077

The principal message of this book is that thermodynamics and statistical mechanics will benefit from replacing the unfortunate, misleading and mysterious term entropy with a more familiar, meaningful and appropriate term such as information, missing information or uncertainty. This replacement would facilitate the interpretation of the driving force of many processes in terms of informational changes and dispel the mystery that has always enshrouded entropy.
It has been 140 years since Clausius coined the term entropy ; almost 50 years since Shannon developed the mathematical theory of information subsequently renamed entropy. In this book, the author advocates replacing entropy by information, a term that has become widely used in many branches of science.
The author also takes a new and bold approach to thermodynamics and statistical mechanics. Information is used not only as a tool for predicting distributions but as the fundamental cornerstone concept of thermodynamics, held until now by the term entropy.
The topics covered include the fundamentals of probability and information theory; the general concept of information as well as the particular concept of information as applied in thermodynamics; the re-derivation of the Sackur Tetrode equation for the entropy of an ideal gas from purely informational arguments; the fundamental formalism of statistical mechanics; and many examples of simple processes the driving force for which is analyzed in terms of information.
Contents: Elements of Probability Theory; Elements of Information Theory; Transition from the General MI to the Thermodynamic MI; The Structure of the Foundations of Statistical Thermodynamics; Some Simple Applications.

Table of contents :
Contents……Page 8
List of Abbreviations……Page 14
Preface……Page 16
1.1 A Brief History of Temperature and Entropy……Page 28
1.2 The Association of Entropy with Disorder……Page 36
1.3 The Association of Entropy with Missing Information……Page 46
2.1 Introduction……Page 60
2.2.1 The sample space, denoted……Page 63
2.2.2 The field of events, denoted F……Page 64
2.2.3 The probability function, denoted P……Page 66
2.3 The Classical Definition……Page 70
2.4 The Relative Frequency Definition……Page 72
2.5 Independent Events and Conditional Probability……Page 77
2.5.1 Conditional probability and subjective probability……Page 85
2.5.2 Conditional probability and cause and effect……Page 89
2.5.3 Conditional probability and probability of joint events……Page 91
2.6 Bayes’ Theorem……Page 92
2.6.1 A challenging problem……Page 99
2.6.2 A more challenging problem: The three prisoners’ problem……Page 101
2.7 Random Variables, Average, Variance and Correlation……Page 103
2.8.1 The binomial distribution……Page 113
2.8.2 The normal distribution……Page 117
2.8.3 The Poisson distribution……Page 120
2.9 Generating Functions……Page 121
2.10 The Law of Large Numbers……Page 127
3 Elements of Information Theory……Page 130
3.1 A Qualitative Introduction to Information Theory……Page 131
3.2 Definition of Shannon’s Information and Its Properties……Page 137
3.2.1 Properties of the function H for the simplest case of two outcomes……Page 139
3.2.2 Properties of H for the general case of n outcomes……Page 141
3.2.3 The consistency property of the missing information (MI)……Page 152
3.2.4 The case of an infinite number of outcomes……Page 157
3.2.4.1 The uniform distribution of locations……Page 159
3.2.4.2 The normal distribution of velocities or momenta……Page 161
3.2.4.3 The Boltzmann distribution……Page 164
3.3 The Various Interpretations of the Quantity H……Page 165
3.4 The Assignment of Probabilities by the Maximum Uncertainty Principle……Page 171
3.5 The Missing Information and the Average Number of Binary Questions Needed to Acquire It……Page 176
3.6 The False Positive Problem, Revisited……Page 197
3.7 The Urn Problem, Revisited……Page 199
4 Transition from the General MI to the Thermodynamic MI……Page 204
4.1 MI in Binding Systems: One Kind of Information……Page 205
4.1.2 Two different ligands on M sites……Page 206
4.1.3 Two identical ligands on M sites……Page 209
4.1.4 Generalization to N ligands on M sites……Page 10
4.2 Some Simple Processes in Binding Systems……Page 213
4.2.1 The analog of the expansion process……Page 214
4.2.2 A pure deassimilation process……Page 217
4.2.3 Mixing process in a binding system……Page 221
4.2.4 The dependence of MI on the characterization of the system……Page 223
4.3.1 The locational MI……Page 228
4.3.2 The momentum MI……Page 231
4.3.3 Combining the locational and the momentum MI……Page 232
4.4 Comments……Page 234
5 The Structure of the Foundations of Statistical Thermodynamics……Page 238
5.1 The Isolated System; The Micro-Canonical Ensemble……Page 240
5.2 System in a Constant Temperature; The Canonical Ensemble……Page 247
5.3 The Classical Analog of the Canonical Partition Function……Page 255
5.4 The Re-interpretation of the Sackur–Tetrode Expression from Informational Considerations……Page 259
5.5 Identifying the Parameter β for an Ideal Gas……Page 262
5.6 Systems at Constant Temperature and Chemical Potential; The Grand Canonical Ensemble……Page 263
5.7 Systems at Constant Temperature and Pressure; The Isothermal Isobaric Ensemble……Page 269
5.8 The Mutual Information due to Intermolecular Interactions……Page 271
6 Some Simple Applications……Page 278
6.1 Expansion of an Ideal Gas……Page 279
6.2 Pure, Reversible Mixing; The First Illusion……Page 282
6.3 Pure Assimilation Process; The Second Illusion……Page 284
6.3.1 Fermi–Dirac (FD) statistics; Fermions……Page 286
6.3.2 Bose–Einstein (BE) statistics; Bosons……Page 287
6.3.3 Maxwell–Boltzmann (MB) statistics……Page 288
6.4 Irreversible Process of Mixing Coupled with Expansion……Page 292
6.5 Irreversible Process of Demixing Coupled with Expansion……Page 295
6.6 Reversible Assimilation Coupled with Expansion……Page 297
6.7 Reflections on the Processes of Mixing and Assimilation……Page 299
6.8 A Pure Spontaneous Deassimilation Process……Page 311
6.9 A Process Involving only Change in the Momentum Distribution……Page 314
6.10 A Process Involving Change in the Intermolecular Interaction Energy……Page 317
6.11 Some Baffing Experiments……Page 320
6.12 The Second Law of Thermodynamics……Page 325
A Newton’s binomial theorem and some useful identities involving binomial coefficients……Page 344
B The total number of states in the Fermi–Dirac and the Bose–Einstein statistics……Page 346
C Pair and triplet independence between events……Page 348
D Proof of the inequality |R(X, Y )| 1 for the correlation coecient……Page 349
E The Stirling approximation……Page 353
F Proof of the form of the function H……Page 354
G The method of Lagrange undetermined multipliers……Page 358
H Some inequalities for concave functions……Page 361
I The MI for the continuous case……Page 367
J Identical and indistinguishable (ID) particles……Page 370
K The equivalence of the Boltzmann’s and Jaynes’ procedures to obtain the fundamental distribution of the canonical ensemble……Page 377
L An alternative derivation of the Sackur–Tetrode equation……Page 379
M Labeling and un-labeling of particles……Page 382
N Replacing a sum by its maximal term……Page 383
O The Gibbs paradox (GP)……Page 387
P The solution to the three prisoners’ problem……Page 390
1. Thermodynamics and Statistical Thermodynamics……Page 400
2. Probability and Information Theory……Page 403
4. Cited References……Page 405
Index……Page 408

Reviews

There are no reviews yet.

Be the first to review “Farewell To Entropy”
Shopping Cart
Scroll to Top