Entropy – Increased by Stirring, Decreased by Observation |
Stirring: [start] [once] [stop] | ||||||||||
Peek Purify | ||||||||||
|
||||||||||
|
Conceptually, entropy is a measure of how much we don’t know about the system. See reference 1 for details.
Formally, entropy is defined in terms of probability; specifically:
| (1) |
where Pi is the probability of the ith microstate and the sum runs over all possible microstates of the system.
In the grid shown here, each site can have two possible states: occupied or unoccupied. The coloring shows the probability of occupation: white means unoccupied, i.e. 0% probability, while solid black (or solid red) means 100% probability, and shades of gray indicate intermediate probabilities.
The system is set up so that we can observe some of the sites ("above the veil") and not observe other sites ("below the veil").
In accordance with equation 1, sites above the veil make zero contribution to the total entropy. That’s because Pi log(1/¶i) is zero when Pi=1, and also when Pi=0.
Sites below the veil contribute to the entropy in the usual way, in accordance with equation 1. If you remember where the veiled particles are (perhaps because of a recent peek), the entropy is zero, but stirring causes the entropy to grow.
Each stirring event has a source and a destination. These are chosen randomly, without regard to where the veil is.
When the grid portrays the occupation of sites below the veil (black, white, or shades of gray) that does not mean you are "seeing through the veil". This part of the grid is not telling you anything you did not already know ... it is just helping you keep track of what you remember from the last time you peeked. In contrast, the part of the grid above the veil constantly provides complete up-to-date information about the occupancy of the exposed sites.
It is amusing to hit the Purify button, and then hit the Peek button a few times while the system is still evolving, before it reaches its maximum-entropy state.
As the saying goes, if you’ve see one two-state system, you’ve seen them all. So far, we have interpreted our model system in terms of particles hopping around on a lattice, such that each site was occupied or unoccupied. Therefore it was appropriate to measure the molar entropy (lower-case s) on a per-particle basis. However, we could equally well re-interpret it as a lattice of spin-1/2 objects, such that each site could be spin-up or spin-down. (This is relevant to some interesting applications, such as adiabatic demagnetization refrigerators.) The mathematics is the same in both cases. The only difference is that in the latter case, it would be more conventional to measure the molar entropy on a per-site basis. This is denoted by s′ in the caption to the grid. The distinction is purely a matter of convention; the total entropy (capital S) is what really matters, and it is the same in either interpretation.
The main point of this exercise is to make clear the important distinction between mixing and probability. Entropy is defined in terms of probability. This definition is not going to change anytime soon. Mixing is not the same as probability. If you want a high-entropy state, it is usually necessary but never sufficient for the occupied sites and the vacant sites to be well mixed together. The crucial contribution to the entropy is not knowing where the particles are. If you know where they are, it doesn’t matter whether they are mixed or not ... as you can demonstrate with the "Peek" button:
To say the same thing another way: Some people labor under the misconception that entropy must somehow be a highly dynamic, active process, requiring constant mixing and re-mixing. There is no experimental or theoretical basis for believing this, and counterexamples abound. For example, suppose I hand you a chunk of glass (or if you want to be fancy, a chunk of spin glass). It has a very considerable amount of entropy that is frozen in, completely non-dynamic. The point is, you don’t know what microstate it’s in, and you can’t easily figure it out, so it really doesn’t matter whether the microstate is changing or not.
You can’t pretend that "all entropy is the entropy of mixing", because peeking at a highly-mixed system changes the entropy but does not change the degree of mixing ... not according to the usual definition of mixing. The Peek button behaves differently from the Purify button.
Specialists note: In the context of quantum statistical mechanics, when talking about density matrices, there is a concept of "mixed state" as opposed to "pure state", which is more-or-less directly related to entropy. This is a somewhat specialized, technical definition. It makes perfect sense to experts, but is not suitable for introducing non-experts to the idea of entropy. For example: A density matrix consisting of one electron in Pasadena and another electron in Boston would (almost certainly) be a mixed state, whereas a density matrix for two electrons sharing an sp-hybrid orbital in a single atom would (almost certainly) be a pure state. This rather dramatically conflicts with intuitive notions of mixing ordinary fluids.
Ordinary mixing refers to the distribution of particles over the 400 sites in the two-dimensional grid. The "mixed state" of a density matrix involves things that live in a 400-dimensional Hilbert space. (If you don’t know what this means, don’t worry about it.) Please don’t confuse mixing of particles with mixing of density matrices.
Entropy is defined in terms of the ensemble average of log(1/Pi). Get used to it.