What's entropy?

I sat down with the fascinating crew at the Titanium Physicists podcast to serve as their special physics-ignoramus guest in an episode about entropy (MP3)
Discuss

26 Responses to “What's entropy?”

  1. Boundegar says:

    Cats: entropy’s little helpers.

  2. spacedmonkey says:

    I think that a lot of the quasi-mystical crap that still surrounds entropy in our popular culture is just leftover from the fact that we discovered it empirically before we derived it statistically.   Before Boltzmann founded statistical physics, the empirically observed characteristics of entropy must have seemed pretty fucking strange.  I didn’t listen to the podcast, so I’m not sure if they covered it, but one of my favorite implications of it is that information (as defined by Shannon in terms of entropy) is actually a physical quantity.  A couple years ago, some folks actually physically measured the energy associated with erasing a bit of information, and found it in line with theoretical predictions.

  3. timquinn says:

    As I understand it entropy represents the tendency of all . . . Um . . .

  4. axlrosen says:

    I highly recommend “The Information: A History, A Theory, A Flood” by James Gleick. It’s a  very engaging book on all kinds of things related to “information”, and maybe half of it is a reasonably accessible overview/history of information theory, including entropy (and how confusing it is). 

    It doesn’t help that when it was first coined, it took a while for people to decide on the valence of this new term “entropy” – some used it to mean useful energy, and some the opposite. Maxwell started with one definition, and then flipped to the opposite, which we all use now.

  5. Dewgeist says:

    Thoroughly enjoyed that.  I’d love to have Ben and company over for dinner.  Maybe an annual toast to the heat death of the universe fete?

  6. pjcamp says:

    Turns out on entropy a lot of us physicists are physics ignoramuses too. The presentations we learn from are obscure and baffling. There’s a lot of nattering on about “disorder” but it isn’t clear how to measure disorder. You certainly can’t check out a disorderometer from stores. And in any case, that totally fails to clarify how entropy changes can affect energy, which I do know how to measure. I am told you can compute the entropy from the log of the multiplicity, the multiplicity being a number that counts the number of ways that the energy quanta of a system can be distributed amongst the particles in a system. But surely they are only really distributed one way, and it baffled me for years why a measurable attribute of a system should depend on states that the system could be in but is not. In graduate school, we were taught that the entropy depends on the volume in phase space. Quantum mechanics rears its head here because (a) classically, the volume in phase space can neither increase nor decrease (Liouville’s Theorem) and (b) Planck’s constant appears out of nowhere because it can be thought of as defining a unit volume in phase space.  These days, the Shannon information theory interpretation is fashionable and we compute a fairness function for energy distributions and try to maximize it.

    None of this ever made any sense to me, but I played along, until one day I had cause to figure out the units of entropy (the texts are strangely silent on that topic) and found it to be essentially a type of energy.  Then things started to click.

    So here’s a little entropy game you can play. Lay a whiteboard or large sheet of paper on the floor divided into squares. You don’t need a lot of them. A 4×4 or 5×5 grid is fine. Put a pile of dice on one square, a fair number of them that is not the same as the number of squares. The squares represent particles in your system. The dice represent quanta of energy that may be exchanged between particles as they interact. Roll each die one time. If it comes up 1, move the die one square to the left; if 2, one square to the right; and so on. If 5 or 6 do nothing. If a die reaches the edge of your grid either wait for it to randomly go back, or wrap around to the opposite side of the grid. You can get a procedure to compute the Shannon entropy here:

    http://en.wikipedia.org/wiki/Entropy_%28information_theory%29

    What you find is that the value starts off small and reaches a maximum value where it wavers up and down a bit but never departs significantly thereafter. If you watch the dice at the same time, you see that the max is reached as the dice become as uniformly spread out on the grid as it is possible for them to get.

    Now lets unpack this. There are a lot of energy functions in thermodynamics — the internal energy, the enthalpy, the free energy, the Gibbs free energy, and so on. Each measures that portion of the energy that is available to do work in a particular type of process. For example, if the volume of your system expands, some of the energy must be used to do work to move the surrounding medium out of the way and that energy is not available for you to do anything with as it can’t be used twice. The enthalpy tells you how much is left for you.

    But doing work means transferring energy. That is what work is. And to transfer energy, there must be a gradient, a “hill”, in the energy. Work is done as the energy in the “hill” part of the system moves down into the “valley” part of the system. Aha! As work is done, the energy becomes more uniformly distributed, and the entropy increases! So what entropy measures is indeed an energy. It is that portion of the energy that is evenly distributed through the system and so is not available for doing work in any process at all.

    It took me the better part of 20 years of intermittent thinking to figure this out.

    • niktemadur says:

      Absolutely brilliant, thank you.  Allow me expand your sentence a bit in bold text, please tell me if it’s right or wrong:

      It (entropy) is that ever increasing portion of the energy that is evenly distributed through the system and so is not available for doing work in any process at all, ever again.

      • Energy is not entropy. Entropy is a measure of the distribution of energy in a system. It is not quite right to say that the energy is evenly distributed: it is more accurate to say it’s randomly distributed, and if you follow Einstein’s model from his 1907 paper, then at maximum entropy, the distribution follows a Poisson distribution.

        If you expand a system at maximum entropy by adding a “cold” component to it, then energy will flow from the warm part to the cold, and that flow can be used for work. The enlargement of the system increases the number of ways the energy can be distributed, so even though the energy density falls (immediately), the entropy increases (with time).

        If you have time to work through the Nuffield book on Change and Chance, it’s well worth it. It illustrates the difference very clearly, with experiments you can do at home, if you buy some lab equipment or write some software.

        • pjcamp says:

           Entropy is a form of energy. It is measured in units of Joules/Kelvin and appears in the first law of thermodynamics/conservation of energy thus:

          dU = deltaQ + deltaW

          where U is the total internal energy, deltaQ is the heat transfer and deltaW is the work done. If the system is such that temperature T is a well defined quantity, then deltaQ = TdS where S is the entropy. You can’t set an energy equal to a thing that isn’t an energy. Energy is not the same as entropy, but entropy (or more accurately TdS) is a form of energy. It is a portion of the energy that is defined by the way in which it is distributed. But the distribution is not itself the entropy. This is the sort of argument that makes entropy difficult to understand.

          • Saying J/K is a form of Energy is like saying m/s is a form of distance, or Force is a form of momentum. Really?

          • Let me add – there is a real difference between the understanding that comes from an understanding of thermodynamics that is rooted in statistics, and the classical understanding. When approached from the statistical viewpoint, the distribution is everything. Run your own program: see whether it evolves to the “evenly distributed”, or “uniformly distributed” map your posts refer to.

          • pjcamp says:

            Not, not really. Read the whole thing instead of cherry picking. Changes in internal energy are related to changes in entropy and in work done. I suppose you assert that work is unrelated to energy as well?

            In point of fact, I have run my own program — the last time I taught Thermal Physics last fall. When I did that, I drew on modern, research based pedagogy from the Paradigms program at Oregon State, where I got the whole dice idea. When you “run the program” what do you see? You see the dice spreading out until they come as close to uniformly distributed as it is possible to come with that number of dice on that number of squares. That is exactly what happens to energy quanta in a real physical system as well.

            In the modern point of view, entropy is viewed as how much energy is spread ou at a specific temperature. At equilibrium, there are many equivalent ways to maximally spread out the energy of the system, as the dice illustrate. Because the dice/energy quanta spread out over time, there are a greater number of microstates with equivalent energy in the final state than in the initial state. That’s the whole point.

            This point of view makes clear some previously very mysterious points, such as entropy of mixing, where two materials being mixed are at the same temperature and pressure so there can be no work done nor heat exchanged between them. Nevertheless, the entropy does increase, in this instance because of the literal spreading out of motional energy for each material in the larger volume.

            The energy spreading point of view has a long history in physics. Lord Kelvin wrote about it in 1852 (http://zapatopi.net/kelvin/papers/on_a_universal_tendency.html). It has been a common view in physics and physical chemistry since the 1950′s. Frank Lambert, of Occidental College, has written extensively on the energy spreading point of view http://entropysite.oxy.edu/ Dan Styer, of Oberlin College, has advocated a similar point of view (Styer D. F., 2000, Am. J. Phys. 68: 1090-96). In Lambert’s words, “Entropy change measures the dispersal of energy: how much energy is spread out in a particular process, or how widely spread out it becomes.” Which pretty much sounds like what I’ve been describing.The quantity TS can be shown to be the amount of mechanical energy that has been converted into thermal energy by system irreversabilities. So does this mean S/T is an energy measure?Yes, it does. You ask if m/s is a form of distance, by which I assume you are making an inarticulate reference to velocity, or if force is a from of momentum, by which I assume you’re referring to the Newtonian F=dp/dt relation.Yes. Like any ratio, velocity can be interpreted physically along the lines of “how much of the thing on the top goes with one of the thing on the bottom.” So velocity can be interpreted as “how much distance an object travels in each second.” That sounds like a form of displacement to me. Force can be interpreted as “how much momentum is transferred in each second.” I’ll let you work out entropy.I’ll also let you go pick nits somewhere else. I’m done here.

      • pjcamp says:

         That’s essentially the idea. Randomly distributed is as close to uniformly distributed as you can get when the number of energy quanta and the number of particles don’t match. The exact distribution varies a bit from moment to moment as particles exchange quanta, but being more or less evenly distributed there’s no particular preferred direction(s) to exchange so over time it averages out to a steady value at equilibrium.

        • pjcamp, do you mean “uniform distribution” as the technical term? as in http://en.wikipedia.org/wiki/Uniform_distribution_(continuous) ? If so, then no. That isn’t it at all.

        • pjcamp – I’ll use your model of a board with counters to show why energy is not evenly distributed at maximum entropy.

          The more likely a distribution pattern is, the greater the entropy. That’s what the entropy value is.

          Take a 5×5 board (crystalline solid) and 25 counters (quanta). There is exactly one way to put down all the counters so that every cell has one counter.Take one counter from anywhere on the board, and move it to another cell. There are 25 cells you can take the counter from, and 24 you can move it to, so there are 600 possible ways for this distribution (0 quanta in 1 cell, 1 quantum in 23 cells, 2 quanta in 1 cell) to arise. Repeat the process, and again look at the distribution. You may get 2 cells with no quanta, 22 cells with one, and 2 cells with 2: very probable, lots of combinations look like that. A few – 25 – will have one cell with three quanta; and only one will have the even rectangular distribution. Which becomes increasingly improbable as the board evolves.

          The even, rectangular distribution still has only one way to arise, and it’s actually less probable than “every counter in one cell”, because there are 25 ways to do that.

          • pjcamp says:

            That’s true, and also not important. Any single configuration has a vanishingly small probability. The point is that scattering themselves across the board is highly likely whereas unscattering is not. So barring outside interactions, moving quickly toward a maximally scattered state, and then fluctuating between the various maximally scattered and nearly maximally scattered states, is by far the most likely outcome.

            I specifically excluded the case of having the number of quanta exactly match the number of particles for that very reason. That is (a) very unlikely for a real physical system and (b) excludes the possibility of an even rectangular distribution and pretty much requires random fluctuations, which is an important part of the physics.. So when you say “take a 5×5 board with 25 counters” you’re not actually using my model. You’re taking the very situation I deliberately excluded as an exemplar.

            Well, ok, based on that assumption your point is valid but is not relevant to the example I gave and also models a set of physical systems so unlikely that they pretty much never occur.

  7. pjcamp says:

     Oh, and also: “The more you put things together, the more they fall apart.” — Doctor Who.

  8. I took A-level physics in the UK 30 years ago, and my school followed a syllabus which not only delivered statistical thermodynamics, but also rudimentary quantum mechanics. For many years, the textbooks were unavailable, but in the past 5 years, with the emergence of “STEM teaching is important”, the textbooks have become available again, at  http://www.nationalstemcentre.org.uk/elibrary/collection/711/nuffield-advanced-science-physics. Register and download them all.

    Part 4 of “Physics Student’s Book and Teachers’ Guide Unit 9 – Change and Chance” develops the model of a solid with fixed intervals between energy levels from Einstein’s 1907 paper on Plank’s radiation theory and specific heat. On page 121, the AP-level text arrives at this statement:

    “What is entropy? It is just the logarithm of the number W of distinct states a thing can be in, scaled down by the constant k [Boltzmann constant] to make the value more manageable.”

    My son is about to go to high school in the US, and he’s getting interested in the sciences. I was very surprised to find that AP physics was purely classical: it has Maxwell’s equations, but no wave mechanics to speak of, no measurement of the speed of light, and absolutely no quantum mechanics.

Leave a Reply