Entropy? It's easy!

This post is a free translation of the response that Mark Eichenlaub gave to the question What's an intuitive way to understand entropy? , given on the site Quora i>

Entropy. Perhaps this is one of the most difficult-to-understand concepts with which you can meet in the course of physics, at least when it comes to classical physics. Few graduate, can explain what it is. Most of the problems with the understanding of entropy, however, can be removed if to understand one thing. Entropy is qualitatively different from other thermodynamic quantities: such as pressure, volume, or internal energy, because it is not a property of the system, and how we look at this system. Unfortunately, in the course of thermodynamics it is usually considered along with other thermodynamic functions, which exacerbates the misunderstanding.




So what is the entropy?
In a nutshell, the

The entropy - that's how a lot of information you do not know about the system blockquote> For example, if you ask me where I live, and I will answer: in Russia, my entropy is high for you, after all, Russia big country. If I call you my postal code: 603081, my entropy decreases for you, because you will get more information.


Zip code contains six digits, that is, I have given you six characters of information. Entropy your knowledge of me dropped approximately 6 characters. (In fact, it is not because some indexes correspond more addresses, and some - less, but we neglect the these).


Or consider another example. Suppose I have ten dice (hex), and throw them, I tell you that their sum is equal to 30. Knowing only that, you can not say what specific numbers on each of the bones - you do not have enough information. These specific figures on the bones in statistical physics called microstates, and the total amount (30 in our case) - macrostate. There are 2,930,455 microstates that correspond to the amount equal to 30. So that the entropy of the macrostate is about 6, 5 characters (half comes from the fact that the numbering of microstates in order in the seventh discharge you, not all numbers, but only 0, 1 and 2).

And what if I told you that the sum is equal to 59? To do this, there are only 10 macrostate possible microstates, so that its entropy is only one character. As you can see, different macrostate have different entropy.

Now let me tell you that the sum of the first five bones 13, and the sum of the remaining five - 17, so that the total amount of over 30. You have, however, in this case more information is available, so the entropy of the system for you should fall. Indeed, five bones 13 can receive 420-th different ways, and 17 - 780 th, i.e. the total number of microscopic make only 420h780 327 = 600. The entropy of the system by approximately one character is smaller than in the first example.

We measure the entropy of the number of characters required to record the number of microstates. Mathematically, it is defined as the logarithm of the number, so the symbol denoting the entropy S, and the number of microstates symbol Ω, we can write:

S = log Ω

This is nothing but the Boltzmann formula (up to a factor k, which depends on the selected unit of measurement) for the entropy. If macrostate meet one microstate, its entropy by this formula is zero. If you have two systems, the total entropy is equal to the sum of the entropy of each of these systems, because log (AB) = log A + log B.


From the above description it becomes clear why we should not think of entropy as properties of the system. The system has opedelёnnye internal energy, momentum, charge, but it has no specific entropy: entropy ten bones depends on you only know their full amount, or also the partial sums fives bones.

In other words, the entropy - that's how we describe the system. And that makes it much different from other values ​​which made work in physics.

Physical example: gas under the piston
Classical system, which is considered in physics, is the gas contained in the vessel under the piston. Microstate of gas - it's position and momentum (speed) of each of its molecules. This is equivalent to saying that you know the value that fell on each bone in the above example earlier. Macrostate gas is described such quantities as pressure, density, volume, chemical composition. This is the sum of the values ​​on the rolled dice.


Quantities describing macrostate may be bonded to each other through a so-called "equation of state". The presence of this connection allows, not knowing microstates to predict what will happen to our system, if you start it to heat or to move the piston. For an ideal gas equation of state has the simple form:

p = ρT

although you are likely to be more familiar with the Clapeyron - Mendeleev pV = νRT - this is the same equation, but with the addition of a pair of constants to confuse you. The more Microstates meet this macrostate that is more particles are part of our system is better describe its state equation. For typical values ​​of the number of gas particles are Avogadro's number, that is of the order of 10 23 sup>.

Quantities such as pressure, temperature and density averaged called because they are averaged for continued successive microstates corresponding to a given macrostate (or rather close to it macrostates). To find out which microstate is a system, we need a lot of information - we need to know the position and velocity of each particle. The amount of this information is called entropy.

How does the change in entropy of a macrostate? It is easy to understand. For example, if we have a little heat the gas, the speed of its particles will increase, therefore, will increase and the extent of our ignorance about the speed, that is, the entropy increase. Or, if we increase the volume of gas by pushing the piston to increase the extent of our ignorance of the positions of the particles, and the entropy will also grow.

Solids and the potential energy
If we consider a gas instead of a solid body, in particular with an ordered structure as a crystal, for example, a piece of metal, its entropy is small. How Come? Because knowing the position of a single atom in this structure, you know, and the position of all the other (they are arranged in a regular crystal structure), the speed of the atoms is small, because they can not fly away from their position and only slightly oscillate around the equilibrium position. < br />

If a piece of metal is in a gravitational field (for example, raised above the surface of the Earth), the potential energy of each atom in the metal is approximately equal to the potential energy of other atoms, and associated with this energy entropy is low. This differs from the kinetic potential energy, which for thermal motion can vary from atom to atom.

If a piece of metal, raised to a certain height, let them go, its potential energy is converted into kinetic energy, but the entropy increases there will be little, because all the atoms will move approximately the same. But when a piece falls to the ground during the strike metal atoms will have a random direction, and the entropy increase dramatically. The kinetic energy of the directed motion goes into the kinetic energy of thermal motion. Before we hit some know how to move each atom, now we have lost this information.

Understand the second law of thermodynamics
The second law of thermodynamics states that the entropy (closed system) always increases. We can now understand why: because you can not suddenly get more information about microstates. As soon as you lose some information about microstate (as during the impact a piece of metal on the ground), you can not return it back.


Let's go back to the dice. Recall that macrostate with a total of 59 has a very low entropy, but also get it is not so simple. If you roll the dice again and again, it will fall out of the amounts (macrostate), which corresponds to a greater number of microstates that is to be implemented macrostate with large entropy. The biggest entropy has a total of 35, and that she will fall more often than others. That's about it, and said second law of thermodynamics. Any random (uncontrolled) interaction leads to an increase in entropy, at least until it reaches its maximum.

Stirring gas
And one more example to secure said. Suppose we have a container in which there are two gas separated located in the middle of the container wall. We call one gas molecule blue, and the other - red.

If the open bulkhead, gases begin to stir, because the number of microscopic where mixed gases is much greater than Microstates in which they are separated, and microstate all naturally equally. When we opened the partition for each molecule we lost information on which side of the partition, it is now. If molecules were N, the N bits of information is lost (bits or symbols in this context, it is actually the same, and differ only by some constant factor).

Investigated with Maxwell's demon
And finally, consider the solution within our paradigm famous paradox of Maxwell's demon. Let me remind you that he is as follows. Suppose we have a mixed gases of blue and red molecules. Put back the partition and do the small hole, which landed in an imaginary demon. His task - to pass from left to right only red and blue only from right to left. It is obvious that after some time the gases will be divided again: all blue molecules will be left of the walls, and all the red - right.


It turns out that our demon dropped the entropy of the system. With the demon, nothing happened, that is, its entropy has not changed, but the system we had closed. It turns out that we found an example where the second law of thermodynamics is not satisfied! As such was possible?

We solve this paradox, however, is very simple. After entropy - this property is not the system, and our knowledge about the system. We all know about the system a little, why we think that its entropy decreases. However, our system daemon knows about a lot - so that separate molecules, it must know the position and velocity of each of them (at least on approach thereto). If he knows everything about the molecules, then from his point of view, the entropy of the system, in fact, is zero - it does not have the missing information about it. In this case, the entropy of the system as equal to zero and remains at zero, and the second law of thermodynamics is never disturbed.

But even if the demon does not know all the information about the microstate of the system, it is, at least, it is necessary to know the color of the molecule flies up to him to see her pass or not. If the total number of molecules is equal to N, the daemon should have N bits of system information - so much information but we lost when opened septum. That is, the number of lost information is exactly equal to the amount of information to be obtained about the system to return it to its original state - and it sounds quite logical, and, again, does not contradict the second law of thermodynamics.

Source: geektimes.ru/post/246406/