Home Grape Entropy Wiktionary. The meaning of the word "entropy. Internally and externally reversible processes

Entropy Wiktionary. The meaning of the word "entropy. Internally and externally reversible processes

Entropy is a term that is used not only in the exact sciences, but also in the humanities. In the general case, it is a measure of randomness, disorder of a certain system.

As you know, mankind has always strived to shift as much work as possible onto the shoulders of machines and mechanisms, using as few resources as possible for this. Mention of a perpetual motion machine was first found in Arabic manuscripts of the 16th century. Since then, many designs have been proposed for a potentially perpetual motion machine. Soon, after many unsuccessful experiments, scientists understood some of the features of nature, which later determined the foundations of thermodynamics.

Drawing of a perpetual motion machine

The first law of thermodynamics says the following: in order to perform work, a thermodynamic system will require either internal energy of the system or external energy from additional sources. This statement is a thermodynamic law of conservation of energy and forbids the existence of a perpetual motion machine of the first kind - a system that does work without expending energy. The mechanism of one of these engines was based on the internal energy of the body, which can be converted into work. For example, this may be due to an extension. But humanity does not know bodies or systems that can expand indefinitely, which means that sooner or later their internal energy will end and the engine will stop.

Somewhat later, the so-called perpetual motion machine of the second kind appeared, which did not contradict the law of conservation of energy, and was based on the mechanism of heat transfer required for work by the surrounding bodies. They took the ocean as an example, by cooling which, presumably, one could get an impressive supply of heat. However, in 1865, the German scientist, mathematician and physicist R. Clausius defined the second law of thermodynamics: “a repeating process cannot exist if the result is only a transfer of heat from a less heated body to a more heated one, and nothing more.” Later, he introduced the concept of entropy - a certain function, the change of which is equal to the ratio of the amount of heat transferred to the temperature.

After that, the law of non-decreasing entropy became an alternative to the second law of thermodynamics: “entropy does not decrease in a closed system.”

In simple words

Since entropy takes place in a wide variety of areas of human activity, its definition is somewhat vague. However, on the simplest examples, one can understand the essence of this quantity. Entropy is the degree of disorder, in other words, uncertainty, disorder. Then the system of scattered scraps of paper on the street, which are still periodically thrown up by the wind, has a high entropy. And the system of papers folded in a pile on the desktop has a minimum entropy. To lower the entropy in a system with scraps of paper, you have to spend a lot of time and energy gluing scraps of paper into full sheets, and folding them into a pile.

In the case of a closed system, everything is just as simple. For example, your things are in a closed closet. If you do not act on them from the outside, then things will seem to retain their entropy value for a long time. But sooner or later they will decompose. For example, a woolen sock will take up to five years to decompose, but leather shoes will take about forty years. In the described case, the closet is an isolated system, and the decomposition of things is a transition from ordered structures to chaos.

Summing up, it should be noted that the minimum entropy is observed for a variety of macroscopic objects (those that can be observed with the naked eye) that have a certain structure, and the maximum for vacuum.

Entropy of the Universe

As a result of the emergence of such a concept as entropy, many other statements and physical definitions appeared, which made it possible to describe the laws of nature in more detail. One of them is such a thing as "reversible/irreversible processes". The former include processes whose entropy of the system does not increase and remains constant. Irreversible - such processes in a closed system of which the entropy increases. It is impossible to return a closed system to the state before the process, because in such a case, the entropy would have to decrease.

According to Clausius, an irreversible process is the existence of the Universe, at the end of which the so-called "Heat death" awaits it, otherwise - the thermodynamic equilibrium that exists for closed systems. That is, the entropy will reach its maximum, and all processes will simply die out. But, as it soon turned out, Rudolf Clausius did not take into account the forces of gravity, which are present everywhere in the universe. For example, thanks to them, the distribution of particles at maximum entropy does not have to be uniform.

Also, other shortcomings of the theory of the "thermal death of the Universe" include the fact that we do not know whether it is really finite, and whether the concept of "closed system" can be applied to it. It should also be taken into account that the state of maximum entropy, as well as absolute vacuum itself, are the same theoretical concepts as an ideal gas. This means that in reality the entropy will not reach its maximum value, due to various random deviations.

It is noteworthy that the visible in its volume retains the value of entropy. The reason for this is a phenomenon already known to many - the Universe. This interesting coincidence once again proves to humanity that nothing happens in nature just like that. According to scientists, in order of magnitude, the value of entropy is equal to the number of existing photons.

  • The word "chaos" refers to the initial state of the universe. At that moment, she was just a formless collection of space and matter.
  • According to the research of some scientists, the largest source of entropy is supermassive. But others believe that due to the powerful gravitational forces that attract everything to a massive body, a measure of chaos is transferred to the surrounding space in an insignificant amount.
  • It is interesting that the life and evolution of man are directed in the opposite direction from chaos. Scientists argue that this is possible due to the fact that throughout his life a person, like other living organisms, takes on a smaller entropy value than he gives it to the environment.

Entropy. Perhaps this is one of the most difficult concepts to understand that you can meet in a physics course, at least when it comes to classical physics. Few physics graduates can explain what it is. Most of the problems with understanding entropy, however, can be solved by understanding one thing. Entropy is qualitatively different from other thermodynamic quantities such as pressure, volume or internal energy, because it is not a property of a system, but of how we consider this system. Unfortunately, in the course of thermodynamics, it is usually considered on a par with other thermodynamic functions, which exacerbates the misunderstanding.

So what is entropy?

In a nutshell, then

Entropy is how much information you don't know about a system.

For example, if you ask me where I live, and I will answer: in Russia, then my entropy for you will be high, after all, Russia is a big country. If I give you my zip code: 603081, then my entropy for you will decrease as you get more information.


The zip code contains six digits, so I gave you six characters of information. The entropy of your knowledge of me has decreased by approximately 6 characters. (Actually, not quite, because some indexes correspond to more addresses, and some to less, but we will neglect this).

Or consider another example. Suppose I have ten dice (hexagonal), and throwing them away, I inform you that their sum is 30. Knowing only this, you cannot say what specific numbers are on each of the dice - you do not have enough information. These specific numbers on the bones in statistical physics are called microstates, and the total amount (30 in our case) is called a macrostate. There are 2,930,455 microstates that add up to 30. So the entropy of this macrostate is approximately 6.5 symbols (the half appears due to the fact that when numbering the microstates in order in the seventh digit, not all digits are available to you, but only 0, 1 and 2).


What if I told you that the sum is 59? There are only 10 possible microstates for this macrostate, so its entropy is only one symbol. As you can see, different macrostates have different entropies.

Now let me tell you that the sum of the first five dice is 13 and the sum of the other five is 17, so the total is again 30. You, however, have more information in this case, so the entropy of the system should drop for you. And, indeed, 13 on five bones can be obtained in 420 different ways, and 17 - in 780, that is, the total number of microstates will be only 420x780 = 327,600. The entropy of such a system is approximately one symbol less than in the first example.

We measure entropy as the number of characters required to write the number of microstates. Mathematically, this number is defined as a logarithm, therefore, denoting the entropy with the symbol S, and the number of microstates with the symbol Ω, we can write:

This is nothing but the Boltzmann formula (up to a factor k, which depends on the chosen units of measure) for entropy. If a macrostate corresponds to one microstate, its entropy is equal to zero according to this formula. If you have two systems, then the total entropy is equal to the sum of the entropies of each of these systems, because log(AB) = log A + log B.

From the above description, it becomes clear why one should not think of entropy as an intrinsic property of the system. The system has a certain internal energy, momentum, charge, but it does not have a certain entropy: the entropy of ten bones depends on whether you know only their total sum, or also the partial sums of fives of bones.

In other words, entropy is how we describe a system. And this makes it very different from other quantities with which it is customary to work in physics.

Physical example: gas under a piston

The classical system considered in physics is the gas in the vessel under the piston. The microstate of a gas is the position and momentum (velocity) of each of its molecules. This is equivalent to knowing the value rolled on each die in the example above. The macrostate of a gas is described by such quantities as pressure, density, volume, and chemical composition. It's like the sum of the values ​​rolled on the dice.


Quantities describing a macrostate can be related to each other through the so-called "equation of state". It is the presence of this connection that allows, without knowing the microstates, to predict what will happen to our system if we start heating it or moving the piston. For an ideal gas, the equation of state has a simple form:

although you're probably more familiar with the Clapeyron-Mendeleev equation pV = νRT - it's the same equation, just with a couple of constants added to confuse you. The more microstates correspond to a given macrostate, that is, the more particles are part of our system, the better the equation of state describes it. For a gas, the characteristic values ​​of the number of particles are equal to the Avogadro number, that is, they are about 1023.

Values ​​such as pressure, temperature and density are called averaged, since they are the averaged manifestation of constantly changing microstates corresponding to a given macrostate (or, more precisely, macrostates close to it). To find out what microstate the system is in, we need a lot of information - we need to know the position and speed of each particle. The amount of this information is called entropy.

How does entropy change with a change in the macrostate? This is easy to understand. For example, if we heat the gas a little, then the speed of its particles will increase, therefore, the degree of our ignorance about this speed will also increase, that is, the entropy will increase. Or, if we increase the volume of the gas by quickly retracting the piston, the degree of our ignorance of the position of the particles will increase, and the entropy will also increase.

Rigid Bodies and Potential Energy

If instead of gas we consider some solid body, especially with an ordered structure, as in crystals, for example, a piece of metal, then its entropy will be low. Why? Because knowing the position of one atom in such a structure, you know the position of all the others (they are also arranged in a regular crystalline structure), but the speeds of the atoms are small, because they cannot fly far from their position and only slightly oscillate around the equilibrium position.


If a piece of metal is in a gravitational field (for example, lifted above the surface of the Earth), then the potential energy of each atom in the metal is approximately equal to the potential energy of other atoms, and the entropy associated with this energy is low. This distinguishes potential energy from kinetic energy, which for thermal motion can vary greatly from atom to atom.

If a piece of metal raised to a certain height is released, then its potential energy will turn into kinetic energy, but the entropy will practically not increase, because all atoms will move approximately the same way. But when the piece hits the ground, during the impact, the metal atoms will get a random direction of movement, and the entropy will increase dramatically. The kinetic energy of directed motion will be converted into the kinetic energy of thermal motion. Before the impact, we knew approximately how each atom moves, now we have lost this information.

Understand the second law of thermodynamics

The second law of thermodynamics states that the entropy (of a closed system) always increases. We can now understand why: because it is impossible to suddenly get more information about microstates. Once you've lost some information about a microstate (like when a piece of metal hits the ground), you can't get it back.


Let's get back to the dice. Recall that a macrostate with a sum of 59 has a very low entropy, but it is not so easy to obtain it either. If you roll the dice over and over again, then the sums (macrostates) that correspond to a larger number of microstates will fall out, that is, macrostates with large entropy will be realized. The sum of 35 has the highest entropy, and it is this sum that will fall out more often than others. This is what the second law of thermodynamics says. Any random (uncontrolled) interaction leads to an increase in entropy, at least until it reaches its maximum.

Gas mixing

And one more example to reinforce what has been said. Suppose we have a container in which there are two gases separated by a partition located in the middle of the container. Let's call the molecules of one gas blue, and the other - red.

If you open the partition, the gases will begin to mix, because the number of microstates in which the gases are mixed is much greater than the microstates in which they are separated, and all microstates are, of course, equally probable. When we opened the septum, for each molecule we lost information about which side of the septum it is now on. If there were N molecules, then N bits of information were lost (bits and symbols, in this context, are, in fact, the same thing, and differ only by a certain constant factor).

Dealing with Maxwell's Demon

And finally, let's consider the solution within our paradigm of the famous paradox of Maxwell's demon. Let me remind you that it is as follows. Suppose we have mixed gases of blue and red molecules. Let's put the partition back, making a small hole in it, into which we will put an imaginary demon. His task is to let only reds from left to right, and only blues from right to left. It is obvious that after some time the gases will be separated again: all the blue molecules will be on the left of the partition, and all the red ones on the right.

It turns out that our demon lowered the entropy of the system. Nothing happened to the demon, that is, its entropy did not change, and our system was closed. It turns out that we have found an example when the second law of thermodynamics is not fulfilled! How was this possible?

This paradox is solved, however, very simply. After all, entropy is not a property of a system, but of our knowledge about this system. You and I know little about the system, which is why it seems to us that its entropy is decreasing. But our demon knows a lot about the system - in order to separate molecules, he must know the position and speed of each of them (at least when approaching him). If he knows everything about molecules, then from his point of view the entropy of the system is, in fact, equal to zero - he simply does not have the missing information about it. In this case, the entropy of the system, as it was equal to zero, remained equal to zero, and the second law of thermodynamics was not violated anywhere.

But even if the demon does not know all the information about the microstate of the system, at least he needs to know the color of the molecule flying up to him in order to understand whether to skip it or not. And if the total number of molecules is N, then the demon must have N bits of information about the system - but that's how much information we lost when we opened the partition. That is, the amount of information lost is exactly equal to the amount of information that needs to be obtained about the system in order to return it to its original state - and this sounds quite logical, and again does not contradict the second law of thermodynamics.

This post is a loose translation of the answer that Mark Eichenlaub gave to the question What's an intuitive way to understand entropy?, asked on Quora.

Entropy is a measure of the complexity of a system. Not disorder, but complication and development. The greater the entropy, the more difficult it is to understand the logic of this particular system, situation, phenomenon. It is generally accepted that the more time passes, the less ordered the universe becomes. The reason for this is the uneven rate of development of the Universe as a whole and us, as observers of entropy. We, as observers, are a huge number of orders simpler than the Universe. Therefore, it seems to us excessively redundant, we are not able to understand most of the cause-and-effect relationships that make it up. The psychological aspect is also important - it is difficult for people to get used to the fact that they are not unique. Understand that the thesis that people are the crown of evolution is not far removed from the earlier belief that the Earth is the center of the universe. It is pleasant for a person to believe in his own exclusivity and it is not surprising that we tend to see structures that are more complex than us as disordered and chaotic.

There are some very good answers above explaining entropy from the current scientific paradigm. Respondents explain this phenomenon using simple examples. Socks scattered around the room, broken glasses, monkeys playing chess, etc. But if you look closely, you understand - the order here is expressed in a truly human idea. The word "better" is applicable to a good half of such examples. Better folded socks in the closet than scattered socks on the floor. A whole glass is better than a broken glass. A notebook written in beautiful handwriting is better than a notebook with blots. In human logic, it is not clear what to do with entropy. The smoke flying out of the tube is not utilitarian. A book torn to pieces is useless. It is difficult to extract at least a minimum of information from the polyphonic conversation and noise in the subway. In this sense, it will be very interesting to return to the definition of entropy introduced by the physicist and mathematician Rudolf Clausius, who saw this phenomenon as a measure of the irreversible dissipation of energy. Who is this energy coming from? Who finds it more difficult to use it? Yes, man! Spilled water is very difficult (if not impossible) to collect all the drops in a glass again. To repair old clothes, you need to use new material (fabric, thread, etc.). This does not take into account the meaning that this entropy may carry not for people. I will give an example when the dissipation of energy for us will carry the exact opposite meaning for another system:

You know that every second a huge amount of information from our planet flies into space. For example, in the form of radio waves. For us, this information seems to be completely lost. But if a sufficiently developed alien civilization is in the way of radio waves, its representatives can receive and decipher some of this energy lost to us. Hear and understand our voices, see our television and radio programs, connect to our Internet traffic))). In this case, our entropy can be ordered by other intelligent beings. And the more energy dissipation there is for us, the more energy they can collect.

Both physicists and lyricists operate with the concept of "entropy". Translated from ancient Greek into Russian, the word "entropy" is associated with a turn, a transformation.

Representatives of the exact sciences (mathematics and physics) introduced this term into scientific use and extended it to computer science and chemistry. R. Clausius, and L. Boltzmann, E. Jaynes and K. Shannon, K. Jung and M. Plank determined and investigated the phenomenon named above.

This article summarizes and systematizes the main approaches to the definition of entropy in various scientific fields.

Entropy in exact and natural sciences

Starting with the representative of the exact sciences R. Clausis, the term "entropy" denotes a measure:

  • irreversible energy dissipation in thermodynamics;
  • the probability of the implementation of any macroscopic process in statistical physics;
  • uncertainty of any system in mathematics;
  • information capacity of the system in informatics.

This measure is expressed by formulas and graphs.

Entropy as a humanitarian concept

K. Jung introduced a familiar concept to psychoanalysis, studying the dynamics of personality. Researchers in the field of psychology, and then sociology, single out and define personality entropy or social entropy as a degree:

  • uncertainty of the state of personality in psychology;
  • psychic energy, which cannot be used when investing it in the object of study in psychoanalysis;
  • the amount of energy unavailable for social change, social progress in sociology;
  • personality entropy dynamics.

The concept of entropy turned out to be in demand, convenient in theories, both natural sciences and the humanities. In general, entropy is closely related to the measure, the degree of uncertainty, chaos, disorder in any system.

ENTROPY

ENTROPY

(from Greek entropia - turn,)

part of the internal energy of a closed system or the energy aggregate of the Universe that cannot be used, in particular, cannot be transferred or converted into mechanical work. The exact entropy is made using mathematical calculations. The effect of entropy is most clearly seen in the example of thermodynamic processes. Thus, it never completely transforms into mechanical work, transforming into other types of energy. It is noteworthy that in reversible processes the entropy remains unchanged, while in irreversible processes, on the contrary, it steadily increases, and this increase occurs due to a decrease in mechanical energy. Consequently, all the irreversible processes that occur in nature are accompanied by a decrease in mechanical energy, which should eventually lead to general paralysis, or, in other words, "thermal death". But this is only valid if the totalitarian nature of the Universe is postulated as a closed empirical reality. Christ. theologians, based on entropy, talked about the finiteness of the world, using it as the existence of God.

Philosophical Encyclopedic Dictionary. 2010 .

ENTROPY

(Greek ἐντροπία - turn, transformation) - the state of thermodynamic. system, which characterizes the direction of the flow of spontaneous processes in this system and is a measure of their irreversibility. The concept of energy was introduced in 1865 by R. Clausius to characterize the processes of energy conversion; in 1877 L. Boltzmann gave him a statistical. interpretation. With the help of the concept of E., the second law of thermodynamics is formulated: E. of a thermally insulated system always only increases, i.e. such, left to itself, tends to thermal equilibrium, at which E. is maximum. In the statistical physics E. expresses the uncertainty microscopic. state of the system: the more microscopic. states of the system correspond to this macroscopic. state, the higher the thermodynamic. and E. the latter. A system with an unlikely structure, left to itself, develops towards the most probable structure, i.e. in the direction of increasing E. This, however, applies only to closed systems, so E. cannot be used to justify the heat death of the universe. In the theory of information and information, e. is considered as a lack of information in the system. In cybernetics, using the concepts of entropy and negentropy (negative entropy), they express the degree of organization of a system. Being fair in relation to systems subject to statistical. laws, this measure, however, requires great care when transferring to biological, linguistic and social systems.

Lit.: Shambadal P., Development and applications of the concept of E., [transl. s.], M., 1967; Pierce, J., Symbols, Signals, Noises, [transl. from English], M., 1967.

L. Fatkin. Moscow.

Philosophical Encyclopedia. In 5 volumes - M .: Soviet Encyclopedia. Edited by F. V. Konstantinov. 1960-1970 .


Synonyms:

See what "ENTROPY" is in other dictionaries:

    - (from the Greek entropia turn, transformation), a concept first introduced in thermodynamics to determine the measure of irreversible energy dissipation. E. is widely used in other areas of science: in statistical physics as a measure of the probability of the implementation of k. ... ... Physical Encyclopedia

    ENTROPY, an indicator of the randomness or disorder of the structure of a physical system. In THERMODYNAMICS, entropy expresses the amount of thermal energy available for doing work: the lower the energy, the higher the entropy. On the scale of the universe ... ... Scientific and technical encyclopedic dictionary

    A measure of the internal disorder of an information system. Entropy increases with a chaotic distribution of information resources and decreases with their ordering. In English: Entropy See also: Information Financial Dictionary Finam ... Financial vocabulary

    - [English] entropy Dictionary of foreign words of the Russian language

    Entropy- Entropy ♦ Entropie A property of the state of an isolated (or perceived as such) physical system, characterized by the amount of spontaneous change it is capable of. The entropy of a system reaches its maximum when it is completely... Philosophical Dictionary of Sponville

    - (from Greek entropia turn transformation) (usually denoted S), the state function of a thermodynamic system, the change in which dS in an equilibrium process is equal to the ratio of the amount of heat dQ communicated to the system or removed from it, to ... ... Big Encyclopedic Dictionary

    Disorder, discord Dictionary of Russian synonyms. entropy noun, number of synonyms: 2 disorder (127) … Synonym dictionary

    ENTROPY- (from the Greek en in, inside and trope turn, transformation), a value characterizing the measure of bound energy (D S), which cannot be converted into work in an isothermal process. It is determined by the logarithm of the thermodynamic probability and ... ... Ecological dictionary

    entropy- and, well. entropie f., German Entropie gr. en in, inside + trope turn, transformation. 1. A physical quantity that characterizes the thermal state of a body or system of bodies and possible changes in these states. Entropy calculation. ALS 1. ||… … Historical Dictionary of Gallicisms of the Russian Language

    ENTROPY- ENTROPY, a concept introduced in thermodynamics and which is, as it were, a measure of the irreversibility of a process, a measure of the transition of energy into such a form, from which it cannot spontaneously pass into other forms. All conceivable processes occurring in any system, ... ... Big Medical Encyclopedia

Books

  • Statistical mechanics. Entropy, Order Parameters, Complexity Theory, James P. Setna. The textbook "Statistical Mechanics: Entropy, Order Parameters and Complexity" was written by Cornell University (USA) professor James Setna and was first published in English in 2006…

New on site

>

Most popular