It's simple! Entropy. Physical meaning of entropy

Equation (44.7) or (44.12) can be interpreted differently. When operating reversible machines, heat at temperature is “equivalent” to heat at temperature; after all, if it is absorbed, then heat is always released. If we now come up with a special name for it, we can say that during reversible processes the same amount is absorbed as is released. In other words, it does not decrease and does not increase. This quantity is called entropy, and we say that “over a reversible cycle, the change in entropy is zero.” If , then the entropy is equal to ; We have already given entropy a special symbol. Entropy is denoted throughout by the letter , and numerically it is equal to the heat (which we denoted by the letter) released in a one-degree reservoir (entropy is not just heat, it is heat divided by temperature, and it is measured in joules per degree).

It is interesting that, in addition to pressure, which depends on temperature and volume, and internal energy (a function of the same volume and temperature), we found another quantity - the entropy of a substance, which is also a function of state. We will try to explain how to calculate entropy and what we mean by the words “state function”. Let's monitor the behavior of the system under different conditions. We already know how to create different conditions experimentally, for example, we can force the system to expand adiabatically or isothermally. (By the way, the machine does not have to have only two reservoirs; there could be three or four different temperatures, and the machine will exchange heat with each of the reservoirs.) We can walk around the entire diagram, moving from one state to the next. In other words, you can transfer a gas from a state to some other state and require that the transition from to be reversible. Now suppose that small reservoirs with different temperatures are placed along the path from to. Then each short step will be accompanied by the removal of heat from the substance and its transfer to the reservoir at a temperature corresponding to a given point on the path. Let's connect all these reservoirs using reversible heat engines to one reservoir of unit temperature. After we finish transferring a substance from state to state, we will return all reservoirs to their original state. A reversible machine will return every fraction of heat removed from a substance at a temperature, and each time at a unit temperature an entropy equal to

Let's calculate the total amount of allocated entropy. The entropy difference, or the entropy needed to go from to as a result of some reversible change, is the total entropy, that is, the entropy taken from small reservoirs and released at a unit temperature:

The question is: does the entropy difference depend on the path in the plane? There are many roads leading from to. Recall that in the Carnot cycle we could move from point to point (see Fig. 44.6) in two ways. It was possible to expand the gas first isothermally and then adiabatically, or it was possible to start with adiabatic expansion and end with isothermal expansion. So, we must find out whether the entropy changes when the path from to changes (Fig. 44.10). It should not change, because if we complete a full cycle, leaving from in one way and returning along another, then this journey will be equivalent to a full cycle of a reversible machine. With this cycle, no heat is transferred to the one-degree tank.

Fig. 44.10. Entropy change during a reversible transition.

Since we cannot take heat from a one-degree reservoir, we have to make do with the same amount of entropy on each journey from to. This number is independent of the path, only the end points are significant. Thus, we can talk about a certain function, which we called the entropy of matter. This function depends only on the state of the substance, i.e. only on volume and temperature.

You can find the function. We will calculate the change in entropy for reversible changes in matter by monitoring the heat generated in a one-degree reservoir. But this change can also be expressed in terms of heat removed from the substance at temperature

The total change in entropy is equal to the difference in entropy at the final and starting points of the path:

. (44.18)

This expression does not completely define entropy. So far, only the difference in entropy in two different states is known. It is absolutely possible to determine entropy only after we are able to calculate the entropy of one state.

For a very long time it was believed that absolute entropy is a meaningless concept. But in the end Nernst made a statement that he called the heat theorem (sometimes called the third law of thermodynamics). Its meaning is very simple. We will now state this theorem without explaining why it is true. Nernst's postulate simply states that the entropy of any body at absolute zero is equal to zero. Now we know at what and (at ) the entropy is zero, and we can calculate the entropy at any other point.

To illustrate this idea, let's calculate the entropy of an ideal gas. For isothermal (and therefore reversible) expansion is simply equal to , because

which is constant. Thus, according to (44.4), the change in entropy is equal to

,

So plus the function of temperature alone. How does it depend on? We already know that during adiabatic expansion there is no heat transfer. Thus, the entropy remains constant although the volume changes, forcing it to change (to maintain equality). Is it clear to you after this that

,

where is a constant that does not depend on either ? [The constant is called the chemical constant. It depends on the properties of the gas and can be determined experimentally in accordance with Nernst's theorem. To do this, it is necessary to measure the heat released by the gas during its cooling and condensation until it turns into a solid at 0° (helium remains liquid at this temperature). Then you need to find the integral. Can be found theoretically; this will require Planck's constant and quantum mechanics, but we will not touch on this in our course.]

Let us note some properties of entropy. First, remember that in the section of the reversible cycle between points and the entropy changes to (Fig. 44.11). Let us also remember that as we move along this path, entropy (the heat released at a unit temperature) increases in accordance with the rule, where is the heat removed from the substance at a temperature.

Fig. 44.11. Entropy change over a complete reversible cycle.

The total entropy change is zero.

We already know that after a reversible cycle, the total entropy of everything that is included in the process does not change. After all, the heat absorbed at and the heat released at make contributions to the entropy equal in magnitude but opposite in sign. Therefore the net change in entropy is zero. Thus, with a reversible cycle, the entropy of all participants in the cycle, including reservoirs, does not change. This rule seems to be similar to the law of conservation of energy, but it is not. It only applies to reversible loops. If we move on to irreversible cycles, then the law of conservation of entropy no longer exists.

Let's give two examples. To begin with, let us assume that some machine with friction produces irreversible work, releasing heat at a temperature. Entropy will increase by . Heat is equal to the work expended, and when we produce some work through friction against some object whose temperature is equal to , then entropy increases by an amount.

Another example of irreversibility: if you apply two objects with different temperatures to each other, say and , then a certain amount of heat will flow from one object to the other. Suppose, for example, that we throw a hot stone into cold water. How much does the entropy of a stone change if it transfers heat to water at a temperature of ? It decreases by . How does the entropy of water change? It will increase by . Heat, of course, can only flow from a higher temperature to a lower one. Therefore, if it is more, then it is positive. Thus, the change in entropy is positive and equal to the difference of two fractions:

. (44.19)

So, the following theorem is true: in any irreversible process, the entropy of everything in the world increases. Only reversible processes can keep entropy at the same level. And since absolutely irreversible processes do not exist, entropy always increases little by little. Reversible processes are idealized processes with a minimal increase in entropy.

Unfortunately, we will not have to go deeper into the field of thermodynamics. Our goal is only to illustrate the main ideas of this science and explain the reasons why it is possible to build on these arguments. But in our course we will not often resort to thermodynamics. Thermodynamics is widely used in engineering and chemistry. Therefore, you will practically become familiar with thermodynamics in a course in chemistry or technical sciences. Well, there is no point in duplicating, and we will limit ourselves to only a certain overview of the nature of the theory and will not go into details for its special applications.

The two laws of thermodynamics are often formulated as follows:

First Law: The energy of the Universe is always constant.

Second Law: The entropy of the Universe always increases.

This is not a very good formulation of the second law. It does not say anything, for example, that entropy does not change after a reversible cycle and does not clarify the very concept of entropy. It's just that it's an easy-to-remember form of both laws, but it's not easy to understand from it what they're actually talking about.

We have collected all the laws that were discussed now in a table. 44.1. And in the next chapter, we will use this set of laws to find the relationship between the heat generated by rubber when it is stretched, and the additional tension the rubber exerts when it heats up.

Table 44.1 Laws of thermodynamics

First law

Heat supplied to the system + Work done on the system = Increase in internal energy of the system:

Entropy is a concept that was introduced in thermodynamics. Using this value, the measure of energy dissipation is determined. Any system experiences a confrontation that arises between heat and a force field. An increase in temperature leads to a decrease in the degree of order. To determine the measure of disorder, a quantity called entropy was introduced. It characterizes the degree of exchange of energy flows in both closed and open systems.

The change in entropy in isolated circuits increases along with increasing heat. This measure of disorder reaches its maximum value in a state characterized by thermodynamic equilibrium, which is the most chaotic.

If the system is open and at the same time nonequilibrium, then the change in entropy occurs in the direction of decrease. The value of this measure in this version is characterized by the formula. To obtain it, two quantities are summed:
- the flow of entropy occurring due to the exchange of heat and substances with the external environment;
- the magnitude of the change in the index of chaotic movement within the system.

Entropy changes occur in any environment where biological, chemical and physical processes occur. This phenomenon occurs at a certain speed. The change in entropy can be a positive value - in this case, there is an influx of this indicator into the system from the external environment. There may be cases when the value indicating a change in entropy is defined with a minus sign. This numerical value indicates the outflow of entropy. The system may be in In this case, the amount of entropy produced is compensated by the outflow of this indicator. An example of such a situation is the state It is nonequilibrium, but at the same time stationary. Any organism pumps entropy, which has a negative value, from its environment. The amount of disorder extracted from it may even exceed the amount received.

Entropy production occurs in any complex system. In the process of evolution, information is exchanged between them. For example, when information about the spatial arrangement of its molecules is lost. There is a process of increasing entropy. If the liquid freezes, the uncertainty in the molecular positions is reduced. In this case, entropy decreases. Cooling a liquid causes a decrease in its internal energy. However, when the temperature reaches a certain value, despite the removal of heat from the water, the temperature of the substance remains unchanged. This means that the transition to crystallization begins. A change in entropy during an isothermal process of this type is accompanied by a decrease in the measure of chaos of the system.

A practical method that allows the heat of fusion of a substance is to carry out work, the result of which is the construction of a solidification diagram. In other words, based on the data obtained as a result of the study, it is possible to draw a curve that will indicate the dependence of the temperature of the substance on time. In this case, external conditions must remain unchanged. It is possible to determine the change in entropy by processing data from a graphical representation of the experimental results. On such curves there is always a section where the line has a horizontal gap. The temperature corresponding to this segment is the solidification temperature.

A change in any substance accompanied by a transition from a solid to a liquid at an ambient temperature equal to and vice versa is referred to as a phase change of the first kind. In this case, the density of the system, as well as its entropy, changes.

The second law of thermodynamics has several formulations. Clausius' formulation: the process of transferring heat from a body with a lower temperature to a body with a higher temperature is impossible.

Thomson's formulation: a process is impossible, the result of which would be the performance of work due to heat taken from one particular body.

This formulation imposes a limitation on the conversion of internal energy into mechanical energy. It is impossible to build a machine (a perpetual motion machine of the second kind) that would do work only by receiving heat from the environment. Boltzmann formulation:- This is an indicator of the disorder of the system. The higher the entropy, the more chaotic the movement of the material particles that make up the system. Let's see how it works using water as an example. In the liquid state, water is a rather disordered structure, since the molecules move freely relative to each other, and their spatial orientation can be arbitrary. Ice is another matter - in it the water molecules are ordered, being included in the crystal lattice. The formulation of Boltzmann’s second law of thermodynamics, relatively speaking, states that ice, having melted and turned into water (a process accompanied by a decrease in the degree of order and an increase in entropy), will never itself be reborn from water. Entropy cannot decrease in closed systems - that is, in systems that do not receive external energy supply.

Third law of thermodynamics (Nernst's theorem) is a physical principle that determines the behavior of entropy as temperature approaches absolute zero. It is one of the postulates of thermodynamics, accepted on the basis of a generalization of a significant amount of experimental data.

The third law of thermodynamics can be formulated as follows:

“The increase in entropy at absolute zero temperature tends to a finite limit, independent of the equilibrium state the system is in”.

where is any thermodynamic parameter.

The third law of thermodynamics applies only to equilibrium states.

Since, based on the second law of thermodynamics, entropy can only be determined up to an arbitrary additive constant (that is, it is not the entropy itself that is determined, but only its change):

the third law of thermodynamics can be used to accurately determine entropy. In this case, the entropy of the equilibrium system at absolute zero temperature is considered equal to zero.

Entropy of ideal gases

To obtain a calculated expression for the change in entropy of ideal gases, we use the first law of thermodynamics, in which heat is determined using the change in enthalpy

The difference between the entropies of an ideal gas in specific two states can be obtained by integrating expression (4.59)

To determine the absolute value of the entropy of an ideal gas, it is necessary to fix the beginning of its counting by any pair of thermal parameters of the state. For example, taking s 0 =0 at T 0 and P 0, using equation (4.60), we obtain

Expression (4.62) indicates that the entropy of an ideal gas is a state parameter, since it can be determined through any pair of state parameters. In turn, since entropy itself is a state parameter, using it in conjunction with any independent state parameter, it is possible to determine any other gas state parameter.

Entropy

A change in the enthalpy of a system cannot serve as the only criterion for the spontaneous implementation of a chemical reaction, since many endothermic processes occur spontaneously. This is illustrated by the dissolution of some salts (for example, NH 4NO 3) in water, accompanied by a noticeable cooling of the solution. It is necessary to take into account one more factor that determines the ability to spontaneously transition from a more ordered to a less ordered (more chaotic) state.

Entropy (S) is a thermodynamic function of state, which serves as a measure of disorder (disorder) of the system. The possibility of endothermic processes occurring is due to a change in entropy, because in isolated systems the entropy of a spontaneously occurring process increases Δ S > 0 (second law of thermodynamics).

L. Boltzmann defined entropy as the thermodynamic probability of the state (disorder) of a system W. Since the number of particles in the system is large (Avogadro's number N A = 6.02∙10 23), then the entropy is proportional to the natural logarithm of the thermodynamic probability of the state of the system W:

The dimension of entropy of 1 mole of a substance coincides with the dimension of the gas constant R and is equal to J∙mol –1∙K –1. Entropy change *) in irreversible and reversible processes is conveyed by the relations Δ S > Q / T and Δ S = Q / T. For example, the change in entropy of melting is equal to the heat (enthalpy) of melting Δ S pl = Δ H pl/ T pl For a chemical reaction, the change in entropy is similar to the change in enthalpy

*) term entropy was introduced by Clausius (1865) through the Q/T (reduced heat) ratio.

Here Δ S° corresponds to the entropy of the standard state. The standard entropies of simple substances are not equal to zero. Unlike other thermodynamic functions, the entropy of an ideally crystalline body at absolute zero is zero (Planck’s postulate), since W = 1.

The entropy of a substance or system of bodies at a certain temperature is an absolute value. In table 4.1 shows standard entropies S° some substances.

Compound


(J∙mol –1∙K –1)

Compound


(J∙mol –1∙K –1)

C(t)diamond

C(t)graphite

iso-C 4H 10(g)

Table 4.1.

Standard entropies of some substances.

From the table 4.1 it follows that entropy depends on:

  • Aggregate state of a substance. Entropy increases during the transition from solid to liquid and especially to the gaseous state (water, ice, steam).
  • Isotopic composition (H 2O and D 2O).
  • Molecular weight of similar compounds (CH 4, C 2H 6, n-C 4H 10).
  • Structure of the molecule (n-C 4H 10, iso-C 4H 10).
  • Crystal structure (allotropy) – diamond, graphite.

Finally, fig. 4.3 illustrates the dependence of entropy on temperature.

Consequently, the higher the temperature, the greater the system's tendency toward disorder. Product of the change in entropy of the system and temperature TΔ S quantifies this trend and is called entropy factor.

Problems and tests on the topic "Chemical thermodynamics. Entropy"

  • Chemical elements. Chemical element signs - Initial chemical concepts and theoretical concepts, grades 8–9

    Lessons: 3 Assignments: 9 Tests: 1

  • 3.3. Rotation of a rigid body around a fixed axis, its moment of inertia and kinetic energy.
  • 3.4. Moment of impulse. Law of conservation of angular momentum. Second law of dynamics for rotational motion.
  • Lecture No. 4
  • 4.1. Description of the movement of liquid and gas. Viscosity of liquids and gases.
  • 4.2. Continuity equation.
  • 4.3. Bernoulli's equation and conclusions from it
  • Lecture No. 5
  • 5.1. Harmonic vibrations.
  • 5.2. Addition of harmonic vibrations.
  • 5.3. Addition of perpendicular vibrations.
  • 5.4. Differential equation of oscillations.
  • 5.5. Energy relationships in oscillatory processes.
  • 5.6. Oscillations of mathematical and physical pendulums
  • 5.7. Equation of forced oscillations. Resonance
  • Lecture No. 6
  • 6.1.Waves in elastic media and their types. Wave front, plane and spherical waves.
  • 6.2. Wave energy
  • 6.3. Elastic waves in a solid
  • Lecture No. 7
  • 7.1. Basic provisions of MKT.
  • Aggregate states of matter
  • 7.2. Experimental ideal gas laws
  • Avogadro's law
  • 7.3. Ideal gas equation of state
  • 7.4. Basic equation of the molecular kinetic theory of an ideal gas.
  • 7.5. Maxwell's law for the distribution of molecules by speed.
  • 7.6. Barometric formula. Boltzmann distribution
  • Lecture No. 8
  • 8.2. Collisions of molecules and transport phenomena in an ideal gas
  • 8.3. Average number of collisions and average free travel time of molecules
  • 8.4.Mean free path of molecules
  • 8.5. Diffusion in gases
  • 8.6. Gas viscosity
  • 8.7. Thermal conductivity of gases
  • 8.8. Osmosis. Osmotic pressure
  • Lecture No. 9
  • 9.1. Energy distribution over the degrees of freedom of molecules
  • 9.2. Internal energy
  • 9.3. Work of gas during its expansion
  • 9.4. First law of thermodynamics
  • 9.5. Heat capacity. Mayer's equation
  • 9.6. Adiabatic process
  • 9.7. Polytropic process
  • 9.8. Operating principle of a heat engine. Carnot cycle and its efficiency.
  • 9.9. Entropy. Physical meaning of entropy. Entropy and probability.
  • 9.10. The second law of thermodynamics and its statistical meaning.
  • Lecture No. 10
  • 10.1. Real gases, van der Waals equation.
  • The van der Waals equation qualitatively describes the behavior of gas during liquefaction quite well, but is unsuitable for the solidification process.
  • 10.2. Basic characteristics and patterns of states of aggregation and phase transitions.
  • Phase transitions of the second order. Liquid helium. Superfluidity
  • 10.3. Surface tension of a liquid. Laplace pressure.
  • 10.4. Capillary phenomena
  • 10.5. Solids
  • Defects in crystals
  • Thermal properties of crystals
  • Liquid crystals
  • Lecture No. 11
  • 11.1. Electrical properties of bodies. Electric charge. Law of conservation of charge
  • 11.2. Coulomb's law
  • 11.3. Electrostatic field. Electric field strength. Field lines.
  • 11.4. Electric dipole
  • 11.5. Tension vector flow. Ostrogradsky-Gauss theorem
  • 11.6. The work of electrostatic field forces to move charges.
  • 11.6. Potential. Potential difference. Potential of a point charge, dipole, sphere.
  • 11.7. Relationship between electric field strength and potential
  • 11.8. Types of dielectrics. Polarization of dielectrics.
  • 11.9. Ostrogradsky-Gauss theorem for a field in a dielectric. Relationship between vectors - displacement, - tension and - polarization
  • 11.10. Conductors in an electrostatic field
  • 11.11. Conductor in an external electrostatic field. Electrical capacity
  • 11.12. Energy of a charged conductor, conductor system and capacitor
  • Lecture No. 12
  • 12.1. Electricity. Current strength and density.
  • 12.3. Ohm's law for a homogeneous section of a chain. Conductor resistance.
  • 12.4. Ohm's law for a non-uniform section of a circuit
  • 12.5. Joule-Lenz law. Work and current power.
  • 12.6. Kirchhoff's rules
  • Lecture No. 13
  • 13.1. Classical theory of electrical conductivity of metals
  • 13.2. Thermionic emission. Electric current in a vacuum.
  • 13.3. Electric current in gases. Types of gas discharge.
  • Self-sustaining gas discharge and its types
  • Lecture No. 14
  • 14.1. A magnetic field. Magnetic interaction of currents. Ampere's law. Magnetic induction vector.
  • 14.2. Biot-Savart-Laplace law. Magnetic field of rectilinear and circular currents.
  • 14.3. Circulation of the magnetic induction vector. Solenoid and toroid field
  • 14.4. Magnetic flux. Gauss's theorem
  • 14.5. The work of moving a conductor and a frame with a current in a magnetic field
  • 14.6. The effect of a magnetic field on a moving charge. Lorentz force
  • 14.7. Magnetic field in matter. Magnetization and magnetic field strength.
  • 14.8. Total current law for magnetic field in matter
  • 14.9. Types of magnets
  • Lecture 15
  • 15.1. The phenomenon of electromagnetic induction.
  • 15.2. Self-induction phenomenon
  • 15.3. Magnetic field energy
  • 15.4. Maxwell's electromagnetic theory.
  • 1) Maxwell's first equation
  • 2) Mixing current. Maxwell's second equation
  • 3) Maxwell's third and fourth equations
  • 4) The complete system of Maxwell’s equations in differential form
  • 15.5. Alternating current
  • Lecture No. 16
  • 16.1. Basic laws of geometric optics. Complete internal reflection of light.
  • 16.2. Reflection and refraction of light on a spherical surface. Lenses.
  • 16.3. Basic photometric quantities and their units
  • 17.1. Interference of light. Coherence and monochromaticity of light waves. Optical path length and optical path difference of rays.
  • 17.2. Methods for obtaining interference patterns.
  • 17.3. Interference in thin films.
  • 17.4. Optics coating
  • 17.5. Diffraction of light and conditions for its observation. Huygens-Fresnel principle. Diffraction grating. Diffraction by a spatial grating. Wulff-Bragg formula
  • 17.6. Fresnel diffraction from the simplest obstacles.
  • 17.7. Diffraction in parallel beams (Fraunhofer diffraction)
  • 17.8. Diffraction by spatial gratings. Wolfe-Bragg formula.
  • 17.9. Polarization of light. Natural and polarized light.
  • 17.10. Polarization of light during reflection and refraction. Brewster's Law.
  • 17.11. Polarization during birefringence.
  • 17.12. Rotation of the plane of polarization.
  • 17.13. Dispersion of light. Absorption (absorption) of light.
  • Lecture No. 18
  • 18.1. Quantum nature of radiation. Thermal radiation and its characteristics. Kirchhoff's law. Stefan-Boltzmann and Wien laws.
  • 18.2.Types of photoelectric effect. Laws of external photoelectric effect. Einstein's equation for the photoelectric effect.
  • 18.3. Photon mass and momentum. Light pressure. Compton effect.
  • Lecture No. 19
  • 19.2. Line spectrum of the hydrogen atom.
  • 19.3. Bohr's postulates. Experiments of Frank and Hertz.
  • Lecture No. 20
  • 20.1.Atomic nucleus.
  • 20.2.Nuclear forces.
  • 20.3. Nuclear binding energy. Mass defect.
  • 20.4. Nuclear fission reactions.
  • 2.5.Thermonuclear fusion.
  • 20.6.Radioactivity. Law of radioactive decay.
  • Independent work schedule
  • Schedule of laboratory and practical classes
  • List of questions to prepare for the Mechanics Colloquium
  • Formulas
  • Definitions
  • Questions for the exam
  • Rules and sample of laboratory work
  • 9.9. Entropy. Physical meaning of entropy. Entropy and probability.

    Considering the efficiency of a heat engine operating according to the Carnot cycle, it can be noted that the ratio of the temperature of the refrigerator to the temperature of the heater is equal to the ratio of the amount of heat given by the working fluid to the refrigerator and the amount of heat received from the heater. This means that for an ideal heat engine operating according to the Carnot cycle, the following relation holds:
    . Attitude Lorenz called reduced heat . For an elementary process, the reduced heat will be equal to . This means that when the Carnot cycle is implemented (and it is a reversible cyclic process), the reduced heat remains unchanged and behaves as a function of the state, then, as is known, the amount of heat is a function of the process.

    Using the first law of thermodynamics for reversible processes,
    and dividing both sides of this equality by temperature, we get:

    (9-41)

    Let us express from the Mendeleev-Clapeyron equation
    , substitute into equation (9-41) and get:

    (9-42)

    Let's take into account that
    , A
    , substitute them into equation (9-42) and get:

    (9-43)

    The right side of this equality is a complete differential, therefore, in reversible processes, the reduced heat is also a complete differential, which is a sign of the state function.

    State function whose differential is , called entropy and is designated S . Thus, entropy is a function of state. After introducing entropy, formula (9-43) will look like:

    , (9-44)

    Where dS– entropy increment. Equality (9-44) is valid only for reversible processes and is convenient for calculating the change in entropy during finite processes:

    (9-45)

    If a system undergoes a circular process (cycle) in a reversible way, then
    , and, therefore, S=0, then S = const.

    Expressing the amount of heat through the increment of entropy for an elementary process, and substituting it into the equation for the first law of thermodynamics, we obtain a new form of writing this equation, which is usually called basic thermodynamic identity:

    (9-46)

    Thus, to calculate the change in entropy during reversible processes, it is convenient to use reduced heat.

    In the case of irreversible nonequilibrium processes
    , and for irreversible circular processes it holds Clausius inequality :

    (9-47)

    Let's consider what happens to entropy in an isolated thermodynamic system.

    In an isolated thermodynamic system, with any reversible change in state, its entropy will not change. Mathematically, this can be written as follows: S = const.

    Let us consider what happens to the entropy of a thermodynamic system during an irreversible process. Let us assume that the transition from state 1 to state 2 along path L 1 is reversible, and from state 2 to state 1 along path L 2 is irreversible (Fig. 9.13).

    Then the Clausius inequality (9-47) is valid. Let us write the expression for the right side of this inequality corresponding to our example:

    .

    The first term in this formula can be replaced by a change in entropy, since this process is reversible. Then the Clausius inequality can be written as:

    .

    From here
    . Because
    , then we can finally write:

    (9-48)

    If the system is isolated, then
    , and inequality (9-48) will look like:

    , (9-49)

    T That is, the entropy of an isolated system increases during an irreversible process. The growth of entropy does not continue indefinitely, but up to a certain maximum value characteristic of a given state of the system. This maximum entropy value corresponds to a state of thermodynamic equilibrium. An increase in entropy during irreversible processes in an isolated system means that the energy possessed by the system becomes less available for conversion into mechanical work. In a state of equilibrium, when entropy reaches its maximum value, the energy of the system cannot be converted into mechanical work.

    If the system is not isolated, then entropy can either decrease or increase depending on the direction of heat transfer.

    Entropy as a function of the state of the system can serve as the same state parameter as temperature, pressure, volume. By depicting a particular process on a diagram (T,S), one can give a mathematical interpretation of the amount of heat as the area of ​​the figure under the curve depicting the process. Figure 9.14 shows a diagram for an isothermal process in entropy – temperature coordinates.

    Entropy can be expressed through the parameters of the gas state - temperature, pressure, volume. To do this, from the main thermodynamic identity (9-46) we express the entropy increment:

    .

    Let's integrate this expression and get:

    (9-50)

    The change in entropy can also be expressed through another pair of state parameters – pressure and volume. To do this, you need to express the temperatures of the initial and final states from the equation of state of an ideal gas through pressure and volume and substitute them into (9-50):

    (9-51)

    With isothermal expansion of the gas into the void, T 1 = T 2, which means the first term in formula (9-47) will be zeroed and the change in entropy will be determined only by the second term:

    (9-52)

    Despite the fact that in many cases it is convenient to use reduced heat to calculate the change in entropy, it is clear that reduced heat and entropy are different, not identical concepts.

    Let's find out physical meaning of entropy . To do this, we use formula (9-52) for an isothermal process, in which the internal energy does not change, and all possible changes in characteristics are due only to changes in volume. Let us consider the relationship between the volume occupied by a gas in an equilibrium state and the number of spatial microstates of gas particles. The number of microstates of gas particles, with the help of which a given macrostate of a gas as a thermodynamic system is realized, can be calculated as follows. Let us divide the entire volume into elementary cubic cells with a side of d~10–10 m (on the order of the effective diameter of the molecule). The volume of such a cell will be equal to d 3. In the first state, the gas occupies volume V 1, therefore, the number of elementary cells, that is, the number of places N 1 that molecules can occupy in this state will be equal to
    . Similarly, for the second state with volume V 2 we obtain
    . It should be noted that the change in the positions of the molecules corresponds to a new microstate. Not every change in the microstate will lead to a change in the macrostate. Suppose molecules can occupy N 1 places, then swapping the places of any molecules in these N 1 cells will not lead to a new macrostate. However, the transition of molecules to other cells will lead to a change in the macrostate of the system. The number of microstates of a gas corresponding to a given macrostate can be calculated by determining the number of ways of placing particles of this gas in elementary cells. To simplify the calculations, consider 1 mole of an ideal gas. For 1 mole of an ideal gas, formula (9-52) will look like:

    (9-53)

    The number of microstates of the system occupying volume V 1 will be denoted by Г 1 and determined by counting the number of placements N A (Avogadro's number) of molecules contained in 1 mole of gas in N 1 cells (locations):
    . Similarly, we calculate the number of microstates G 2 of the system occupying volume V 2:
    .

    The number of microstates Г i with the help of which the ith macrostate can be realized is called thermodynamic probability of this macrostate. Thermodynamic probability Г ≥ 1.

    Let's find the ratio Г 2 /Г 1:

    .

    For ideal gases, the number of free places is much greater than the number of molecules, that is, N 1 >>N A and N 2 >>N A. . Then, taking into account the expression of the numbers N 1 and N 2 through the corresponding volumes, we obtain:

    From here we can express the volume ratio through the ratio of the thermodynamic probabilities of the corresponding states:

    (9-54)

    Substitute (9-54) into (9-53) and get:
    . Considering that the ratio of the molar gas constant and Avogadro's number, there is Boltzmann's constant k, and also that the logarithm of the ratio of two quantities is equal to the difference of the logarithms of these quantities, we obtain:. From this we can conclude that the entropy of that state S i is determined by the logarithm of the number of microstates through which a given macrostate is realized:

    (9-55)

    Formula (9-55) is called Boltzmann's formula who first received it and understood statistical meaning of entropy , How disorder functions . Boltzmann's formula has a more general meaning than formula (9-53), that is, it can be used not only for ideal gases, and allows one to reveal the physical meaning of entropy. The more ordered the system, the smaller the number of microstates through which a given macrostate is realized, the lower the entropy of the system. An increase in entropy in an isolated system, where irreversible processes occur, means the movement of the system in the direction of the most probable state, which is the state of equilibrium. It can be said that entropy is measure of disorder systems; the more disorder there is, the higher the entropy. This is physical meaning of entropy .

    Did you like the article? Share with your friends!