Second Law of Thermodynamics and Entropy

From Physics Book
Jump to navigation Jump to search

Main Idea

Entropy quantitatively represents the unavailability of a system's Thermal Energy to be converted into mechanical work. The entropy of a system can also be described as the system's level of disorder or randomness. The more entropy a system has, the more random it will be. The Second Law of Thermodynamics states that the entropy of the Universe will always increase over time. The second law also states that the change in the entropy of the Universe will never be negative. Essentially, through the discovery that Heat can not spontaneously flow from a colder body to a hotter body, physicists also discovered that it is impossible to complete any physical process without the loss of some usable energy. As time goes on, randomness can only increase.

Mathematical Model

In Boltzmann's description of entropy, entropy is a way to describe the number of different possible micro-states within a macro-state (specified Temperature, Pressure, Volume, etc.). This leads to the popular idea that greater entropy means greater disorder in a system. Formally, Boltzmann's theory says:

[math]\displaystyle{ S = -k_{B}\sum_{i} p_{i}\ \text{log}(p_{i}) }[/math], where
[math]\displaystyle{ S = }[/math] is the entropy of the macro-state
[math]\displaystyle{ k_{B} = 1.38065 \times 10^{−23} \ \frac{J}{K} }[/math] (Boltzmann's constant)
[math]\displaystyle{ p_{i} = }[/math] the probability that the micro-state is in the [math]\displaystyle{ i\text{-th} }[/math] micro-state
[math]\displaystyle{ \sum_{i} = }[/math] the summation over all possible micro-states

However, it is usually assumed that each micro-state is equally probable:

[math]\displaystyle{ \Omega = }[/math] the number of different, possible micro-states

If each possible micro-state is assumed to be equally probable, then:

[math]\displaystyle{ S = -k_{B}\sum_{i} p_{i}\ \text{log}(p_{I}) = -k_{B}\sum_{I}^{\Omega} \frac{1}{\Omega} \ \text{log}\left(\frac{1}{\Omega}\right) = k_{B} \ \text{ln}(\Omega) }[/math]

Also, The Second Law of Thermodynamics states the entropy of an isolated system will always stay the same (if a reversible process occurs) or increase (if an irreversible process occurs), as a system tends towards thermodynamic equilibrium.

In an idealized closed system, where an infinitesimal change in entropy occurs due to an infinitesimal transfer of Heat, the following is true:
[math]\displaystyle{ dS = \frac{\delta Q}{T} }[/math], where
[math]\displaystyle{ dS = }[/math] an infinitesimal change in entropy of the system
[math]\displaystyle{ \delta Q = }[/math] an infinitesimal transfer of Heat to the or from the system
[math]\displaystyle{ T = }[/math] the Temperature of the system in equilibrium and the surroundings which supply or receive the Heat
In a more realistic setting, a closed system could go through an irreversible process. If so, the following is true:
[math]\displaystyle{ dS \gt \frac{\delta Q}{T_{surr}} }[/math], where
[math]\displaystyle{ T_{surr} = }[/math] the Temperature of the surroundings
In general then, for any system in the Universe, including the Universe itself, the following holds true:
[math]\displaystyle{ dS \ge \frac{\delta Q}{T} }[/math]

Clausius' Theorem states "Heat can never pass from a colder to a warmer body without some other change, connected therewith, occurring at the same time." In his theorem he describes entropy mathematically as:

[math]\displaystyle{ \Delta S \ge \int \frac{\delta Q}{T} }[/math]

Note the equality only holds if the process is reversible.

Keep in mind, a system's entropy can decrease if the necessary process is fulfilled. However, to do this, the surrounding's increase in entropy must be greater than or equal to the loss of entropy of the system, so that as a whole, the system's and the surrounding's net change in entropy is greater than or equal to 0. Here is an example containing an ice cube tray in a freezer, in the Universe:

System 1: Ice Cube Tray

In this system, we find that energy is leaving the system. In fact, it appears that energy is flowing from cold to hot, meaning that the freezer (the surroundings) must be doing negative work on the system. Looking at the equation from The Second Law of Thermodynamics, we can deduce that [math]\displaystyle{ \Delta Q }[/math] is negative since energy is leaving the system (negative work is done). This signifies that the change in entropy is negative, as T is always positive (Kelvin). With neither the energy principle nor The Second Law of Thermodynamics satisfied, we find that this open system must be expanded in order to satisfy these fundamental principles.

System 2: Ice Cube Tray and Freezer

When the freezer is included in the system, the transfer of thermal energy becomes 0. This is due to all the thermal energy flowing from the ice cube tray into the freezer, meaning the ice cube tray loses thermal energy while the freezer gains thermal energy. Thus, since there is no loss of energy to the surroundings ([math]\displaystyle{ \Delta Q = 0 }[/math]), we find that the entropy stays the same. However, the system can once again be expanded to satisfy The Second Law of Thermodynamics. In reality, energy would flow from the surroundings into the freezer. For this case, the freezer is perfectly insulated and no energy or matter can pass into the freezer from the surroundings.

System 3: The Universe

Consider the Universe as the system. Energy flows to the freezer (despite insulation, the Temperature of the freezer will inevitably increase). Therefore, [math]\displaystyle{ \Delta Q }[/math] is positive. Meaning that the entropy must increase. Consequently, we find that the energy principle is still satisfied, as the energy is transformed, and not created nor destroyed.

In general if:

[math]\displaystyle{ \Delta S_{system} \lt 0 \rightarrow \Delta S_{surroundings} \gt 0 \quad \And \quad \Delta S_{system} + \Delta S_{surroundings} \ge 0 }[/math]

Computational Model

Insert Computational Model here

Examples

Simple

In a set of experiments involving different set ups of gases in sealed containers (closed systems), an inexperienced fellow collects the following data (assume the change in entropy is equal to [math]\displaystyle{ \frac{Q}{T} }[/math], and no work is done on any of the gases):

Heat Engine Data
Gas Container [math]\displaystyle{ \frac{Q}{T} \ \left(\frac{J}{K}\right) }[/math] [math]\displaystyle{ \Delta S_{gas} }[/math] Reversible or Irreversible
[math]\displaystyle{ 1 }[/math] [math]\displaystyle{ 27.2 }[/math] [math]\displaystyle{ + }[/math] Irreversible
[math]\displaystyle{ 2 }[/math] [math]\displaystyle{ 0 }[/math] [math]\displaystyle{ + }[/math] Reversible
[math]\displaystyle{ 3 }[/math] [math]\displaystyle{ -21 }[/math] [math]\displaystyle{ 0 }[/math] Reversible
[math]\displaystyle{ 4 }[/math] [math]\displaystyle{ 0 }[/math] [math]\displaystyle{ - }[/math] Irreversible


a) Examine the data. For each Heat Engine, decide whether the data makes sense. If not, explain why.
We will make use of:
[math]\displaystyle{ \Delta S = \int \frac{\delta Q}{T} }[/math]
In this case, it is assumed:
[math]\displaystyle{ \Delta S = \frac{Q}{T} }[/math]
From this, we see that if the change in entropy is positive, then the value of the fraction in column three of the table must have be positive and vice versa. We will use this fact later.
Gas Container 1:
In this container, we see all the numerical values are consistent, since both the change in entropy and the integral are positive. This corresponds to Heat being added to the gas, increasing its entropy. Also, since the change in entropy is positive, the process would indeed be irreversible.
Gas Container 2:
In this container, we see that the data is inconsistent. The change in the entropy cannot be positive, and the value of the fraction be 0. The system cannot become more disordered with no exposure to Heat since no Work is done on the gas.
Gas Container 3:
In this container, the data is inconsistent, due to the fact that the fraction is negative, while the change in entropy is 0. For this to occur, the gas would have to give off Heat but stay just as ordered as it previously was.
Gas Container 4:
In this container, the data is inconsistent, due to the fact that the fraction is 0, while the change in entropy is negative. For this to occur, the surroundings would have to receive Heat from the gas and/or do Work on the gas.

Middling

A small glass jar contains [math]\displaystyle{ 18.06 \times 10^{23} }[/math] molecules of a gas, or 3 moles of the gas. Each micro-state is equally probable, and the entropy of the gas is [math]\displaystyle{ 112 \ \frac{J}{mol \cdot K} }[/math] initially.

a) How many micro-states are there?
Since each micro-state is equally probable, we can use Boltzmann's simplified equation for entropy to solve for the number of micro-states:
[math]\displaystyle{ nS_{mole}A = k_{B} \ \text{ln}(\Omega) }[/math], where [math]\displaystyle{ n = 3 }[/math], [math]\displaystyle{ A = \frac{1 \ mol}{6.02 \times 10^{23} \ \text{molecules}} }[/math], and [math]\displaystyle{ S_{mole} = 112 \ \frac{J}{mole \cdot K} }[/math]
[math]\displaystyle{ \frac{nS_{mole}A}{k_{B}} = \text{ln}(\Omega) }[/math]
[math]\displaystyle{ \Omega = e^{\frac{nS_{mole}A}{k_{B}}} = e^{P} }[/math]
[math]\displaystyle{ P = \frac{nS_{mole}A}{k_{B}} = \frac{3 \times 112 \times \frac{1}{6.02 \times 10^{23}}}{1.38 \times 10^{-23}} = \frac{3 \times 112}{(1.38 \times 10^{-23}) \times (6.02 \times 10^{23})} = \frac{3 \times 112}{1.38 \times 6.02} = 40.4449 }[/math]
[math]\displaystyle{ \Omega = e^P = e^{40.4449} = 3.673 \times 10^{17} }[/math] micro-states
b) Now, the number of micro-states is reduced to [math]\displaystyle{ \Omega = 1.54 \times 10^{11} }[/math]. What is the change in entropy of the system?
The initial entropy can be calculated as:
[math]\displaystyle{ S_{i} = k_{B} \ \text{ln}(\Omega_{i}) = (1.38 \times 10^{-23})\ \text{ln}(3.673 \times 10^{17}) = 7.488 \times 10^{-22} }[/math]
The final entropy can be calculated as:
[math]\displaystyle{ S_{f} = k_{B} \ \text{ln}(\Omega_{f}) = (1.38 \times 10^{-23}) \ \text{ln}(1.54 \times 10^{11}) = 3.555 \times 10^{-22} }[/math]
Therefore, the change in entropy is:
[math]\displaystyle{ \Delta S = S_f - S_i = 3.555 \times 10^{-22} - 7.488 \times 10^{-22} = (3.555 - 7.488) \times 10^{-22} = -3.933 \times 10^{-22} \ \frac{J}{K} }[/math]
c) Would the change in entropy calculated in the previous part normally be possible in a natural system?
No, a negative change in entropy corresponds to an increase in order of the system. This would not normally happen in a natural system.

Difficult

Insert Difficult Example here

Connectedness

How is this topic connected to something that you are interested in?

  • For someone who loves to cook as much as I do, entropy plays a surprisingly large role. For example, entropy is the reason that Crystal Light powder dissolves in water. Over time, the particles spread and spread because the more spaced out they are, the more disorderly they are. And as we know, disorder must increase with time.

How is it connected to your major?

  • As an Industrial and Systems Engineering major, a lot of the work I do will be designing production plants to be as efficient as they can possibly be. Understanding the concept of entropy and that no physical process can be completed without the loss of energy will help me to be able to account for that energy loss in my plans so that I can minimize the amount of inefficiency that occurs. To understand that disorder is inherently more probable than order can help me be aware of more possible outcomes.

Is there an interesting industrial application?

  • The Second Law of Thermodynamics is especially used in any industry making insulated products. Well, it is not "used" so much as it is fought against. Using the specifics of The Second Law of Thermodynamics, companies who make thermoses or lunch boxes or insulation need to understand entropy and The Second Law to learn how to best fight Heat dissipation so that they can keep hot things hot and cold things cold.

History

Contributors in the Development of The Second Law of Thermodynamics:

Nicholas Carnot
  • Nicholas Léonard Sadi Carnot (1 June 1796 – 24 August 1832)
    • Considered to be the father of Thermodynamics
    • Major Scientific Contributions:
      • Carnot Heat Engine
      • Carnot Theorem
      • Carnot Efficiency
    • His research was centered around learning if the work available from a Heat source was limited, and whether the efficiency of a Heat engine could be improved upon by replacing steam with a different substance
Rudolf Clausius
  • Rudolf Clausius ( 2 January 1822 – 24 August 1888)
    • German Physicist
    • Developed the Clausius Statement, which states that in general, Heat can not flow spontaneously from a lower temperature to a higher temperature
    • Wrote a famous paper titled "On the Moving Force of Heat and the Laws of Heat which May be Deduced Therefrom"
      • Pointed out differences between the concept of conservation of energy
      • Stated that assumptions about the Caloric Theory were incorrect
    • Presented the idea of Entropy and named it as such
    • Was known for taking a mathematical approach to Physics
William Thompson
  • William Thompson (26 June 1824 – 17 December 1907)
    • Also known as First Baron Kelvin
    • Mathematical physicist and engineer
    • Formulated the Kelvin Statement, which states that there is no way to convert all of the energy in a given system into work without losing energy
    • Developed the Vortex Theory of the atom
    • In addition to his contributions to Thermodynamics, he also created the Kelvin scale
Constantin Caratheodory
  • Constantin Carathéodory (13 September 1873 – 2 February 1950)
    • German mathematician of Greek origin
    • Principle of Caratheodory
    • Took on Thermodynamics with a mathematical axiomatic foundation
    • Created his own version of The Second Law of Thermodynamics by stating that "In the neighborhood of any initial state, there are states which cannot be approached arbitrarily close through adiabatic changes of state."
    • Used differential equations and Pfaffian expressions to prove the existence of entropy

See also

Further reading

External links

References