Entropy: Difference between revisions

From Physics Book
Jump to navigation Jump to search
Line 45: Line 45:
== See also ==
== See also ==


Are there related topics or categories in this wiki resource for the curious reader to explore?  How does this topic fit into that context?
Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.


===Further reading===
===Further reading===

Revision as of 18:24, 27 November 2017

Short Description of Topic

The Main Idea

Entropy is an important idea as it is crucial to both the fields of physics and chemistry, but often times it is hard to understand. The traditional definition of entropy is "the degree of disorder or randomness in the system" (Merriam). This definition can however can get lost on some people. A good way to visualize how entropy works is to think of it as a probability distribution with energy. In a sample space which includes two models and 8 quanta, you can configure each quanta to any system you like. All 8 quanta could go to one system, or they can be evenly distributed. If there each systems have equal probabilities of quanta levels, then a whole distribution can be formed around it. In this model, the probability that the energy will reach equilibrium is the highest, while scenarios where all the quanta is located in exclusively one of the two models have the lowest probability. In this way the new definition of entropy becomes "the direct measure of each energy configuration's probability."

A Mathematical Model

Here is a formula to calculate how many ways there are to arrange q quanta among n one-dimensional oscillators:


From this you can directly calculate Entropy (S):

Where (The Boltzmann constant) Kb = 1.38 e -23

A Computational Model

How do we visualize or predict using this topic. Consider embedding some vpython code here Teach hands-on with GlowScript

Examples

Be sure to show all steps in your solution and include diagrams whenever possible

Simple

Middling

Difficult

Connectedness

  1. How is this topic connected to something that you are interested in?
  2. How is it connected to your major?
  3. Is there an interesting industrial application?

History

The first person to give entropy a name was Rudolf Clausius. He questioned in his work the amount of usable heat lost during a reaction, and contrasted the previous view held by Sir Isaac Newton that heat was a physical particle. Clausius picked the name entropy as in Greek en + tropē means "transformation content."

The concept of Entropy was then expanded on by mainly Ludwig Boltzmann who essentially modeling entropy as a system of probability. Boltzmann gave a larger scale visualization method of an ideal gas in a container; he then stated that the logarithm of each of the micro-states each gas particle could inhabit times the constant he found was the definition of Entropy.

In this way Entropy came from an idea expounding terms of thermodynamics to a statistical thermodynamics which has many formulas and ways of calculation.

See also

Here are a list of great resources about entropy that make it easier to understand, and also help expound more on the details of the topic.

Further reading

External links

Great TED-ED on the subject:

References