Search Blogs

Thursday, January 4, 2024

Entropy: Thinking Beyond Disorder

Every so often, I casually refer to terms like entropy or free energy and forget to really think about what the fundamental meaning of those words are. So, with this post, I aim to focus on the meanings of these terms (i.e., without involving math), particularly entropy.

Entropy: More Than Just Disorder

Entropy is often described in terms of 'disorder' or 'randomness', but these terms, in my opinion, can be misleading and confusing. When we consider epistemology of a system, that is the study of what we can know about a system, entropy is meaningful only from a statistical perspective. This is because, scientifically, we understand that everyday matter1 is made up of microscopic constituents (i.e., atoms or molecules), yet it's impossible2 to know all details about these constituents. Therefore, we rely on a statistical representation of their collective behavior. Entropy is fundamentally about the number of ways a system composed of matter (or information) can be arranged at the microscopic level while still appearing the same at the macroscopic level. It measures the diversity of microstates corresponding to a given macrostate.

As a result, I think it's best not to conceive 'disorder' in the traditional sense, where objects are randomly spread out in a chaotic fashion, but rather as the numerous microstates a system can adopt while maintaining familiar macroscopic properties, such as temperature or pressure. Thus, entropy reflects a lack of specific information3 about individual microstates of a system, while still enabling knowledge of macroscopic properties.

The 'randomness' in entropy relates to the probabilistic nature of microstates. At the microscopic level, the exact configuration of particles in a system follows the laws of probability. This aspect of randomness is key to understanding why macroscopic properties of systems emerge as averages over many microstates. Simply put, we can never truly know what microstate the system is in, only the expectation value of a given microstate while considering all other microstates.

Nature's Tendency To Maximize

What we empirically observe in nature, is that matter at the microscopic and mesoscopic scales, tends towards the largest possible configurational space of microscopic states. This tendency to maximize the number of accessible microstates drives the natural progression towards states of higher entropy and thermodynamic equilibrium. Recall that entropy relates the microstates to the macrostate, and thus nature aims to maximize the number of microstates corresponding to an equilibrium macrostate.

To recap, entropy, often encapsulated in terms like 'disorder' and 'randomness', is more easily thought of as a nuanced measure of the probabilistic distribution of microstates in a system. Understanding entropy in this way helps better appreciate the intricate relation between microscopic thermodynamics (i.e., statistical mechanics) and macroscopic thermodynamics.

Bonus: Free Energy

Free Energy is another term worth digesting. It's best understood as the energy freely available in a system to do work, hence the name free energy. This concept is intertwined with the energy content of a system and its capacity to perform work. Under the laws of thermodynamics, free energy represents the principle that, while energy cannot be created or destroyed, it can be transferred or transformed. Thus, free energy is a conceptual tool that helps us understand how energy moves within a system and its surroundings.

Footnotes


  1. A famous thought experiment that attempts to circumvent the idea of not being able to determine what every constituent particle is doing is Maxwell's Demon. It was used as a way to violate the second law of thermodynamics for an adiabatic system, ΔS0

  2. Our best scientific theories, confirmed by experiment, indicate the fundamental constituents of matter are more elementary particles than atoms and molecules. However, for the most part, our interactions with the physical world at the smallest scales can be well described by thinking in terms of electrons, atoms, molecules, and the like. 

  3. In information theory, entropy is used to quantify details about information. So that rather than the microscopic states of matter, one thinks about how information can be represented. This was one of the seminal results put forth by Claude Shannon and I believe inspired by John von Neumann


Reuse and Attribution
CC-BY Logo
Bringuier, S., Entropy: Thinking Beyond Disorder, Dirac's Student, (2024). Retrieved from https://www.diracs-student.blog/2024/01/entropy-thinking-beyond-disorder.html.

  @misc{Bringuier_4JAN2024,
  title        = {Entropy: Thinking Beyond Disorder},
  author       = {Bringuier, Stefan},
  year         = 2024,
  month        = jan,
  url          = {https://www.diracs-student.blog/2024/01/}# 
                 {entropy-thinking-beyond-disorder.html},
  note         = {Accessed: 2025-04-16},
  howpublished = {Dirac's Student [Blog]},
  }

No comments:

Post a Comment

Please refrain from using ad hominem attacks, profanity, slander, or any similar sentiment in your comments. Let's keep the discussion respectful and constructive.