Tubetotext

    Reversing Entropy with Maxwell's Demon

    Entropy is a Measure of Disorder or Randomness

    Entropy is sometimes described as a measure of disorder or randomness the second law of thermodynamics the law that entropy must on average increase has been interpreted as the inevitability of the decay of structure this is misleading as we saw in our episode on the physics of life structure can develop in one region even as the entropy of the universe rises ultimately entropy is a measure of the availability of free energy of energy that isn't hopelessly mixed in thermal equilibrium pump energy into a small system and complexity can thrive but entropy is connected to disorder and randomness in a very real way

    00:01

    Thermodynamic Entropy vs Shannon Entropy

    thermodynamic entropy is related to the amount of hidden information based on thermodynamic knowledge only it defines how far a system is from thermal equilibrium and it also defines the availability of free energy energy that can be extracted as the system moves back to equilibrium shannon entropy on the other hand is a measure of the hidden information in any system not just thermodynamic systems

    04:10

    Quantum Entropy

    quantum entropy also known as von neumann entropy it describes the hidden information in quantum systems in fact the evolution of quantum entanglement may be the ultimate source of entropy the second law the limits of information processing and even the arrow of time

    11:10

    Criticisms of the Ergodic Hypothesis

    iago silva criticizes our assumption of the ergodic hypothesis for the non-physicists out there the ergodic hypothesis is basically the assumption that all microstates are equally probable over long periods of time iago's criticism is totally fair starting with this assumption gets you to the boltzmann equation and it's a nice relatively simple way to understand entropy it's valid for idealized situations but isn't necessarily general

    11:46