jwmerrill
You can reach me at jwmerrill@gmail.com
๐ Joined in 2012
๐ผ 1,598 Karma
โ๏ธ 274 posts
Load more
(Replying to PARENT post)
(Replying to PARENT post)
(Replying to PARENT post)
(Replying to PARENT post)
(Replying to PARENT post)
In thermodynamics, there often isn't really one "best" choice of two coordinate functions among the many possibilities (pressure, temperature, volume, energy, entropy... these are the must common but you could use arbitrarily many others in principle), and it's natural to switch between these coordinates even within a single problem.
Coming back to the more familiar x, y, r, and ฮธ, you can visualize these 4 coordinate functions by plotting iso-contours for each of them in the plane. Holding one of these coordinate functions constant picks out a curve (its iso-contour) through a given point. Derivatives involving the other coordinates holding that coordinate constant are ratios of changes in the other coordinates along this iso-contour.
For example, you can think of evaluating dr/dx along a curve of constant y or along a curve of constant ฮธ, and these are different.
I first really understood this way of thinking from an unpublished book chapter of Jaynes [1]. Gibbs "Graphical Methods In The Thermodynamics of Fluids" [2] is also a very interesting discussion of different ways of representing thermodynamic processes by diagrams in the plane. His companion paper, "A method of geometrical representation of the thermodynamic properties of substances by means of surfaces" describes an alternative representation as a surface embedded in a larger space, and these two different pictures are complimentary and both very useful.
(Replying to PARENT post)
(Replying to PARENT post)
This is effectively what OP does, but it is phrased there in terms of properties of the Li function, which makes it seem a little more exotic than thinking just in terms of differentiating power functions.
(Replying to PARENT post)
> Mind that all of this does not impose how we actually scale temperature.
> How we scale temperature comes from practical applications such as thermal expansion being linear with temperature on small scales.
An absolute scale for temperature is determined (up to proportionality) by the maximal efficiency of a heat engine operating between two reservoirs: e = 1 - T2/T1.
This might seem like a practical application, but intellectually, itโs an important abstraction away from the properties of any particular system to a constraint on all possible physical systems. This was an important step on the historical path to a modern conception of entropy and the second law of thermodynamics [2].
(Replying to PARENT post)
I believe MathJax has a similar capability.
(Replying to PARENT post)
Two probability distributions with different entropy can both assign finite probability density to the same state, so an increase in entropy does not preclude the possibility of the system returning to its initial state.
A great deal of confusion about entropy arises from imagining it as a function of the microstate of a system (in classical mechanics, a point in phase space) when it is actually a function of a probability distribution over possible states of a system.
A further wrinkle: Liouville's Theorem [0] shows that evolution under classical mechanics is _entropy preserving_ (because the evolution preserves local phase space density, and entropy is a function of this density). An analogous result applies to quantum mechanics. However, a simple probability distribution parametrized by a few macroscopic parameters rapidly becomes very complex as it evolves in time. When we imagine the entropy of an isolated classical system increasing over time, the meaning is that if we want to model the (very complicated) evolved probability distribution with a simple probability distribution (describable in terms of a few macroscopic parameters), the simple distribution must have entropy greater than or equal to the complex evolved distribution, which is equal to the original entropy before evolution.
It's difficult to reconcile the idea that entropy is a function of a probability distribution (not a function of a system's microstate) with the idea that Thermodynamical entropy is an experimentally measurable (kind of...) property of a system. Jaynes' "The Evolution of Carnot's Principle" [1] is the clearest description I've seen of the relationship between Thermodynamic entropy and Statistical Mechanical/Information Theoretical entropy. Many of Jaynes' other papers [2] on this topic are also illuminating.
[0] https://en.wikipedia.org/wiki/Liouville's_theorem_(Hamiltoni...
(Replying to PARENT post)
I think this is a pretty user-friendly compromise.
(Replying to PARENT post)
โAmazon confirms 14,000 job losses,โ is not an example of the passive voice.
โ14,000 workers were fired by Amazon,โ is an example of the passive voice.
There is not a 1:1 relationship between being vague about agency and using the passive voice.