chilehed Posted January 25, 2007 Report Posted January 25, 2007 I’ve always been under the impression that, although the math of information entropy and thermodynamic entropy are virtually identical, the two are completely independent of each other because they have entirely different physical significances. Kind of like how the math of F=m*a is the same as v=I*R, but voltage and mechanical force are very different things. Over in another thread I’ve been gently told that I don’t know what I’m talking about, which is fine but I want to understand why. Unfortunately I have no background in statistical mechanics, I’m just a BSME and, although I did well in thermo, StatMech gives me a migrane. But maybe someone can explain it so I can understand. My working definitions:Entropy is a state property of matter, expressed in units of Joules per kilogram-degree K. It is indicative of the availability of the heat energy of the substance to do work. The entropy of a substance can go up or down during any real process.Total entropy is a property of thermodynamic systems, expressed in units of Joules per degree K. It is indicative of the reduction in the availability of the energy of the system to do work, as the result of a process. No real thermodynamic process can result in a negative total entropy. Changes in total entropy are also referred to as entropy generation. Someone said this on the other thread:To put that information there or maintain it takes energy, hence a system elsewhere shows a corresponding increase in entropy... Just in case you're still listening Chile... :confused:.... (the math is) the exact same thing...I thought that any increase in thermodynamic total entropy due to the action of putting the information into its media is due to those processes, which are outside of the system boundary of the media itself. Let’s say that we’ve agreed that if I put a brick up on a shelf, it means one thing, but if I leave it on the floor it means another. The work done in raising the brick up to the shelf is completely reversible, so there’s no entropy generation between the brick on the floor and the brick on the shelf even though they encode very different information. If I lift the brick and put it on the shelf, or alternatively lift the brick but let it drop back to the floor in such a manner that all of the irreversible losses are identical to putting it on the shelf, the total entropy is identical even though the encoded information is very different. A similar comparison between a brick left on the floor and one that was raised and dropped back into position results in a difference in total entropy but no change in the encoded information. So it seems to me that information and thermodynamic entropy have nothing to do with each other, aside from the fact that the mathematics is identical. I’ve been under the impression that there are two ways to quantify “information”. One is due to the amount of it, as in how many bits it takes to encode. Given four colors you need a 2-bit binary, but then each color requires 2 bits to encode and so each has the same amount of information. The other is due to the meaning of the information, but that’s rather subjective which again means that it and thermo are independent of each other. This is a good summary: “The point is that information "entropy" in all of its myriad nonphysicochemical forms as a measure of information or abstract communication has no relevance to the evaluation of thermodynamic entropy change in the movement of macro objects because such information "entropy" does not deal with microparticles whose perturbations are related to temperature.” (http://www.entropysite.com/shuffled_cards.html) So what am I missing? Quote
Buffy Posted January 25, 2007 Report Posted January 25, 2007 I’ve always been under the impression that, although the math of information entropy and thermodynamic entropy are virtually identical, the two are completely independent of each other because they have entirely different physical significances.In the wiki article on Information Entropy it says:in the view of Jaynes (1957), thermodynamics should be seen as an application of Shannon's information theory: the thermodynamic entropy is interpreted as being an estimate of the amount of further Shannon information (needed to define the detailed microscopic state of the system) that remains uncommunicated by a description solely in terms of the macroscopic variables of classical thermodynamics. For example, adding heat to a system increases its thermodynamic entropy because it increases the number of possible microscopic states that it could be in, thus making any complete state description longer.This is obviously an interpretation open to argument (which we could do here), but it forms the basis of the concept of stronger linkage. The most important thing we have to clear up in this thread though is:I’ve been under the impression that there are two ways to quantify “information”. One is due to the amount of it, as in how many bits it takes to encode.... The other is due to the meaning of the information, but that’s rather subjective.....Information theory really only refers to the former, content-related definition of information, and this is what really trips up the non-experts when they hear about this stuff. Meaning and semantics are all about finding equivalences between instances of data--to try to put it in information theory terms. Minimizing bits,Buffy Quote
sanctus Posted January 26, 2007 Report Posted January 26, 2007 Now, the best way I like to understand entropy is in the statistical meacanical way, as written in the wiki article posted by buffy, all the others derive from there (for me!). The entropy is just the logarithm of the number of states a system can have (with probably some constants I don't remember), ie. you have one spin 1/2 particle so the entopy would be proportional to ln2 because you can have to possible states (up and down), now you generalize this to n particles and you have a very intuitive description of what entropy is. Quote
Qfwfq Posted January 26, 2007 Report Posted January 26, 2007 Chile, you quote me replacing my word "they're" with "the math is". Thermodynamic entropy turned out to be one instance of the statistical (or info theory) concept, which can be applied very much in general. “The point is that information "entropy" in all of its myriad nonphysicochemical forms as a measure of information or abstract communication has no relevance to the evaluation of thermodynamic entropy change in the movement of macro objects because such information "entropy" does not deal with microparticles whose perturbations are related to temperature.”Info theory deals with whatever it likes. Suppose you specify that a certain brick (exactly that one) is at a uniform temerature T. How precisely are you specifying the state of that brick? How much information would be necessary to give the exact state of it? How does this compare with another state of the same brick, one at a lower uniform temperature? Quote
chilehed Posted January 26, 2007 Author Report Posted January 26, 2007 Chile, you quote me replacing my word "they're" with "the math is". :DB) I understood "the math" to be what "they're" referred; I certainly didn't intend to misrepresent what you said. Suppose you specify that a certain brick (exactly that one) is at a uniform temerature T. How precisely are you specifying the state of that brick? How much information would be necessary to give the exact state of it? How does this compare with another state of the same brick, one at a lower uniform temperature?I'm assuming that we are able to define the macroscopic state of the brick with absolute precision. I'd say that however much information is necessary to quantify the exact state at a temperature T1, the same amount of information would be required to quantify its state at a temperature T2. Quote
chilehed Posted January 26, 2007 Author Report Posted January 26, 2007 Information theory really only refers to the former, content-related definition of information, and this is what really trips up the non-experts when they hear about this stuff. Meaning and semantics are all about finding equivalences between instances of data--to try to put it in information theory terms. Minimizing bits,BuffyHmmm... My lack of background is a real problem here; I feel like A. Square must have felt in his conversations with A. Sphere. But what you say here has me thinking - I'll have to mull this over for a while. I see that my example above hinges on what the position of the brick means. I'll be back after I collect my thoughts. Quote
Qfwfq Posted January 29, 2007 Report Posted January 29, 2007 Buffy is saying that "meaning" and "semantics" can be reduced to maps of the combinations of those bits onto whatever set you may call the "meaning". Therefore it is irrelevant to information theory. You can apply it to thermodynamics as much as to data compression, cryptography etc....... I'm assuming that we are able to define the macroscopic state of the brick with absolute precision. I'd say that however much information is necessary to quantify the exact state at a temperature T1, the same amount of information would be required to quantify its state at a temperature T2.That's true for the macrostate, but what could be known about the state at each instant of time? How absolutely true is it that the hot body heats the cold one? How absolutely true is it that the two gasses mix and don't unmix? Just reverse the momenta... :) Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.