coldcreation Posted November 18, 2009 Report Posted November 18, 2009 What is Entropy? This question appears to surface quite often in various forms and in various places: always energizing considerable debate. Here, it is stated: The equation for entropy, or the amount of disorder in a system, was formulated by the German physicist Rudolf Clausius in 1850. Some would define entropy as a thermodynamic property, a measure (or degree) of disorder in a system. Entropy is a property defined for all states of a system. It can be assigned absolute values that must be positive (nonnegative). Entropy cannot be destroyed. It is a property of all matter. Depending on the type of system, entropy either remains the same or increases with time. In Greek, entropy means evolution. (see Second law of thermodynamics, and/or Entropy) So far so good. But it is not at all that clear as to what exactly is entropy, especially to those unfamiliar with thermodynamics (which includes almost everyone :)). Frankly, even those that are familiar with thermodynamics seem to disagree on the question of what is entropy. Here are some thoughts and questions that may arise on the topic of entropy: What role does entropy play in the evolution of open thermodynamic system? The entropy problem: Entropy is a thermodynamic quality of matter, defined by the degree of disorder (randomness) of a system. It is known to increase with time. When applied to the universe as a whole, the entropy should have been very low to begin with, and increased with time. By deduction, the universe must have been a very ordered system in its very early stages. Why? Entropy is essentially a conserved quantity in an expanding universe. The conclusion must be that the entropy of the universe has always been huge. The standard models do not explain why (Pagels 1985). Entropy seems related to irreversibility. The problem of irreversibility relates to the very foundation of thermodynamic systems in the evolutionary description of nature connected with the increase of entropy with time, a phenomenon described by the second law of thermodynamics. Has the problem of Bekenstein-Hawking entropy of black holes been resolved? What is the state of least entropy, i,e., is there an absolute minimum amount (like absolute zero Kelvin) of entropy of a system? What’s do you believe entropy is (or is not) and why? Coldcreation Quote
Qfwfq Posted November 18, 2009 Report Posted November 18, 2009 Let's take a cold shower. Ready? Consider a system describable by a random variable with number N of values. These values might correspond to the state of motion of the many many atoms a material is made of, but to be simple let's imagine it has a finite set of values and a probability can be assigned to each value (for a material it would really be a probability distribution and this would require fancy integrals). Consider the system in one of the possible exact states and call [imath]p_i[/imath] the probability of the variable's [imath]i[/imath]-th value. Define entropy as: [math]S=\sum_{i=0}^N p_i\ln\frac{1}{p_i}[/math] Now this is a gross simplification for thermodynamics where, however, a formally similar definition can be used. Entropy has to do with information content. When you zip a file, you are putting the same info into a smaller number of bits, you make a file with a higher entropy. Quote
coldcreation Posted November 18, 2009 Author Report Posted November 18, 2009 Let's take a cold shower. Ready? Consider a system describable by a number N of random variables. Their values might correspond to the state of motion of the many many atoms a material is made of, but to be simple let's imagine each of them has a finite set of values and a probability can be assigned to each value (for a material it would really be a probability distribution and this would require fancy integrals). Consider the system in one of the possible exact states and call [imath]p_i[/imath] the probability of the [imath]i[/imath]-th variable's value. Define entropy as: [math]S=\sum_{i=0}^N p_i\ln\frac{1}{p_i}[/math] Now this is a gross simplification for thermodynamics where, however, a formally similar definition can be used. Entropy has to do with information content. When you zip a file, you are putting the same info into a smaller number of bits, you make a file with a higher entropy. Interesting. So I imaging the introduction of Boltzmann's constant k would be included in a version of the equation that would not be a gross simplification. Is that correct? I think one of the main issues that people have is that we see the organization of things, complexity, the formation of complex structures in the universe, which a t first glance appears in opposition the the second law of thermodynamics. Let's take a concrete example: the solar system. From an original cloud of rotating gas (a proto-planetary nebula) planets form and remain in quasi-stable (albeit chaotically to some extent). Is this not like a shuffled deck of cards ending up ordered by suit, where Ace is high (the Sun)? Or would that actually be considered an increase of entropy from the formerly ordered gas nebula? If entropy is considered a degree of disorder of a system, there's something non-intuitive with the concept of gas cloud being 'ordered' (with its molecules all interacting, pointing in different directions) and ending up in a 'disordered' grouping of planets in a gravitationally bounded in a quasi-stable equilibrium configuration. Like the box experiment where half is filled with gas, the other empty. Once the separation is removed (or an opening is made between the two halves) the gas fills the entire space, increasing the entropy inside the system. It seems that the solar system now is partitioned in a way similar to when the gas occupied a section of the box (a state of low entropy). And that the proto-planetary disc is represented by the gas once it fills the entire space (a state of higher entropy). Where in the conceptual error? Maybe order and disorder are not the best words to use when entropy (a potential for disorder) is considered. It's strange. Sometimes I feel I understand entropy, and at others I don't. I guess it's because I don't see a gas as orderly, yet I do see the solar system as orderly. Or maybe my problem is with gravity and its relation to the concept of entropy. Entropy tends in increase irreversibly but gravity tends to counter the trend (e.g., the celestial mechanics is said to be time reversible), or something like that. Something seems to be amiss. CC Quote
Qfwfq Posted November 18, 2009 Report Posted November 18, 2009 Don't neglect the fact that in coalescing they become very hot by compression. Quote
Larv Posted November 18, 2009 Report Posted November 18, 2009 Entropy is a measure of the number of internal states a system can have without looking any different to an outside observer. Quote
JMJones0424 Posted November 18, 2009 Report Posted November 18, 2009 I'm glad to know I'm not the only one that has been questioning my understanding of entropy. The last time I really delved into the subject was when reading Murray Gell-Mann's The Quark and the JaguarAmazon.com: The Quark and the Jaguar: Adventures in the Simple and the Complex (9780716727255): Murray Gell-Mann: Books http://www.amazon.com/Quark-Jaguar-Adventures-Simple-Complex/dp/0716727250 I think it included the best description I have read to date, but as it was a library book that I read, I don't have the text in front of me. While searching online for excerpts, I came across Entropy Demystified on scribd. In it, the author claims to explain the second law in laymen's terms. It is 254 pages, so it will be a while until I know for sure how well the author lives up to his goal, but I'll give it a shot. I almost hesitate to include a link to "the other thread" as it has devolved beyond its original topic, but a statement there has forced me to entirely re-think what I thought I knew about entropy. I won't ask anymore questions until after doing some more reading. I find myself at an extreme disadvantage at times, lacking formal training it takes me longer to digest some of what the rest of you say than the average hypographer. modest 1 Quote
coldcreation Posted November 19, 2009 Author Report Posted November 19, 2009 Don't neglect the fact that in coalescing they become very hot by compression. Ok, but what about the classical box experiment where half is filled with gas, the other empty. A separation is removed the gas fills the entire space, increasing the entropy inside the system. It seems that the proto-planetary disc (of the early solar system) is represented by the gas once it fills the entire space (a state of higher entropy). Now, he solar system looks partitioned in a way similar to when the gas occupied a section of the box (a state of low entropy). Where in the conceptual error? Quote
coldcreation Posted November 19, 2009 Author Report Posted November 19, 2009 Entropy is a measure of the number of internal states a system can have without looking any different to an outside observer. I don't know about this. The cleaning-lady comes on friday. The place is spotless (a state of low entropy). By the time next friday rolls around, the place is a mess (a state of higher entropy). That is totally observable. I've heard a similar analogy used by a thermodynamicist to describe the second law. Quote
coldcreation Posted November 19, 2009 Author Report Posted November 19, 2009 I'm glad to know I'm not the only one that has been questioning my understanding of entropy. The last time I really delved into the subject was when reading Murray Gell-Mann's The Quark and the JaguarAmazon.com: The Quark and the Jaguar: Adventures in the Simple and the Complex (9780716727255): Murray Gell-Mann: Books... I'm glad to know, too, I'm not the only one that has been questioning my understanding of entropy. Sounds like a good book. I've seen it before somewhere. The title is a good one. I think the whole problem with understanding entropy revolves around the relations between order-disorder, the simple and the complex and the non-decrease of entropy. Quote
Qfwfq Posted November 19, 2009 Report Posted November 19, 2009 Entropy is a measure of the number of internal states a system can have without looking any different to an outside observer.I was meaning to expand on what I already posted with a similar remark but differing in the detail and I hope it clarifies CC's query about the above. Entropy is a measure of the number of unique states a system can have which share some common attribute or, if you like, meaning. In thermodynamics, the former are often called the microstates and the latter the macrostates: the state of motion of each particle vs. a macroscopic quantification such as temperature-pressure-volume. Ok, but what about the classical box experiment where half is filled with gas, the other empty. A separation is removed the gas fills the entire space, increasing the entropy inside the system.This is called adiabatic expansion. The glaring question of course is: how can it correspond to an increase in entropy, defined as [imath]\frac{dQ}{T}[/imath], when it is adiabatic? The answer is: if a hot boilerplate and a pot of cold water are placed inside an excellent thermos, the whole thing is fairly adiabatic as well. If this isn't clear, in either case the system is not in thermal equilibrium. Likewise, planet formation did not occur in thermal equilibrium and further, once the matter is not all gaseous, you have macroscopic kinetic energy which thermalizes with inelastic collisions, So, not all the [imath]\frac{dQ}{T}[/imath] must come in from outside and this goes for things such as chemical reactions too. Quote
lemit Posted November 19, 2009 Report Posted November 19, 2009 I have a question which since ColdCreation said something seems amiss I'll feel emboldened to ask. If we accept conservation of mass (i.e. energy), wouldn't there be a continuum between potential and kinetic energy but no actual loss of energy? And I have a follow-up question. That other question's pretty stupid, isn't it? Thanks. --lemit Quote
Qfwfq Posted November 19, 2009 Report Posted November 19, 2009 If we accept conservation of mass (i.e. energy), wouldn't there be a continuum between potential and kinetic energy but no actual loss of energy?Entropy is related to thermal energy but isn't energy. Energy is conserved, it just changes from form to form and mass is --in a sense-- just one form of it. Entropy has to do with information or, as I said earlier, some kind of meaning. Quote
JMJones0424 Posted November 19, 2009 Report Posted November 19, 2009 The scribd document I previously referenced was too basic and too lengthy to hold my attention. I have found another site that addresses the problems with defining entropy as a description of order. Disorder - A Cracked Crutch for Supporting Entropy Discussions from the Journal of Chemical Education, vol 79, p 187-192. The entire site seems to be a good source for review of the concept of entropy. Quote
Pyrotex Posted November 19, 2009 Report Posted November 19, 2009 Ok, but what about the classical box experiment where half is filled with gas, the other empty. ...Where in the conceptual error?The conceptual error is where you pick a certain class of system (say the Solar System) and then assume it is representative of the Universe, or of all systems. Here goes an analogy. Let's make up a Law. Call it the Law of the Ocean: "All water in the ocean will tend to accumulate at or below Sea Level." Sounds true. It is true. However, let us zero in on a WAVE, which was created by a STORM. The water in the Wave is above Sea Level. OMG!!! Isn't this a "violation" of the Law of the Ocean? :phones: Local conditions (storm winds) can easily create local regions where water is building up in waves above Sea Level. That does not mean that some global condition can make the ENTIRE OCEAN go above Sea Level. Gravity is one of those natural forces that can locally decrease entropy. A random cloud of cold gas becomes concentrated in a highly ordered star at high temperature, emitting free energy, and surrounded by planets in nice regular orbits. That is a whopping big decrease of entropy -- in a small local region. That available free energy can fall on a water-planet, and, over geological time, create highly ordered systems that we call "Life". Again, a lowering of local entropy. However, if you take the total entropy of both Sun and Earth, the total is increasing: the Sun is losing free energy at a rate much higher than the Earth is collecting it with dissipative structures (Life). For any given system you wish to analyze, you should first take a good look at your starting point, at your "boundary conditions", to see if you are already "assuming" some kind of transition or special case. Then double check to see that your conclusions stay true to THAT ONE system, rather than misapply them to some more general case. Turtle 1 Quote
coldcreation Posted November 21, 2009 Author Report Posted November 21, 2009 The conceptual error is where you pick a certain class of system (say the Solar System) and then assume it is representative of the Universe, or of all systems. Actually that is not at all what I was doing with the example of the solar system. I merely ask whether the formation and evolution of the solar system, from it's original protoplanetary disc (say) to the actual configuration, represent an increase in entropy, in accord with the second law. Or, if this is not a reflection of entropy, but a solely gravitational phenomenon. I think Q answered that question pretty well, albeit I did not understand exactly what he wrote. Here goes an analogy. Let's make up a Law. Call it the Law of the Ocean: "All water in the ocean will tend to accumulate at or below Sea Level." So far so good. :singer: Sounds true. It is true. However, let us zero in on a WAVE, which was created by a STORM. The water in the Wave is above Seal Level. OMG!!! Isn't this a "violation" of the Law of the Ocean? :umno: No. Agreed. :shrug: Local conditions (storm winds) can easily create local regions where water is building up in waves above Sea Level. That does not mean that some global condition can make the ENTIRE OCEAN go above Sea Level. Right. That phenomena would be reserved for the moon, and the sun, to some extent. :eek_big: Gravity is one of those natural forces that can locally decrease entropy. Continue, you got my attention. :eek_big: A random cloud of cold gas becomes concentrated in a highly ordered star at high temperature, emitting free energy, and surrounded by planets in nice regular orbits. That is a whopping big decrease of entropy -- in a small local region. That available free energy can fall on a water-planet, and, over geological time, create highly ordered systems that we call "Life". Again, a lowering of local entropy. I'm not sure I agree with this assessment. I would consider life as a reflection of the second law (not a violation of it), So too the formation of the solar system would be in accord with the 2nd law. This latter is the point I try, without apparent success, to understand. However, if you take the total entropy of both Sun and Earth, the total is increasing: the Sun is losing free energy at a rate much higher than the Earth is collecting it with dissipative structures (Life). I see where you're coming from, and where you're going. Nice. Edit: but my question concerned the evolution of the solar system from its original gaseous/diffused material state to the one which you describe above. For any given system you wish to analyze, you should first take a good look at your starting point, at your "boundary conditions", to see if you are already "assuming" some kind of transition or special case. Then double check to see that your conclusions stay true to THAT ONE system, rather than misapply them to some more general case. We've discussed this "boundary conditions" before. I would say there is no "boundary conditions" relative to the solar system. You say there is one, but it can be arbitrarily chosen. But if the former is true, then the solar system is an open system, where entropy is still a nondecreasing property. I wouldn't look at this claim from the perspective of a misapplication of one systems to a general case scenario. Though, I strongly believe that the second law must be operational universally. In other words, I wouldn't expect the solar system (or similar systems) to be an exception to the 2nd law. Quite the contrary. CC Quote
lemit Posted November 21, 2009 Report Posted November 21, 2009 I hope we don't get into the argument from that other thread, but wouldn't an object the size of, say, earth possess enough potential energy to generate--even without an outside energy source--a number of fairly dynamic systems? (Nudge Wink) And on a larger scale, might not the universe itself be, as it were, a throbbing mass of energy on a continuum between potential and kinetic states, so that the Big Bang could have resulted from an unsupportable accumulation of potential energy? That is much too simple, much too easy. What am I missing? (Please limit your answer to 140 characters or less. It would be really nice if you could do it in the 17-syllable haiku format.) Thanks. --lemit Quote
coldcreation Posted November 21, 2009 Author Report Posted November 21, 2009 I hope we don't get into the argument from that other thread, but wouldn't an object the size of, say, earth possess enough potential energy to generate--even without an outside energy source--a number of fairly dynamic systems? (Nudge Wink) I think someone (perhaps CraigD of Freezy) mentioned the volcanic activity deep beneath the ocean surface (where tectonic activity transpires) that generates sufficient potential energy to sustain life, without an outside energy source like the sun. There are a number of dynamic systems down there. So, yes. And on a larger scale, might not the universe itself be, as it were, a throbbing mass of energy on a continuum between potential and kinetic states, so that the Big Bang could have resulted from an unsupportable accumulation of potential energy? Sounds good to me. IT would have been the outcome of a type of phase transition that would have occurred after a particular threshold was reached in the potential energy. But I don't get the connection with entropy. That is much too simple, much too easy. What am I missing? (Please limit your answer to 140 characters or less. It would be really nice if you could do it in the 17-syllable haiku format.) :eek_big: Where's ~modest when you need him? CC Quote
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.