Jump to content
Science Forums

Recommended Posts

Posted

Does anyone have a succinct explanation for the difference between thermodynamic entropy, and information entropy?

 

One uses Joule/degK, the other is dimensionless bits? Can't seem to get a good answer to this one from some sites. There seems to be some confusion in the, er, 'science' community. Biologists seem especially unsure about it.

 

A guy who claims to be a teacher told me "entropy is not = change". What?

Posted
A guy who claims to be a teacher told me "entropy is not = change". What?

Sure... However, it was more than one guy claiming to be a teacher who told you this. It was a graduate level physics professor, 2 people who work in nuclear plants, and pretty much every other member of the forum.

 

I'll give you the same definition I gave you on the other forum where you were just banned.

 

Just because you plug your ears and close your eyes does not mean it's not the accepted and correct definition.

 

 

Entropy is a thermodynamic quantity used to measure the disorder of a system.

 

define: entropy - Google Search

 

 

 

You said explicity that entropy = change.

 

If this logic is accurate, this means also explicitly that change = entropy.

 

This is due to the understanding that equality is reflexive, symmetric, and transitive.

 

Equality implies equivalence.

 

 

To prove your assertion wrong, one simply must come up with an example of change that is not entropy, since your calling them equal implies their equivalence.

 

 

 

Anyone wish to offer any examples to WiggleTroll of a change that is not entropy?

 

 

Be sure to differentiate between closed systems and overall universe, because (troll that he is) WiggleTrip will play semantic games with you to pretend you're wrong even when your example is understood in context by 99% of all readers.

Posted
To prove your assertion wrong, one simply must come up with an example of change that is not entropy, since your calling them equal implies their equivalence.

Yes, folks. This is all you have to do.

equality is reflexive, symmetric, and transitive.

 

This would imply that your opinion of 'entropy', which simply cannot be based on any scientific notion, is equal to ignorance of what change actually is, or a measurement.

 

Apage! Ars tuum non scienta commutare.

Posted

Entropy is the measure of randomness of a system it is not equal to change

and is given by Boltzmanns equation.

 

S = klogV

 

where S = Entropy of some system

V = to the volume of the system

and k is Boltzmanns constant.

 

I hope this helps

 

Peace

:)

Posted

Entropy is disorder, while change can lead to both order or disorder. For example, if we change water from liquid to solid, there is a change that will lower the entropy that was contained in the liquid state.

 

Relative to communication and entropy, communication that increases entropy results in idea or beliefs becoming more subject to disorder. The mind will attempt to seek order and will lower entropy based on our belief systems. This will cause the entropy to lower into something logical.

 

Conversation is based on entropy and the lowering of entropy. When entropy increases it will absorb energy. The lowering of entropy releases energy. This is the verbal output. This energy output causes the entropy to increase in the other person, since it provides an energy source for the entropy. As the idea lowers entropy and releases energy, this is more verbal output.

 

Depending how much entropy is created and how much energy is released to lower the entropy, will determine the heat of the conversation. If nobody is listening such that there is little entropy change, the conversation is cool. If two people are listening and consider the other, the entropy increases and the energy release is higher making the conversation warmer. If the two people are yelling at each other, such a a political debate, both are trying to break the wall of the other increasing their entropy. As the wall reassembles the high entropy induction releases high energy for a heated debate.

 

An interesting form of communication entropy is romance. In this case, the information exchange is subjective to the subjectivities of romance. The data output is not exactly objective but is molded by the infatuation. The result is a dual entropy exchange that is not able to reach a lowest energy state for very long. One's mind and heart is in the spin of tactics and infatuation. A good analogy is trying to form the perfect crystal of minimal entropy. But each time it crystalizes out there are defects due to subjectivity. The result is the need to reheat the crystal again and again, with heat and warmth being released as the crystal tries to reform into perfection.

Posted
Entropy is the measure of randomness of a system it is not equal to change

Another one; another 'scientist' who can only spell entropy, and give us Boltzmann's eqn. Get outta here, dude.

Depending how much entropy is created and how much energy is released

Created?? Created where? Where the Fk is it? Another person who thinks entropy goes backwards too??

Isn't entropy a measurement? If it is, why should anyone think that measuring it means it must exist (ontologically)? We measure all sorts of things that don't have a real existence, we do this to explain the world.

Weight for example, is a projection our mind makes because of mass and gravity. We can do something with a bit of matter, but what can we do with a measurement except remember it?

 

Entropy seems to be a measurement that confuses all sorts of people. But it's 'just' a measure of mass dispersal, which we attribute to energy, which we otherwise call heat. It's also a measure of information flow. You know, information

 

Can any person who thinks they're a scientist tell me what Shannon entropy is and why it looks like Boltzmann's version (like it's an equivalent or something)? How about von Neumann entropy? Or conditional or equivocal entropy?

Maybe we can discuss this thing.

Posted

When entropy increases it absorbs energy which is why an expanding gas will get cooler. The thermal energy goes into the entropy so there is less heat. When we reverse or lower entropy, heat or energy is given off. One only has to compress the same gas. If you look at the entropy in a crystal, these are the defects. These high entropy zones contain potential energy relative to the perfect parts of the crystal with no entropy defects.

 

Because of the energy connection to entropy, i.e., entropy increase absorbs energy, if we add energy, we can increase entropy in a system that will not change entropy on its own. If I heat ice and it turns into liquid, I have used energy to increase entropy. Without the heat, the entropy would have stayed the same. If we take away heat and the ice reforms, we take away energy such that the entropy decreases. By cooling a system one can lower the entropy in a system that has steady state entropy.

 

Since entropy is disorder, entropy provides confusion in thinking. It can add defects, such as new data, that alters the way we see things. Since it contains potential energy, when order forms again and entropy begins to decrease, these will be an energy output, i.e., eureka!! The emotions are a good barometer for the level of energy being released as the mind tries to make sense and restore a sense of order and lowers entropy.

 

Although entropy communication is disorder, confusion and defects, this doesn't have to be bad. Sometimes new ideas create disorder, confusion and defects in older thinking. It sort of cracks, shatters or melts the old crystal by adding entropy. This type of entropy give us another opportunity for a better crystallization with a net lowering in entropy. Science sort of works this way, with each new discovery adding entropy followed by an evolving understanding that turns the entropy into a state of order.

 

An interesting twist is order that appears to be entropy. For example, when Copernicus confirmed the earth was not at the center, this was order or natural truth that acted like entropy, with respect to the old way. This unique situation was not an entropy defect but was analogous to a seed crystal added to the melt, on which science was born.

Posted
Another one; another 'scientist' who can only spell entropy, and give us Boltzmann's eqn. Get outta here, dude.

 

Created?? Created where? Where the Fk is it? Another person who thinks entropy goes backwards too??

Isn't entropy a measurement? If it is, why should anyone think that measuring it means it must exist (ontologically)? We measure all sorts of things that don't have a real existence, we do this to explain the world.

Weight for example, is a projection our mind makes because of mass and gravity. We can do something with a bit of matter, but what can we do with a measurement except remember it?

 

Entropy seems to be a measurement that confuses all sorts of people. But it's 'just' a measure of mass dispersal, which we attribute to energy, which we otherwise call heat. It's also a measure of information flow. You know, information

 

Can any person who thinks they're a scientist tell me what Shannon entropy is and why it looks like Boltzmann's version (like it's an equivalent or something)? How about von Neumann entropy? Or conditional or equivocal entropy?

Maybe we can discuss this thing.

 

Troll.

Posted
Since it contains potential energy, when order forms again and entropy begins to decrease, these will be an energy output, i.e., eureka!! The emotions are a good barometer for the level of energy being released as the mind tries to make sense and restore a sense of order and lowers entropy.

Entropy does not decrease. Sorry, but you would not get marks for a reply like that in a Physics exam.

You're talking about a reversible process.

How about von Neumann entropy? Or conditional or equivocal entropy?

Maybe we can discuss this thing.

Or maybe not.

Posted
Entropy does not decrease. Sorry, but you would not get marks for a reply like that in a Physics exam.

 

Entropy for a small system can both increase and decrease. UNIVERSAL entropy cannot decrease. There is a difference.

 

Entropy has nothing to do with change, but rather is a specific measure of the randomness of a system. It is NOT a measurement.

 

Von Neumann entropy is a specific type of entropy of quantum mechanical systems- it is essentially the straightforward application of Boltzmann's ideas to quantum system using density matrix approaches.

 

Shannon entropy stems from a realization about entropy that can best be understood by looking at Maxwell's demon- i.e. information itself carries entropy (if information were completely ordered, it would be easily to compress data- not so. The more dense the data, the more random the storage device.). In proper units (boltzmann's constant = 1, i.e. we measure temperature in units of energy), it is a straightforward extension of boltzmann entropy.

 

I really don't understand if you've asked a question here- you have made a few statements and denigrated people who try to answer. Whats the point?

-Will

Posted
Entropy has nothing to do with change

Seriously? Entropy isn't a number we use to measure change?

My opinion of that statement is: no marks for you.

I really don't understand if you've asked a question here- you have made a few statements and denigrated people who try to answer

Like I'm denigrating you, perhaps, for saying something like:

..a realization about entropy that can best be understood by looking at Maxwell's demon- i.e. information itself carries entropy (if information were completely ordered, it would be easily to compress data- not so. The more dense the data, the more random the storage device.). In proper units (boltzmann's constant = 1, i.e. we measure temperature in units of energy), it is a straightforward extension of boltzmann entropy.

I know information "has" entropy -at least it's called Shannon entropy.

 

What does heat have to do with information, or is there no relation, no symmetry? Information is something static, like DNA is?

 

Actually Shannon, von Neumann, and others who have looked at the info-theoretic model, call it "conditional entropy", and talk about uncertainty and expectation. Applying thermodynamic principles doesn't work with bits, but there's an obvious symmetry.

Posted
Seriously? Entropy isn't a number we use to measure change?

My opinion of that statement is: no marks for you.

 

I don't understand where you could have gotten the idea that entropy measures change- it doesn't. How are you defining "change?" How are you quantifying "change?" Given that you ASKED the question, and I am answering, how are you in a position to give "marks?" We can talk about the change in entropy, just as we can talk about the change in position, temperature, color, etc. Does that mean position measures change? Color measures change? etc.

 

What does heat have to do with information, or is there no relation, no symmetry? Information is something static, like DNA is?

 

Thermodynamic entropy can be given a precise meaning by building it up from statistical mechanics. By doing this, and looking at Maxwell's demon we can show that information MUST carry "thermodynamic" entropy. See Charlie Bennett's 1982 paper on Maxwell's Demon.

 

Actually Shannon, von Neumann, and others who have looked at the info-theoretic model, call it "conditional entropy", and talk about uncertainty and expectation. Applying thermodynamic principles doesn't work with bits, but there's an obvious symmetry.

 

Conditional entropy is a much more specific thing- its the entropy of a subsystem when you know something about about the rest of the system.

 

Uncertainty and expectation obviously come into play because entropy is a statistical (it does NOT measure change) quantity. And, by following the line of reasoning above, you can show that not only CAN you apply thermodynamic quantities to bits, but its the only way to save the second law.

-Will

Posted
just as we can talk about the change in position, temperature, color, etc. Does that mean position measures change? Color measures change? etc.

Are you saying a change in distance (that we notice) isn't a measure of change?

If we see the colour of something change, what are we measuring? Something changing position -a bird flying-, what are we looking at when a bird does this?

This is where I got the idea of change and measurement.

...information MUST carry "thermodynamic" entropy.

Carry it? No, I think you might mean there is a thermodynamic equivalent. As in, to turn a 'message' into something useful, we abstract it, or remember it.

This abstraction (a process) requires thermodynamic energy. The information, in that sense, is projected by this process into a more stable (content-wise) form, which has lower (informational) entropy. The reduction in the entropy or uncertainty, is proportional to the thermodynamic entropy, or change, in energy expended by the abstraction 'process'.

I still can't quite see how you get to "entropy doesn't measure change".

Is entropy static? Do we only see a "snapshot" or something?

 

If I open a bottle of compressed gas, it will all come out. What difference will there be between the entropy before and after the bottle "empties"? Isn't it a measure of two different states of some system? This is what I learned, and in IT, communications looked at this idea (informational entropy), but it really isn't that same thing; it should be called uncertainty (or certainty), which is also something that changes.

Posted
Another one; another 'scientist' who can only spell entropy, and give us Boltzmann's eqn. Get outta here, dude.

 

.

 

 

Im an engineer not a scientist.

 

but just trying to help

 

Peace

:phones:

Posted
Seriously? Entropy isn't a number we use to measure change?

My opinion of that statement is: no marks for you.

No, you've got it backwards. No marks for you. :naughty: Quit being presumptuous.

 

In thermodynamics, entropy is a function of state. How does this match up with your opinion?

 

Are you saying a change in distance (that we notice) isn't a measure of change?
Of course it is, just as a change in entropy is a measure of change. This doesn't mean entropy is a measure of change, any more than distance is a measure of change.

 

Carry it? No, I think you might mean there is a thermodynamic equivalent.
He put it loosely but you aren't doing much better by nitpicking this way. The link is statistical mechanics, it can be shown that [imath]\frac{\Delta Q}{T}[/imath] is equal to the variation of entropy according to the definition in terms of information. What's the need for such a fuss?
Posted

"Entropy isn't a number we use to measure change?"

Of course it is, just as a change in entropy is a measure of change. This doesn't mean entropy is a measure of change, any more than distance is a measure of change.
What is the change? What is it that is measured, or how is a change in entropy discriminated, if a "snapshot" of entropy isn't a measure of that change? A measurement isn't static, it takes time and effort (there is entropy involved in discriminating anything).

 

You seem to be saying it is a measure used to determine change, but this doesn't mean it's a measure of change...? Do you mean the different states -observed states- aren't a change in themselves, it's the observed difference between the states that's = the change?

 

Are you implying that a measurement is a fixed "snapshot"? A few I have tried to discuss this subject with seem to have this view, like the (completely illogical) view that there's a difference between an experiment and an observation; they're different words, but any observation is an experiment, any scientifical experiment (method applied to make some expected observation) involves observation. I can't imagine many scientists doing experiments, and setting up equipment with blindfolds on and their ears plugged tight.

How are you defining "change?" How are you quantifying "change?"

How would you describe change conceptually (maybe we can start with that)?

What measurements do we make, in order to discriminate any state (at some time) from any other (at some other time)?

Are distance, position, entropy, weight, the "state" of a system, all just measurements, so can't be said to be a change in themselves, just representative, or like pebbles we drop as we walk along, to mark or observe a previous position, and calculate how far we are from any of the previous places we've been? These pebbles don't change, as such, is that what you mean?

Also, are you saying entropy is a noun (= state of a system)?

Isn't it also a verb (= change of state of a system)?

 

P.S. Once more: thermodynamic units cannot be applied to dimensionless bits.

Posted

In chemical thermodynamics entropy is half of the free energy equation, with enthalpy the other half. Enthalpy is connected to the lowering of energy due to the forces of nature, particularly the EM force. For example, if we start with a cloud of hydrogen in space, the entropy is at a max. If the entropy continued to increase, it would disperse into space. But if gravity turns this into a star, it will cause the entropy to decrease, because the enthalpy affect due to gravity. Entropy will not spontaneously decrease, but it can be manhandled by enthalpy, if this lowers free energy.

 

The idea that the entropy of the universe is increasing does not correlate to basic observations. There shouldn't not be any stars or galaxies, since these have created order and have caused entropy to decrease. A black hole essentially eliminates entropy, except for a token amount.

 

Maybe the problem I am having is the term entropy may have been redefined away from the original thermodynamics of free energy. Relative to communication and entropy, the topic confusion is adding entropy to the discussion, but everyone is releasing the free energy. This may be a type of psychology experiment, with intentional communication entropy.

Guest
This topic is now closed to further replies.
×
×
  • Create New...