Jump to content
Science Forums

Recommended Posts

Posted

I don’t think the 13678 thread made much progress on its orginal question:

Does anyone have a succinct explanation for the difference between thermodynamic entropy, and information entropy?
Other than demonstrating the already well-known phenomena of bickering, ad hominem attacks, and gross misuse of common terms in a web forum, ;( this thread demonstrated to me that while many of us hypographers have an intuitive grasp of information and thermodynamic entropy, and a vague sense of how they’re connected, these concepts are far from well understood in detail. :confused:

 

I though it helpful to start by applying the information (Shannon) entropy of some intuitively obvious systems (that is, discrete random variables), such as a roulette wheel and some variations.

 

The entropy of a discrete random variable X with possible values [math]\{ x_1 ... x_n \}[/math] is [math]H(X) = - \sum_{i=1}^np(x_i)\log_2 p(x_i)[/math]

 

For a fair roulette wheel with the usual 38 pockets [math]p(x_1) = … p(x_n) = \frac{1}{38}[/math]

So [math]H(X_{\mbox{roulette}})= -38 \left( \frac{1}{38} \log_2 \frac{1}{38} \right) = \log_2 38 \dot= \, 5.2479275 \, \mbox{bits}[/math]

 

Doubling the number of pockets increases the entropy by 1.

Halving it decreases it by 1.

Altering the number of pockets to an integer power of 2 gives an integer entropy, ie: [math]H(X_{\mbox{roulette with 32 pockets}}) = 5[/math]

 

Making the wheel unfair lower’s its entropy …

Having 1 cup that is 2 times as likely to get the ball reduces the entropy by about 0.0138073

Having 1 cup that is .5 times as likely reduces it by 0.0057755

Having 1 cup 5 times as likely to get the ball reduces it by 0.1823552

Having 5 cups that are 2 times as likely to get the ball as any of the remaining 33 cups reduces it by 0.1339015

Having 1 cup that is 100 times as likely reduces it by 2.9994255

 

Having the ball always land in the same cup – effectively, having a wheel with only one cup – reduces the entropy to 0.

 

Adding multiple balls (and allowing a pocket to catch any number of them) the wheel greatly increases a roulette wheel’s entropy…

For 2 balls, entropy increases by 4.2742433 to 9.5221708.

For 5 balls, it’s 19.5930529

 

The above math is pretty simple, though for some of the variations, it requires either mild ingenuity or inhumanly brutal computational power. Before trying to extend this experience into a comparison with thermodynamic entropy, does anyone see any flaw, or have questions or comments, about this example of calculating informational entropy? :QuestionM

Posted

I think there is a Latex syntax error on this page Craig.

 

Interesting though, I must admit I dont really have a handle on 'information entropy'.

 

Entropy itself is quite a tricky concept.

 

But I wasnt bickering.

 

But thanks for the math and explanation of it, interesting.

Posted

Let me give it a try :rolleyes: Thermodynamic entropy is a measure of disorder in physical systems, say a gas, in a vessel that is made up of billions of atoms/molecules. In formation entropy, on the other hand is the uncertainity in real life situations, where the number of options is much more limited, like in the case of roulette you have used as an example.

 

They appear to be complementary to each other! ;)

Posted
Interesting though, I must admit I dont really have a handle on 'information entropy'.

 

Entropy itself is quite a tricky concept.

 

In general entropy is a measure of the disorder or randomness in a system. In thermodynamics it is usually a measure of disorder, the higher the disorder, the higher the entropy. With information it is a measure of randomness or uncertainty, the more random, the higher the entropy.

Posted
In general entropy is a measure of the disorder or randomness in a system. In thermodynamics it is usually a measure of disorder, the higher the disorder, the higher the entropy. With information it is a measure of randomness or uncertainty, the more random, the higher the entropy.

 

 

Yes thanks I understand the basic concepts like if you sent a message in a stream of data bits the entropy involoved would be the uncertainty of your success in completely transmitting the stream.

 

and I know in general entropy is a measure of disorder.

but thanks anyway.

 

Just not so sure about some of the finer points and philosophical questions that arise. I have read Penrose and his ideas about entropy and his view that entropy is not fundamental and I know from my engineering background boltzmann's equation but thanks for your reply.

Posted
They appear to be complementary to each other! :)
I would say more like this: thermodynamic entropy is just one example of entropy. It's a matter of how many microstates belong to the same macrostate.
Posted
I would say more like this: thermodynamic entropy is just one example of entropy. It's a matter of how many microstates belong to the same macrostate.

 

Interesting Qfwfq, could you define what you mean by microstates... obviously you mean the small but is there a mathematical explanation of what you just said or a post to further reading would be helpful ?

 

Thanks

 

Peace

:)

Posted

I would say more like this: thermodynamic entropy is just one example of entropy. It's a matter of how many microstates belong to the same macrostate.

I agree with your take on entropy, Qfwfq. Stephan Hawking, in his book The Universe In A Nutshell (2001, p.63) describes the entropy of a black hole as:

 

The entropy is a measure of the number of internal states (ways it could be configured on inside) that the black hole could have without looking any different to an outside observer, who can only observe its mass, rotation, and charge.

 

My imagination has always urged me (unwisely) to translate this into person terms, such as: "My personal entropy is a measure of the number of internal states I have without looking any different to an outside observer." This is true because I've chosen to be a black hole to those who try to observe me, knowing that all they can ever see is my "mass, rotation, and charge" (translation unspecified). Thus, taking this definition of entropy from a thermodynamic context and placing it a psychodynamic one, or a genodynamic one, or a chemodynamic one, my personal entropy approximates my personal complexity.

 

Or not.

 

—Larv

Posted

Rather than drag this too far off what I think the topic is (computing entropy in the context of information), I created a thread to discuss the direct correspondence between information/thermodynamic entropy. This thread is here .

-Will

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...