wigglieverse Posted December 18, 2007 Author Report Posted December 18, 2007 Entropy is a measure. Whether it's seen as a fixed, or static metric, or as the difference between two or more such "static" observations, it still measures something. Time is something that also measures change (but of course, we do this measuring); because we know how to arrange things, or observe them, that behave regularly; pendulums have a regular "behaviour", and can be used to "mark" the "passage" of time, but this is only possible because we "remember" past behaviour. Memories are like the pebbles I used before to illustrate distance. I call that something change. Whether the change is in heat energy, or in certainty or expectation. It's a measure of change, and the expansion of the universe is change, therefore the universe has entropy, because entropy is (a measure of) change. Distance (spatiality) is a "result" of the change entropy "causes". Pretty straightforward. It's also language, and this is how English describes both change, and how it gets measured or observed. The notion of equilibrium and stasis is nice to play around with, and we can imagine a stationary state (for some system), but this imagining isn't static (our brains are changing constantly), and there is no such thing as a static anything, in the universe. We only conceive of an instantaneous (zero-time) observation, which is never actually a possibility. So who's got the right idea here? Can anyone point out the bleedingly obvious misconception?A black hole essentially eliminates entropy, except for a token amount.A black hole is a thing that has time and distance "swapped"; the surface area of a black hole is equal to the entropy of a black hole.
snoopy Posted December 18, 2007 Report Posted December 18, 2007 . So who's got the right idea here? Can anyone point out the bleedingly obvious misconception? A black hole is a thing that has time and distance "swapped"; the surface area of a black hole is equal to the entropy of a black hole. As far as I know the entropy of a black hole is expressed as S= A k c^3 -------- 4 hG where A is the area of the black hole, k is boltzmanns constant, c is the speed of light, h is plancks constant and G is the gravitational constant So I dont know why you think the area of a black hole is equal to its entropy. Unless you are talking about Hawking Radiation where the decreasing area of the black hole is equal to its entropy ?? Clarification needed I think...hmm oh well Peace:)
Erasmus00 Posted December 18, 2007 Report Posted December 18, 2007 Entropy is a measure. While entropy is quantitative, it is NOT something you can measure directly. You are confusing "entropy" with "CHANGE in entropy," which is as elementary a mistake as can be made. Take any quantity, X, and stick it into the phrase "change in X," this is now a measure of change. Its true regardless of what X is- HOWEVER, X itself can be fundamentally unrelated to change. Whether it's seen as a fixed, or static metric, or as the difference between two or more such "static" observations, it still measures something. Measuring SOMETHING and measuring CHANGE specifically are not the same thing. entropy is (a measure of) change. No, it catagorically is not. To illustrate this point- two substances, one has and entropy of 5, one of 7 *10^5. One of these two is about to undergo a phase transition (drastic change). Which? (hint: there is no way to answer this question.) Distance (spatiality) is a "result" of the change entropy "causes". Entropy is statistical- as such to talk of entropy "causing" things is, simply, an improper use of the concept. A black hole essentially eliminates entropy, except for a token amount. This isn't true- Black Holes have maximal entropy! -Will
wigglieverse Posted December 18, 2007 Author Report Posted December 18, 2007 entropy is quantitative...Yes, its measured as the state of something, heat content, or flow from high to low temperature (another number we 'make up')....it is NOT something you can measure directly.How is it measured then, if it can't be done "directly"? How do scientists achieve this?Measuring SOMETHING...How do you, specifically, think you (an observer) are able to observe anything? Are you saying measurement itself can be instantaneous, performed with zero effort and 'done' in zero time?...and measuring CHANGE specifically are not the same thing. A measurement can be something that doesn't measure change? I don't agree.Entropy is statisticalRight. How is this determined? There's a difference between taking two measurements--now you have two records, or two numbers--and taking the difference between two measurements. I understand this fairly basic concept.So it's wrong, obviously, to say that the distance between two pegs in the ground is equal to the two pegs. I think it's ok to say " entropy is change", the way distance is the "change" between the pegs. It's a linguistic shortcut, or an example of using the action (calculating a difference), as a noun. English does this all the time.Sticking to a absolute definition of entropy as "the difference between two (static or fixed) observations", is being precious, isn't it? I know exactly what I mean when I say "change and entropy are the same thing". Of course observations are needed to actually calculate it, this is implicit in anything (change) we observe. But it's all change, isn't it? A measurement, or observation, or a calculation, are all the same kind of thing -a "mental" process. P.S. If there's a system that is "about to" undergo a change in entropy, how would anyone know this, unless they were observing it; we can't know anything about any system unless we observe it?
wigglieverse Posted December 18, 2007 Author Report Posted December 18, 2007 Yes, thanks for that. A link to another example of people discussing what they think they know; but who seem incapable of saying much at all about "what they know", or even how they observe anything.Just my 50c worth. P.S. what does 'deja-vu' have to do with entropy? P.P.S. Here's what an MIT prof. reckons:We define an irreversible process as one that cannot be reversed without some change to the surroundings (typically, work going to heat). Reversible processes are useful idealizations and we use them for comparison to measure how well we are doing with real (irreversible) processes. When we measure the entropy changes of both the system and the surroundings, the sum tells us how much irreversibility we have (how far from ideal we are)... An "entropy type" question would either be to describe what entropy is and why it is useful to us, or to calculate the entropy change of a system during a process...Entropy changes are related to how much work we can get out of a system (an energy transfer across the system boundary) compared to the maximum possible work for an ideal process...All real processes produce entropy changes (when summed up for the system and the surroundings)... Conservation of energy is not a sufficient condition for a process to be reversible. It is rather the other way around. All real processes are irreversible. In our world, physical processes like friction etc., lead to additional work being required to return systems to their initial state. --ocw.mit.edu/ans7870/16/16.unified/thermoF03/mud/T12mud03.html Sure... However, it was more than one guy claiming to be a teacher who told you this. It was a graduate level physics professor, 2 people who work in nuclear plants, and pretty much every other member of the forum.... Just because you plug your ears and close your eyes does not mean it's not the accepted and correct definition. Entropy is a thermodynamic quantity used to measure the disorder of a system.Entropy is disorderComplete BS; entropy is not disorder, so sorry. You might want (or not) to give this another try. Or are you busy looking for an example of change that isn't = entropy -remember, you claimed that this was all that was needed to show everyone how wrong I am.
C1ay Posted December 18, 2007 Report Posted December 18, 2007 A link to another example of people discussing what they think they know; but who seem incapable of saying much at all about "what they know", or even how they observe anything. Actually most of them seem to have a good handle on it except that Fred56 fellow (or gal). He/she even wanted to argue about the most widely used definition here. His/her antics reminded of the direction of this thread, i.e. the deja-vu.
wigglieverse Posted December 18, 2007 Author Report Posted December 18, 2007 Actually most of them seem to have a good handle on it I disagree. You saying this doesn't make it so. My opinion is that all of the posters who responded have only vague ideas of the subject. I have yet to see anything (including from yourself) in this thread that demonstrates I'm wrong. How about you? What sort of handle do you 'have on' it?I bet it doesn't concur with the definition at the end of that link you posted.I tend to ignore such things, if you can't be bothered to post your thoughts or ideas (i.e. what you think you understand), I can't be bothered either. I've probably read it before anyway. So where are all the refutations of what's in the extract from ocw.mit.edu?Surely you can come up with something? Where's the example of 'change' that isn't entropy? Where are all the examples of entropy that aren't a measure of change?Answer: there are none, you made it up. Surely you're not going to say nothing and let me have the last word? Entropy is (a measure of) change. Ipso facto. Entropy is not (a measure of) disorder or randomness. Of course, you could perform your own antics: ignore this "discussion", and do nothing about what appears (to me at least) to be a collective misconception of what an observation or a measurement is -especially of that entropy stuff. Or you could be a big grown-up person and "ban" me. I'm sure you'll manage. No responses? Maybe it's got something to do with this idea: "Something about doing the same thing and expecting different results?" P.S.Looks like someone thinks they posted a response after this effort; doesn't seem to have much to say about entropy -maybe he doesn't know what it is, so is unable to say anything relevant, just a lot of ranting. Now that's an accusation. Bit of a graffitist though. Posting opinion, how unscientific can you get? Still can't find an example of something that changes, but "isn't entropy", huh? P.P.S.Your eventual logical collapse will be like a balloon deflating, but that will be the only connection to any science, Mr. I now it all. (I could have posted that in Latin, but I thought I'd make it easy for you)
InfiniteNow Posted December 18, 2007 Report Posted December 18, 2007 Or you could be a big grown-up person and "ban" me. I'm sure you'll manage. Your eventual ban will have nothing to do with support of your claims... or, the lack thereof. :phones:
snoopy Posted December 18, 2007 Report Posted December 18, 2007 Complete BS; entropy is not disorder, so sorry.You might want (or not) to give this another try. Or are you busy looking for an example of change that isn't = entropy -remember, you claimed that this was all that was needed to show everyone how wrong I am. Entropy is disorder... entropy changes are related to how much work we can get out of a system this is true entropy is a measure of the unavailability of a systems energy to do work this is also true. entropy is a measure of how much randomness there is in any given system, this is also true. What is not true is that entropy is equal to change. For what then are changes in Entropy ? What is the change in change ?Its like saying there is space in space... Its bleedin obvious What is useful in the real world is changes in Entropy not the Entropy itself per se. Ice melting in a glass is the classic example of changes in Entropy which you are confusing with Entropy itself which is a measure of disordergiven byS= klogV. But you will probably tell me to get outta here or something charming like that so... Here I go again.... Peace:phones:
snoopy Posted December 18, 2007 Report Posted December 18, 2007 [math]S= \frac{Akc^3}{4 hG}[/math] entropy of a black hole. Yay at last I got it to work !!
C1ay Posted December 18, 2007 Report Posted December 18, 2007 Entropy is (a measure of) change. Ipso facto. Entropy is not (a measure of) disorder or randomness. Your claim, your burden to prove it. I'm not going to post the hundreds of available dictionary entries that refute your claim. Of course, you could perform your own antics: ignore this "discussion", and do nothing about what appears (to me at least) to be a collective misconception of what an observation or a measurement is -especially of that entropy stuff. Or you could be a big grown-up person and "ban" me. I'm sure you'll manage. You'll manage well enough. The infraction system will be rid of you soon enough. Until then I'd rather watch you demonstrate your misunderstanding of science. Maybe the criticism from your peers will help you learn something.
CraigD Posted December 19, 2007 Report Posted December 19, 2007 To explore the idea of information (Shannon) entropy, minus the insults and other unpleasantness into which this thread descended, I’ve started the thread 13716.
Recommended Posts