Jump to content
Science Forums

Recommended Posts

Posted
Even if we draw 1 and 2, the chance that the last number in the set will be 4,5 or 6 is higher (75%) than that the number will be a 3.
It’s true that the probability of a random set selected from {1,2,3,4,5,6} being {1,2,4} or {1,2,5} or {1,2,6} is three times greater than the probability of it being {1,2,3}. However, it’s also true that the probability of it being {1,2,3} or {1,2,4} or {1,2,5} is three times greater than the probability of it being {1,2,6}. The probability of it being one of any collection of three picks is three times greater than the probability of it being any single pick. No pattern of the members of the picked set has any significance.

 

:Exclamati Lawcat, though you’ve not directly answered my question,

For the sake of clarity, Lawcat, are you claiming that the probability of drawing the sequence 1-2-3-4-5-6 from a collection of 48 balls labeled with the numbers 1-48 is lower than the probability of drawing the sequence 9-14-17-26-41-43? I’m not asking for a defense of your claim, only a yes/no response to the preceding question.
,you appear to be persisting in believing that the probability of some pick from a uniquely labeled set have a different probability than some other of the same number of members. This simply isn’t true.

 

If none of the explanations presented in this thread are convincing to you, try examining the question experimentally: take 6 card, bingo balls, etc, and randomly draw 3 of them many times, tallying how often a particular set like {1,2,3} occurs, vs. another like {2,4,6}. You’ll find no difference in their observed occurrence, and thus none in their actual probability.

 

Since actually shuffling and drawing balls or cards many times is time consuming and physically wearisome, it’s convenient to simulate it using a computer. Here are the quick results of 1 million trials of drawing 3 from {1,2,3,4,5,6}, sorted from most to least likely by occurance:

1,4,6  .050395
1,4,5  .050328
4,5,6  .050225
2,5,6  .0502
1,3,6  .050165
1,2,3  .050146
1,2,6  .050139
3,4,6  .050094
1,5,6  .050081
3,4,5  .050079
1,3,4  .050046
1,3,5  .049947
2,3,5  .049933
2,4,6  .04989
2,3,4  .049807
1,2,4  .049788
2,4,5  .049771
1,2,5  .049684
3,5,6  .049642
2,3,6  .04964

The usually combinatory math formula for this, [math]p=\frac{(m-n)!n!}{m!}[/math], gives [math]\frac{(6-3)!3!}{6!}=\frac{1 \cdot 2 \cdot 3 \cdot 1 \cdot 2 \cdot 3}{1 \cdot 2 \cdot 3 \cdot 4 \cdot 5 \cdot 6}=\frac1{20} = 0.05[/math].

As one can see above, an actual trial very closely matches that expectation.

  • 2 years later...
Posted

This was the first time I posted on here. This is a great thread.

 

God bless math. I know it's telling me that the probability of drawing an adjacent number, like 3 after 2 or 24 after 25 is the same as any other number, but I just do not trust that. There has to be a better math. :smile:

Posted

This was the first time I posted on here. This is a great thread.

 

God bless math. I know it's telling me that the probability of drawing an adjacent number, like 3 after 2 or 24 after 25 is the same as any other number, but I just do not trust that. There has to be a better math. :smile:

 

:huh: erhm...you posted numerous times to this thread before.

 

taking the stock market as a random-system proxy for a lottery, my approach is/was technical analysis -charting- wheras craig, erasmus, & jayq's approach would be the efficient market hypothesis. :lightsaber2:

 

one thing for sure about lotteries; if you don't play, you won't win. :twocents:

Posted

Turtle, sorry to have confused you. I meant to say, I joined hypography tp post in this thread. This was my first participation at hypography.

 

i see. well, welcome to the forum then... again. :wave2:

 

(presuming that) every one is in agreement that the more tickets, i.e. combinations, one buys the better the odds of winning on up to the certainty of winning if one bought all the tickets. since many if not all lotteries have forbid buying all the tickets, what do you suppose, or know, are the legal wranglings and tanglings concerning just how many lottery tickets can one entity buy "at a time"? beyond that matter, how must the entity buy them? only so many per outlet per person per time period? can there be some manner of "hot" terminal(s) for large purchase of specific combinations?

 

well, hope that gets some balls jumping. carry on. :juggle:

Posted

Cardinality, ordinality, and math skepticism

 

This was the first time I posted on here. This is a great thread.

This was a fun thread – and a good one, especially since it drew you into the hypoverse, lawcat. :)

 

God bless math. I know it's telling me that the probability of drawing an adjacent number, like 3 after 2 or 24 after 25 is the same as any other number, but I just do not trust that. There has to be a better math. :smile:

Discrete counting and probability is pretty near perfect mathematically. It’s finding the right illustrations and metaphors to make it click with our intuitions – winning our trust when the math contradicts our gut instinct (or, as I recall DrDick puts it something like, squirrel logic) – that can be the trick. It’s been decades since my livelihood depended on managing this trick – a couple years of teaching introductory math, and a couple more years as an actuary and a survey designer – but I’ll try a passing shot at it here.

 

The feeling that sequence of number-labeled lottery balls like 1-2-3-4 are less or more likely to be drawn than ones like 8-1-7-4 is due, I think, to our unconscious confusion of the mathematical concept of cardinality (counting) and mapping, and the related concept of ordinality.

 

We use numbers to count the elements in sets. Intuitively, the operation of addition corresponds to the act of combining sets. For example, using C() as the counting function, + to be both the “combine set” (union, roughly) and addition operator, and the usual notation for the rest:

C({g,b,m} + {x}) = C({g,b,m}) + C({x}) = 3 + 1 = 4

 

The idea of “adjacent” cardinal numbers is related to the cardinality of special sets, those with a single member, like {x} in the example.

 

We also can use numbers to “label” the elements in sets. The term “label” here means “map to members of another set, in this case, to some set of numbers. The natural numbers with or without zero, are the most common set of numbers mapped to in this way. We can, though, map members of a set of members of set other than sets of numbers, such as the a set consisting of the names of some fruits and vegetables.

 

Unless some method of selecting members from a set mapped to natural numbers explicitly uses this mapping, however, the mapping has no effect on the selection.

 

I believe that the confusion of cardinality, in which the concept of “adjacency” has a real, special meaning, and ordinality, in which is doesn’t, leads to lawcat’s “can’t trust” rejection of this threads math.

 

Lottery balls are a physical realization of a set mapped to a finite subset of the natural numbers. The method of drawing them, which is effectively random, doesn’t use this mapping. So, the same balls would be drawn if they were labeled with fruits rather than numbers.

 

Considering that a mapping of “lottery balls” to fruit names is interchangeable with a mapping to numbers, does the idea that the likelihood of drawing an ball labeled with a number after drawing one labeled to an “adjacent” number (which is interchangeable with drawing a ball labeled with a fruit) is no more or less likely than drawing one mapped to a number that isn’t adjacent seem more plausible to you, lawcat :QuestionM

  • 4 months later...
Posted

Hi there. First post here. Nice to meet you.

 

Sorry to revive an old thread. I don't know if there is a more recent thread where you are discussing all these, but Google brought me here when searching about consecutive numbers and probabilities and there's a lot info in here, that somehow making a new thread didn't seem right.

 

Anyway, let me tell you what my question is first and then I'll walk you through my thought of why I became interested in it. Well my question is, could someone explain to me the equal probabilities of 123456 with any other random number sequence, in terms of increasing entropy of a physical system?

 

Now the backup of my question. I always thought that 123456 is just as equal as any other sequence, but lately I was reading Brian Greene's book "the Fabric of the Cosmos" and at one point he is discussing entropy. In his book he had an example to show that an ordered state is less likely to happen than a disordered one if we let a physical system evolve on its own. He said that if one was to toss 100 pages of a book up in the air, the chances that they would fall down in numerical order from 1 to 100 (the perfectly ordered state) is far less likely to happen, than the state where the pages land in totally random order.

 

Now when I read this everything was crystal clear to me, no problem picturing and understanding the logical experiment with the book. But then I started wondering if the same applies with the evolution of other physical systems. So I thought, lets imagine we are tossing 49 balls up in the air and we keep record of the sequence of how they land. Using the same increasing entropy principal, the tendency of nature towards disorder, leads us to conclude that the chances of the balls landing in the highly ordered state of 123456 is less likely to happen than falling in any other disordered state of a totally random sequence. But this is in contrast with the equality in chances of 123456 and any other number sequence in a lotto drawing.

 

That's what drove me into Google and, after a few hits and misses, into this forum. So if anyone could provide me with a new way to think of this while considering entropy as well, I would be more than happy to read about it. One conclusion that I've made is that maybe the example Greene gives, in the attempt to explain how entropy works, is wrong too and that the chances with the book are equally the same, but this doesn't feel right.

 

Now that I think of it, while writing all these, maybe the difference between the book and the balls is that the book has an order to start with, not in the sense of page numbers but in the way the writer wrote his story in order to make sense, whilst the numbers in the lotto do not have that order in a physical way, it is just a way we choose to refer to them. Could this be it?

 

Thank you in advance.

Posted
Could this be it?
Definitely not. It's all a pure matter of combinatorics.

 

What you might be finding confusing is the sound-alike statements about "any other combination" and "any of all the other combinations" in the sense of "any one" vs. "each one" of them. That is the total number minus 1 vs. 1. You don't give identity to each one of all other combinations (except for the one(s) you played) and hence bundle them all together, reckoning as if "each one" had the probability of "any one" of them. They are two distinct concepts and indeed very different.

 

If that sounds confusing too, consider some specific cases: (3, 9, 23, 26, 38, 42) and (4, 15, 17, 31, 41, 48). These have equal probabilities which, of course, also equal that of (1, 2, 3, 4, 5, 6) but, if you played a ticket with those two particular ones, your chances of winning are the probability of either of them coming out, which of course is twice the chance of each. If you have played some number of distinct combinations, your odds are that many times the probability of each single one coming out. Extend this to the idea of having played all of them except the one "ordered" sequence (surely an expensive thing to do) and you would be almost certain to win (perhaps even taking a bit more than you spent). The difference in probabilities is obvious and it may confuse the meaning when talking about each unspecified one of all other combinations.

Posted

what he is specifically refering to is the 1 versus 1 and the 1 versus many idea.

with the example of the book, throwing 100 pages in the air, there are 100 factorial ways the pages could land, and out of those, only one result where the pages go precicely from 1 to 100. however, there is also precicely only one way that the pages could go in the order of 1,3,5,7... 2,4,6,8... or any other "order". so all particualar states have the same probability, but the "random" state is more likely than the "ordered" state. (that is you cannot predict beforehand what order the pages will land in.)

Posted

Thank you guys for your responses.

 

Qfwfq I get what you are saying but to be honest, it didn't help me with my question. You see I am not wondering what are my chances of winning with the combination I choose, but I am wondering if the sequence 123456 has something special to it in contrast with any other random combination. I say it doesn't. Someone else would say "but of course it has something special to it. It has the requirement that the numbers are in a specific order", but does that mean anything in a physical sense? Because if it does, entropy states that the chances of it appearing are fewer that of any other combination which has no requirements at all and it's totally random.

 

So,

 

Definitely not. It's all a pure matter of combinatorics.

 

I am going to ask again. Are you sure that my conclusion above is definitely wrong? Because I've been thinking about it since then and as time passes by it feels right. Entropy is about a physical system. Do the 6 balls out of 49 make a physical system? Of course they do but how one group is different from the other? It's only because we chose to label them with numbers. If we chose not to, we couldn't tell which ball is which. There is no specific order in the physical system of the balls. With the pages on the other hand, even if we didn't have page numbers, there is a way we could tell if they were in correct order. Were they to make sense while reading them, then they would. And there's only one combination of "absolute order", with the lowest possible entropy in it. So surely it has fewer chances of it happening in a natural evolution of a physical system.

Posted
I am wondering if the sequence 123456 has something special to it in contrast with any other random combination. I say it doesn't.
This is correct, as far as probability goes. The only distinction between them is the identity you attribute to the "ordered" set, a property you ascribe it. Note also that, since permutation doesn't count in the game, it would make more sense to ascribe "consecutivity" to it.

 

Because if it does, entropy states that the chances of it appearing are fewer that of any other combination which has no requirements at all and it's totally random.
Actually, this isn't what entropy means and:
Entropy is about a physical system.
:umno:

 

It was first defined in thermodynamics but eventually interpreted in statistical terms and defined in information theory. Entropy is a matter of how many single combinations belong to some category as well as the probability of each; specifically you are talking about the "ordered" ones and those that aren't. Of all sets of six numbers in the game, how many do you recognize as being "ordered" and how many not? In the game, each single one has the same probability; a consequence is that the entropy of each category is logarithmically proportional to the number of combinations belonging to it. More in general it would be proportional to:

 

[math]\sum_i p_i\ln\frac{1}{p_i}[/math] with: [math]\sum_i p_i=1[/math] for the category.

 

For the outcomes of the game, equiprobability means that a category of [imath]N[/imath] outcomes has entropy proportional to [imath]\ln N[/imath]. In thermodynamics, the entropy of a given macrostate is computable (in principle, at least!) according to all the microstates which belong to it, each with its probability.

Posted

I am aware of the use of entropy in information theory too. I don't know the details but I've read some things about it. In addition, in my search for lotto numbers and entropy I came upon the use of the term in situations where they were trying to analyze the randomness of the winning numbers and in another case where they were trying to apply the maximum entropy principle in order to see if there is a pattern in what numbers people usually pick. As much as these are interesting indeed, I was referring, as I said in my first post, to the entropy of a physical system. The thermodynamic version of it that is.

Posted
I came upon the use of the term in situations where they were trying to analyze the randomness of the winning numbers and in another case where they were trying to apply the maximum entropy principle in order to see if there is a pattern in what numbers people usually pick.
I don't get the utility of the first case, at least in a lotto type game where I would not expect significant deviations from combinatoric estimation of probability. The second is of interest in games based on equiprobability where the jackpot could likely be split amongst ex-aequo winners. If you win when nobody else does your take is better; if there is no difference in probability it is better to play combinations that most folks avoid.

 

As much as these are interesting indeed, I was referring, as I said in my first post, to the entropy of a physical system.
It is definable in principle by the same method, but isn't relevant to this topic.

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...