Back in February, I responded to an article by philosopher Martin Smith—“Why Throwing 92 Heads in a Row Is Not Surprising,”1—in which he claims that it would not be surprising, or at least shouldn’t be seen as cause for surprise by a rational person, to get 92 (or more) Heads in a row from a fair coin.
The pith of his argument is that since it wouldn’t be surprising to get Heads on any one of 92 (independent) flips, it shouldn’t be surprising when those individually unsurprising results are tallied up to reveal that all 92 flips landed Heads.
My response to the argument takes the form of a lengthy blog post: “Why 92 Heads in a Row Is Not Not Surprising.” Here’s a shorter one. But I’m going to use 100 Heads in a row rather than 92, due to habit. The probability of getting 100 heads in a row is minuscule: 2-100 or about 1 out of
Now, let’s take 100 Heads in a row to be a single event. My claim is that you’d be justified in believing that this event won’t happen. In fact, we tend to make fun of or pity people who believe in, count on, or genuinely hope for things far far far more likely than this happening, such as winning the lottery, even it’s in the form of a raffle (it’s generally guaranteed that someone wins a raffle, even if there are trillions of names in the hat; by the way, I’ve explored elsewhere [see below] how getting 100 Heads in a row is different than winning a lottery of any form).
Also, as Smith points out, were two people to randomly select a grain of sand on planet Earth, the likelihood of them selecting the same grain of sand is much higher than getting 100 Heads in a row (it’s actually closest to getting 63 Heads in a row, assuming an estimate of 7.5 × 1018 grains of sand). Imagine what this means for the probability of landing a thousand or a trillion Heads in a row! Such events are trillions and trillions of times less likely than two people randomly picking the same grain of sand, and yet getting a trillion Heads in a row is just as subject to Smith’s no-surprises model as is, say, getting two Heads in a row.
All that in mind, imagine a jar containing one trillion green marbles and one orange marble. I claim that you’d be justified in believing that, in any given pull of a marble from that (well-mixed) jar, you will pull a green marble. (At a trillion green marbles, an orange marble appearing is far far far more likely to happen than getting 100 Heads in a row; but I’m replacing that event with the event of pulling an orange marble from a jar because I find the marble case easier on the imagination.)
Here’s my point. You are justified in behaving, before each pull, as though it is guaranteed that a green marble will come out. Suppose your friend has pulled a trillion marbles, always with replacement and always with the jar being well mixed before each pull. Your friend tells you, “I’ve pulled a trillion green marbles. So now I’m due for the orange one. I’m about to put down a million dollar bet with a bookie. I don’t have a million dollars, but that’s ok because I know the chances are very high that I’ll win. After all, frequentist statistics tell me that for every orange marble I pull, I pull a trillion green ones.”
If you have common sense, you’ll advise your friend, “No, that’s not how this works. You’re no more likely to get orange on this pull than you were a trillion pulls ago.” If you’re a little more theoretically inclined, you might add, “Frequentism doesn’t say that you’re due orange. It says that in a theoretical model in which the number of pulls goes to infinity, the ratio of orange to green outcomes approaches 1 to 1 trillion.”
Then after the next pull, you’ll add, “Orange you glad you didn’t make that bet?”
So what’s the point here? In any given pull, you’re justified in believing—i.e., in behaving as though it’s guaranteed—that you’ll see green. Therefore, string together as many pulls as you like and you are still justified in believing that you’ll only see green. This is intuitive if you reflect on the scenario one pull at a time. At even just a trillion green marbles, it’s very easy to imagine that in trillions of pulls, only green marbles come up. It’s even easier to imagine with quintillion × quintillion × quintillion green marbles.* As the number of green marbles increases, the event of seeing orange approaches undiluted impossibility.
[*We could work out how many pulls are needed to practically guarantee—at least theoretically—seeing the orange marble, but I won’t do that here. I do that sort of thing in the post where I also discuss stuff like the aforementioned lottery case: “Anthropic Bias (Ch 2, Part 2): Fine-Tuning in Cosmology & 100 Heads in a Row“].
If pulling an orange marble is impossible—or is even reasonably or rationally treated as impossible—in a given pull, then it should be reasonably treated as impossible in a sequence of independent pulls. This seems to align perfectly with Smith’s claim that if the result of one coin flip isn’t surprising, then seeing that result indefinitely many times in a row isn’t surprising either. That is, “no surprise” + “no surprise” = “no surprise,” in the same way that 0 + 0 = 0.
By this same reasoning, it seems fair to say that “impossible” + “impossible” = “impossible.” Of course, the individual numerical probabilities in question are not themselves zero or “impossible”; rather, we’re dealing here with a qualitative, cognitive orientation towards the events in question, whose formation is meant to be guided by numerical probabilities, along with one’s theoretical and philosophical understanding of probability (and how the world works) overall.
I’m not necessarily committed to the “impossible” + “impossible” = “impossible” orientation, but if I’m right that its derivation is a fair application of Smith’s reasoning, then this poses a problem for that reasoning, as it instructs us to be unsurprised by 100 Heads in a row while simultaneously viewing that outcome as impossible.

Or click the banner to shop at Amazon (at no extra cost: it just gives me some of what would have gone to Amazon).