Ep. 436: Common Misconceptions in Probability


Human beings are bad at many things, but we’re particularly terrible at understanding probability in a rational way. We underestimate, overestimate and generally mess up probability. We’ll try to fix it here, but we’ll surely fail.
We usually record Astronomy Cast every Friday at 1:30 pm Pacific / 4:30 pm Eastern / 21:30 PM UTC (9:30 GMT). You can watch us live on AstronomyCast, or the AstronomyCast YouTube page.
If you would like to join the Weekly Space Hangout Crew, visit their site here and sign up. They’re a great team who can help you join our online discussions!
If you’d like to download the awesome book Fraser keeps mentioning, “101 Astronomical Events in 2017” by Dave Dickinson, go here to download in PDF and EPUB at Universe Today.
We are getting very excited for the AstronomyCast Solar Eclipse Escape, where you can meet Fraser and Pamela, plus WSH Crew and other fans. Right now we’re at capacity, but you can join the waiting list in case spaces open up by emailing us at astronomycast@gmail.com with Eclipse Waiting List in the subject line!

Download the show [MP3] | Jump to Shownotes | Jump to Transcript

Show Notes

Show notes here

Transcript

Transcription services provided by: GMR Transcription

Fraser: Astronomy Cast Episode 436: Common Misconceptions in Probability
Welcome to Astronomy Cast, our weekly facts-based journey through the cosmos, where we help you understand not only what we know but how we know what we know.
My name is Fraser Cain. I’m the publisher of Universe Today and with me is Dr. Pamela Gay – gotta… here we go – the director of Technology and Citizen Science at the Astronomical Society of the Pacific and the director of CosmoQuest.
Hey, Pamela. How are you doin’?
Pamela: I’m doin’ well. How are you?
Fraser: Good.
So this is, like, the first time – for the ten years that we’ve doing this show – that I gave you a new title. Well, that’s not entirely true. People – astute listeners – noticed that over the last couple of months, we dropped the professor title and then just went the director of CosmoQuest and now I just gave you a new title. I don’t know whether I’ll do it every week. It’s very long. It’s a mouthful. But –
Pamela: It – yes.
Fraser: So, you want to give people the one-minute version of this?
Pamela: So, the short version is, I’ve been at the same institution for ten years and I wanted –
Fraser: Now we’re at a new institution.
Pamela: Yeah, I wanted to be surrounded by other astronomers.
Fraser: Yeah.
Pamela: And now I’m surrounded by other astronomers. I’m at an organization that has a female director; that has a lot of people that I really respect, working there. It’s over a hundred years old and has been part of increasing scientific literacy through astronomy for those 100 years. And I like all of the other fields of study but it gets lonely.
Fraser: Mm-hmm.
Pamela: And now I have other people and access to journals and all sorts of things – and I don’t have to move! So –
Fraser: That’s right, you’re doing this virtually.
Pamela: Yeah.
Fraser: Yeah.
Pamela: So, me and Cory, we’re transitioning together. And this is our chance to start over in the same place, in a new place.
Fraser: Yeah. No, fantastic. And they’re just a wonderful group of people at the ASP. You know, I’ve worked with several of them in the past, on doing some of our Virtual Star parties, and I am really excited. And they’re located in San Francisco which is a great city, relatively nearby to where I am on Vancouver Island. So I hope to be able to participate in person more, which – down there – which will be great.
Pamela: Yes.
Fraser: Second thing is I’m testing out a different microphone because people were complaining. So, if my microphone sounds a little weird today, that’s because I’m using my older microphone. So I apologize. I’m just troubleshooting.
Alright, let’s move on.
So, human beings are bad at so many things, but we’re particularly terrible at understanding probability in any rational way. We underestimate, overestimate and generally mess up probability. We’re going to try and fix it today but we will surely fail, because we’re just literally hardwired to mess this up. At least, that’s – I like our odds of screwing this up.
Pamela: I think we are hardwired to misunderstand probability because otherwise, we’d probably die. Because if you think about it, it’s in our best interests to be overly cautious and, in other cases, I think we’re programmed to not understand probability because we take stupid risks that ultimately benefit society, even if a fair number of us die along the way.
Fraser: So, tell me how this topic came up in your brain.
Pamela: So, I was at Arisia last week, which is a science fiction fantasy conference that’s held in Boston, my fair city. And I was on a panel on “And these are the way we shall destroy the Earth” and it was actually, “These are the ways we will destroy civilization” so it was even – It went that extra step.
And someone in the audience made the comment that we didn’t need to bring up supervolcanoes or asteroids because every year that goes by that we don’t have a supervolcano or asteroid goes off, lowers the probability of these things occurring. And since we haven’t experienced these things since – well, Chinese record-keeping days – we can just –
Fraser: We’re running out the clock.
Pamela: Yeah, yeah. And so, here we are – There’s a supervolcano under Naples that’s getting ready to go! And people think that because we haven’t had one go off in a long time, we don’t need to worry about it.
Fraser: But the chances are the same every year.
Pamela: And this is the problem. There is probability of any given year having a particular event and there’s a probability of a sequence of years. And so, the probability of flipping four heads in a row and the probability of any one of those being heads are not the same probability.
Fraser: Right, of course. So I’m losing my voice here.
Pamela: Yeah.
Fraser: So you –
Pamela: I need to deal with a dog. One moment. Sorry, YouTube video.
Fraser: Oh, do you – Are you going to deal with the dog? Whoa – sorry everybody.
Your video dropped off.
Pamela: That is – Okay, there we go.
Fraser: Alright.
Pamela: Okay, all is well. Sorry, Chad. Sorry, Chad.
Fraser: Somebody gave you an acronym for your new title.
Pamela: That’s about right, yeah.
Fraser: Okay. So the classic analogy of this, of course – right? – is that you flip a coin. And you flip the coin and it comes up heads or tails.
Pamela: Yes.
Fraser: Let’s say it comes up heads, and then you flip it again and it comes up heads, and then you think like, “What are the chances that I’m gonna flip this again and it’s gonna come up heads again?”
The reality, of course, is that it’s gonna – It’s a 50-50 chance every time, right?
Pamela: Yes.
Fraser: And this is the gambler’s fallacy.
There was actually a great podcast over on Freakonomics and they covered this a bit, about probabilities and sort of our ability to make decisions. And just that, you know, in so many fields, we will make this mistake, in terms of, like – You know, it’s happened one way so many times, so the chances are that it’s gonna switch now, right?
The gambler, you know, “I’ve lost so many times at the blackjack table so this has gotta be the one where I’m gonna win.” But of course, every time you go in to it, you’re dealing with the same kinds of odds.
Pamela: And so, here’s where you have to ask: Am I asking, what is the probability of a sequence of events? Am I asking, what is the probability of a group of objects doing something? So, do I need to look at probabilities like you do with radioactive half-lifes? Am I looking at the probability of a single uncoupled event? So, dependent versus independent variables. And then there’s that whole: Is it a group of things that are all dealing with this probability simultaneously, like with radioactive half-lifes.
Fraser: So, let’s talk about the – Well, let’s go back and talk about that asteroid destruction. So, what are the chances that a gigantic asteroid is going to smash into the Earth and kill us all?
Pamela: So, the “kill us all” is basically 1 in 65 million, any given year.
Fraser: Right – which is in that about every 65 million years, a gigantic asteroid smashes into the Earth and wipes out most life on the surface of the planet. Therefore –
Pamela: And disturbingly, this last happened about 65 million years ago.
Fraser: Right. And so you are then, you know, here on the surface of the Earth. You are experiencing a year on Earth. This year, you’ve got a 1-in-65-million chance – I just did a video about the Carrington Event, that you’ve got about a 1-in-500 chance of the –
Pamela: Yeah.
Fraser: You know, of a gigantic solar storm wiping out all our technology. They take about once every 500 years or so; so, each of these things.
So then, where is the fallacy coming in, right? Why are you thinking that, as the longer these things happen, the more likely it is that it’s going to happen? Or the less likely it is that it’s going to happen? What was – You know, what?
Pamela: So this is where you have to start looking at it in terms of, we don’t fully understand the statistics of asteroid impacts. So – and asteroids, comets – I’m using that term “asteroid” to refer to both.
And so, what open variables we don’t fully understand are: Is there something that’s disturbing the Kuiper belt on a regular basis, or disturbing the Oort cloud on a regular basis, that is causing a cyclic phenomena of extra amounts of “objects come plunging in”?
So that’s one possibility, in which case we’re looking at something where you go through periods of decreased actual probability those years, and then increased probability, because it’s a cyclic event more like a plague of cicadas upon your house. You know –
Fraser: Right.
Pamela: Every seven years, there’s gonna be a bazillion cicadas if you live somewhere with 7-year cicada plagues – which I do. So there, Years 1 through 6 – probably not gonna have the plague of cicadas; Year 7, you’re doomed. Now, what you don’t know is exactly which day and summer the plague of cicadas is going to fall upon your house.
Fraser: So, when we say that we are “overdue” for an asteroid impact, that’s not entirely true. It’s not like your cicada example these asteroids show up on the schedule. It’s just that if you average it out over the history of the Earth, that’s the – how often you get them.
Pamela: And this is where I have the qualifier of “we don’t fully know”. Because it could be that there’s some higher-order effect, where every 200-and-something years, maybe, there is this increased, “Oh, dear. We’re all going to get thumped.” And we don’t know.
So, that’s one possibility. We don’t know.
Fraser: Right.
Pamela: Just to be clear – don’t know.
And then the other situation that we don’t know is: Is there a sufficient family of comets and asteroids that have orbits that encounter the planet Earth, that we’re looking more at a radioactive decay where any given day – may not see something – but the probably is such that 65,000 years from now, it won’t be half of them have hit the Earth but it will be one of them has hit the Earth.
Fraser: Right, right. But, you know, with the cicada example, right? That you’ve got that – You’ve got this very specific timeframe that it’s happening; this, you know, no cicadas and then every seventh year, the cicadas show up. But we –
And, as you said, you know – there could be some kind of cycle where the probabilities increase because of perturbations from Jupiter or, you know, some – you know, other star – some kind of brown dwarf star that’s in a close proximity. Or sometimes the sun passes above the plane of the Milky Way and then back in, and it increases the odds but still, you know, all you’re getting is an increase of odds. But –
Pamela: Yes.
Fraser: But you’re not –
Pamela: It doesn’t guarantee it will happen.
Fraser: You’re not getting a guarantee of this event happening, right.
Pamela: Now, then you have situations where the probability of something occurring over time increases. This is where we look suspiciously at the volcanoes.
So, right after a volcano has erupted – when its magma chamber is nice and empty and it doesn’t have a reason to go kabluey and fill the atmosphere with airplane-grounding ash – this is a situation where: immediately after, super-low probability. Yeah, weirdness happens; we don’t fully understand our planet.
But over time, pressure can grow and grow and grow underneath a volcano. The magma chamber fills up, the ground uplifts and, at a certain point, the outward pressure of that magma starts to unbalance the ability of the volcano not to go kabluey.
And we don’t have the full capacity to understand all the triggering events because we have to worry about things like – well, frictional effects. We have to worry about how frozen the ground is; how much ice and snow is on top of it if it’s Iceland. What are other seismic effects that might jiggle it loose? There are so many different factors that we can’t say precisely, “This volcano is going off in the next three days” until it starts giving off steam; in which case, we can say it’s probably going off in the next three days – but it might actually give us three-minutes warning.
At the same time, we can say this volcano is definitely becoming higher probability and will go off, like they just said with the volcano in Naples. They elevated it from “It’s pretty darn quiet. You can pretty much ignore it” to “Scientists need to be watching this now.”
Fraser: Right.
A great – Oh, I forget the name. The Torino Scale, that’s right. So the –
Pamela: Yeah.
Fraser: So, the astronomers have studied asteroids and they’ve developed this idea called the Torino Scale. And it is all about probabilities, right? It’s all about looking at these various objects and determining if – you know, looking at their orbits and looking at how often they cross the Earth and how close they are and their mass and things like this – what are the chances that it’s going to impact the Earth at some point down into the future? And the higher that possibility comes, it enters the Torino Scale somewhere between 1 and 10 – 10 being “we’re doomed”; 1 being maybe, just maybe, in the far future, this object is going to get close enough that it could strike the Earth.
And it’s the same thing, right? It allows you to shift the probability for these individual objects around. And so, we used to think it was 1 in 100,000; and now we think it’s 1 in 10,000; and now we think there’s a 10-percent chance; and now we think there’s a 15-percent chance. But – And now we think there’s a one-in-a-million chance again, that, you know – for each one of these objects. But once again, it’s – You know, in some cases this stuff – as with your volcano, right? It’s either not going to hit or it’s hitting right now.
And, you know, the volcano; it’s either not – you know, we don’t know if it’s going to go off – or it’s going off right now. And anything in between is just us pulling numbers out of a hat.
Let’s talk about weather.
Pamela: Okay.
Fraser: Yeah. So, same thing – You know, I look at my cell phone app and it tells me that I have a – well, I always have 100 percent chance of rain here on the west coast. But no, it will tell me, you know, I have a – It’ll go: Tomorrow I have a 50-percent chance of showers. Does that mean it’s going to rain for half the day? Does that mean that I’m going to, you know – I have a 50-50 chance of – You know, flip a coin –
Pamela: So –
Fraser: – and I might have rain tomorrow?
Pamela: In general, what it means – and there’s a chance your app presents it in a different manner. I reserve the right for an app developer to have done something strange.
Usually, what that means is your friendly, neighborhood weatherman, woman, scientist, ran a gazillion computer models. And the computer models basically did the: If this happens, if that happens, given these conditions. How many of these scenarios have you being rained upon? And, based on all possible outcomes, they predict 50 percent of the time, with these scenarios; you’re getting rained upon because you live on Vancouver Island. I fully blame your location.
Fraser: Right. But the weather prediction for tomorrow is going to be more accurate than the weather prediction – because it’s going to tell me even at what time during the day I’m going to see rain ten days from now. I’m going to have a 60-percent chance of showers Saturday, January 28th. So, what that doesn’t include is error bars.
Pamela: And the reason that it gets more accurate over time is because some of the variables get locked in on a value.
So, the way to think of this is if you’ve ever played RoboRally, it’s a board game where you get to program your robot to try and get to a flag. And you can set what each bracket’s going to be but if you get shot, you end up losing the ability to change some of these tiles. So, if you’re trying to figure out what someone else is going to do and you can’t see any of their variables, you have to run through the: Well, the most likely card they’ll get is this; the most likely card they’ll get is that. This is the range of things they can do and it turns out it’s pretty extensive.
That’s our ten-day forecast. There are a whole lot of things that could happen ten days from now. But, if they’ve had an unfortunate game and they’ve been shot up pretty badly, and five of those registers are locked so that you know they’re going to turn right, then they’re going to go forward three, then they’re going to go backwards one; if you know what they’re going to do for five of those variables, where they’re going to end up is locked into a very small set of possibilities.
With the weather – when you’re 24 hours out – you know what the current temperature is, you know what the current humidity is, you know where the high pressure is, you know where the low pressure is. That makes it far easier because your registers are locked and the remaining places that things can go are much smaller.
Fraser: And so – you know, back to sort of the mistakes that we make in probability. You know, do you see in research journals people making mistakes? Or do you see us, as journalists, messing up and not understanding how probability works? Is that – That’s more likely, isn’t it? He says.
Pamela: So, journalists do the most amazing, complete failure to pay any attention to probability – unless you’re Nate Silverman or the guys at Princeton. There’s a –
Fraser: Well, I mean –
Pamela: Yeah.
Fraser: I’m gonna continue on the question in a second but Nate Silverman was a great example, right? Because – Nate Silvers. Because –
Pamela: You’re right.
Fraser: You know, with predicting the outcome of the election, at some point he was predicting a 30 percent, one person winning; 70 percent another person winning. And in the end, who did win was a bit of a surprise but that is probability.
Pamela: Yes.
Fraser: That you’re going to get, sometimes, the 30 percent outcome, not the 70 percent outcome.
Pamela: Right, right.
Fraser: Yeah.
Pamela: And –
Fraser: Alright, so let’s go back to how journalists screw this up.
Pamela: So, a lot of times they’ll say, like, “(Such-and-such) gives you cancer because, in a rat study, (X) percent of the rats got cancer after being fed a diet entirely made of (this thing).” And they don’t take into consideration: Well, what’s the probability that some of those rats would have gotten cancer anyways? They don’t take into consideration: What is the probability that only feeding rats X –
Fraser: Right.
Pamela: – because they’re not getting Y and Z, causes cancer. Lack of Y and Z causes cancer is not what you hear.
Fraser: Right. So that example, right? If, you know – If it says, like, it doubles your chance of getting cancer – it gives you a 100 percent additional chance of getting cancer – but your chances of getting cancer of this variety is super-extremely low; that additional 100 percent, or increases cancer risk by 40 percent, is actually not a lot.
Pamela: And it – The “it doubles your chance of” is one of those red-alert phrases because if your chance of getting struck by lightning – and I don’t know what your actual chance of being struck by lightning is. I’m making numbers up here. If your chance of being struck by lightning while walking across the field in the summer in Illinois on any random summer day is 1 in 100,000 – because we have a lot of lightning strikes here – and your chance is doubled if you put a giant pole on top of your head, you’re still not going to get struck by lightning.
Fraser: Right. Even though you have, indeed, doubled your chances – or quadrupled your chances, or multiplied your chances by 100 – you’re still not going to get hit by lightning because the –
Pamela: Now –
Fraser: The original probability of this event occurring is incredibly low and even –
Pamela: On a random summer day.
Fraser: And if you – right.
Pamela: So, like, if I told you to go out and walk across a field on July 17th wearing a giant pole on your head, you’re good.
Fraser: Right.
Pamela: Now, the thing is, this is also partially a deterministic probability because, if there’s already a thunder and lightning storm going on, don’t do it. I like you.
Fraser: Right. So the chances of you getting struck by lightning by walking out into a lightning storm with a lightning rod on your head, they have noticeably increased.
Pamela: Yes.
Fraser: Although, still probably very rare.
Pamela: Yes.
Fraser: Just don’t do it. Like, we’re still not recommending that you do it. Right?
Pamela: Right.
Fraser: So then, what can we, as the general public, do when we are reading research reports, when we are looking at studies – how can we recognize this probability misunderstanding in ourselves and compensate for us.
Pamela: So, what we really want to do is ask: What is the chance in terms of, like, 1 in 1,000; 1 in 10,000. You don’t want the “your chance doubled” because it doubled compared to what? You don’t know.
You do want to know: What is the error on something because, if the chance of something happening is 1 in 5, plus or minus 1 in 5 – 20 percent –
Fraser: Mm-hmm.
Pamela: – then that starts to look kind of different.
Fraser: Right.
Pamela: If it’s –
Fraser: So I guess that’s the point, right? Is that if you see a study and they say that performing – you know, driving a vehicle increases your chances of something else by 100 percent, that is possibly meaningless. But if they say that if you are, you know, driving a vehicle while texting, it increases your 1 in 10,000 chance of dying in a motor vehicle accident to 1 in 1,000; that’s a significant number that you should take very seriously.
Pamela: Right.
Fraser: Right. And it’s that one – You know, it’s knowing that probability, it’s knowing that 1 in whatever; that’s the number that you’ve really got to focus in, not necessarily the increase – because the increase is meaningless if you don’t know what the original percentage is. Okay?
Pamela: Yeah. Yep.
Fraser: Let’s talk about the error bar.
So, how would error bars be described in – you know, if I’m reading some study or if I’m reading the newspaper?
Pamela: So, error bars come in a lot of different varieties and this causes reading to be required. There are, for instance, error bars that are one standard deviation – which means it’s a very, very narrow error bar. So, if it’s one standard deviation error bars, it means there’s actually a pretty good chance that something is going to fall outside of that. So those are overly optimistic error bars.
Fraser: Right.
Pamela: So you have to be careful.
In science, we often look at six sigma error bars which, when you start looking at that, you are looking at six times the standard deviation where, really, if you have a giant population, everything in your giant population should, for the most part, fall within your six sigma error bars.
Fraser: Now, cuing this up, you want to talk a bit about radioactivity and how, sort of, we see probability and prediction with radioactivity.
Pamela: So, when we start looking at radioactivity – with radioactivity it’s not looking at a single atom; I know exactly what this particular atom is. It’s saying that given a full population of particles, then you’re going to expect half of them – but you don’t know which half; it could be all of the ones on the left, it could be a random distribution of them – you don’t know which ones will have decayed. And it could be that they all decide to sit there and go, “No, I won’t decay” but the greatest probability is half of them will have decayed in one half?life.
Fraser: And because you’re dealing with so many particles all at the same time, you are getting this distribution – this average – and generally, you’ll end up with half the element, you know, as per the half?life.
Pamela: Right. And this is one of those things where how well it fits the equation is a direct function of having more things in your population.
So, if you have a population of four happy little nuclear isotopes sitting there, it is one of these things where they’re not as likely to have the exact two of them, then one of them, then zero of them are left – just because there’s four and that’s a lot to ask.
The more you have, the better the fit of what’s observed to the theoretical model is going to be.
Fraser: Do you think that there’s any way that we can get good at this? Or do you think that this is sort of one of those fundamental weaknesses of human beings and we should always be aware and alert in ourselves?
Pamela: As human beings, we’re wetwear that comes with a whole lot of predetermined biases built in.
Fraser: Yeah.
Pamela: Features, not bugs, I’m told. And our inability to notice bias, our inability to notice how we preferentially view things as having a higher probability or a lower probability, isn’t going to get inherently better. That’s the code. I have to figure out how to monkey-patch the code by recompiling it with a new set of parameters. And that’s called “getting an education”.
Fraser: Monkey-patch, I love that. That is the new quote of the week, I think. That’s awesome.
Pamela: So, we’re biased creatures and we just have to figure out, through education, how to overcome all of our built-in bias.
Fraser: Yeah. Well, I think that example you gave of, like: If you see that something increases the probability of something by some number – 100 percent, 200 percent, 17,000 percent – you could literally feel free to ignore that until someone tells you –
Pamela: Yeah.
Fraser: – what the actual odds are. And compare that to other things that might happen.
Pamela: And seriously, if something says it’s within one standard deviation, just sort of raise – I can’t raise one eyebrow on cue.
Fraser: Yeah, ignore it. Yeah.
Pamela: Raise one eyebrow in a very questioning way at it.
Fraser: Yep.
Pamela: If it’s two sigma, where – you’re still only looking at 65 percent of the population fits in that. So it’s only when you start to get out to six sigma, where you’re at basically everything, that you can start to go, “Yeah, that’s a completely reasonable error bar.”
Fraser: Alright. Well, thanks Pamela. We’ll see you next week.
Pamela: Thank you.
Male Speaker: Thank you for listening to Astronomy Cast, a non-profit resource provided by Astrosphere New Media Association, Fraser Cain and Dr. Pamela Gay. You can find show notes and transcripts for every episode at astronomycast.com. You can email us at info@astronomycast.com. Tweet us @astronomycast. Like us on Facebook or circle us on Google Plus.
We record the show live on YouTube every Friday at 1:30 p.m. Pacific, 4:30 p.m. Eastern or 2030 GMT. If you missed the live event, you can always catch up over at cosmoquest.org or our YouTube page. To subscribe to the show, point your podcatching software at astronomycast.com/podcast.xml, or subscribe directly from iTunes. Our music is provided by Travis Serl and the show was edited by Chad Weber.
This episode of Astronomy Cast was made possible thanks to donations from people like you. Please give by going to astronomycast.org/donate.
[End of Audio]
Duration: 33 minutes

Download the show [MP3] | Jump to Shownotes | Jump to Transcript

Follow along and learn more: