Ep. 547: Why Astronomy Still Needs Humans

Few sciences have been able to take advantage of the power of computers like astronomy. But with all this computing power, you might be surprised to learn how important a role humans still play in this science.

In this episode we mentioned donations. Click to learn more!

Download MP3| Download Raw Show with Q&A| Show Notes | Jump to Transcript or Download

Show Notes

Transcript

Transcriptions provided by GMR Transcription Services

Fraser:                         Astronomy Cast, Episode 547. Why Astronomy Still Needs Humans. Welcome to Astronomy Cast, your weekly facts-based journey through the cosmos where we help you understand not only what we know but how we know what we know. I’m Fraser Cain, publisher of Universe Today with the as always, Dr. Pamela Gay, Senior Scientist for the Planetary Science Institute and the Director of CosmoQuest. Hey, Pamela. How are you doing?

Pamela:                        I’m doing well. How are you doing, Fraser?

Fraser:                         Great. Back from more travels.

Pamela:                        I am. Last week I was out in San Francisco for QCon and I’m so grateful for the opportunity to attend that the hosts gave me. It was an amazing week. I feel so inspired. All I want to do is go and write software, unfortunately I have paperwork I need to get completed first and then, I’m gonna write all the software. All of it. All of it will be written and I will enjoy it.

Fraser:                         I booked flights for me and Chad to attend the American Astronomical Society meeting in Oahu in January.

Pamela:                        Excellent.

Fraser:                         So, we’re in. We’ll be there. And so, if you’re listening to this and you are also attending double A S, find us. Let’s hang out.

Pamela:                        We will all be there together.

Fraser:                         We’ll all be there together, yeah. Few sciences have been able to take advantage of the power of computers like astronomy. But, with all of this computing power, you might be surprised to learn how important a role humans still play in all the science. Pamela, you’re telling me that humans haven’t been completely relegated to the feeding and servicing of computers?

Pamela:                        There are days when we think that would be the best of all possible options but, not yet. They still need us. It turns out there’s still stuff that machine learning and computer vision just can’t quite do.

Fraser:                         Yeah.

Pamela:                        And we’re needed. We’re actually needed.

Fraser:                         But it is surprising, I mean, this episode is gonna be about how humans are still needed. But it is pretty amazing how much of modern astronomy is completely due to and thanks to the power of computers and modern – I mean, when you think about what astronomers had to go through 100 years ago, when they had to take pictures, they had to – on photographic plates. They had to blink back and forth between those photographic plates, trying to find things like Pluto. Oh, the work. And now, beep boop, computers do it all for you.

Pamela:                        And what’s really amazing, to give an example, at the beginning of this, of the extreme opposite case. With the Gaia data sets, there are people who have prewritten possible papers such that when the new databases of data get downloaded, their software crunches and then outputs the appropriate blocks of text for what was found in the database and then submits the paper. So that people can essentially beat one another to publication.

And this auto generation, Mad Libs style research era that we’re now in, is truly amazing. But then at the opposite end of that, we have the science we’re talking about today. Which is where you still have researchers gathering around the table with a Wacom tablet and screen, had drawing things on images trying to figure out what it is they’re looking at.

Fraser:                         So, what’re the places then where human beings still have a role to play in the science and astronomy?

Pamela:                        So, we have four basic roles that we’re gonna hit on during this episode. The first is, we’re there to confirm that the computers didn’t go off the rails.

Fraser:                         Right.

Pamela:                        So, we’re there to essentially look at the potential supernovas and say yes, that actually is a supernova and not a cosmic ray. We are there to –

Fraser:                         So – sorry. So, just give like that as a specific example, right? How would computers be involved in trying to find potential supernova candidates and then, how would a computer go horribly, horribly wrong in attempting to identify this and where a human would be called in? So, what’s the process? So, that I can kind of understand.

Pamela:                        With supernova in particular, and this works for a lot of different things, you can do a similar process with asteroids, for instance. You take two images of the exact same area of the sky. You then subtract these two images and where nothing moved and nothing changed in brightness, you should get nothing. Nothing is a good result. Now, the reality is that from one night to the next, sky conditions will change, stars will appear a little big fuzzier, a little less fuzzier. And what you actually end up with is residual donuts.

And if you have an object that is significantly brighter in one image or is present in one image and not present in the other, you’ll end up with a brighter spot and a darker spot in the two partner images. Now, that’s if something is moving. So, you have it moves, it ends up subtracted out in one image and added into the other image. If something stays constant, you end up with sky brightness. If it just brightens, you end up with it in one image and it – it’s there. You have, hopefully, a supernova. Now, the problem is that brightening doesn’t actually have to necessarily be the star that did the brightening.

You can end up with annoying things like a cosmic ray that coincidently hit it exactly where your star is. You could – or, in this case, where a fuzz of light in the distant galaxy is.

Fraser:                         And to cause a gray, is a blast of essentially light on the detector. I mean, the detector sees it as this place where something was very bright.

Pamela:                        Yeah.

Fraser:                         And when you make that subtraction of the two images, you’re gonna see this bright object that’s there, when really, it wasn’t a supernova exploding.

Pamela:                        Right. Exactly. You can also get weird things happening with your software. If a satellite crosses in the wrong place, wrong part of your image so that it’s coincident with something. And human beings need to be there to do the, huh, that was a really weird supernova – not supernova. That was a really weird cosmic ray, our software can’t handle it. There was a coincident satellite shoot, our software couldn’t quite remove it. All these little things end up with false positives and sometimes false negatives, which are way worse.

When something real happened and your software is like nope, I’m gonna ignore that. So, human beings have to constantly be in there monitoring the things that change that the software flags to go yes, that’s real. No, that’s not.

Fraser:                         The full – I mean, you mentioned the false negative problem, that is scary. That you are trusting a computer to hammer through 100,000 photographs that’ve been taken of the sky looking for differences to – and there might be a supernova in there and for whatever, you have defined the characteristics of your software, it’s gonna go nope, not good enough. And it’s gone. And you’re not gonna double check a couple hundred thousand, right?

Pamela:                        No, and in terms of supernova, it’s annoying but it’s not actually scare. Where it actually is scary, is in the sky surveys that are out there looking for the asteroid that’s gonna hit the planet Earth.

Fraser:                         Yes.

Pamela:                        And the most dangerous ones are the ones that we get very little warning on, the ones that are coming at us out of the direction on the sky that corresponds to the Sun. And we might not get a lot of warning. So, you don’t want those to be ignored by your software.

Fraser:                         There was a really dramatic example of it just this summer. There was one of the closest calls that Earth has had in a long time and it did get picked up in the automated sky surveys, but it also happened to be a time when I think there was like a full moon nearby. And so, the full moon matched with the direction that the asteroid was coming from. Matched with the small amount of time that they had to find it. They, you know, they missed it.

Pamela:                        Yeah.

Fraser:                         And then it went right past the Earth and everybody was like, what?

Pamela:                        Yes. So, yeah. The sky is a dangerous thing sometimes, it throws rocks at us.

Fraser:                         Yeah. So, this is one classification that you were mentioning. What else is there?

Pamela:                        So, beyond being there to basically check, is the software off its rocker. There’s also going to be a lot of times, and this comes up more in astronomy than it does in planetary science, where we can train our software now to go through and say this is a potential this kind of thing. This is a potential this kind of thing. And it does this via machine learning, where you can take an image and one of the things that doesn’t require machine learning that we’ve been able to do for my entire career, is go through an image and find every source of light in the image.

Identify where all the stars are, find those disparate globularly things. And we’re finally at the point now, where machine learning can go galaxy, galaxy, planetary nebula maybe. But, there’s a lot of weird looking stuff out there, especially the hydrogen emission regions, those beautiful red nebula that we love to look at. Supernova remnants, these things all like to come in entirely their own shape. Even many planetary nebula aren’t that nice classic round shape; they take on their own structures. So, what the software can do it, there’s something here, look, look, please look.

And we have to essentially do that next step for our software and say, ah ha, yes. Thank you, software. Let me tell you what this is. And that’s still kind of awesome that the universe has certain classes of object that we haven’t seen so many times that we can train our machine learning yet. Now, when it comes to the future of the large synoptic survey telescope, we will hopefully begin to get enough data that it turns out that all those square planetary nebula, are old school and we can just let the software do it, but we’re not there yet.

Fraser:                         So, again like provide me with an example of what that feels like as a professional astronomer. The – how do you know that that thing is a nebula or it’s not if the computer can’t tell. The computer has, again, gone through the 100,000 images, it’s spat out all the weird stuff and then you look at the picture. How – what clues would you use as an astronomer to figure out what this thing is?

Pamela:                        Well, quite often the machine learning isn’t smart enough to generalize between a bunch of different things. So, when it keys in on trying to identify what a thing is, it may look at what is the distribution of light in the image. Then I know if we don’t yet have anything that’s smart enough to use multiple catalogues, to be able cross correlate.

But I as an astronomer can go, this dead bug shaped blob of stars and gas and multiple different colors, this is a star forming region. This large field of red light that has large swaths of blackness through it, this is a hydrogen cloud with dark dusty bands in front of it. And this is because my brain has the ability to key in on all the individual features and understand that this set of things –

Fraser:                         Right.

Pamela:                        This is what star forming regions look like. This sort of thing, this is what emit – hydrogen gas emission areas look like.

Fraser:                         Right. And so, it’s like you’re putting the thing that you see in context of the larger structure that you’re looking at, at the same time and human beings are very good at that.

Pamela:                        And we don’t care what kind of dead bug a star forming region looks like. We’re able to interpolate across all these possible different shapes on the sky and say all of these things. That yeah, they all have different boundaries. They all have different patterns of light and dark. These are actually all the exact same thing.

Fraser:                         Right. What else?

Pamela:                        Well, over in planetary science, we have the surfaces of worlds and it turns out that software and the surfaces of worlds do not get along. We are to the point that some of the best algorithms out there are doing better than 90 percent in mapping out simple things like sharp boundary edged craters when the training set and the area the software is doing is the same kind of surface. So, you’re sticking to the same kind of regolith. You’re sticking to the same kind of maria on the Moon and you’re not doing anything crazy.

But 90 percent, 95 percent even, isn’t good enough if you’re trying to land a spacecraft because, let’s face it, the Moon and mars already like to eat spacecraft when we know exactly what we’re doing. Let’s give our little rovers and landers every possible chance they can have and well, to do that, human beings need to be mapping these landing areas. And this gives us a job, we have to fill in for what machine learning isn’t clever enough to do yet. Even with these objects that occur millions of times on a given surface. So, all those boulders on Bennu?

Fraser:                         Right.

Pamela:                        That’s on us. That’s on us.

Fraser:                         Now, is part of that though, like there’s a photograph of say, the surface of Ben – Bennu is a good example, right?

Pamela:                        Yes.

Fraser:                         And of course, you’re – maybe all you dream is Bennu at this point with the amount of rocks that you’ve identified on the surface of that collection of rocks. But, is part of it though just the quality of the data that you have so far? Like, if you had an aircraft on Mars that flew over a potential landing zone, very carefully with a laser altimeter to measure the – a 3D construction of every single object, that would be easier to feed into a machine learning program than one photograph taken from orbit around Mars. It’s those first images where it’s vague. And so, the human really plays a role when later on, once the science and once the data gets better, then the machines can step in and figure this out with more accuracy.

Pamela:                        I wish it was that simple. What we’re running into with Mars and moon, is we have lidar data of both of these surfaces. But the lidar data isn’t of sufficient resolution and there’s enough scatter and noise in it, that it doesn’t get us down to being able to see the same kinds of details that we can see with the cameras. High rise on Mars Reconnaissance Orbiter, the narrow angle camera on Lunar Reconnaissance Orbiter. These are recording things at amazing resolutions. With the Moon, we’re down to tens of centimeters per pixel with the highest quality images and this means that if you were to lay down on the Moon and assume the snow angel position, we could see you on the Moon.

And, with these resolutions, we should be able to just say go forth software, and at least tell me everything that’s 10 meters and bigger. What we run into is there are issues of the Sun has this super annoying habit of moving through the sky and the shadows change. Okay, fine. We can deal with that. Well, soil comes in different colors and textures. Well, software can’t generalize across that. And then, you just have weird stuff. So, on the Moon we run into things where we have these magnetic anomalies that cause different grains of material that are radically different color to land in these patterns that look like, well somebody’s been sand painting swirls on the Moon.

Fraser:                         Yeah.

Pamela:                        And how do you get your software to understand that a crater in one area of soil is going to be filled with white and spray white material everywhere? A crater somewhere else is going to be black on the inside and spray black material everywhere. Sometimes it’s a little of both. Sometimes it’s gray all over. Sometimes we can have patterns of color that have nothing to do with topography. And as the Sun moves, everything changes.

Fraser:                         Right, right, right. And so, in theory you could take over a whole bunch of really powerful computers. Feed them lots and lots of examples and teach them carefully to learn how to identify these things, but maybe the quickest thing is to have a human being take a quick look. Teach a four-year-old to identify craters on the moon or rocks on Bennu.

Pamela:                        They don’t – The 4-year-old doesn’t have the physical hand eye coordination. The 4-year-old, I love you, you can’t do this. Yeah.

[Crosstalk]

Fraser:                         Fine, and then that 10-year-old, you know. Yeah.

Pamela:                        Yeah, fine. That – yes. And the other problem that we run into is, especially for objects like Bennu which is a half kilometer across, you could sit it down on a large city block. It’s so tiny, that we could try and train a machine algorithm, but the data set would be the entire surface of Bennu. And if you have to train your software with your entire data set, you’ve now written software that has no purpose.

So, with the Moon, what I’m really hoping is we’ll figure out a sampling set, such that if we hand mark 10 percent of this kind of geology, it will be able to go off and handle the rest. But we’re still gonna have to do 10 percent of every different kind of surface combination of color, texture, sun angle. We’re not there yet and it’s super frustrating to not be there yet.

Fraser:                         Right. When the kind of science that you’re trying to do is like – where’s every crater on the Moon and then let’s figure out how old different parts of the Moon are or where’s every rock on Bennu, so that we can figure out which is the safest place to land or where’s every potential Kuiper Belt object that New Horizons could visit, etcetera, etcetera. Like, with all of this it’s beyond just like, I want to find a specific example of a thing and it’s moving to, I wanna understand it all so that I can make some larger just understandings about what’s going on.

Pamela:                        It’s super annoying. We have humans –

[Crosstalk]

Fraser:                         So, what was the – Yeah, what was the fourth one?

Pamela:                        So, then the fourth one is the things that we can’t even begin to tell computers how to do. And this is actually what inspired me to do this entire podcast. Couple weeks ago, the New Horizons team released detailed geologic maps of the far side of Pluto. What we’ve come to do is the near side is the side that was nearest the New Horizon spacecraft when it did its closest approach. The far side is the side that we’ve only seen in the lower resolution data that was taken when the spacecraft was further away.

Fraser:                         Like the Moon, it has no dark side.

Pamela:                        Exactly. And looking at these maps, they’re super cool and they have these beautiful curved continuous edges going from one geologic feature to another geologic feature. They trace out all of these different stripy bits, light planes, craters and all of it was hand drawn. All of this is science that’s generated most often by people sitting around a Wacom tablet, at a table and staring at a monitor together and as a group deciding this line should go over here.

No, let’s shift it a pixel, let’s erase that and extend this a bit. And this unification of art and geology is the only way to get at geologic feature maps is something that I find absolutely remarkable. Truly beautiful to look at and super frustrating because it means there’s no clear mathematical definition of what belongs to Sputnik Planum and what belongs to the stuff beside it.

Fraser:                         And so, those maps, and I think we reported on that on Universe Today as well, that it got images of Pluto when – of the other side of Pluto when it was so far away, and it was just this teeny tiny dot. And so, it’s amazing, right? You’ve got these astronomers that are sitting there and free form attempting to draw what they see. Large impact craters, more regions that connect to Sputnik Planum, etcetera. It’s quite an amazing piece of work.

Pamela:                        And they get really good results where it’s all by eye. The human mind can look at Pluto and say, based on the shadow, the texture and the color that I see here, these are mountains. And then combining spectral information, these are water mountains. They can look at the color variation across Sputnik Planum and see where there’s been convective cells that have changed the surface ever so slightly from one point to another.

And this is all subtle variations that are specific world to world. Specific place to place and these are things that because machine learning can only do what it’s been told and what it’s seen examples of, every new world is gonna have to be done, at this point in our technological evolution, by individual scientists working by hand.

Fraser:                         I love this idea that they had these images of Pluto, I mean we even had images from the Hubble Space Telescope and I’ve even seen people do that, they’ve gone back now and recreated what Hubble was seeing.

Pamela:                        Yeah.

Fraser:                         Now that we know what it actually looks like. And so, they – we got the close-up images, high resolution and then in turn a human can interpret what that looked like and look at low resolution images and do a better job of showing what’s there. And then you can take an even lower resolution and do a better job of figuring what that is. And probably, if you have more distant – and then you compare what was seen with, and I forget the name now, MU69, Arrokoth? I forget the way to say it.

You’ve got different surface features on that and now with two different versions you could, you know, show someone a low-resolution image of a third object and people could take a crack at attempting to see what’s on the surface. Map out the surface features of that world without necessarily having to get as close as New Horizons did. So, it’s a very powerful process and, as you say, couldn’t be done without humans. I love those old sketches that people used to do of the Moon and Jupiter and –

Pamela:                        Mars.

Fraser:                         Galaxies and Mars and things like that. I mean, okay. So, maybe some of them thought they saw canals but, you know. Most of the time, they’re absolutely fascinating and there’s a lot of astronomers, amateur astronomers, even professional astronomers that still sketch –

Pamela:                        Yes.

Fraser:                         To help understand what they’re looking at.

Pamela:                        And there’s psychological research that shows that the connection when you’re physically writing, and drawing is significantly greater than when you’re pounding away on a keyboard. I don’t fully understand, I just know that this is what the research is showing. And it’s getting to the point however, with all of our devices that allow styluses to be used, especially with the Apple Pencil, you can bring all of that hand eye cognition into what we’re doing with our scientific images.

And what I’m also really loving along these lines is seeing people then repeat the cycle as they’re hand drawing their notes listening to conference talks. So, that you’re going from digital image off the spacecraft, scientists around a table drawing the map, scientists listening to the talk recreating the drawing in the notebooks. And art is really at the core of planetary science. And this is something I would never have guessed at when I was a graduate student.

Fraser:                         So, I think we still would both agree that learning to be a computer programmer is one of the most valuable skills that you’ll need for becoming an astronomer. And yet, don’t throw away your drawing pencils as well.

Pamela:                        Exactly.

Fraser:                         Thanks, Pamela. Do you have some names to read out this week?

Pamela:                        I do. As always, our show would not be possible without the generous contributions of folks over on Patreon.com/AstronomyCast. You keep our show going, you pay Suzie. You’re gonna be helping out with our double A-S travel. So, thank you to Frederick Hognick Von Jensen. Thank you to Phillip Walker, to Enod Avelon, to Cooper, to Chris Shirehoffer. Sorry about that, Chris. Thank you do Paul Disney, thank you to Dave Lackey, thank you to G4184. Thank you to Steven Ludking, thank you to Sarah Turnball. Thank you to Dana Norey. All of you are our heroes this week. And we’ll have new heroes next week.

Fraser:                         Right on. Thanks, Pamela. We’ll see you next week.

Pamela:                        Buh bye.

Female Speaker:          Thank you for listening to Astronomy Cast. A non-profit resource provided by The Planetary Science Institute, Fraser Cain and Dr. Pamela Gay. You can find show notes and transcripts for every episode at Astronomy Cast. You can email us at info@astronomycast.com. Tweet us @AstronomyCast. Like us on Facebook and watch us on YouTube. We record our show live on YouTube, every Friday at 3:00 PM Eastern, 12:00 PM Pacific or 19:00 UTC. Our intro music was provided by David Joseph Wesley. The outro music is by Travis Surrel. And the show was edited by Suzie Merv.

[End of Audio]

Duration: 29 minutes

Download MP3| Download Raw Show with Q&A| Show Notes | Jump to Transcript or Download

Follow along and learn more: