Astronomers rely on the optics of their instruments, and there are some basic limits that you just can’t avoid. Whatever we look at is distorted by the optics, in fact, a basic property of light means that we’ll never get perfect optics. Here’s why we can’t “magnify and enhance” forever.
This episode is sponsored by: 8th Light
Transcription services provided by: GMR Transcription
Female Speaker: This episode of Astronomy Cast is brought to you by Swinburne Astronomy Online, the world’s longest running online astronomy degree program. Visit www.astronomy.swin.edu.au for more information.
Fraser Cain: Astronomy Cast episode 380, The Limits of Optics. Welcome to Astronomy Cast, our weekly facts-based journey through the cosmos. We’ll help you understand not only what we know, but how we know what we know. My name is Fraser Cain. I’m the publisher of Universe Today. And with me is Dr. Pamela Gay, a professor at Southern Illinois University Edwardsville and the director of CosmoQuest. Hey Pamela, how you doing?
Pamela Gay: I’m doing well. How are you doing, Fraser?
Fraser Cain: Doing great. So we didn’t really queue up anything to talk about. But is there anything new and interesting happening over on CosmoQuest? Anything that people should go and do? Anything they should analyze?
Pamela Gay: We are doing a Summer of Science. We are encouraging everybody to get online, participate in doing a certain amount of science every day. Scholastic Summer Reading Program encourages an hour of reading. We’ll encourage an hour of science. But because reading’s also important, we have a blog post up on perfect pairings of science and science fiction.
So when you’re analyzing the moon, listen to The Moon is a Harsh Mistress. If you’re working on Mars, The Martian is there for you. So check out our blog for that perfect pairings of audiobooks and science that you can do.
Fraser Cain: That is great. I plan to inflict this on the children.
Female Speaker: This episode of Astronomy Cast is brought to you by 8th Light, Inc. 8th Light is an agile software development company. They craft beautiful applications that are durable and reliable. 8th Light provides discipline software leadership on demand and shares its expertise to make your project better. For more information, visit them online at www.8thlight.com. Just remember, that’s www dot, the digit eight, T-H, L-I-G-H-T, dot com. Drop them a note. 8th Light: software is their craft.
Fraser Cain: So astronomers rely on the optics of their instruments. And there’s some basic limits that you just can’t avoid. Whatever we look at is distorted by the optic. In fact, a basic property of light means that it will never get perfect optics. So here’s why we can’t magnify and enhance forever.
So that term, right? Magnify and enhance. Magnify and enhance. That drives you bonkers, doesn’t it?
Pamela Gay: It’s not so much that it drives me bonkers as it’s just sort of like, I wish. That’s a good thought.
Fraser Cain: That would be great. I could – Nobel – so many Nobel prizes could have been –
Pamela Gay: Right.
Fraser Cain: – achieved. Yeah.
Pamela Gay: It’s just – I will rage against the television during certain spy shows when they do the whole, “We have an image from outer space!” And they show the raw image. And in the raw image, everything is kinda pixilated. And then they do magic! And when they do magic, suddenly you can, like, read license plates and stuff. And it’s just sort of, like, if the raw image had that as two pixels across, it’s still two pixels across.
Fraser Cain: So I guess let’s go back to, like, the spy movie or whatever, right? And you just sort of described this really well. They’ve got this kind of grainy, blurry image from really far away, and they zoom in. And then someone says, “Zoom in and enhance!” And they pick some little part and then they blow it up. And then a computer does some kind of pixel interpolation, and you get a little bit more resolution. And then, “No, no! Just that area on the license plate!” And then it zooms in again. You get a little … and then boom, the license plate comes into view. And then they’ve got their lead. Where are they going wrong there?
Pamela Gay: So the fundamental issue is, you get data with a digital detector such that you have little tiny light buckets called pixels that are simply going, “Light!” or ”No light.” And they do this usually in several thousand different increments of light or no light. Those pixels, you can only do so much to figure out, well, if you have light in this pixel and no light in this pixel, what must be going on between these two pixels?
And with these spy shows, what they’re doing is essentially saying, “We have six pixels, and we’re going to somehow take those six pixels” – or usually it’s something like 3 X 6 pixels – “and we’re going to resolve those 3 X 6 pixels into 300 X 60 pixels!” And you can’t get information out of nothing. It’s sort of like saying your text on your screen is suddenly capable of allowing you to read an entire page of a novel, when there’s only five pixels by ten pixels on the page or something.
Fraser Cain: Right. And so if you actually blew up those pixels to the larger resolution, you would just get –
Pamela Gay: Bigger pixels.
Fraser Cain: – five big, big pixels, five big squares. And you wouldn’t get any additional information.
So I think that’s pretty obvious to most of us. But I think where people are going wrong now is that when they’re seeing these photographs and such of Ceres and Pluto, and then they’re starting to kind of hypothesize what we could and couldn’t do with our current telescopes. There are not only sort of the capabilities of the instruments, but also just basic laws of physics that we can’t overcome, right?
Pamela Gay: And there are some deceptions that end up occurring, whether we want them to or not. So first of all, there’s the basic problem that if you open up your picture of Pluto, picture of Ceres, picture of whatever it is that you took a picture of that is three pixels across, you can blow it up in Photoshop. And say you use bicubic interpolation or some other form of interpolation. And it will happily go, okay, so when I make this bigger, the center of this pixel has this value; the center of this pixel has this value.
I’ve made each of those pixels now 20 X 20, and I’m going to do some mathematical smoothing between those two points to make shit up. And that’s what Photoshop is doing. It’s making up a mathematical way of getting from the center of one pixel to the center of another.
So people get deceived into the, “But that picture I see www.space.com is, like, clearly 100 pixels across.” Well, no. But the image that we got back to earth was three. Chill.
Fraser Cain: Yeah. We get that a bit with – I don’t know if you ever have people, like, finding creatures on Mars or strange – they think they’re seeing buildings and structures on the surface of Mars. And what they’re really seeing is artifacts from Photoshop attempting to scale up things. You scale up what could be a tiny little object in the image, and it’s gonna take on rectangular features because the pixels themselves are rectangular.
Pamela Gay: And then the next problem we end up with is, any time you look at something through an optical system, it’s going to end up getting an artificial blurring, no matter how clear the picture, no matter how perfect the lack of atmosphere you’re looking through. And this is due to a physical phenomena called the formation of an Airy disk. It’s what happens when you have light getting focused down to a point. Well, the light getting focused down to a point is going to have a ring around it, which is gonna have a ring around it, which is gonna have another ring around it.
And how small that central point in the Airy disk is is defined by your optical system. And it’s always gonna be bigger than zero. So if you’re looking at a point function that should have, like, all the light in the absolute tiniest point possible, that tiniest point actually is gonna end up getting an artificial size to it due to the interference inherent in light waves.
Fraser Cain: Whoa.
Pamela Gay: Yeah.
Fraser Cain: So okay. So let’s give some practical examples of this, right? Like in actual – like, so let’s say we’re gonna point the Hubble Space Telescope at Pluto, and we’re gonna take a photograph of it. It’s gonna be itty bitty teeny tiny, even on the Hubble Space Telescope’s massive optics. And this is why all we have are very blurry images of Pluto.
And so you’re saying that interference – the light itself is interfering with itself?
Pamela Gay: Yep.
Fraser Cain: And is causing, I guess, like a probability function where it appears in your images, and that you literally can’t see where it actually is and what it actually is because of this interference?
Pamela Gay: Exactly. Using Hubble as an example, it actually complicates the picture even more because when you’re dealing with Hubble, now you’re dealing with a complicated optics system that is going to add in its own distortions.
So even if we had, like, a perfect single-lens system, single-mirror system that focused the light down onto a detector of some sort, and we’re looking at a distant star so there is no angular side to what we’re looking at, it is a point source, even that absolute point source looked at through a vacuum because of the nature of the collimated light coming off of it – this is where all the light waves get lined up – you’re gonna end up with the photons interfering with the photons and creating this set of rings that creates a disk.
Fraser Cain: And I think I kind of understand this sort of – I’m thinking about in my mind, right? Like, you can imagine, like, what a lens is, what an optic system is is it’s a funnel. And so you’re taking the light rays that would be parallel to each other, and then you’re squooshing them down so that more of them, a bigger collecting area, is falling on a smaller CCD. And so it’s like you’re packing the photons together, and that’s causing them to interfere with each other. So they’re not where they would have normally been. They’re now in a much tighter area, interfering with each other. Is that kind of right?
Pamela Gay: Not quite. It’s simply the nature of lenses and mirrors. When you play a lens, you can actually – if you do this just right – you can set up a set of rings by pointing a laser through a lens. Or you can set up a series of lines against a wall if you reflect a laser off of a CD that maybe you got in the mail with something or a DVD you have hanging around in the house.
Light just likes to interfere with itself. It doesn’t have to be crammed down. It simply has to be put in a situation where the rays go from traveling parallel to one another to being put in a position where they can interact. So instead of focusing them down, you can for instance focus them through a slit. And when you focus them through a slit, they spread outwards. So it’s any time the light rays are given a chance to go from being parallel to one another to being not parallel with one another, and they can interfere.
Fraser Cain: Right. And you will literally lose information. So how do astronomers – I’m guessing that when – professional astronomers, they spend some time dealing with learning all this. And then it’s, like, it’s the uncertainty in their results, right? That as they look at images, they know what the …
Pamela Gay: What the resolution of the system is going to be. And it’s actually a good thing that we have this point spread function in a way. Because if you think about it, if you have the light from a start spread out over a few different pixels, that allows you to do better analysis on it, to do things like correct for cosmic rays and things like that, that you couldn’t do if the light was a square star, if you had that single pixel of light. So when you do have telescopes that have really big pixels and really large fields of view, you do have to sometimes knock them a little bit out of focus to get the star light spread out over more than one pixel. So it actually works in our favor to have this point spread function.
Fraser Cain: But it’s different than, like, when you say the actual CCDs where the photons will spill out of one bucket and then into others, right? And then your image actually gets bigger because there’s just too much light falling in any one area.
Pamela Gay: That’s where you saturate your image.
Fraser Cain: Right.
Pamela Gay: So the way to think about a CCD is it’s that ultimate set of detectors that are collecting the star light that’s raining down from the sky. And if you get too much star light in one bucket, it’s gonna flood out of the bucket. And it has the potential where the buckets touch on the sides and the top to spill from one bucket to another.
Now, it turns out the way the optics usually work that you end up with preferential spreading along one axes, which is where you end up with spikes that are often mis-termed defraction spikes. No, they’re just saturation spikes. But this is one of the things we have to deal with.
Now, it’s not just the Airy disk that ruins your images to a certain point, or in the case of trying to do photometry, makes what you do a little bit more understandable. You also have to worry about the optics systems smearing the light that you’re looking at in different ways. So this is where I said looking at things with the Hubble Space Telescope’s an actually more complicated problem because you end up with the way the light focuses in the center in the image is different from the way the light focuses on the edge of the image. And this is a convolution of optical issues or tracking issues. Here on the ground, sometimes you just have the, someone bumped their head on the telescope issues.
Fraser Cain: And of course the atmosphere.
Pamela Gay: And the atmosphere. And all of this works against you. And people do try and deal with this by doing fancy deconvolutions. But you can never mathematically say this fancy deconvolution I did is actually representative of what the light would have done had you not bumped your head on the telescope. But it’s the best you can do.
Fraser Cain: And so each iteration, each time you move through a piece of optics system, each time you have to go through an instrument, as you come through the atmosphere, you’re gonna lose a certain amount of information that can never be restored, that can never be brought back.
Pamela Gay: Exactly.
Fraser Cain: And no amount of trying to fill in the gaps is gonna work. And so it just adds uncertainty to uncertainty to uncertainty each step that you go.
Pamela Gay: And this means that we do have to do complicated things when we’re doing our science. For instance, if you’re trying to accurately measure the light from stars, you have to take into account the fact that your stars may actually look like teardrops and count the light within the teardrop-shaped what we call the annulus – the shape where we count the light – and say, “This is star light,” and then a teardrop-shaped annulus around that which is where we say, “This is what the sky’s light is,” to get at that sky subtraction from the star to get at pure star light.
It means when we’re trying to separate stars from galaxies at the edge of our ability to separate those two, we have to be careful to realize that our stars may be ellipsoids because of tracking issues. And so you have to say, “All ellipsoids that are shaped in this one exact way, that’s probably a star.” We have ways to take this into account. But it does, as you point out, add uncertainty to what we’re doing.
Fraser Cain: So how do you as an astronomer, when you’re planning out – for example, you’re gonna do an observing run, and you’re trying to get a certain kind of data. How do you account for all of those issues? Does an observatory provide you with the various issues along the way, and then you have to then compute the uncertainty into what you’re doing?
Pamela Gay: It all depends on the system you’re using. So for instance, a lot of the research I did where I was dealing with either a wide field telescope that had a one-degree field of view. And it was on Earth, and it was dealing with wind and it was dealing with tracking issues and all of the things that are fundamental to using a telescope that is as old as I am or older.
I knew that every night, I had to for each different image calculate a new point spread function. I had to figure out, for this image, if I look across all the stars in the field, I can mathematically determine the point spread function up here is like this, down here is like this, and it graduates between those two shapes across the field. So it was image by image.
If you’re dealing with the Hubble Space Telescope, they know the optics-based point spread function. So as long as you’re not doing anything weird that might induce some sort of a tracking error because you’re looking at a moving object that is moving at a rate that maybe you miscalculated, as long as you’re dealing with perfectly normal science, they know the optics of Hubble perfectly, and they can tell you, “This is the point spread function.” And you just mathematically build that in. And the software does it for you once you figure it out or once you’re given it.
Fraser Cain: And so can you come up with a result, but because of the, I guess the limits of the optics along the way, you have to say that the uncertainty falls out – it’s kind of like, your result falls too far into the uncertainty, and so it’s not a result?
Pamela Gay: Well, where you end up having to deal with this the most is that issue of star/galaxy separation. If you’re trying to figure out these faint little things that I’m looking at that are basically the same size as the star, you have to figure out, how much am I interested in the galaxy such that I’d rather get more stars in my sample, versus how much am I interested in a different science problem so it’s more important that I miss galaxies and only have galaxies? So you have to make choices as you go.
When you’re dealing with photometry, when you’re just trying to figure out how much light is coming off the subject, you quite often end up having to run through a variety of different, “I’m going to try –” One of the things that we look at is what’s called full width half max. This is where you do a curve of how much light is coming off of an object. And you plot from the center outwards: how much light am I getting? And there’s a certain cutoff where, at that width of that plot from the center outwards how much light do I get, you’re getting half of the light within that particular annulus. You try different multiples of that, trying to figure out what is the best solution for the sky conditions you’re dealing with on a given night.
It’s complicated. And there’s entire, basically, observing books dedicated to the simple problems of trying to figure out very precisely, how bright is this thing that I’m trying to measure? And –
Fraser Cain: Right. If you study variable stars, it’s important for you to be able to know if the star is varying in brightness.
Pamela Gay: Or if you’re trying to study many other things. So you can look at the flickerings of a quasar in the distance, if you’re looking at certain types of objects that are basically standard candles. So supernovae, it becomes very important. There’s lots of times when it’s very important to know precisely how bright what you’re looking at is.
Fraser Cain: Right. I can just imagine standard candle with a super nova. That’s a great point, right? Because you could be off by hundreds of millions of light years if you get that wrong. And so for example, results like discovering dark energy depended on them getting the brightness of those type 1a supernova perfectly. And so if you get that wrong, then you may not detect dark energy and things like that.
Pamela Gay: And dark energy is another one of those cases where the point spread function became very important because they were looking at the variations in the average shape of galaxies at the edge of what was easy to see, which means you have to know what distortions are due to dark matter – not dark energy, sorry. Dark matter’s another one of those things where it’s very important to understand your point spread function because trying to understand the microlensing and all the other lensing effects that come in due to that dark matter requires you to know, very precisely, my point spread function is causing this distortion, and dark matter is creating this other distortion.
Fraser Cain: Right. And you can only be certain about the part that is outside of the point spread part. So …
Pamela Gay: Yes.
Fraser Cain: Okay. So what – are there some tricks that astronomers can use? I’m guessing build a bigger telescope.
Pamela Gay: Build the bigger telescope is definitely one situation. There’s also the combined multiple telescope so that you have a greater edge-to-edge distance. This is something the very large telescope does in the infrared. It’s something that ALMA does in the microwave on the radio. Interferometry is kind of the ultimate get a greater distance: left, right, forward, backwards, north, south, east, west; pick an axes.
Fraser Cain: And so specifically, right, the baseline of your telescope is what you’re fighting with on this.
Pamela Gay: For resolution, not for how faint or what you’re looking at.
Fraser Cain: Right. Right.
Pamela Gay: For faintness, you want bigger.
Fraser Cain: Yeah.
Pamela Gay: But for resolution, it’s you need, how many wavelengths fits from one edge to the other edge? And the more wavelengths you can fit, the higher the resolution of your telescope.
And we do play some digital games. There’s this evil thing called an unsharp mask, which is the reason that many of the pre-Pluto encounter images of Pluto make Pluto look like it’s lumpy, when it’s probably a real sphere.
Fraser Cain: Right. Now, what about time? Because, like, I know with a lot of amateur astrophotographers, when they take images of Jupiter and such for example, instead of just taking a photograph, they will record a video. And then they’ll stack up all the frames of the video where bits and pieces of it are clear. And it creates a very stunningly clear image that looks a lot better than what you get with just one frame from the telescope.
And I know that NASA actually has developed technology that’s sort of similar for that, that they can – instead of looking at a photograph, they actually can enhance, zoom in and enhance –
Pamela Gay: It’s called drizzle.
Fraser Cain: Yeah.
Pamela Gay: So the idea is that if you take just one image, all the pixels are going to be distorted due to the light getting shifted over time as it comes through. So to give you a specific, the light from the red spot might be three pixels by three pixels, but then also wander an additional three pixels in any given direction over time. So that three pixel by three pixel red spot, over a five-minute image, might blur out to a nine pixel by nine pixel smudge.
Now, if instead of taking that long exposure where the atmosphere is being whimsical, you can instead take a whole bunch of high-speed images, you can stack those high-speed images and align on the specific features so that you can essentially erase the movement that’s put in by the atmosphere.
Now with Hubble, you don’t have to worry about the atmosphere, which is why we put Hubble in space. What they’ve learned with Hubble is if you nudge your telescope around ever so slightly from image to image, you can go from having a feature dead center on a pixel to being half on/half off a pixel. And as you move things around, you’re shifting what the centers of the pixel are looking at. By then drizzling the images together, where you add them in this higher resolution space, you can essentially take advantage of that change in what’s at the center of the pixel to get a higher resolution – sort of. It’s a fake. You can fake the higher resolution image in a fairly valid kind of way that we understand.
Fraser Cain: Right. Because a half pixel is half as good as a regular pixel, but it’s still better than no pixel.
Pamela Gay: Well, it’s giving you an authentic understanding of what’s going on when you’re pointed at this one place, versus when you’re pointed at this other place. So you can actually say, yes, the average between these two positions is actually half of their two values. Or oh, no, it’s not actually half of their two values. It’s actually a third. So it gives you a sense of what is the gradient between two different positions as you move the telescope around.
Fraser Cain: That’s pretty cool. So they’ll actually just, like, just gently drift the Hubble around a little bit, just to try and get – moving, shifting the images onto different pixels. And then they’ll build it back on computer to rebuild it.
Pamela Gay: And we do this with ground-based images as well, and with space-based. You have to worry about things like cosmic rays, hot pixels, variability and sensitivity. And so in general, even if you don’t have the advantages of drizzle, by moving the image around ever so slightly with where it hits on the CCD, you can take care of some of the blemishes, take care of some of these aberrations.
Fraser Cain: That is really cool. So are there any other tricks? I mean, you talked about interferometry. I know that’s tougher for visible-light telescopes.
Pamela Gay: The wavelengths are just too small to feasibly add together with current technology on a large scale. There’s some testing units that have gotten it to work. But the real thing we have to be careful of is how we use our technology. As I said, one of the things that gets abused is the idea of an unsharp mask. And this is something that we all have the button in any of the software we use that says, “Sharpen.” And when you click that little button that says, “Sharpen,” what it’s actually doing is mathematically going through and trying to figure out, where are there features that are above a certain threshold?
So it starts out often by blurring the image out, subtracting off the blur, looking for what’s left, and then strengthening the signal from those things that stand out above the blur. This is a great way to get rid of things like .jpeg artifacts, where it didn’t do a good job figuring out how to smooth the color in your background.
But if you over apply the “sharpen,” it starts making up data that’s not actually there, essentially. And this is where we’re running into so many problems with Pluto data right now. It’s six weeks out. When you unsharp mask a expanded image, where you’ve taken your three pixels, turned it into 20, and then unsharp masked it a bunch of times, you end up with something deformed – with texture. We don’t know if Pluto has texture; it’s probably not deformed. This is all artifacts of hitting that “sharpen” button many, too many times.
Fraser Cain: And so we talked about build a bigger telescope. But I guess with New Horizons, the solution is get closer.
Pamela Gay: That works, too. We’ve all done that with our camera. You can only zoom so far before it goes into the software zoom, and software zoom bad. Don’t use software zoom. So you walk closer to your subject. While it’s – you can’t walk closer to Pluto, but you sure can fly a little tiny spacecraft there.
Fraser Cain: And so I guess that’s it. Let’s launch more spacecraft. Let’s build bigger instruments and put more stuff into space to get away from the blurring effect of atmosphere.
Pamela Gay: And hey, while we’re giving Pluto a shout out, don’t forget Ceres. We have the Dawn mission that is shrinking its orbit down, getting closer and closer. And Ceres is a former planet, too. It was on that classic list of worlds. And the Dawn mission is gonna hopefully let us see awesome features like geysers and help us figure out what the heck these shiny spots are that are cropping up on its surface. So we have multiple missions, and Dawn is sure gonna send us back a whole lot more gigabytes than New Horizons is of this other former planet.
Fraser Cain: Awesome. All right, well, thanks Pamela.
Pamela Gay: Thank you.
Fraser Cain: Thanks for listening to Astronomy Cast, a non-profit resource provided by Astrosphere New Media Association, Fraser Cain, and Dr. Pamela Gay.
You can find show notes and transcripts for every episode at www.astronomycast.com. You can e-mail us at firstname.lastname@example.org. Tweet us at Astronomy Cast, like us on Facebook, or circle us on Google+.
We record our show live on Google+ every Monday at 12:00 p.m. pacific, 3:00 p.m. eastern, or 2,000 Greenwich Mean Time. If you miss the live event, you can always catch up over at www.cosmoquest.org.
If you enjoy Astronomy Cast, why not give us a donation? It helps us pay for bandwidth, transcripts, and show notes. Just click the “donate” link on the website. All donations are tax deductible for U.S. residents. You can support the show for free, too. Write a review or recommend us to your friends. Every little bit helps. Click “support the show” on our website to see some suggestions.
To subscribe to this show, point your podcatching software at www.astronomycast.com/podcast.xml. Or subscribe directly from iTunes.
Our music is provided by Travis Serrel, and the show is edited by Preston Gibson.