The dark art of calibrating your astrophotography - light frames, dark frames, flat frames and bias frames.

All my photos are bad in different ways!

Astrophotographers are terrible "pixel peepers". We rarely take single photos. Deep sky people like me will take upwards of 50 images at a time, and call them "light frames", or perhaps "subframes". Planetary or lunar photographers are even worse. They'll take literally thousands.

Why? If you take 50 lights, they're all the same, right?

Yes, that's nine running chickens


Look closer, they're all slightly different. If you look carefully, you'll see all manner of problems with each image. 

In one, you might have had poor focus. In another there might have been a cloud, or atmospheric disturbance. In another a satellite or an aeroplane zipped through the frame. In yet another, your tracking was off.

Arrgh! Did I waste my time?

No, you didn't. We can use mathematics to get rid of most of the problems where the images are slightly different. We call it "image stacking".

Part 1: stacking your images

If we take a number of images with the same camera, the same scope and under the same conditions, we can look at the group of images. This will give us some understanding about how good each individual image is.

Clearly, I'm not going to go into the maths, but in general, we figure out which images (or parts of images) are similar, and which ones are different to the group.

Let's say we take 100 images and look at the corresponding pixel of each one. If 98 pixels are pretty much the same, but two are white, it's a good bet you've been photobombed by Elon Musk. We can instruct the computer to toss those pixels out, and then average the other 98. Not only does this get rid of artifacts like satellites, clouds and bad focus, it also gets rid of a good amount of noise as well.

And what's great, a lot of the software to do this is free!

We end up with a single image that - at a distance anyway - doesn't look a whole lot different to the original ones. The extra benefit is that, deep within the photo, there's a way higher signal-to-noise ratio - the pixel in the final image is probably much closer to the true reality. This allows us to dig deep into the darkness and brighten up those nebulas without getting a snowstorm.

But remember that astrophotographers are never happy. You'll also notice there are other problems with your light frames, and these problems show up in all of your images, so you can't "stack them out".

For example, all your photos have donut-shaped shadows on them. If you were using filters, it only happens with one of them. All of your light frames have a few white dots in the same places. All your photos are dark in the corners. Worse, there's a horrible streak radiating from somewhere out of the frame. 

Arrgh! Are your photos still ruined?

Worry not!

If the problems are across all frames, we can probably correct them. If it's consistent, it's predictable. We can know our enemy's weakness!

All we need to do is find out what's going on. That's hard when you're photographing a star field, but easy if you're photographing a blank field. Enter the "calibration frame".

Have a look at this photo. Not only is it black and white, but it's a bit exaggerated, I admit. However, it shows what you can get out of an image stacker.

Stacked light frames with no calibration

You can see that it's still a right old mess. There are lots of problems that the stacking software wasn't able to get rid of. These can be classified into two main groups: problems caused by your hardware, and problems caused by an imperfect light path (yes, admit it, your telescope still isn't perfect).

Part 2: correcting for problems caused by things other than light

In the photo above, you can probably see that there's a horrible bunch of what looks like light rays coming from the right side of the frame. 

This is "amp glow" from the electronics inside the camera. It causes rays, either radiating from a point or straight across the frame. During long exposures, this starts low, but it gets worse over time as the electronics warms up. For a given exposure length, the amount of amp glow building up is remarkably predictable - and predictable is what we want.

Also in the photo above, if you're very sharp-eyed, you might be able to spot a bunch of single pixels that are white. They don't quite look like stars, they're too sharp and small. Real stars normally cover more than one pixel.

These are when some of your sensor's pixels are out of whack: too bright, including permanently white, or too dark, including black. They mightn't be actually wrong, if it's a consistent problem, the computer can fix it.

Oh, and don't freak out - every sensor has pixels that are poorly calibrated. A "hot" pixel is one that is brighter than it should be (including one that's white all the time), and a "cold" pixel is one that is dimmer than it ought to be (including black).

To see all the problems that are caused by things other than light, we simply turn the light off by - you guessed it - putting the cap over the objective lens while we're still out at night. Without the light coming down the tube complicating things, we can see all the imperfections that the sensor or the electronics (or other things) have caused.

And here it is. Astrophotographers call it a "dark" frame.

A dark frame


To get a "dark" frame, it's important to keep as many other variables the same as you can, so you can really compare the dark and light frames. Ideally, put the lens cap on your telescope, and take the same photo (same exposure, same ISO or gain, same temperature) outside at night at the same time as your "light" frames. That way the only difference is that there's no light from the stars.

You can clearly see the amp glow, but you can probably also see the occasional tiny white or grey dot. What you probably can't see is dots which are darker than their surrounds as well.

Now we simply ask the computer to "subtract" the dark frame from the light frame. It's basic maths, but devilishly clever.

Where the dark frame found no errors (the pixel wasn't affected by amp glow and it was not too bright or dark), there won't be any change. But the pixel was too light, subtracting the dark frame will make the pixel darker. And here it is:

Stacked light frames with dark frame subtracted

Incidentally, if you've got a cooled camera, and you always use the same gain and exposure, you can take a "library" of dark frames, and only have to spend time refreshing it every year or so. I only expose for 30 seconds, 120 seconds or 300 seconds, and always use gain 70, so I have a bunch of dark frames on my computer for these settings. That saves hours at night when I could be taking light frames.

Part 3: correcting for problems caused by light imperfections

But you'll notice that there are still some problems with our latest photo. There's a circular shadow, and a couple of worm-shaped shadows as well. As well as this, the photo is brighter in the middle than it is on the edges.

The shadows on the photo are caused by dust in the imaging train. Typically, they're round or donut-shaped, but tiny hairs can cause worm shapes as well. Dust can settle pretty much anywhere: on a lens, mirror, a filter, or on the sensor itself. Again don't freak out - it's normal.

The darkening towards the edges of the frame is called "vignetting", and it's caused by a couple of things. First, it's just part of optical systems - it's always brighter in the middle where the light has a clearer path through the system. This type of vignetting normally gets gradually darker the further you get from the centre of the frame. 

It can also be caused by throttling in your imaging train (say, if you've got a filter that's a little too small or far from the sensor). This type of vignetting normally has a sharper drop-off.

To see the effects that your equipment has on the image, we give it some nice even, predictable light. That is, we take a photo of a plain grey field. Here it is, and it's called a "flat" frame:

A flat frame


I think that flat frames are the most effective calibration you can get. Certainly, they give you the most visible improvement in your final result. They record vignetting and dust imperfections on your lenses or sensor.

There are a few ways to record a flat frame. I won't go into details, but you can use the "tee-shirt" method, where you stretch a piece of white fabric over the front of the telescope, and then photograph an even, dull sky. The alternative is a light source that glows evenly across a frame, such as a light box for tracing used by artists, and simply put this across the front of the telescope.

Because flats record dust, which can move around from day to day, you need to get new flats every time you shoot light frames. The exposure is much shorter, of course, but gain (ISO), temperature has to be the same as your light frames. 

So...

During processing, you start with your "lights", averaging them and removing outliers to make a "light stack" with no satellites and low noise. Then, you average a number of dark and flat images and the computer "subtracts" these from your light stack to get final images without dust blobs, vignetting, hot pixels and amp glow.

Stacked light frames with dark and flat frames subtracted


The wonders of mathematics and digital images!

Part 4: Expert level: Bias frames and Dark flats

To round out this discussion, I really should talk about the other calibration frames you can use: "bias" and "dark flat" frames.

Remember that your sensor has some noise right at the start of the exposure, which builds up (more or less consistently) over time?

Bias frames are taken with the fastest speed possible, and record the noise that's present at the start of the exposure. Your dark frame measures the noise at the end of the exposure. As long as the noise builds up at a constant rate, these two measurements can tell your computer how much noise to subtract for any exposure.

But I only use a few exposure times (30s, 120s and 300s), always set my temperature to -15°C (I have a cooled camera) and use the same gain (or ISO, if you're using a DSLR). Because I'm not using all manner of exposures, my dark frames give me all the calibration information I need.

I do use bias frames, but they're very flat (see my image here), so don't add much to my processing. My camera has nice low read noise. 

A bias frame. Super boring.

Dark flats enable you to remove problems like amp glow from your flat calibration frames. Calibrating your calibration sounds a bit meta.

But unlike my dark frames, I take new flat frames for each filter every night. Because they're the same exposure time, gain and temperature as my light frames, I don't need to have the computer calculate how much noise to subtract from my flat frames - it's right there in the frame.

This is my finished Running Chicken (I've shown this before). It contains Hydrogen alpha and Oxygen frames, as well as Sulphur you've been seeing. Getting back to the start of this article, I'm a pixel peeper, and know pretty much every error or problem in the shot. I'm not saying what they are, but I've got bigger problems here than bias and dark flats can help me with!

Finished product. IC2944, the Running Chicken Nebula


Comments