Magnification with a camera - why doesn't it really mean much?

What's the deal with magnification? When using a telescope for visual observations, it's a simple matter to calculate how much bigger everything looks. But with a camera it's more complicated. Here, two concepts  replace magnification: your field of view and your camera's resolution.

One of the most common questions people ask me is about the magnification on telescopes. For visual telescopes, it's actually pretty simple. The magnification is just the ratio of the focal length of the main mirror or lens to the focal length of the eyepiece you're using.

So, for example, if you've got a telescope with a 900mm focal length, and you use a 20mm eyepiece, the magnification is 900/20, or 45 times. Swap to a 10mm eyepiece and the magnification jumps to 90 times. This is why shorter eyepieces are more powerful.

But people also ask me about about magnification when you're using a camera. There's no eyepiece, so it's more complicated. 

It's about field of view and resolution. Both of these are partly determined by focal length, but there are other things involved too.

Image size is determined by focal length

When there's no eyepiece between the lens (or the mirror) and your camera (what we call prime focus imaging), there is only one factor that determines the size of the image that falls onto the sensor: the focal length of the telescope.

You can see this with your own eyes if you hold up a white piece of paper at the focal point of your telescope (with no eyepiece) while it's pointed at the Moon. You can see the image projected there, and its size never changes. (Of course, you can move the paper backwards to get a larger image, but it goes out of focus.)

A telescope with a long focal length will give you a larger image than a telescope with a shorter focal length. You can think of this as higher magnification, but when there's a camera involved, it's more complicated.

Field of view (FOV) is determined by focal length and sensor size

Field of view (FOV) is the area of the sky you actually get in your photo. My setup currently has an FOV of about 1.8° by 1.3°, which means I can easily get the the Horsehead and Flame nebulas in the one image, but I can't quite get the the whole of the Rosette nebula in.

That's because, as in the example with the Moon, the physical size of the Rosette nebula image is set by the focal length of my telescope. Unfortunately, that size is just a little larger than the physical size of the sensor on my camera.

If I wanted to get the Rosette nebula into a single image, I could do it in two ways. I could increase the size of the sensor by getting a new camera, or I could decrease the size of the image by getting a new telescope (with a shorter focal length).

What's my FOV?

You can calculate your FOV and show how it compares to targets by using any one of a number of websites or programs, such as this one.

Here I've used the free program Stellarium to simulate a couple of FOVs. Both are for the same telescope pointed at the Jewel Box, NGC 4755. I've shown two FOVs, for couple of different Sony sensors. In Stellarium, the FOV is the red rectangle in the centre of the star field.

The IMX183 (which is the sensor you'll find in the QHY183 and the ASI183 cameras) has a larger sensor (13.2x8.8mm) than the IMX462 (which is inside the QHY462 and the ASI462, and measures 5.6x3.2mm), and so its FOV is larger. Of course, this makes the Jewel Box appear smaller on the IMX183, because the imge of the cluster takes up less room on the larger sensor.

So a camera with a large sensor will have a much wider FOV than one with a small sensor. This might look like less magnification, but it's not that simple.

But this is about magnification. Can't I just blow up my image?

After you've taken the photo, you can crop the image and display what's left on your computer. Blowing up like this gives you more apparent magnification. This is what I've done with a shot of M7, Ptolemy's Cluster. The first shot is not cropped much. The second one is cropped but displayed as the same size. It looks bigger.

But you can't keep blowing it up forever because at a point, the image begins to pixellate. At this point, you've reached the limit that your camera can sustain.You can see this in the third shot, where I've cropped to a ridiculous extent, but displayed the image at the same size. It'll depend on your computer screen, but you can probably see that the stars are starting to pixellate. (I'm not too keen on those colours, either. Not sure what's going on there.)

What can help (a little) here is if I use a cameras with smaller, denser packed pixels. Because they pack more pixels into the image, the point at which the image starts to pixellate is further down the line. Of course, you still get to that point sooner or later.

So it's not about FOV, and it's not about how many pixels you can get into this field. It's a combination of both. How can you measure this?

Resolution is determined by focal length and pixel size

There is an objective way to measure something close to magnification for your camera. We can figure out how much of the sky a single pixel can cover. It's measured in arc seconds per pixel.

This is generally known as the camera's resolution, or pixel scale, and it's not affected by processing, because you can't blow up or crop a single pixel!

The pixel scale is your camera's pixel size (in microns, or µm) divided by the telescope's focal length (in millimetres) multiplied by a constant: 206.265.

Going back to the example of the two sensors, the IMX462 has a pixel size of 2.9µm - I found that in the manual. My refractor with an effective focal length of 560mm (with its reducer). That means my pixel scale is 2.9/560x206.265=1.07 arc seconds per pixel.

If I swap to the IMX183, the FOV is way larger due to the sensor size, but the pixels are smaller as well, at 2.4µm. The pixel scale drops to 0.88 arc seconds per pixel, which means it has a higher resolution.

So the IMX183 covers more sky than the IMX462, and it does it to a finer resolution. Does this mean that the IMX183 is a better sensor than the IMX462? No, the 462 has other advantages (that I'm not going to go off an a tangent about).

You need to know both your field of view and your resolution

This post was about magnification. But the point I'm trying to make is that in astrophotography, magnification is less of a concept than your field of view and the detail you can get in that field.

But they're independent. Resolution on its own doesn't tell you anything about your FOV, and vice-versa: you need to know both. But once you know these, you can figure out (and suimulate on a computer) pretty much everything about your image. This is invaluable when planning your astrophotography.

Knowing your FOV and your pixel scale will tell you definitively what you can photograph and to what resolution.