Introduction to Image Calibration

What is Image Calibration?

Calibration

Astrophotos are not simple point-and-click images. In other words, they need a lot of work to bring out the variety of beautiful detail. The first step is called calibration. This involves multiple steps to remove unwanted artifacts. No optical train is perfectly clean and free from small imperfections. One of the most common is dust somewhere in the light path. This dust can cast unusual and sometimes large shadows on the images. The good news is that with the proper software, these shadows can be removed relatively easily. Below is an example of a typical calibration image called a flat field.

What are Flat Frames?


Typical Flat Field

Flat fields are taken with an even illumination across the entire image. Not all pixels are equally sensitive to light, and not every portion of the sensor receives exactly the same amount of light. Flat fields are intended to compensate for these issues. It is pretty obvious there are a number of issues with the above image. There are obvious small circular shadows caused by dust, but there is also a noticeable darkening at the corners and edges. This is caused by the falloff of light as you get farther away from the central focus. Field flatteners can minimize this somewhat, but unless the image circle is very large, there will be some falloff. This is called vignetting.

What is the purpose of taking flat fields? Using sophisticated software, flat fields help to correct for the unevenness of pixel sensitivity and illumination. Basically a flat field is applied to an image to normalize the image to an assumption of even illumination, effectively evening out the variations in pixel light sensitivity as well as the unevenness in illumination caused by light falloff at the edges of the sensor or shadows created by dust. Any increase in pixel value beyond this baseline is considered a genuine light signal. The end result is an image that captures detail based on light capture and not by different pixel sensitivity to light or unevenness in illumination. Dust shadows and vignetting are removed.

What are Dark Frames?

Image sensors are powered by electrical current. This current can cause electrons to be captured by pixels that aren’t caused by light hitting them. Over time, these electrons can accumulate, causing what is called a “hot pixel”. These electrons cause falsely elevated values for those pixels, which can be mistaken for stars or other false stimulation of the pixel. To remove these “hot pixels”, dark fields must be taken and subtracted from each image. This results in images in which the pixel values are caused only by the light that hits them and not by the underlying electrical current. Ideally, these dark fields are taken at the same length of the image(s) that needs to be calibrated. Below is an example of what a typical dark field looks like.


Typical Dark Field

What are Bias Frames?

Bias frames are images taken with a camera’s shutter speed set to the fastest (shortest) possible exposure time and with the camera’s sensor covered to prevent any light from entering. These frames are used to capture the camera’s inherent electronic noise, known as readout noise, which is present even when no light is hitting the sensor.


Bias Frame

Bias Frame looks black, but pixel values are not zero


Image Integration

Once the flat field and dark fields have been applied to the individual images, it is time to stack the images in order to reduce noise and improve the signal-to-noise ratio. By averaging each pixel value over multiple images, you get a better representation of the “real” pixel value. This results in a “smoother” image with more detail.

  You may ask why not just take one very long image instead of taking multiple short images. This is a very good question. It does make sense that taking longer images will result in more detail being revealed, particularly in very dim areas. But many astronomical objects are surrounded by bright stars. A very long exposure can over saturate these stars, resulting in bleeding of electrons into surrounding pixels. This can make the stars look like big blobs that may even overlap areas of interest. But there is an even more compelling reason to not take really long (over 15 minutes) exposures. You might be surprised to know that over the course of a few minutes, several things can happen to ruin an exposure. A strong gust of wind will blur the image. Or a satellite or plane will travel through the field of view. It can ruin that particular image. If you are taking a series of short 3 minute exposures, you can simply discard that image, and you have only “wasted” 3 minutes or less. If you are taking very long exposures, say 30 minutes, you could end up discarding one image and “wasting” 30 minutes or more of precious imaging time. Often times, imaging is limited to periods of clear sky or a cloudless area of sky, which might be fleeting during a night. If you only have 2 or 3 hours of these periods in a given night, losing 30 minutes or more could end up ruining half of your night of available imaging time.

Conclusion

Getting good results from imaging is more than just point and click. There are several steps that you need to take to make the most out of the images you acquire. Image calibration is absolutely necessary to correct the technological and visual imperfections that occur in digital astrophotography. Once these are done, the artistic work of bringing out the desired detail and colors can begin.

Leave a Comment