Lunar imaging F.A.Q.

Summary

This page attempts to answer questions which tend to come up frequently in relation to my lunar imaging techniques.

How far away from the city lights does one have to travel in order to create color photographs of the Moon?

I take my photos from the balcony of my apartment in Piekary Śląskie, Poland. I live in a heavily light-polluted area, with approximately 2 million inhabitants in a radius of 40 kilometres. This pretty much excludes any chances for a serious photography of deep sky objects, like galaxies, nebulae etc. Luckily, this is not the case for major Solar System objects including the Moon, which are absolutely not affected by light pollution due to their brightness - so for that type of astrophotoghraphy, there's no need to travel at all.

What's the required equipment?

I use a Celestron C9.25 Schmidt-Cassegrain telescope, which is, photographically speaking, an equivalent of a 2350 mm lens with a focal ratio of f/10. The telescope is used with a Sky-Watcher HEQ5 equatorial mount, which tracks celestial objects as they move across the sky due to Earth's rotation. There are several accessories mounted at the rear of the tube: a high-precision William Optics DDG focuser, which assures the image is easily brought into perfect sharpness, a manual filter wheel which allows quick switching, and finally a ZWO ASI174MM camera, connected to the computer via a USB3 interface.

Of course, a smaller equipment will also be able to reveal a lot of features on the Moon. The maximum level of detail depends on the diameter of the telescope. A fast camera is the most important element in lunar astrophotography.

telescope moon photography
Above: Photographing 98% Waning Gibbous on 2015 August 31st.

Why is a two megapixel monochrome industrial camera preferred over a 20+ megapixel DSLR camera?

The key advantage is speed. My camera can record up to 165 uncompressed frames per second. The practical number of exporures per second is usually smaller, as it primarily depends on the brightness of the target object. Other advantages include the ability to use custom filtering for astrophotography, such as infrared.

It should be also noted that sensor sizes of modern DSLR cameras are not perfectly compliant with the optical properties of a classical Schmidt-Cassegrain telescope design, which has a rather small diffraction-limited field of view. This means that aberrations such as coma are readily visible at the edges of a DSLR sensor. This effect does not occur with the camera I'm using - due to smaller size, the image has excellent quality all across the sensor's area.

What's the purpose of taking thousands of images of the same region?

The largest obstacle in high resolution Solar System imaging is the instability of our planet's atmosphere. The fluctiations of the air's density constantly distort the appearance of celestial bodies. The name of this effect is astronomical seeing. An effective method of countering seeing is the lucky imaging technique. The small-scale air movement is essentially random, so when we average out (or stack) a large number of short exposures, we get a "clean" image data containing a lot of features which can be brought out by the proper application of image sharpening algorithms.

Another positive effect of image stacking is a huge boost of the signal-to-noise ratio, which allows us to reveal extremely subtle brightness and hue differences of the lunar surface.

astrophotography moon processing steps
Above: (1) a single frame (of a fairly low quality) from a 2000-frame raw footage; (2) a raw stack of 400 best frames; (3) a sharpened version of the stack.

Where do these colors come from? Are they real?

The matter of the colors is probably the trickiest one to satisfactorily explain, as the exact answer will greatly depend on what exactly do you expect to be real. In comparison to what the human eye can see, the colors are obviously grossly exaggerated. Yet, our eyes - limited to detecting a very narrow range of the elecromagnetic spectrum - are quite a weak benchmark when it comes to telling what's real and what's not.

From a technical standpoint, the answer is fairly simple. Digital color images are composed of three channels: red, green and blue. All it takes to create a color image from a monochrome data is assigning at least two source images taken with different filters into separate image channels. The colors can be made more prominent by increasing the pixel value differences between individual channels (or saturating the image).

moon color saturation
Above: Before and after boosting the saturation.

To sum up, the color differences on the Moon are most definitely real and correspond to the variations in chemical composition of the surface of our satellite. Blue areas are covered by a titanium-rich soil, whereas brown/orange areas indicate an abundance of iron.

How long does it take to create one photograph?

The image acquisition process itself takes usually around 10 to 30 minutes, with the exact duration dependent on the surface brightness of our satellite (requiring adequate exporure times) and a combination of the apparent size and the percentage of illumination of the lunar disk. A full or a nearly full Moon which is close to Earth may require recording up to 12-13 mosaic panes in order to cover the entire visible surface, whereas a thin crescent only needs as little as 4 separate panes.

But obviously recording the raw material is just a part of the whole process. What follows next is a multi-step phase of post processing, or data reduction as astromoners like to call it. Image stacking is fairly resource demanding and may take up to five hours on a mid-range laptop. The application of the image sharpening algorithms and a final assembly of the mosaic in Photoshop is usually up to two hours of additional work.

Of course, much of it is automated. Launching the stacking process takes just a few minutes, so it's not like the author must sit in front of the monitor and click the same buttons over and over for several hours. Only the final adjustments require an active intervention on the photographer's part, since every processed material comes out a little different, so things like color balance need to be applied manually.


Share this article

Like my work? Spread the word!