Light Pollution Subtraction
Posted: Thu Jan 09, 2020 8:09 pm
I have been using SharpCap 3.2 Pro for a couple of years to do imaging in a Bortle 8-9 urban sky. I typically use live stack with real-time dark frame and flat field calibration. The results have been rather surprising given that the sky is very bright with light pollution.
Over the past couple of years I have been involved with using SharpCap during public star parties in the urban areas around Los Angeles, California. I usually live stack images while members of the public view the objects that I am imaging through a parallel telescope that is aimed at the same object. By using the live histogram stretching feature in SharpCap, I can show the object in color on the computer screen in which I subtract light pollution by setting the black point up scale. Unfortunately, this tactic leads to a severe color bias in the deep sky object image because the light pollution has a strong blue-green color, so the object does not look to be its true color. This is because the black point setting clips the red component more than the blue and green components of the light pollution.
If I take the live-stacked 32-bit FITS image into PixInsight and use the DynamicBackgroundExtraction (DBE) tool to remove the light pollution then I get a fairly nice looking true color image of the object. I post these images for my astronomy club on my Instagram account: @vctyree The problem is that these images are not immediately available and most of the visiting public never bother to look at the processed images the next day after the public star party.
Since SharpCap 3.2 has live Preprocessing to do dark frame and flat frame calibration, I started to think that perhaps I could also remove light pollution in each sub-exposure by subtracting a synthetic light pollution image from the PixInsight DBE tool. I can take a number of live stacked images from a region near where I will be showing the deep sky objects, generate a light pollution image with DBE and perform live stacking or real-time imaging along with light pollution removal to show people the true color object without light pollution. I can also imagine that this technique can be used to show wide field images of the sky to show meteor showers in deeply light polluted urban areas. Would it be possible to include a third preprocessing function that takes a saved light pollution image and subtracts it from the live image after dark frame and flat field calibration?
Over the past couple of years I have been involved with using SharpCap during public star parties in the urban areas around Los Angeles, California. I usually live stack images while members of the public view the objects that I am imaging through a parallel telescope that is aimed at the same object. By using the live histogram stretching feature in SharpCap, I can show the object in color on the computer screen in which I subtract light pollution by setting the black point up scale. Unfortunately, this tactic leads to a severe color bias in the deep sky object image because the light pollution has a strong blue-green color, so the object does not look to be its true color. This is because the black point setting clips the red component more than the blue and green components of the light pollution.
If I take the live-stacked 32-bit FITS image into PixInsight and use the DynamicBackgroundExtraction (DBE) tool to remove the light pollution then I get a fairly nice looking true color image of the object. I post these images for my astronomy club on my Instagram account: @vctyree The problem is that these images are not immediately available and most of the visiting public never bother to look at the processed images the next day after the public star party.
Since SharpCap 3.2 has live Preprocessing to do dark frame and flat frame calibration, I started to think that perhaps I could also remove light pollution in each sub-exposure by subtracting a synthetic light pollution image from the PixInsight DBE tool. I can take a number of live stacked images from a region near where I will be showing the deep sky objects, generate a light pollution image with DBE and perform live stacking or real-time imaging along with light pollution removal to show people the true color object without light pollution. I can also imagine that this technique can be used to show wide field images of the sky to show meteor showers in deeply light polluted urban areas. Would it be possible to include a third preprocessing function that takes a saved light pollution image and subtracts it from the live image after dark frame and flat field calibration?