Here's what a typical video frame might look like when capturing Jupiter
And this is what you might get if you let SharpCap stack about 1000 similar frames and then apply sharpening and a bit of a saturation boost (all done in SharpCap)
Getting Started
Getting started is really simple - once you have your planet in view, activate the new 'Live Planetary Stacking/Enhancement' tool. The important things to adjust are:
- Camera ROI - to keep things from being slow, put the camera into ROI mode with not much more than the planet in the ROI area. Alternatively, you can use the Selection Area tool in the toolbar to draw a selection box around the part of a larger image to process
- The 'Target Stack Length' - this is how many frames SharpCap will try to stack together - obviously the stack will start much shorter and be noisy, but the noise will drop as more frames are added
- The Wavelet Sharpening sliders and colour adjustments - increasing wavelet levels 1, 2 and 3 is a good start for sharpening the image.
You can save the current processed image as a PNG file (there's also a timelapse option to save a processed frame every second to a video file), however there are other possible uses of this tool from outreach events to making fine focus adjustments while watching the stacked, sharpened image to see what effect they are having. There is also a basic filtering option to only include the best 25% of frames received from the camera in the stack.
Testing on existing saved SER files
If the weather is not co-operating for trying this feature out, you can use 'Test Camera 2 (High Speed)', then choose the 'Browse' button to browse for a sample image in the 'Testing Controls' area. As well as choosing still images, you can select an existing SER file to make SharpCap load and play that file (on repeat) in the test camera. Any existing SER video of one of the planets should be good for testing.
Will it be as good as AutoStakkert!/Registax/PlanetarySystemStacker?
It's obvious to ask the question 'Is this going to be as good as stacking a video in AS!3 or PlanetarySystemStacker or Registax?' The simple answer is no - it's not going to be quite as good as a separate stacking program for reasons including:
- Only a single alignment point is used to stabilize the video, rather than multiple alignment points with most stacking applications
- Filtering on frame quality is not as effective when working on a frame-by-frame basis rather than being able to read all the frames at once and then decide which to choose
- Some calculations have been designed for high speed rather than to get the very best final quality to allow the process to run on a live video stream
- The technique uses a simplified (exponential decay) stacking system to avoid holding hundreds or thousands of frames in memory
Other things to know...
- You will need a fairly fast PC to get the best results from this tool - performance is faster when working on smaller images, so use ROI or selection area too
- This is a SharpCap Pro feature - if you don't have a SharpCap Pro license then you can try it out without being able to save (you will also get a watermark across the on-screen image)
- A longer stack length will give less noise and allow stronger sharpening to be used. However, long stack lengths will respond less quickly to changes (focus, seeing changes, rotation).
- The exponential decay stacking being used means that older frames have a weaker contribution to the stack than newer ones. A 900 frame stack at 30fps will have about 63% of the data coming from the last 30 second, about 23% from between 30s and 60s ago, about 8.5% from between 60s and 90s ago and about 5% from longer ago than 90s.
Finally, I need to say a massive thank you to Mike (@Borodog) who persuaded me to give this a try and who has been chief tester and feedback generator over the last week or so
cheers,
Robin