Bravo Dr Glover I am on target with outstanding data

Discussions of Electronically Assisted Astronomy using the Live Stacking feature.
Post Reply
DiligentSkies
Posts: 52
Joined: Wed Dec 02, 2020 10:46 pm

Bravo Dr Glover I am on target with outstanding data

#1

Post by DiligentSkies »

Dear Dr Glover,

Bravo!

Absolutely outstanding.

Been putting your Signal to Noise Smart Histogram best practices to work.
It has been a long haul on the learning curve.
A point I reached late last year. However, "The seeing sucked last year. Massively."
Hopefully this year's weather pattern in Southern Florida is setting up to allow me to collect exceptional world class data on several targets."
Messier 2, Helix Nebula, Great Orion Nebula and the Horseheads Nebula.
That is my goal.

Right now, "My target is M2."
I am collecting some exceptional data.

When I say Bravo, "I am saying your Signal to Noise best practices has IMHO turned Deep Sky Imagine into a lucky shot outcome/prospect.
For my ~4.9 F/ratio OTA and my measured sky brightness.
The smart histogram analysis is providing, "Absolute confidence to shoot in the 12 to 18 second subs range.
blue.png
blue.png (63.39 KiB) Viewed 6379 times
When you start to see unstretched SharpCap live stacking results like this at 200% zoom after 200 frames:
lucky shots.png
lucky shots.png (6.57 KiB) Viewed 6379 times
The underlying data is going to be rock solid.

By rock solid, "I mean if any given sub frame stretched out of the basement produces this:"
An on the fly stretch.png
An on the fly stretch.png (35.11 KiB) Viewed 6379 times
Therefore, the underlying data collection is, "Naturally going to a produce a great number of sub frames for Deep Sky Stacking score ranking."
Meaning, integration time now becomes, "A function not of the totality of sub time integration but of the greatest number of the highest-ranking subs scores of the number of equivalent sub frame integration."

Do you think a 240 second sub frame of data knows the difference between a 16 second sub frame that produces, "A greater Deep Sky Stacker score ranking."

At what point does the astrophotometry integration time become meaningless, "As a function of total integration time measured by numerical statistical data theory ranking.

Again, does one sub frame from another know the difference in the time it took to collect the data.

My point is if four hours of integration time built on 180 second sub frames that produces 80 sub frames of working stacking data.
How is that any different than 80 frames of 16 second sub frames out of many hundreds over the same period of time."
Thus my, "Lucky shot conjecture!"

So, in measuring this, "Apples to Oranges paradigm shift."
Under any traditional measures, "If I stack 80 frames of 16 second sub frames of best ranking measured by Deep Sky Stacking how is that any different from stacking 80 frames of much longer exposures."
Essentially it boils down to a numerical statical analysis of, "At what point does the total number of sub frames produce a limit of diminishing returns!"
Talk about turning the hobby on its head.
Thank you, "Dr. Glover."

At this point I am going to make, "An exceptional statement."
The data I am collecting is going to produce a Best of Class World Class Amateur photo of Messier 2, "Hands down for the class of my wide field OTA."

Put this way and I think you will have to absolutely agree given your, "CMOS signal to noise sensor paradigm."
Deep Sky Astrophotography CMOS imaging is no longer a conjecture of the old school of the need of longer sub frame exposures.
But a function of settled solid state physics.
"Robin, that is a massive shout out to you."

Sincerely Mark
User avatar
admin
Site Admin
Posts: 13344
Joined: Sat Feb 11, 2017 3:52 pm
Location: Vale of the White Horse, UK
Contact:

Re: Bravo Dr Glover I am on target with outstanding data

#2

Post by admin »

Hi Mark,

glad it's all working out so well for you!

One thing that you have to remember is that in the end it's all about collecting photons, and they arrive randomly.

Let's suppose that there's an area in your image that has a particular brightness, and that brightness means that on average 10 electrons are created by light from the target the target onto each pixel of the sensor per second.

If you have a 16 second frame then you will collect 160 electrons per pixel (on average), but the photons that create the electrons are random like raindrops - not every pixel in that region collects 160 electrons per second, some get 159, 158, 155, 150 or less, others get 161, 162, 163, 168, 175 or more. The typical variability is easy to work out - it's just the square root of the number, so 12.6 (lookup Poisson distribution for further info). So the pixels in that region will all have different values due to this random variability, which we see as noise. In this case, the typical noise is about 8% (12.6/160) of the pixel value.

Now put 100 of those 16 second frames together. Instead of collecting 160 electrons per pixel, you are now collecting a total of 16000 electrons. The noise level on that is ~126, which seems higher, but 126 is only 0.8% of 16000, so relative to the actual image brightness the noise has gone down by a factor of 10.

If you decide to keep only the 10 best of those 100 frames then you may see a boost in sharpness (by choosing those with best seeing), but the noise will be more of a problem - with only 10 frames you get 1600 e/pixel, a noise of 40, which is a relative level of 2.5%.

Unfortunately this is a fundamental of imaging and there is no way around it - to bring the noise levels down you have to collect more photons, which means a longer exposure total time. SharpCap tries to point you towards using more short frames rather than fewer long frames (where appropriate) as that helps avoid issues with guiding/tracking, so you still need long total exposure times for the very best results. Of course, this is less significant on brighter targets or when the light polluction levels are very low.

cheers,

Robin
DiligentSkies
Posts: 52
Joined: Wed Dec 02, 2020 10:46 pm

Re: Bravo Dr Glover I am on target with outstanding data

#3

Post by DiligentSkies »

Dr. Glover,
Woke up and did a stretch of my blue channel data.
Did not fuss too much about it.
However in DSS, "I did play around with a custom rectangle and did a x3 drizzle.

I think out the 500 plus subs the final 109 subs I chose to stack may have, "Incorporated much of your reply and insight."
Now as an after thought, "I did see much of what you were saying in a manner."
When subs I did not elect to stack had better clarity(seeing) and better full well ADUs of similar ranked subs.

Of the subs that did get stacked. Seeing clarity was excellent and the raw data had very consistent full well ADUs across the subs.

Anyhow, here is the results of quick GIMP stretch.
When I do go full on-board with a LRGB stretch, "My hunch is that many blue and red stars will pop out in the globular core."
Autosave.jpg
Autosave.jpg (584.67 KiB) Viewed 6356 times
Shot with a William Optics GT81 APO, x0.8 reducer and an Astronomik Deep Sky Blue(CCD) filter.
ZWO ASI533 mono sensor cooled to 0 Celius.
109 17.3 seconds subs, gain 105 and black level of 5.
blue.png
blue.png (63.39 KiB) Viewed 6356 times
DiligentSkies
Posts: 52
Joined: Wed Dec 02, 2020 10:46 pm

Re: Bravo Dr Glover I am on target with outstanding data

#4

Post by DiligentSkies »

Poisson Distributions.png
Poisson Distributions.png (497.71 KiB) Viewed 6353 times
Very well versed with statistics.
Post Reply