A little background…I have a scope with a long focal length of 2438mm. Even with a .62x focal reducer, my arc/sec per pixel scale is .47. The Observatory is at 6500 feet above sea level, and is located 17 miles North of a large mountain range. Normal “seeing” averages out to about .50 arc/sec, but can change a lot during an image session due to the turbulent air rolling off the mountains. I bin the camera (zwo asi2600 mc-pro) 2x2 to get my scale to about .94 to get as far above the seeing as possible. At full image times of 3 to 5 min I will see HFR’s averaging 1.9 to 2.4 through out the session and even from image to image, this is normal…can’t do anything about it, it’s all up to Mother Nature.
At first the images history graph will show the stars and HFR running parallel with slight bumps to represent the higher 2.4 HFR’s but still running together. It will continue to look like this over many images. Then suddenly, either caused by me looking at the series parameters or what ever…haven’t discovered what triggers it yet, the vertical scale of the image history graph goes way too sensitive. Where previously a HFR of 2.4 would give me a slight bump on the graph compared to a HFR of 1.9, now that same difference produces huge peaks and valleys. Every time it happens I pause the sequence to see if I was dragging a cable or something equally drastic. But every time when I pull up the image series the HFR’s are the same as they have been all along, nothing has changed all the data is still good. From then on the rest of the night the huge peaks and valleys are there. Even though I can confirm the HFR’s are good via image statistics and image history series it still freaks me out watching it. Is there a trigger that I can avoid to prevent the big change? Or maybe down the road include a user variable to change the vertical sensitivity of the graph?