HI,
I’ve been using 2.4.2.4 (will update to .5) and have noticed a problem with the temperature smoothing which is not a problem with SGPro, but brings out problems with my focus controllers.
I use different focusers based on my equipment profiles (an Optec and a couple of USB focus) and on occasion, they send some bogus temp values (usb sends a 100 C and I think Optics a -273 C) when something gets gummed up (busy doing something else, like moving the focuser). This has caused a problem when averaged over a few minutes. On a few occasions, the last focus temp or the current temp reads a few degrees warmer than what it should read because it averaged in a 100 C value. (I’ve had sparse imaging time and am using the usb right now). This causes a lot of focus sessions when using the delta temp as an autofocus trigger…almost every frame in one case when I was using 2 minute subs…longer subs have time to get rid of the bad value if it corrupts the average current temp.
I’ve been poking through the SGPro logsand will try to see if they are in the focus controllers ASCOM log (if there is one) to see if it’s been using/sending the 100 C reading. But, since they send these values when there is no probe or a probe problem, fixing it at the ASCOM level may be a problem.
One ‘fix’ would be to limit the temps used by SGPro for the averages…only use values within a certain range and ignore those sent by the focuser for no probe/noisy data etc. i.e. “Use temps between x1 and x2 for temp averaging”. In my case I could set those to say -20 C and 40 C so anything outside those would not be used. Another feature for the autofocus dialog, that’s why I’m listing this as a new feature and not a bug…nothing wrong with SGPro…it just shows what you didn’t see when there was no averaging.
Thanks,
Frank Z…