Like many long term CCD camera users, I had my doubts about 12bit CMOS cameras. 12bit data seems like a big loss of dynamic range – and it is for photometry. But for pretty pictures, not so much. Why? Well, your computer monitor is the great equalizer. The majority of HD computer monitors only support 8bit data. Some expensive gaming monitors support 10bit data and a very few support 12bit data.
So, what does this mean? When you have an image with 16bit data and you display it on your computer screen, the device driver (or monitor) converts the data to 8bit before it is displayed. When you display an image with 12bit data, the data is also converted to 8bit before it is displayed. This means, that for all practical purposes, your CCD images and CMOS image are going to look the same on your 8bit computer screen.
If you are printing your images, that’s a different story. But for most of us, the computer screen is the only place our images ever appear.
I recently began imaging with a CMOS camera and after learning how to use the various gains and offsets effectively, I have found the increased sensitivity of the CMOS camera (perhaps 3x over my CCD camera) makes a big difference. I see images taken at 300 seconds comparable to my CCD images taken at 900 seconds. The images I am taking with my CMOS camera are just as good (maybe better) than the images taken with my CCD camera.
So, this transition to CMOS cameras may not be the big negative that CCD imagers have feared. Just my 2bits worth.
Charlie