QHY600 Slow Download Speed

My image downloads have increased to about 15sec+ with this camera. Anyone know a solution to reducing the download speed?

I have read that people using NINA get downloads of nearer 5 seconds and that its something to do with the driver used but I don’t know enough to comment further.

Will the 64bit version of SGP improve the situation?

Even with a QHY 247C, at 24mp, my download speeds through the ASCOM driver are very slow. I think this is one of the reasons that some developers are interfacing directly to the device driver through their API.

I’ll search the ASCOM group and see if there are any developments in the pipeline to improve things.

1 Like

It’s possible, but QHY does not support all of their cameras with 64-bit drivers.

1 Like

This is all beyond my knowledge but just doing a bit of googling it seems the native QHY600L driver downloads images in 2-3 seconds rather than 15s-20s with SGPro. Is there a way to use this in SGPro or is it the ASCOM driver that is the problem. Just trying to work out how or who can solve the issue - is it the ASCOM team? It’s driving me mad atm with AF and I’m trying to fix backfocus which needs lots of center here commands which just take forever. Also, I want to take advantage of shorter exposures with CMOS and the long downloads make this much less attractive.

Also see the last two posts in this thread that discuss Maxim downloads are fast with the ASCOM driver.

Could the slowness that myself and others get be down to something other than the driver?

Maybe this is the answer ?
" I saw this thread mentioned elsewhere and thought I’d put my abstention from CN aside to add some insight to this issue, an issue that will become more prominent as more people use cameras that have vastly higher pixel counts than the typical astrophotography cameras did from even just 3 or so years ago.

The core of the issue is what is called data types and how ASCOM packages up the pixel data from the camera before passing it to the host application, the host application being SGP (in this case) or any other app that is running the camera’s ASCOM driver. If you’re a programmer, you might know what I mean. If you’re not a programmer and don’t have the slightest as to what a data type is, I’ll try to explain it in simple terms, and then tie that into the explanation about this issue.

Programmatically, the image data from a sensor is an array . You can think of an array as a box that’s filled with distinct items, each item corresponding to a pixel of the sensor. That means if you have a 20mp sensor, this array - the container - has 20 million items in it. Now let’s talk about each item (pixel). We know that imaging sensors output a certain bits per pixel . This bitness describes the maximum range of values that can be encoded for each pixel. The more bits per pixel, the more resolution the ADC has to describe the charge of the pixel. An 8 bit pixel can describe the brightness in a range from 0 to 255. 14 bit pixels can be described in a range from 0 to 16383, 16 bits from 0 to 65535, and so on. Obviously, for imaging, more resolution means more being more tonally descriptive, which means things like higher dynamic range. The bitness in this case equates to the size of the items in the box (the array). The more bits, the more space each item takes up in the box, and that means the box must be large enough to hold them all. So if you have a 61mp sensor like what’s in the QHY600/ASI6200, and it produces image data at 16 bits per pixel, that array is going to be 16 x 61,000,000 = 976,000,000 bits big, or 122,000,000 bytes (there are 8 bits to a byte.) So raw image is 122 megabytes. That’s a pretty big box compared to the one needed to hold the data for a 12 or 16 megapixel sensor that’s running at 8, 12, or 14 bits per pixel.

Also in programming, we have to deal with what are called data types . A data type describes what a thing is in memory and the size of it, in bits (just like the pixel bitness.) Memory, after all, is also an array. So we have data types that are 8 bits, 16 bits, 32 bits, even 64 and 128 bits large - those two can hold some large numbers. But the point is, is that the computer can deal with items only in these terms. Everything has a type associated with it, and with that type comes a size. When the camera spits its image data onto the wire, either the camera firmware or the camera’s driver on your computer must convert the data (if required) from its image native sensor format (which might be 10, 12, or 14 bits for some sensors) to either 8 or 16 bits. So a pixels with 14 bits per pixel will get scaled to 16 bits - a data type a computer can use. There is a cost of speed in this conversion but it’s not really noticeable in the grand scheme of things.

Now ASCOM slides into the picture. Being an API that tries to present a consistent interface in which to programmatically interact with cameras, it demands that the image data be delivered in a generic data type that’s the same no matter what camera produced it. In ASCOM’s case, it specifies that the image data be presented as an Object . An object is kind of an amorphous data type - it’s a box, but a box that where all the items in it have melted together into a large blob with no discernible separation or organization to them. That’s great, programmatically, because an object can hold any kind of data, and that data has no real form. The programmer has to bestow form to it by breaking that one large blob into many 8 bit things or 16 bit things - whichever or whatever data type is appropriate or desired.

If you’re keeping track, you might notice that data is being converted between types an awful lot here. 10, 12, or 14 bit sensor data is getting scaled and converted into 16 bit data. That 16bit data is then being converted in to a formless object, and then that object is being converted back to an array of 16 bit-sized items for it to be usable by the host application. The programmatic flexibility of an Object is great, but it comes at a cost. When low pixel count sensors were the norm, there wasn’t a whole lot of data to convert back and forth and so the cost in time to do that was negligible or not really noticed. CCD cameras were also the norm, and are relatively pokey in reading out their sensors in the first place, so the additional time cost of conversions between data types and objects for 8 or 10 megapixel cameras was just just part of the CCD life. Now we’re trying convert 4 to 8 times that amount of pixels from CMOS cameras that have astounding readout speeds, and this is where the conversion costs become quite noticeable. Type conversion in programming languages have never been a super speedy thing in the first place. Only under certain, well-prepared circumstances can it be relatively quick… but by and large it’s a lumbering process compared to everything else. Speeding it up is attained more or less by pure brute force - how quickly your CPU and its memory controller can shove blocks of memory about, on top of how well the programming language in question manages conversions.

What “native” camera drivers do is avoid the dismal speed penalty incurred by boxing and unboxing that intermediate ASCOM ImageArray object. If the camera or its vendor-provided SDK presents 16 bit data, that 16 bit data is consumed directly by the host application and generally stays that way throughout its use. There is no need to repackage or unpackage it from a generic data type such as an object. For example, in NINA, we take the image data array as it’s handed to us by the camera’s SDK, and we keep it as-is. One internal process copies it, wraps it in a FITS or XISF header, and writes it out as a file to disk. Another process takes a copy of it, runs it through image statistics and a midtone stretching algorithm and presents it on the screen… and that’s that. The memory is deallocated and life moves on to the next exposure. The downside to native drivers is that it means the application developer is in charge of interacting directly with the camera. Sometimes this is easy, sometimes it is complicated (looking at you, Canon.) There are pros and cons to it. But with sensor pixel counts growing, the pros are outweighing the cons by far. Instead of wasting 10 or 20 seconds each frame just for it to trickle into the application, we gain that time back which can be used for more light frame exposures over the course of a night."

I just found on the QHY website that they recommend 64bit software is used with the 600L so sounds like the driver must be 64bit. However if I go to 64 bit SGP software does that mean that any of my gadgets that only have 32bit drivers wont work?

Something else I noticed with the QHY600 is after taking over about 100 images (ie. darks), SGPro crashed (the first time it has crashed)
I’m running the latest (non beta) software as I subscribed hoping that SGPro had a big improvement in the wings to justify the subscription but I’ve sinced learned it’s not the case.
I think I’ll run NINA every so often and get use to it as it is upgrading and uses native QHY drivers
SGPro now has under a year to pick up it’s game or I might as well risk the perhaps more buggy but improving faster alternative and use faster interface.
Sticking to ONLY generic ASCOM when faster drivers are available seems silly.

I agree gregm. I just downloaded NINA to do a quick test and the download is lightning fast compared to SGP using the native driver. It is buggy though, I found a few bugs within a couple of hours of trying it - looks to me like they do happy path testing.

As things stand one of the bugs is it wont connect to my Sharpsky focuser in 64bit so unless I go 32bit I cant use it just yet. Trouble is SGPro is not useable with current download speeds - I did some accurate timings yesterday and its taking 20s - cant do AF with 20 second exposures!! So unless SGPro offers the native driver quickly I’m going to have to go either NINA 32bit (and upgrade to 64 bit later) or go with Voyager.

I dropped a note on the ASCOM developer’s forum. Their view was, in general, that API and ASCOM interfaces were similar in performance. Their suggestion was that some App authors were interfacing directly to the device and not even using the API, for speed. That, however, comes with its own headaches. Having said that, they were increasingly aware of the slowness of the interface and were working on some ideas to improve download speed.

That’s good to know, hope they come up with something soon. I know 120mb is bigger than your average image but going from/to fast storage over USB3 should be pretty fast.

I had a similar problem with my QHY128C, and it sometimes didn’t even complete a download. But on the bench, it worked just fine. So I purchased a long, buffered USB3.0 cable to go directly between the computer and the camera bypassing a USB hub that was being used and now I get blazing download speeds.

That sounds like its worth a try, which cable did you get?

buffered USB 3.0 cable

That sounds like a good idea. My QHY download speeds have been increasing as well. DId that buffered USB 3.0 cable contain one of those little repeaters?


Yes. From Amazon. 16’

John R Carter Sr

There are thousands of cables on Amazon, be good to know specifically which one you have as many others may not work :smirk:

I have done a bit of Capture time testing using NINA, QHY EZCAP, SGPro 32bit and 64bit on three laptops, a fast i3, a slow i5 (10% faster than the i3) and a fast i7.

It’s not scientific, I counted the seconds 1000, 2000 etc but have drawn some conclusions - maybe rightly or wrongly.

  1. I didn’t see any significant difference between SGP 32bit vs 64bit

  2. The laptop speed made a difference - e.g. for a 0.01s exposure using ASCOM driver the capture was
    i3 - 14s
    i5 - 10s
    i7 - 8s
    I should point out that the laptops didn’t have anything else running and capture times did sometimes vary by a second or 2 with the same settings. All used USB3 “blue” ports but I think when I tried USB2 it didn’t make much difference.

  3. The Ascom driver performed similarly in all software for a given laptop

  4. The Nina driver was faster than ASCOM 6s vs 10s on the i5 with 0.01s capture

  5. QHY EZCap also 6s - presumably same native driver as used by NINA.

The interesting thing I noticed was the time between “beeps”. The capture beeps at the start, then again in the “middle” and again at the end.

Watching this through in NINA it appears that the time between the first 2 beeps is the exposure, and the time between the second two is the download.

I tested this with different exposure lengths and it seems that:

a) For all software/laptops/drivers the time between beep 1 and beep 2 is 5 seconds + exp seconds. Could most of this be the time to read the sensor?
b) The time between Beep 2 and Beep 3, the download, varies significantly. The native driver download is 1s but the ASCOM driver download is maybe 6 or 10s depending on laptop - the download time seems to be more dependent on processor speed.

So, it’s a rough and ready test, I wish I had structured it and recorded results more accurately but it seems a fast PC helps, no surprises there, and something in the ASCOM driver download is slow.

1 Like

Some follow on observations. I recently acquired a QHY163M – a 16 megapixel, CMOS camera. I am also a software developer using ASCOM interfaces to astro gear. With this camera connected to my development system using a USB3 powered hub, I initially would hear beeps during a 1x1 dark frame download (about a 1.0 second download) that was accompanied by a loss of connection. I noticed in the small print in the camera manual, that the 12v DC power connection of the QHY camera is only used for the TEC. The actual camera electronics draw all of its power from the USB cable – that is, from the USB hub or the PC / laptop. I suspect it is pulling very close to the 0.500 amp USB limit. If the USB port being used can’t quite supply the power the camera needs, you will lose the USB connection and the download will fail, freeze up, etc.

I removed another device from the USB3 hub (Arduino) that was also drawing power and the QHY camera stopped losing the USB connection and showed reliable downloads. An overnight test run that downloaded 100, five minute dark frames showed no errors.

So, if your hub or PC / laptop is not providing adequate power to the camera, I would recommend the use of a powered USB3 extension cable. These are cables that have their own power brick supplying power to the USB device it connects to. If your camera has a USB2 connector, these extension cables will still work. Similar to:

Powered USB Extension


1 Like