View Single Post
  #16  
Old 01-05-2015, 04:02 PM
gregbradley's Avatar
gregbradley
Registered User

gregbradley is offline
 
Join Date: Feb 2006
Location: Sydney
Posts: 17,877
I guess it would have to be measured as there may be differences even if slight. If you have done testing I am happy to accept that.

I also wonder if read noise falls with higher cooling. I see that -110C Kingfisher V camera that never made production claimed read noise of 1.5 electrons using the Sony ICX694 chip which gets rated at 5 electrons read noise by other manufacturers.

So yes the gain from high cooling may be minimal for the Sony Interline CCDs. Perhaps not so much if Sony made full frame CCDs when RBI (ghost images) can be a factor. RBI requires an infrared preflush of several iterations and then there is a leakage of that during the exposure which adds noise. High cooling reduces that leakage and resulting noise if you use RBI preflush (I don't but some recommend it). This is full frame sensors only not interline which read the sensor differently.

At the FLI Yahoo Group there is often posts about the importance of high cooling and I have enjoyed the benefits of it with Kodak chips but yes I take your point about it being less important with Sony sensors but again I wonder about the effect of cooling on read noise, pattern noise, bias.

Some camera makers have bias drift hence the PixInsight super bias feature. I am not sure where a QSI stands on this but I do recall Rick using that feature on one of his cameras - I think it was the Apogee 16803.

I wonder if bias drift is worse with warmer temps as well. That may be harder to detect.

Higher cooling also reduces hot pixels.

Little things perhaps, but when go for optimum image quality they all add up.

Greg.
Reply With Quote