Folks,
I've read a few of the _really_ interesting threads about finding optimal exposure times for CCDs, but wondered if there was a way to similarly estimate for a DSLR?
Having taken a bunch of bias and flat frames on my 1100D, I used the DSLR settings Pixie script to calculate the following values using ISO800 selected on the camera:
Gain = 0.36 e/ADU
ISO for unit gain = 288
Read noise = 3.92 e
Bias = 2048 ADU
Thermal noise = 3.73 e (using dark frames of 10 min at ~5C)
If I open an example light frame taken at the same temperature and ISO (a shot of the Grus 3/4 quartet with the Esprit) into PI and hover the pointer over the background, I see average pixel values of 0.032 - 0.035, with the small bright spots of galactic cores around 0.15.
My technique, for want of a better word (!), has been to expose so the histogram is away from the left hand edge, and preferably around 1/4 the way towards the right. Which works well with larger objects, like Andromeda and such, so maybe the example sub I looked at is not a good...example...as there are numerous bright stars in the field with pixel values around 0.25, which I suspect are dominating the luminance in the image, but at least there is some signal in the image that is exposing as intended (misguided or otherwise!)
In my example above, I figure I could have exposed for easily double the time and obtained a different signal. But since I'm using an uncooled DSLR, in generally cool ambient temperatures, is the signal any better?
So...at what point can I be sure that the signal I'm trying to capture is greater in value than the bias/read noise, and/or thermal noise? And where is the sweet spot?
Any insights gratefully received