I tend to standardise my exposures as much as possible. It reduces the size of the darks library and simplify processing afterwards.
I think its based on:
1. Darkness of your skies (you can go longer in darker skies). Perhaps more of an issue with DSLRs where they register sky glow faster than dedicated CCDs.
2. Accuracy of your tracking (shorter exposures are easier than longer).
3. Pixel size and resulting well capacity - that may mean shorter exposures on faster scopes otherwise bright stars can be damaged. Not really that relevant in modern CCDs although the trend for Kodaks' latest chips was all very low well capacity. That is why you look at the full well capacity as one of the factors you evaluate when considering a particular chip. The other would be blooming (not very common now) but it would influence exposure time if using a blooming camera with straight LRGB and how many bright stars there are in the image.
4. Amount of cloud cover or wind - if there is cloud around or its windy 5 minute subs are more likely to be OK than a 10 or 15 minute one which could be spoiled by a rogue cloud or gust of wind.
5. Brightness of the object/ noise levels of camera: Ha and O111 S11 need longer exposures to overcome the noise of the camera and lack of signal in dimmer objects. Objects like Omega Centauri would do better with short exposures than 10 minute ones. 20 minutes for Ha is a common one. 30 minutes would be fine if your tracking is spot on.
6. Cooling power of the camera. Colder cameras would allow shorter exposures as the noise floor overall will be somewhat less allowing a signal to show its head above the overall noise.
As a practical observation the most common subexposure for an astro CCD is 10 minutes. Sometimes you see 5 but that really requires quite a low noise camera.
You aim is to get the signal level of the image above the noise floor of the camera. Until you do that there will be nothing to see in your image.
Different cameras have different noise levels, even ones using the same chip!
Greg.
|