New toys, new things to learn, and so little time.
Finally got some time over the last 2 nights, not Snake valley like al-nighters but enough for the sky to be lightening when finishing my flats and dark flats. I've been waiting for another chance at NGC 253 since Doug (hagar) had a play with my first attempt last Month.
So here goes: Skywatcher ED120, canon 1000D with IDAS LPS filter on EQ6pro with eqmod, guided by ed80 with QHY5 and the captrue controlled by Images Plus.
14 x 10 minute subs (longest I've gone) 11 of which were used.
11 x Darks
8 x Flats
8 x Dark flats
8 x bias fames
All stacked and aligned in Images Plus and then a whole lot of work in CS3.
I'm working my way through layers. curves and levels and using feathering to rejoin. I've also had a go at lifting the stars and Galaxy off the image and sharpening them and adjusting the black point.
I've attached three jpegs, first, stacked and aligned in IP, Second is the first round of CS 3 and 3rd is what I am currently happy with. Although, right now, I'm so tired the bed is looking very good.
Indeed, John is correct. More information lurks within the data set that you present Darren. One way of working out how far you can stretch or push the data is via careful analysis of the histogram. Ideally, when you see the data begin to comb (vertical lines through the histogram), you know you're close to its limit. Some combing can be dealt with, in fact if you run a noise reduction algorithm while the histogram is combed, it will clean it up - pretty cool. But don't push your luck. Better back off the stretching instead of dealing with the consequences of hard pushed data. This can only be used as a general guide as there are other constraints that may also reduce how far the data can be pushed.
Below is what I mean by combing;
First attachment shows no combing.
Second attachment shows the effect.
Quality data can be stretched further than weak, hence its worth acquiring more data where possible. Be careful you don't clip the highlights as you stretch. By the looks of the data, I'd say you effort in going long with subs etc has paid off. Details are quite pronounced. Keep at it!
Thanks John, I'd say your right, there seems more in there, gives me something to do tonight.
Jase, Thank-you very much for the combing tip. I've often wondered why the histogram does that. When processing reasonably good data, is this really the limit I'm looking to stretch my images too? ie: just prior to or just a tad of combing?
I'm working my way through a very large amount of resources that have luckily come my way, but boy there is so much to processing, as it turns out, the capturing is becoming the easy bit!
In the first instance, I would recommend you stretch the data as hard as it will take it. If you don't try this, then you will not know what its capable of (a baseline). This does not necessarily mean a hard stretched image will be the end result every time as aesthetics come in to play. For example, with strong data you are likely to be able to stretch it hard to bring out the fainter features of a galaxy's edge (some very cool stellar tidal streams or integrated flux should they exist), but features in the nucleus will burn out/clip. Restoring the burnt out features isn't a major issue, but you may end up struggling to keep the brightness and contrast relative to the edge of the galaxy, hence losing the aesthetics i.e a mono tone feel lacking depth. Based on this don't over do it and use the combing as one of the thresholds to gauge what can be achieved with the data. In some cases, such as LRGB processing, a heavily stretched luminance is going to wash out all colour so it will simply be not possible to take it to its limits if you can't match it with the chrominance data.
Its easy to get too tricky when processing, often resulting in a small gain. Providing you don't clip data, you can manipulate it as you please between the black and white points. Indeed, capturing data becomes routine, its what you do with it that counts!