This is my second attempt (ever) at processing LRGB data. Man, it's harder than narrowband! Don't know how you guys do it so well. Would have liked more colour data, but the weather is the weather.
Data is:
R: 4x420
G: 6x420
B: 7x420
L = 20x420, + a synthetic L created from the RGB.
The registered and calibrated RGB and "super L" data set is available here If anyone wants to play with it.
Basic process was:
Decon L
Noise reduction on L using ATrousWaveletTransform
Multiple HistogramTransformation passes on L
Using image statistics get the median from the L, then use masked stretch on R/G/B images with a target background matching the L's median
LRGBCombination using default settings
ColourCalibration
Couple of curves tweaks to the L "channel"
HDRWavelets to lightness with lightness mask
Couple of passes of LocalHistogramEqualization
Boost saturation using curves with a mask to protect the background
SCNR to get rid of the green stars.... green stars?
All in all, not that happy with it yet so I'll probably redo it another 50 times :-)
Edit: Just tweaked the colour balance a bit more since it was too yellow.
Last edited by codemonkey; 20-07-2015 at 07:44 PM.
I think its great for the beginnings in LRGB imaging. I remember when I went from a Qhy8 to an st8300m and filters a few years back I found the transition really really difficult to grasp. It took a few months to get my groove going with it. Once you find what works for you, then you start to see the benefits of the extra work in LRGB imaging.
The theory is that ColorCalibration should only be applied to linear data. Might be worth trying to see if it helps... Getting good LRGB colour is a challenge at the best of times
I believe it is best to get the colour the way you want it while it is linear. Don't ask me why it's better to do it linear than not, just what I've heard! I am at the same stage as you codemonkey :-)
I think its great for the beginnings in LRGB imaging. I remember when I went from a Qhy8 to an st8300m and filters a few years back I found the transition really really difficult to grasp. It took a few months to get my groove going with it. Once you find what works for you, then you start to see the benefits of the extra work in LRGB imaging.
Cheers Alex :-)
I didn't really spend that much time doing OSC with a dSLR before moving to a mono CCD. But it's been a long time with lots of equipment issues, so almost all of my (limited) stuff has been mono, because I always get L and then figure I'll get RGB later, but later never comes.
All good fun though, at the early point of a learning curve... best place to be!
Quote:
Originally Posted by RickS
Lee,
The theory is that ColorCalibration should only be applied to linear data. Might be worth trying to see if it helps... Getting good LRGB colour is a challenge at the best of times
Cheers,
Rick
Quote:
Originally Posted by Atmos
I believe it is best to get the colour the way you want it while it is linear. Don't ask me why it's better to do it linear than not, just what I've heard! I am at the same stage as you codemonkey :-)
Thanks Rick & Atmos! I did try to do colour calibration with the linear data but it went from having a strong yellow hue over everything to having a slightly more olive, but still mostly yellow, strong hue over everything... so I tried it with the non-linear data and that seemed to work better.
I'll have another crack at calibrating the colour when it's linear and see how I go... must be some setting I didn't have right.
Quote:
Originally Posted by ghsmith45
Just downloaded it now Lee. Data looks pretty good. Will have a go at it tomorrow since I have no more outstanding data of my own.
Geoff
Edit: Just had a quick run through it. I think it has lots of potential. Will do it more carefully tomorrow.
Thanks Geoff! Having you say "lots of potential" is a pretty big compliment given the work I've seen of yours on Astrobin. Looking forward to seeing what you can do with it.
I was able to pull out some good detail of the original L data, probably the best I've managed with a broadband subject... but trying to do it with LRGB has left me scratching my head.
The colour is extremely muted after I combine and when I try to tweak the data it just washes it out more... and then when I up the saturation it makes the data look bad, and the colour look cartoonish. Lots to learn when it comes to processing colour... looking forward to the challenge
Here's the mono version I did with the original L a while ago. Adding in the synthetic L dropped the noise by roughly 1/3 so there's a decent amount of detail in there I think, if you're actually competent in processing.
Just remember. Luminance is everything. Get it bright, deep, sharp and detailed. Colour needn't be too extreme. My best image with my 8300 was about 10hrs luminance and 3x5min of each colour.
A LRGBCombination of R/G/B followed by BackgroundNeutralization and ColorCalibration using a preview of the galaxy as the white reference (Structure Detection disabled) gave quite good colour for me...
I'll have a quick play and see what I can come up with...
A LRGBCombination of R/G/B followed by BackgroundNeutralization and ColorCalibration using a preview of the galaxy as the white reference (Structure Detection disabled) gave quite good colour for me...
I'll have a quick play and see what I can come up with...
Cheers,
Rick.
Ah, that's interesting. I was using the galaxy as a white reference, but with structure detection enabled. I was also using a preview for the background reference.
I noticed a subtle (to my eye anyway) gradient when viewing with STF and I extracted that using DBE... interestingly enough the colour balance was pretty damn good after that without even running colour calibration. Didn't expect that...
Quote:
Originally Posted by RickS
A quick process in PI...
Looking good! Thanks for that Rick, and I appreciate the detailed write-up too. I'll be going over that and picking up some things which I'll no doubt include in my workflow going forward :-)
Ah, that's interesting. I was using the galaxy as a white reference, but with structure detection enabled. I was also using a preview for the background reference.
Structure detection is appropriate if you're using star colour for the white reference but not for a galaxy where you want the whole object considered.
Quote:
Originally Posted by codemonkey
I noticed a subtle (to my eye anyway) gradient when viewing with STF and I extracted that using DBE... interestingly enough the colour balance was pretty damn good after that without even running colour calibration. Didn't expect that...
DBE can do a pretty effective background neutralization
OK Lee, here we go:
I didn't use your luminance because I didn't know how you'd done it. I generated a new synthetic L from your R, G and B and used that instead.
--Load R, G and B images. Do a LinearFit with G as the reference.
--ChannelCombination
--CloneStamp to remove stray red, green and blue pixels.
--DBE
--BackgroundNeutralization
--ColorCalibration with M83 as the white reference
--RGBWorkingSpace with colour coefficients set to 1 and gamma set to one. The reason for doing this is to extract a linear luminance image in which R, G and B all contribute equally.
-- Image > Extract > Lightness (CIE L*). I used this as my synthetic Luminance. Stretch with HT
--Clone and clip synthetic L for use as a mask for the RGB image
--Back to RGB image.
--Stretch with HT
--Boost saturation with Curves, using the clipped synthetic L to mask the background.
--Noise reduction with ACDNR, using the clipped synthetic L to mask the bright areas. Go strong on the chrominance noise.
--SCNR to remove green cast.
--LRGBCombination
--Masked saturation boost with Curves
--SCNR
--Convert to sRGB for web display.
The image would probably benefit from a deconvolution as well as a light noise reduction on the final image.
Geoff
Most of the data I had was in that L, the RGB had limited time put into it so you pulled out a pretty good image from the worst of the data.
In going over these processes, both yourself and Rick mentioned a clipped luminance mask... can either of you spell that out for me a bit?
I never knew about linear fit either, that's perfect... I used masked stretch in mine because I didn't know that existed.
One other thing I discovered is that the saturation slider on the LRGBCombination tool is backwards, and I've been able to get much better colour now using that tool rather than trying to adjust it after the fact with curves.
I've updated my original post with my newest processing revision... much better now I think, though it's still far from perfect.
In going over these processes, both yourself and Rick mentioned a clipped luminance mask... can either of you spell that out for me a bit?
I never knew about linear fit either, that's perfect... I used masked stretch in mine because I didn't know that existed.
One other thing I discovered is that the saturation slider on the LRGBCombination tool is backwards, and I've been able to get much better colour now using that tool rather than trying to adjust it after the fact with curves.
Lee,
A clipped luminance mask is used when you want to apply a process to the brighter parts of the image more strongly and not to the dim background at all. A couple of typical examples are when you are boosting saturation or sharpening. These are not things you want to do to the background!
To create a clipped Lum mask you need some Luminance captured with an L filter or extracted from an RGB image (CIE L* or CIE Y component will do, but do a RGBWS first to set the luminance coefficients all to 1.) If this data is linear then you'll need to stretch it. The final step is to black clip it with HistogramTransformation so that the background area is all black (use a dynamic preview to adjust the black point.)
Apply this mask to your image and it will protect the background areas and apply the process more strongly to the bright areas.
LinearFit isn't a replacement for a stretch but it does allow you to scale an image to approximately match the brightness of another one. This is useful for matching narrowband masters and many other applications.
And, yes, the Lightness and Saturation functions in LRGBCombination seem to work the wrong way around if you don't understand that they are MTF functions like the midtone balance in HistogramTransfer (moving them to the left is a boost, moving to the right is a reduction.)
Hi Lee
I misunderstood your data. I thought you only had RGB and that the L you posted was a synthetic L Didn't realise you had a real L. So what I did only came from your RGB data. Will have another go tomorrow using your L
Geoff
Damn you PI freaks! I look so incompetent with CCDStack, Photoshop. Layers and Masks, and plenty of them.
Colour combine in CCDStack. Used two RGB versions. Same colour balance for both but one with a high saturation. Applied while data is linear.
DDP stretched the Lum and colour masters. All exported as tiffs into photoshop. Conventional LRGB combine. While tempted to go into PI for some DBE, I opted to use Russ' gradientXterminator plugin. Applied blue photo filter as a mask to highlight the starburst regions. highpass mask on luminance to tighten. Inverse mask for noise control, I should have gone harder on reflection as the background is still a mess but this was a quick and dirty process. Tempted to liquify some of the bloated stars to tighten but decided to save them from implosion.
I actually have an M83 data set from 16" ASA that I'm yet to process. Will look into it shortly.
A clipped luminance mask is used when you want to apply a process to the brighter parts of the image more strongly and not to the dim background at all. A couple of typical examples are when you are boosting saturation or sharpening. These are not things you want to do to the background!
To create a clipped Lum mask you need some Luminance captured with an L filter or extracted from an RGB image (CIE L* or CIE Y component will do, but do a RGBWS first to set the luminance coefficients all to 1.) If this data is linear then you'll need to stretch it. The final step is to black clip it with HistogramTransformation so that the background area is all black (use a dynamic preview to adjust the black point.)
Apply this mask to your image and it will protect the background areas and apply the process more strongly to the bright areas.
Aha, I see, thanks Rick! I was thinking it was a white-clipped rather than black-clipped mask and was wondering how you guys were creating them.
Quote:
Originally Posted by RickS
LinearFit isn't a replacement for a stretch but it does allow you to scale an image to approximately match the brightness of another one. This is useful for matching narrowband masters and many other applications.
Yeah, I used masked stretch for essentially that purpose because you can set a target background value so I used it to normalise the RGB, even though I didn't really want to stretch it.
Quote:
Originally Posted by RickS
And, yes, the Lightness and Saturation functions in LRGBCombination seem to work the wrong way around if you don't understand that they are MTF functions like the midtone balance in HistogramTransfer (moving them to the left is a boost, moving to the right is a reduction.)
Cheers,
Rick.
Eh, I dunno. Sliders have a convention where moving to the right increases value/effect, and the left decreases it. It'd be simple for them to abstract that detail from the user and keep the UI consistent with the rest of the world.
Quote:
Originally Posted by ghsmith45
Hi Lee
I misunderstood your data. I thought you only had RGB and that the L you posted was a synthetic L Didn't realise you had a real L. So what I did only came from your RGB data. Will have another go tomorrow using your L
Geoff
Ah cool. The L data provided is a bit over 2hrs pure L, but with the RGB added in; I simply used L+R+G+B in PixelMath and rescaled the result; maybe there's better ways of doing that.
Quote:
Originally Posted by Rex
This is a great thread Lee and your data looks awesome. Sitting back watching and learning heaps.
Thanks Rex! My data is a bit soft but it's about the best I've managed so far.
Quote:
Originally Posted by jase
Damn you PI freaks! I look so incompetent with CCDStack, Photoshop. Layers and Masks, and plenty of them.
Colour combine in CCDStack. Used two RGB versions. Same colour balance for both but one with a high saturation. Applied while data is linear.
DDP stretched the Lum and colour masters. All exported as tiffs into photoshop. Conventional LRGB combine. While tempted to go into PI for some DBE, I opted to use Russ' gradientXterminator plugin. Applied blue photo filter as a mask to highlight the starburst regions. highpass mask on luminance to tighten. Inverse mask for noise control, I should have gone harder on reflection as the background is still a mess but this was a quick and dirty process. Tempted to liquify some of the bloated stars to tighten but decided to save them from implosion.
I actually have an M83 data set from 16" ASA that I'm yet to process. Will look into it shortly.
Nice one Jase, particularly for one of you incompetent Photoshop guys ;-) But hey, the right tool for the right job and sometimes the right tool is the one you know how to use. Looking forward to see your M83 soon!
Here's my next version Lee. As I said above, I misread your post and assumed you only had a synthetic luminance from your RGB data, so my previous attempt was based only on your RGB data. I used pretty much the same processing in this one. And of course one of the good things about PI is that you can save your work as a project, so when you reload the next day all the stuff you did before (images, history) is still there to be further processed. No lost work.
Differences were:
-- A deconvolution on your (real) luminance
--LRGBCombination with the real luminance rather than the synthetic one (which I still used as a mask)
--Some sharpening with AtrousWaveletTransform.
--Noise reduction withTVGDenoise
--Upped the saturation a bit with CurvesTransformation
--The odd Histogram tweak here and there.
All up this has been an interesting thread. I learned a couple of new tricks from PI Guru Rick's workflow above.
Geoff