Please don't take this the wrong way, but what's the point of the extra resolution if the image is noisy, clipped, and white balance off when viewed at full scale?
Please don't take this the wrong way, but what's the point of the extra resolution if the image is noisy, clipped, and white balance off when viewed at full scale?
That is a good question.
Can you find an image this deep anywhere? I just do not know what colour to make it! I have no one to follow as I am so far in front! I must admit when I zoom in the colour is crap. As far as resolution is concerned I may go to presenting my images in very blocky low resolution jpg's that hide a myriad of faults. Then the assembled multitudes can go wow to a mediocre image that panders to their equally mediocre taste.
I just do not care what anyone thinks.
I did not take it the wrong way as I knew exactly what sort of sentiment your comment engendered.
So please do not take this the wrong way when I say you are lacking the knowledge to even pass judgment.
That is some really dim stuff and is difficult to record let alone display with any sort of relation to your run of the mill bright targets. So cosmetically it will not look at all like the classic deep sky images.
Interesting response. You start off by by acknowledging that it's a good question, then spend 5 paragraphs not answering it directly and descend into making assumptions about my intelligence. Clearly you have taken it the wrong way.
Quote:
Originally Posted by avandonk
That is a good question.
Can you find an image this deep anywhere?
There's a difference between "deep" and "over-stretched". Mr Ward hinted at this earlier in the thread.
Quote:
Originally Posted by avandonk
I just do not know what colour to make it!
I agree. It's all subjective and personal taste. Some don't like green in their astro-images, and others do.
Quote:
Originally Posted by avandonk
I have no one to follow as I am so far in front
This must've either been written tongue-in-cheek, or you're serious.
If serious, you could prove it by leading from the front, which sometimes means being a man and listening to constructive criticism instead of throwing it back in people's faces.
I seem to recall reading here somewhere that you're a (retired?) lecturer? Did you react the same way to your students' questions? ie It's your way or the highway? I would have thought a man of academic background would appreciate different views and being challenged.
Quote:
Originally Posted by avandonk
I must admit when I zoom in the colour is crap. As far as resolution is concerned I may go to presenting my images in very blocky low resolution jpg's that hide a myriad of faults. Then the assembled multitudes can go wow to a mediocre image that panders to their equally mediocre taste.
No need for the sarcasm. I'm sure that most images taken with the sort of gear you're wielding would look the same at 1:1.
Again, I suppose it's personal taste. Some post pictures that can be seen whole on one screen and look pleasing to the eye. While others prefer to delve deep and appreciate the noise up close.
Quote:
Originally Posted by avandonk
I just do not care what anyone thinks.
Me neither. That's why I dared to ask you an honest question instead of blindly saying "wonderful image."
Quote:
Originally Posted by avandonk
I did not take it the wrong way as I knew exactly what sort of sentiment your comment engendered.
Quote:
Originally Posted by avandonk
So please do not take this the wrong way when I say you are lacking the knowledge to even pass judgment.
You're entitled to your opinion. And I agree that I don't have the skill, knowledge, and experience of others here. Most here. I'm humble like that.
But where you're wrong is that I'm also entitled to my opinion.
And you're also confusing knowledge about these topics with intelligence.
Quote:
Originally Posted by avandonk
About fifty percent of humanity is below average intelligence. I wish they would stop proving it!
I'll have to check the actual numbers, but there's a certain percentage of grumpy old men that can't take criticism or even a question without over-reacting and berating. I wish they would stop proving it, but they just can't help themselves.
Troy, the technique does deliver extra resolution *if* you are undersampling.
i.e. if the optics are delivering more resolution than than the camera can resolve.
It really doesn't matter if one likes Bert's image or not, for whatever reasons, the important thing to me is does the technique work.
The answer is yes, and the technique is widely utilised in a myriad of signal processing applications.
The math is well known, and the theory is sound.
Anyone can try it themselves, and verify whether the technique produces usable extra resolution.
So to anyone wondering, give it a go. It works for me and resolves extra fine detail. Not jaw dropping but easy to see. Allan has good results and images to show it in the other thread I referenced.
I'm glad Bert shared the technique.
Last edited by Poita; 01-02-2013 at 06:34 PM.
Reason: gettting my overs and unders mixed up again!
Troy, the technique does deliver extra resolution *if* you are oversampling.
i.e. if the optics are delivering more resolution than than the camera can resolve.
Oversampling is the camera at a higher resolution than the seeing/optics can resolve, Undersampling is optics are delivering more resolution than than the camera can resolve.
Oversampling is the camera at a higher resolution than the seeing/optics can resolve, Undersampling is optics are delivering more resolution than than the camera can resolve.
Brain-fart moment, I just came back from my 3hr math exam and am still a bit addled. Thanks for the catch, I've fixed it now.
You certainly got a lot of detail there. Not sure what the total exposure time of this image is. More data could smooth it out more and reveal even more. The RH200 is proving itself to be a real alternative to APO's perhaps more so than other compound scopes. There's some smart people around.
A physicist has some washing that needs doing, and sees a sign in a shop shop-window down the road. "Laundry done here"
He hauls the washing into the shop. The shop-owner (by co-incidence named Albert) is a Mathematician. The Physicist asked when he can have his washing done?
"Oh no!" "We just make signs! "
The point of my tale?
While the math works, I'd suggest it ignores the elephant in the room.... ...atmospheric turbulence...
Turbulence spreading light beyond a pixels location is why this works. With perfect seeing it would fail, and you'd need to move the sensor (i.e. Drizzle)
Improving the S/N ratio by stacking images (assuming random noise) and getting multiple samples of a stellar location (effectively averaging its location to a precision less than a single pixel) makes the difference....
... but I still fail to see how, the application of a constant (i.e make pixel X bigger by 1.5) improves anything with floating point calculations that follow.
I suspect a 1.5 pixel Gaussian spread/blur applied prior to stacking would end up looking the same.
Last edited by Peter Ward; 01-02-2013 at 08:05 PM.
Troy, the technique does deliver extra resolution *if* you are undersampling.
i.e. if the optics are delivering more resolution than than the camera can resolve.
It really doesn't matter if one likes Bert's image or not, for whatever reasons, the important thing to me is does the technique work.
The answer is yes, and the technique is widely utilised in a myriad of signal processing applications.
The math is well known, and the theory is sound.
Anyone can try it themselves, and verify whether the technique produces usable extra resolution.
So to anyone wondering, give it a go. It works for me and resolves extra fine detail. Not jaw dropping but easy to see. Allan has good results and images to show it in the other thread I referenced.
I'm glad Bert shared the technique.
I never once questioned the technique or theory. And my original question wasn't meant to imply that I did or didn't like the image. I was simply trying to understand why go to these lengths of getting extra resolution when there's more fundamental things (mainly noise and clipping as I agree that white balance is subjective and personal taste) that could be corrected in the image at normal resolution. Get that right first, then start worrying about resolution.
Again, just trying to understand the "why."
I shouldn't be surprised. I asked Bert here in another thread a while ago about why he was choosing to use his NII filter instead of Ha while he was still testing and trying to get flat fields. My reasoning for the question was innocent, just trying to understand.
His response was that many object like the Helix nebula emit strong NII, and that Ha filters wider than 3nm also pick up the NII spectrum. Correct statement, and I already knew that. But it didn't answer my question, and he wasn't imaging Helix, he was imaging Eta Carinae nebula or something that was strong Ha not NII.
I'm wondering if he was using Ha instead of NII he'd be getting stronger signal/noise and he wouldn't have to stretch so much.
Again. I'm no expert. That's why I'm asking the questions. To understand, not antagonise.
I wonder, on a purely scientific level if the N2 band does indeed stop the HA light at F3. The 2 lines are so close together it would not be impossible for the shift to move the line to incorporate part of the HA emission
Originally Posted by Avandonk About fifty percent of humanity is below average intelligence. I wish they would stop proving it!
A physicist has some washing that needs doing, and sees a sign in a shop shop-window down the road. "Laundry done here"
He hauls the washing into the shop. The shop-owner (by co-incidence named Albert) is a Mathematician. The Physicist asked when he can have his washing done?
"Oh no!" "We just make signs! "
The point of my tale?
While the math works, I'd suggest it ignores the elephant in the room.... ...atmospheric turbulence...
Turbulence spreading light beyond a pixels location is why this works. With perfect seeing it would fail, and you'd need to move the sensor (i.e. Drizzle)
Improving the S/N ratio by stacking images (assuming random noise) and getting multiple samples of a stellar location (effectively averaging its location to a precision less than a single pixel) makes the difference....
... but I still fail to see how, the application of a constant (i.e make pixel X bigger by 1.5) improves anything with floating point calculations that follow.
I suspect a 1.5 pixel Gaussian spread/blur applied prior to stacking would end up looking the same.
Peter the images are randomly dithered so all detail is sampled at a finer mesh than the pixel size of the sensor. It would be useful to rotate the sensor as well only to eliminate the factor of nearly root two along the diagonals of the pixels.
I just cannot see why you find this so difficult to understand.
If the seeing is perfect then this will give a resolution of about four seconds of arc rather than a bit over six seconds of arc. The best a single image with this optic and camera can do is 3.1 seconds of arc per pixel which is really a resolution of 6.2 seconds of arc.
All I am claiming is that at best my resolution is four seconds of arc for all practical puposes.
It is even worse than this even if seeing is about 2 seconds of arc. Long exposures and mount tracking errors of even 1 second of arc rms will add to this blurring.
All this means that it is not just the undersampling of the sensor that is the limit of the final resolution. By using this method real practical gains can be made in resolution. After all the best that can be done is limited by the seeing and guiding errors.
The PMX rms guiding errors are less than one second of arc. So on a night of typical seeing say two seconds of arc my system can image to a resolution limited by seeing and guiding rather than inherent sensor resolution.
My choice of 600mm focal length was quite calculated as I am not really limited by seeing, as even when working at best performance of the system the seeing has to be worse than three seconds of arc to affect my imaging resolution.
Where this is method of resolution enhancement is really useful is when binningx2 to get four times the sensitivity and well depth. Rather than doubling your sensor 'pixels' in size they only go up by a factor of root two.
I never once questioned the technique or theory. And my original question wasn't meant to imply that I did or didn't like the image. I was simply trying to understand why go to these lengths of getting extra resolution when there's more fundamental things (mainly noise and clipping as I agree that white balance is subjective and personal taste) that could be corrected in the image at normal resolution. Get that right first, then start worrying about resolution.
Again, just trying to understand the "why."
I shouldn't be surprised. I asked Bert here in another thread a while ago about why he was choosing to use his NII filter instead of Ha while he was still testing and trying to get flat fields. My reasoning for the question was innocent, just trying to understand.
His response was that many object like the Helix nebula emit strong NII, and that Ha filters wider than 3nm also pick up the NII spectrum. Correct statement, and I already knew that. But it didn't answer my question, and he wasn't imaging Helix, he was imaging Eta Carinae nebula or something that was strong Ha not NII.
I'm wondering if he was using Ha instead of NII he'd be getting stronger signal/noise and he wouldn't have to stretch so much.
Again. I'm no expert. That's why I'm asking the questions. To understand, not antagonise.
Troy I am truly sorry for seeming to be impatient or abrupt. I must read genuine questions more carefully before responding.
Most emission objects are about 50:50 NII:HA. NII emission needs close hot blue stars. I am hoping that by making images in HA and NII I can find some differences.
Leakage due to angular spectral shifts is a very valid question. This can only be resolved by taking many images with the object/s in question at both the centre of the field and the edge at both NII and HA.
Thanks for the reply, Bert. I wasn't aware that it was 50/50 for emission objects. I've always read about the strong Ha, but NII is seldom mentioned other than planetary nebulae in the places I've seen.
Thanks for the reply, Bert. I wasn't aware that it was 50/50 for emission objects. I've always read about the strong Ha, but NII is seldom mentioned other than planetary nebulae in the places I've seen.
Troy what people have been calling HA for many years is really HA and NII lumped together. If their filter was 5nm or greater in bandpass they would be recording both.
Below is a spectral graph of their relative emissions I assume for a mixture of 50:50 N and H as both these spectral lines come from the monatomic atoms not molecules as H2 and N2 exists on Earth. It is a bit more complicated than just two emission lines.
It is only in the vast emptiness between stars that atoms exist without forming molecules and really only interact with the odd passing electron only much later to have a photon from a nearby star ruin the relationship. This always results in a new photon emitted when the electron falls back into the proton. I am sure you would see a naked proton would be more likely to interact with an electron rather than another proton. To form molecules of even H2 you would need two protons and two electrons in the same place due to multiple random collisions. This very rarely happens in deep space.
All the other atomic species such as O and N and S and others will hang on to any electron/s and then start this same process as the naked proton or Hydrogen atom when the incoming photons have the energy to ionise them. It takes far more energy from the incident photon to ionise a large nucleus such as NII or OIII than a puny single proton which is the Hydrogen atom. Why the wavelengths and quanta are nearly the same is because the already accumulated electrons shield the larger nucleus of a O or N so the outer electron has a similar quantised energy difference.
For dust and gas to form into stars and a planetary system it all has to get very cold first so gravity will overcome the escape velocity of all the components be they atoms molecules or tiny bits of dust.
What fascinates me is that this is where we all came from! Uncountable atoms have interacted for billions of years so we can argue!!
I have not come very far from the ape at the Zoo that throws its own excrement at people he/I do not agree with!
For more see the FAQ at Astrodon.
Final note. When contemplating an image of a Super Nova Remnant. You are looking at an image of your real ancient ancestors!
I know about the spectral lines and filters. I looked hard at this when I bought my Astrodons
But that's got nothing to do with what the objects are actually emitting. What I'm asking is how you arrived at the objects are emitting 50/50 Ha and Nii, as opposed to what the filters can pass. From my reading it's more (some?) planetary nebulae than emission like Eta Carinae that emit Nii strongly. It makes sense to me that star forming regions would be stronger in H than N for the very reasons you mention.
I mean, just because you're wearing a baseball glove doesn't mean someone is throwing baseballs at you.
I know about the spectral lines and filters. I looked hard at this when I bought my Astrodons
But that's got nothing to do with what the objects are actually emitting. What I'm asking is how you arrived at the objects are emitting 50/50 Ha and Nii, as opposed to what the filters can pass. From my reading it's more (some?) planetary nebulae than emission like Eta Carinae that emit Nii strongly. It makes sense to me that star forming regions would be stronger in H than N for the very reasons you mention.
I mean, just because you're wearing a baseball glove doesn't mean someone is throwing baseballs at you.
I have done some images in NII and HA and have generally found the relative intensities to be about the same. Just do a search on avandonk and NII and OIII.