ICEINSPACE
Moon Phase
CURRENT MOON
Waxing Crescent 30.6%
|
|

30-01-2013, 02:04 PM
|
 |
Registered User
|
|
Join Date: Oct 2011
Location: U.S.A
Posts: 755
|
|
Quote:
Originally Posted by Poita
I'm assuming he is using a super-resolution technique utilising sub pixel shifts in multiple images.
We do it for feature film reconstruction when negatives are lost. I imagine you could do the same for astro images.
I'd like to know which software though.
http://users.soe.ucsc.edu/~milanfar/...al/SRfinal.pdf
|
I think that is a process called "drizzeling". Not to be confused with dithering.
|

30-01-2013, 08:04 PM
|
 |
avandonk
|
|
Join Date: Aug 2005
Location: Melbourne
Posts: 4,786
|
|
It is simple folks just look up Sampling Theorem and Nyquist Theorem. Shannons Theorem also helps.
This only works mathematically if your optic has inherently better resolution than your sensor. It has nothing to do with fractional pixel movements but a redundancy in sampling. It is immaterial how big the dither steps are as long as they are random.
Your resolution is not your pixel size but actually twice this at best. At forty five degrees to your pixel separation it is actually worse by a factor of nearly root two.
So to make it simple with nine micron pixels at an image scale of 3.1 seconds of arc. A single image has at best 6.2 seconds of arc resolution. With dithering and upsizing this can come back to about 4.4 seconds of arc resolution at best. To display this you need at least 2.2 seconds of arc per pixel! I think if you do the simple calculation that a 6000x6000 pixel image from my system is a tad over 2 seconds of arc per pixel.
Trying for more resolution with smaller pixels say 6 micron will at best give you without dithering about the same. Meanwhile you have given up well depth and dynamic range.
It is all about balance of the variables.
Another factor is that resolution is limited by diffraction and is purely dependant on the F ratio of the optic. It is not dependant on focal length and/or aperture solely.
By the way to try and see a difference in 'resolution' of eight bit jpg's is a futile exercise.
You have to compare the original 200MB 16bit tiffs for that.
Bert
Last edited by avandonk; 30-01-2013 at 09:26 PM.
|

30-01-2013, 08:56 PM
|
 |
avandonk
|
|
Join Date: Aug 2005
Location: Melbourne
Posts: 4,786
|
|
Quote:
Originally Posted by Peter Ward
Bert, it's been great seeing your journey with fast wide field optics and CCD acreage but....
.... I'd suggest you don't push the data so hard..to the point it looks like Tech Pan 2415?
Martin (Pugh) does this sublimely....crafty bugger...but I have no idea how he does it.
anyway I digress... Smooth the data a little: shadows and highlights should show a gradual transition and structure rather than a stark contrast.
Just my 2 cents worth 
|
I am just a beginner Peter! Of course I want to show the bling! Subtlety will come with age. At least that is what my mother told me fifty years ago.
I am getting results that are not quite what I am used to.
Bert
|

30-01-2013, 09:04 PM
|
 |
Galaxy hitchhiking guide
|
|
Join Date: Dec 2007
Location: The Shire
Posts: 8,453
|
|
Quote:
Originally Posted by avandonk
So to make it simple with nine micron pixels at an image scale of 3.1 seconds of arc. A single image has at best 6.2 seconds of arc resolution. .....
Bert
|
OK I'll bite.
Nyquist states you can *re*-construct an analog waveform by
taking small discreet steps to approximate the original smooth wave, by approximately halving the wavelength of your wave as your step size.
This is not the same as mathematically doubling your pixel size and saying 6 arc seconds is all you can resolve.
Two point sources, in perfect seeing 6 arc seconds apart will fill, not one, but two pixels.
But by simple inspection of the image, you can confidently state, with a lone illiminated pixel, nothing lies beyond it's 3 arc-sec patch of sky.
|

30-01-2013, 09:43 PM
|
 |
avandonk
|
|
Join Date: Aug 2005
Location: Melbourne
Posts: 4,786
|
|
Quote:
Originally Posted by Peter Ward
OK I'll bite.
Nyquist states you can *re*-construct an analog waveform by
taking small discreet steps to approximate the original smooth wave, by approximately halving the wavelength of your wave as your step size.
This is not the same as mathematically doubling your pixel size and saying 6 arc seconds is all you can resolve.
Two point sources, in perfect seeing 6 arc seconds apart will fill, not one, but two pixels.
But by simple inspection of the image, you can confidently state, with a lone illiminated pixel, nothing lies beyond it's 3 arc-sec patch of sky.
|
It is not about biting anyone or being bitten. I only went up by a factor of root two. You are also missing the point Peter it is about sampling. This gives a finer mesh than nine micron pixels.
Do you really want me to invoke my old mate Fourier? I can give you a treatise on all of this. It would be a waste of time as anyone who has studied advanced mathematics would be totally familiar with all of this.
Hand waving statements about the state of a single pixel is not only simplistic but totally meaningless.
Bert
|

30-01-2013, 10:50 PM
|
 |
Galaxy hitchhiking guide
|
|
Join Date: Dec 2007
Location: The Shire
Posts: 8,453
|
|
Quote:
Originally Posted by avandonk
It is not about biting anyone or being bitten. I only went up by a factor of root two. You are also missing the point Peter it is about sampling. This gives a finer mesh than nine micron pixels.
Do you really want me to invoke my old mate Fourier? I can give you a treatise on all of this. It would be a waste of time as anyone who has studied advanced mathematics would be totally familiar with all of this.
Hand waving statements about the state of a single pixel is not only simplistic but totally meaningless.
Bert
|
Bert... I did some math at Uni in my previous life...Are you seriously suggesting a 9 micron pixel @ 600mm can't resolve down to 3 arc sec?
Of course it can.
And, sure, in a vacuum (al la Hubble) drizzling can and does work
The point I think you may be missing is: you can't (well, let's say it would be very difficult) model/predict the seeing/turbulence from shot to shot for sub-pixel sampling shifts to contribute any significant improvement to the resolution.
Sure with thousands of frames and some serious super-computing power...maybe......but I'd suggest seeing would still swamp any pixel shift terms.
Hence the two images you have posted look identical in every respect except scale....which is why I'm puzzled as to why you'd bother re-scaling the data.
Last edited by Peter Ward; 30-01-2013 at 11:30 PM.
Reason: typo
|

30-01-2013, 11:31 PM
|
 |
avandonk
|
|
Join Date: Aug 2005
Location: Melbourne
Posts: 4,786
|
|
Peter it was not about the stacked data sets. It was about the resolution of individual frames versus the resolution of the upsized stacked frames.
Bert
|

31-01-2013, 10:47 AM
|
 |
Registered User
|
|
Join Date: Oct 2011
Location: U.S.A
Posts: 755
|
|
Quote:
Originally Posted by avandonk
Peter it was not about the stacked data sets. It was about the resolution of individual frames versus the resolution of the upsized stacked frames.
Bert
|
I don't see any resolution improvement with this techique, just a larger image. However, it has been written that star alignment can be improvied in certain software packages by upsizing the individual sub exposures by 2X prior to the combine, then downsizing to the native resolution after the combine.
j
|

31-01-2013, 10:49 AM
|
 |
Registered User
|
|
Join Date: Oct 2011
Location: U.S.A
Posts: 755
|
|
Hey, there was just a knock at the door. My super 3nm Ha filter from AstroDon has just arrived and it's clear tonight!!! They'll be wishing their mother's never met their father's when I get RH300 results with this.
j
|

31-01-2013, 11:14 AM
|
Registered User
|
|
Join Date: Jun 2011
Location: NSW Country
Posts: 3,586
|
|
Bert, are you talking about introducing dithering to counter the effects of amplitude quantisation, to get a finer than camera pixel level detail? (I think I'm on the wrong track here)
I'm trying to understand exactly what the process is.
|

31-01-2013, 11:27 AM
|
 |
Registered User
|
|
Join Date: Jul 2005
Location: Melbourne Australia
Posts: 957
|
|
My BS detector has gone off the scale on some of the information in this thread.
Lots of fancy mathematical terms being thrown about it without any actual information being delivered.
Love to hear in plain english the process you undertake to deliver the outcome you are describing. Ie what steps in which software package.
I certainly have used the technique of upsizing my final luminance 2x and then running deconvolution on this 2x sized file. This works well on undersampled data and gives nice star shapes as apposed to the diamond shape you can get running deconvolution on smaller stars. You then downsize again after this and combine Roth rgb data.
|

31-01-2013, 11:45 AM
|
Registered User
|
|
Join Date: Jun 2011
Location: NSW Country
Posts: 3,586
|
|
Chris, I get the concept of upscaling images before processing them, you get a finer resolution for the processing to work within. It makes perfect sense and gives noticeably better results.
I'm also familiar with adding dithering to an analogue signal to get better than A/D resolution out of it,(i.e. improving on the effects of quantisation) but I am missing something with what Bert is doing here.
The bit I am missing is what the actual process is. Is the image being dithered and then processed in some way and then returned to its original resolution?
|

31-01-2013, 01:41 PM
|
Registered User
|
|
Join Date: Jun 2011
Location: NSW Country
Posts: 3,586
|
|
I just re-read the thread, Bert is upscaling and dithering before stacking. That makes perfect sense.
|

31-01-2013, 02:15 PM
|
 |
Registered User
|
|
Join Date: Jul 2005
Location: Melbourne Australia
Posts: 957
|
|
Quote:
Originally Posted by Poita
I just re-read the thread, Bert is upscaling and dithering before stacking. That makes perfect sense.
|
How do you dither after image is captured ? Dithering is done between frames while capturing.
|

31-01-2013, 02:42 PM
|
Registered User
|
|
Join Date: Jun 2011
Location: NSW Country
Posts: 3,586
|
|
Quote:
Originally Posted by cventer
How do you dither after image is captured ? Dithering is done between frames while capturing.
|
I assume he is dithering by slight movement between captures, then upscaling the image then stacking and processing, in that order.
Not capturing the image and then injecting noise.
|

31-01-2013, 02:46 PM
|
Registered User
|
|
Join Date: Jun 2011
Location: NSW Country
Posts: 3,586
|
|
|

31-01-2013, 03:28 PM
|
 |
Registered User
|
|
Join Date: Jul 2005
Location: Melbourne Australia
Posts: 957
|
|
Ok I see the technique now. Not sure i am convinced, but will try it myself as I allways dither by several pixels between sub frames.
Only danger with this technique is you would need to reject hot pixels before you upsize as this will blur the hot pixels into looking like stars.
|

31-01-2013, 04:36 PM
|
 |
Highest Observatory in Oz
|
|
Join Date: May 2006
Location: Canberra
Posts: 17,631
|
|
Quote:
Originally Posted by cventer
My BS detector has gone off the scale on some of the information in this thread.
|
|

31-01-2013, 05:37 PM
|
 |
PI cult recruiter
|
|
Join Date: Apr 2010
Location: Brisbane
Posts: 10,584
|
|
Last edited by RickS; 31-01-2013 at 05:39 PM.
Reason: Added additional link
|

01-02-2013, 01:44 PM
|
 |
avandonk
|
|
Join Date: Aug 2005
Location: Melbourne
Posts: 4,786
|
|
It is very simple folks. After collecting frames with dither.
1. Correct for darks and flats.
2. Use the filter thingy in ImagesPlus to get rid of column defects.
3. Upsize by a factor of 1.5. This is close enough to a factor of root two.
4. Stack these upsized images.
You will then find as RickS's post has shown that your resolution is far better than any individual frame.
This only works if your optic has better resolution than what your sensor can resolve.
There are only two rules in our Universe!
1. There is no free lunch.
2. If something sounds to good to be true, it is not.
This is not magic or sleight of hand. It is proven mathematical rigor of data interpretation. You all accept noise reduction by stacking multiple images.
Why is the concept of resolution enhancement so foreign?
Bert
|
Posting Rules
|
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is Off
|
|
|
All times are GMT +10. The time is now 02:57 PM.
|
|