View Full Version here: : More Resolution from Your Sensor
avandonk
09-04-2010, 10:52 AM
You can get more resolution from your sensor if you dither between exposures and upsize the images before stacking.
In the case of my Canon 5DH with a pixel size of 8.2 micron and a 300mm lens at f/3.5 the Airy disc is 4.5 microns in diameter at green wavelengths. So theoretically if the lens was perfect i.e. diffraction limited the lens has more resolution than the sensor. The situation is actually worse than this because of the Bayer Matrix and Anti Aliasing filter.
For dim stars your optical system actually does better than the diffraction limit as the Airy Disc is not uniform and is peaked.
I upsized by a factor of 1.6 before stacking each exposure stack.
I further enhanced the final HDR image with slight star reduction and RL enhancement after enlarging further by a factor of 1.4. This is a total of enlarging from the native pixel size of the sensor by a factor of 2.24. In sensor size it is the equivalent of 12.7MP to 64MP!
Here is the final image downsized to the native pixel dimensions. 4000x2655 pixels 6MB
http://d1355990.i49.quadrahosting.com.au/2010_04/carnehdrSN.jpg
Here is the full size enhanced image 8834x5864 pixels 18.7MB
http://d1355990.i49.quadrahosting.com.au/2010_04/carnehdrM.jpg
If you want to get a maximum quality version of the full size (for a jpg) here is a 40MB version
http://d1355990.i49.quadrahosting.com.au/2010_04/carnehdrL.jpg
You will have to zoom in using your favourite image viewer on the downloaded images to see the vast improvement.
I also used a new version of EasyHDR to redo the image. The controls are now better and easier to use.
Bert
Mighty_oz
09-04-2010, 04:58 PM
Fantastic just what the Dr ordered :) I was just wondering what your high res versions would look like, after just going over your Centaurus pair again :)
TYVM :)
Bassnut
09-04-2010, 06:03 PM
Thats very interesting Bert. I often upsample in PS before processing to great effect, but havent tried before stacking. The last image is indeed a huge improvement.
How do you handle stacking large numbers of huge upsampled DSLR images though, your RAM size would have to be obscene ?. Or do you stack with IP, ala off disk?.
Despite this counter intuative process ie "cant get detail that aint there" I found processing "recovers" detail that is buried and arguably retrieved by up sampling, as you have shown.
telecasterguru
09-04-2010, 06:14 PM
Bert,
A most interesting process and I am not sure I understand all that is happening.
I must admit though that the image is amazing.
Can you explain in lay terms exactly what your process is? Easy HDR works on stacking a number of different exposure lengths to extract all of the light possible from the images doesn't it?
Frank
multiweb
09-04-2010, 06:41 PM
I understand upsampling prior to processing in PS can help with specific things such as sharpening, reducing stars because you have more pixels to play with but I'm not sure how you get more details out of the data? I mean isn't deconvolution algorithm what's giving you more details in the nebs? Working it's way inwards? Besides tightening the stars too.
Bassnut
09-04-2010, 06:58 PM
Well, thats a good question Marc, often wondered and have experimented muchly, it is counterintuative, but I swear emperically anyway, upsampling works, and "appears" to increase detail (other than just haveing more pixels to play with, or im kidding myself and thats the only effect, ill admit that).
I understand how decon works, it mathematically reverses a known blur process, which doesnt have much to do with upsampling granted.
Perhaps the upsampling allows all other processing to procceed more accurately and the "more pixels to play with" does actually increase detail, not just a visual effect.
Anyway, visual effect in the end is what we are after, so regardless, works for me ;).
multiweb
09-04-2010, 07:07 PM
Actually now I think about it I always upscale my raws because when I debayer CCDStack interpolates each channel to the native resolution. Green looks best because I have to G in the bayer matrix and R & B are sligthly less sharp.
Bassnut
09-04-2010, 07:16 PM
And, upsampling before stacking is a whole subject itself. I imagine that would improve accurracy and improve noise reduction (with more pixels to play with). Berts idea of dithering then upsampling makes sense.
In fact , and im not sure how to express this, if you could capture into a larger file size than the image being captured, then with dithering, you are sampling a much higher res, by "filling" the "gaps" in the larger file size with repeated exposures. Does that make sense?. A bit like a mosaic, only at the pixel level.
multiweb
09-04-2010, 07:19 PM
Well actually you're right because I always got better result doing noise rejection by upsampling the red in Ha rather than binning 2x2. And I always dither. I mean always. I make a point of it. It makes a hell of a difference for me.
avandonk
09-04-2010, 07:22 PM
I have 12GB of RAM with an intel i720. ImagesPlus is now 64 bit. Would you believe I can open 40 fits images from the 5DH and manipulate the lot.
I could go into a rave about the mathematics but Ican assure you it is all real.
The image is the proof!
Bert
Bassnut
09-04-2010, 07:30 PM
hehe, thats obscene, way over top, just love it. Im tempted to go 64bit and 12meg RAM just for processing 100s of upsampled subs.
Yes, you sure have the proof Bert, nice work.
Bassnut
09-04-2010, 07:39 PM
Well, I found capturing bin2 then oversampling gave the signal and res increase required with less noise.
When you say "rather than bin2", do you mean in cam bin2 or software bin2?, software bin2 has no advantage noise wise. Cam bin2 reduces read noise.
avandonk
09-04-2010, 07:40 PM
OK there are a few factors at work here. By dithering you immediately improve your resolution by a factor of one over root two. This is due to oversampling. More frames better definition.
By upsizing (always use bicubic) before stacking you will get better resolution merely because the arithmetic is better. Note dithering when collecting is what makes this work.
The lens has far better resolution than the sensor so we are only getting back something that already exists. You cannot make a lousy optic better.
I could go further but you get the idea. With star reduction algorithms and RL enhancement we have made the blocky 12.7 MP sensor perform
like a 64MP sensor.
All of this is mathematically valid.
It is the oversampling that is the secret. Not only does it increase signal to noise but resolution as well.
Bert
multiweb
09-04-2010, 07:45 PM
Cam bin2x2. I get very noisy pictures. Probably because I collect all of it from the Gs & B when I do Ha. So I upsample Ha(red) alone then reject and stack.
multiweb
09-04-2010, 07:47 PM
Cool - I wasn't aware dithering was so important for resolution as well. I did it mainly for better data rejection and hot pixels when stacking.
Bassnut
09-04-2010, 08:05 PM
Yes, I would like the math Bert, even though I will struggle with it.
Dithering, as I understand, is primarily to eliminate sensor defects.
Even if you upsample, stacking will just align subs (canceling the dither)and produce the same results as a no-fault sensor sub.
I see dithering increasing res as say if a sub was captured at 4 times (or more) the image res to a fixed non-dithered file so that sub pixels "filled up" a larger pixel space, which is not possible now (that I know of).
Not doubting your math Bert, just asking :thumbsup:
avandonk
09-04-2010, 08:24 PM
I will write a full mathematical explanation tomorow Fred.
The data for this image has been pushed.
Your interest of course is great as you can improve your resolution I estimate at least by a factor of nearly two just by carefull data aquisition and processing.
Fred your system is limited by seeing. I will have to think carefully how I can improve your already scary resolutions.
Bert
Hey Bert, nice info. I routinely upsample my raw planetary data before feeding it into registax etc, I read somewhere that the sampling theorem lets you get something like 2x the native resolution of your sensor if you have a lot of frames and the target is moved a little on each one :-)
Upsampling before processing certainly helps me, good to see it here as well.
cheers, Bird
bojan
09-04-2010, 09:02 PM
So, that means bad tracking is actually a blessing in disguise ;) (just joking.. I see what you mean)
kinetic
10-04-2010, 09:17 AM
Bert, all,
that's why I always hold on to old data :)
I've been doing the upsample to my DSI II stuff for a while now
pre-stack because it seems to work, but only on good nights with
lots of subs, as has been mentioned.
I don't really understand the maths but I know it works for the
planetary guys as Bird says, so why shouldn't it work for LX
The DSI software has the dither function as an option and explains
a bit about star overlap over a pixel grid and interpolation.
Here is an example of my results when I upsample 2x with a Bspline
algorith before stack.
Even though the input raws are done on a night of EXCELLENT seeing
it only works when there is some slight change in frames due to tracking
and seeing in combination. I don't know if it is a combination of
the focal length, sub length (5s in this case) and arc sec per pixel
but the result definitely improves fine detail.
Steve
avandonk
12-04-2010, 06:16 AM
The sensor is sampling the image at the pixel spacing along the sensor edges. At 45 degrees it is sampling at 1.414 X pixel spacing. Dithering randomly then gives you sampling along the diagonals at the pixel spacing. With more frames the image is sampled uniformly at the pixel spacing and then with stacking the resolution actually exceeds the pixel spacing. Note real attainable resolution is twice pixel spacing due to Niquist Theorem. In reality it is a bit worse than this.
In the case of the 300mm lens the stars are undersampled so any enhancement without dithering will result in lovely square stars. Upsizing does not help as the squares just get bigger.
Here is an animated gif showing the difference with dithered and not. 160k
http://d1355990.i49.quadrahosting.com.au/2010_04/card.gif
The effect is more dramatic than reality as the conditions are not identical. I have looked for data that was not dithered to compare to. The only definitive way to do it is to collect a set of exposures without dithering and with on the same night so all other conditions are the same.
Below is a crop of a single frame at native pixel size from the dithered set of corrected tiffs. Note how the stars are square or very blocky. The second image is what it looks like upsized X1.6 by cubic interpolation.
The third is twenty stacked upsized dithered frames. I think the improvement is obvious. The stars are far rounder in the stacked image especially the small dim ones.
These frames were all screen captures so that the pixels in the images did not change from what was actually there.
Bert
avandonk
12-04-2010, 06:19 AM
I use Guidemaster which can offset the guide star between exposures randomly by a specified number of pixels of the guide camera. It then guides atthe new position of the guide star.
Here is an animated GIF of 200% crops of jpg's straight out of the canera of the exact same area of sensor 3.8MB
http://d1355990.i49.quadrahosting.com.au/2010_04/crop1.gif
You can see the hot pixels do not move.
Dithering also really helps with noise reduction.
Below is a stack of these ten crops, Notice the hot pixels are gone even without correcting for darks. And the same stack with levels adjusted. Note this was done with cruddy 8 bit jpg's.
By moving by many pixels it also smooths out any noise due to the sensor. It is pointless stacking the same noisy pixel on top of itself as you just enhance the noise or the 'hole' produced by over subtraction of noise by temperature mismatched darks.
Bert
avandonk
12-04-2010, 08:01 AM
This might be better to show what is happening.
First image is a crop of a single frame at native pixel size.
Second is a stack of twenty dithered of these stacked in DSS
Third image is a crop of a single frame at X1.6 cubic interpolation of the first frame.
Fourth is a stack of twenty dithered of these stacked in DSS
They are all screen captures to show pixel structure.
Bert
bojan
12-04-2010, 08:23 AM
Bert,
This fantastic algorithm should be "patented" :-)
Actually, it may be worth talking to IRIS and/or DSS developers... to incorporate this functionality into software (or to develop script), to automate the procedure.
Once more, fantastic !!
avandonk
12-04-2010, 09:07 AM
Bojan it is too late to patent even if I wanted to as it is now in the public domain. I would be interested to see how this works at shorter focal lengths for real wide fields. Too much to do and not enough clear skies. I am retired now and do not ever want to go through the patent process again. Even with a patent a big rich company can send you broke by eternal litigation. Unless you have deep pockets it is better not to patent. Just keep quiet and die with the secret.
My ethos is to put back in as much as I can into the astronomical community as I have taken far more than I can ever give back.
That last set of images really shows what is happening without invoking any mathematics. I always knew the 300mm lens was better than my sensor. In hindsight the solution is now obvious. It was just a matter of the getting the correct tools and skills.
Bert
bojan
12-04-2010, 09:14 AM
Perhaps I should have put this word in parentheses.. :D It's been corrected now.
I think my 100mm and 50mm Canon FD SSC lenses are better than they look like at first glance as well..
Perhaps when I find some time I will try your method on images taken with them (and, I first of all, have to take them .. )
Octane
12-04-2010, 09:18 AM
Isn't this the same as "drizzling" or am I on the wrong planet?
H
avandonk
12-04-2010, 09:30 AM
Yes it is mathematically. It is a better practical method as it also corrects for the defects in one shot colour sensors that are not cooled to -40C as CCD's are due to their poor performance anywhere near room temperature.
There is really nothing new it is just a method that does it all better.
Some people advocate subpixel dithering. By this the always present drift is sufficient. For reasons already stated it is better to dither by many pixels.
I should note again I correct for flats and darks with data in fits files. I then also stretch in the fits files. It is only then I interpolate to the TIFF files. This gets data that is far less corrupted or riddled with artefacts..
Bert
Bassnut
12-04-2010, 11:26 AM
Excellent and very interesting Bert. Thanks for putting the time into this presentation.
Martin Pugh
12-04-2010, 01:11 PM
Hi Bert
excellent work and presentation.
I have just posted an image of the fox fur - 7.5 hours - so, let me try this procedure and post the result.
So, to get it right: My images are calibrated and aligned (they were dithered during capture).
I will upscale 2x each image (I suspect I can use Photoshop or CCDSTACK or Maxim to do this) and then stack.
I can then make a direct comparison with the image I processed tonight, without the upscaling.
cheers
Martin
allan gould
12-04-2010, 01:48 PM
Bert
Is there a program you use to batch upsize your files before stacking etc
Thanks
Allan
Martin Pugh
12-04-2010, 02:00 PM
Hi Bert
well, my initial test was unsuccessful.
The resultant FWHM of stars in the sum-combined image (consisting of upscaled images before the combination), was larger by a factor of 33%..and visibly obvious when blinking the images. (This is FSQ/STX data, 3.5asp, so well undersampled).
I used CCDSTACK and a quadratic bispline algorithim to upscale, then I used the same procedure to downscale the image at the end. I have heard that a bispline algorithim smears light in stars - which could explain it.
Thoughts?
cheers
Martin
avandonk
12-04-2010, 02:41 PM
What do you mean by sum combined Martin? To get any sort of enhancement you need at least median combined.How many images did you stack?
I use ImagesPlus to upsize by a factor of 1.6 using bicubic interpolation. I then stack with DSS using Median Kappa-Sigma.
Is the BMP image you put up at native pixel size?
Bert
allan gould
12-04-2010, 09:21 PM
Bert
I tried your technique by resizing with nebulosity before stacking in DSS (No drizzle in DSS Stacking- second photo) and compared it to stacking and drizzle resizing with DSS (first photo). There is definately better retention of fine detail which can be brought out by your technique than what I was previously using. Many thanks Bert as i'm now a convert.
Allan
avandonk
13-04-2010, 08:17 AM
Allan I would be interested in seeing 100% crops. What was the focal length and pixel size. The real test is if other people find that it works for them and is not just a figment of my constant cloud addled mind.
Bert
Terry B
13-04-2010, 09:13 AM
I assume that this technique is mostly useful with undersampled images just like "drizel"
It makes little differenc if you have well or oversampled mages.
allan gould
13-04-2010, 09:20 AM
Bert
Dont know if this answers you question but the fl of the imaging system was 1600mm and pixel size 5.4u. The photos were 100% full-frames presented.
Allan
bojan
13-04-2010, 09:38 AM
I think Bert wanted to see 100% sized crops (so that every original pixel is visible) and not 100% frame (which is originally 3888 x 2592 but in your presentation it is scaled down sigificantly)
Martin Pugh
13-04-2010, 10:38 AM
Bert
I summed-combined the images, because I wanted to make a direct comparison with the image I posted yesterday, which was sum combined.
Regardless, I can go back and repeat with a median combine.
I have 23 images in the stack (and even my 64bit 8Gb RAM machine nearly died in the process)
The image I posted yesterday, is at the native resolution - but I did not post my result from my attempt at your technique last night.
I have DSS....so I will upscale using CCDSTACK at 1.6 times and use bicubic, then use DSS and the algorithim you list to median combine.
let you know how it goes.
cheers
Martin
Martin Pugh
13-04-2010, 11:50 AM
Bert
I failed again!
I upscaled by 1.6x using CCDSTACK (bicubic), but I could not median combine in DSS, it crashed everytime.
So I used sigma clipping combine in Maxim, but the result was the same as posted earlier - i.e bloated FWHM.
Still trying other methods for now
cheers
Martin
tonybarry
02-05-2010, 09:38 PM
I read this thread with interest. If I understand correctly, this is Drizzle stacking in essence - the optical system has a certain resolution, the CCD undersamples this resolution, so by moving the CCD slightly between exposures, the stacking process can take advantage of a higher (optical) resolution. For Drizzle, the subs were supposed to be progressively rotated for the maximum effect (field rotation does well here).
However, if I am in error, I would be pleased to have a correction.
Regards,
Tony Barry
vBulletin® v3.8.7, Copyright ©2000-2025, vBulletin Solutions, Inc.