Finally getting some planeray images I'm happy with
While seeing wasn't great from my home in inner-western Brisbane last night (13 April), it was nice and dark and clear, so I set up my 200 mm LX90 (GoTo Alt-Az mount) with ZWO ASI120MC (one-shot colour), and fired off a few 30-second video captures with FireCapture, and then stacked and sharpened in AutoStakkert.
Jupiter was nice and high - and the GRS came into view nicely later in the evening. Mars and Saturn were low in the sky, sitting in the glare of Brisbane City to the east (and directly over my roof), so I hope to get some better shots in a couple of months when they're a bit higher as well.
My best videos were taken without a Barlow - I just couldn't seem to get quite as sharp focus with my 2x Barlow. Maybe I need to try focussing with a Bahtinov mask on a nearby star before pointing at my target?
These are the best of last night's efforts - they may not rival Hubble (or other IIS members!), but I'm pretty happy with them.
As for the barlow, what one are you using? Some of the lower quality ones don't have a high enough level of optics in them to pick up the fine details.
As for the barlow, what one are you using? Some of the lower quality ones don't have a high enough level of optics in them to pick up the fine details.
I have an Orion "Shorty" 2x Barlow, which seems to work fine for visual use. The more I think about it, I think the main problem was probably that I was trying to focus visually in less than optimum seeing conditions, so the image was wobbling on the screen quite a bit. Next time, I'll try getting focus on an adjacent star with my Bahtinov mask, and then go back to the planetary target and try again.
May I ask what your cameras resolution & framerate was set at for these videos
The camera is a ZWO ASI120MC, and I was capturing using FireCapture v 2.4. The camera's full-frame resolution is 1280 x 960, but I was using a smaller ROI (Region Of Interest) framed around the target for each capture video.
For each capture, I would start by loading up FireCapture's default profile for the planet in question (this pre-loads some default exposure settings, etc), and then I would adjust the exposure until it looked "about right" on my computer's screen - I don't know whether my exposure settings (Gain / Exposure / Gamma) are anywhere near optimum for my rig.
Jupiter was typically shot with 5.5 ms exposures with gain at 50%, giving about 59 FPS on average (frames per second); Mars was shot at 2.8 ms (70 FPS); Saturn was 22.6 ms (43 FPS).
Actual capture frame-rate varied a bit between consecutive video captures - I think this is because this camera runs on a USB2.0 cable, and the data interface speed can be a limiting factor.
The Jupiter shots could use a little less green (or more red and blue), but nicely detailed.
Your exposure times don't make sense to me... 5.5ms works out at 181fps whereas 60fps should be 16.6ms.
Thanks!
The colours are exactly as captured by the camera - I haven't attempted to do any colour balancing. (And the fact that I'm Red-Green Colour-blind means that if I do start adjusting colours, people might find the results a bit "interesting"! )
I think the USB2.0 interface limits the overall frame rate that can actually be sent between the camera and the computer - the "pipe" can only handle so many bps, and the available number of bits per second depends on the ROI size and the exposure time. The camera can actually go down to as short as 64 microsecond exposures, but it sure won't send 15,000 FPS to the computer!
Yes BPS are interesting also - one thing I have noticed and thunk about a bit is with Dsr terrestrial photo's and I assume the same would apply to any cameras captures and that is when a photo, say a landscape full of colour and good light, then that single photo file is actually significantly larger than say a low light Image with little color, ie: 10MP & say 5-6MP respectively !
Hence, each pixel in the 1st photo has recorded more information and ultimately near 'full well' capacity - so there is more Bits of info overall.
In saying this, if transferring dater at fast rates, I would expect that say a rather bright jupiter image taking up 1/3 - 1/2 of the chip would transfer slower than say Mars which may only be taking up 1/8 or 1/10th of the chip ?
The colours are exactly as captured by the camera - I haven't attempted to do any colour balancing. (And the fact that I'm Red-Green Colour-blind means that if I do start adjusting colours, people might find the results a bit "interesting"! )
I think the USB2.0 interface limits the overall frame rate that can actually be sent between the camera and the computer - the "pipe" can only handle so many bps, and the available number of bits per second depends on the ROI size and the exposure time. The camera can actually go down to as short as 64 microsecond exposures, but it sure won't send 15,000 FPS to the computer!
Registax has an RGB balance "auto" button, it usually does a pretty good job IMO. The cameras have a green push out of the gate because of the Bayer matrix giving 2 green pixels for every 1 red and blue.
The USB 2.0 interface can handle quite a bit...I use my 224 on a USB 2.0-only netbook and can get ~250fps at 320x240, 4ms exposure time. Note those figures, and then re-read my previous post