ICEINSPACE
Moon Phase
CURRENT MOON
Waxing Gibbous 85.2%
|
|

10-01-2015, 01:33 PM
|
 |
Highest Observatory in Oz
|
|
Join Date: May 2006
Location: Canberra
Posts: 17,691
|
|
Quote:
Originally Posted by Paul Haese
You have a few basic misunderstandings here Mike.
|
Ooooh, o-k  thanks for that Paul
Quote:
Everything in the end boils down to your individual ambitions and budget, but ultimately this is not a job but an adventure in producing art and showing the night sky as I see it.
|
Of course and there are many ways to skin a cat too I guess, but to call all images that are not composed of days of exposure, a turd, is still a little...absurd
Mike
|

10-01-2015, 01:50 PM
|
 |
Registered User
|
|
Join Date: Jan 2009
Location: Adelaide
Posts: 9,991
|
|
Quote:
Originally Posted by strongmanmike
Of course and there are many ways to skin a cat too I guess, but to call all images that are not composed of days of exposure, a turd, is still a little...absurd
Mike
|
Actually I never said that. Those are your words Mike. I simply said a Turd that is rolled up in glitter is still a turd. That is not the same at all.
|

10-01-2015, 02:15 PM
|
 |
PI rules
|
|
Join Date: Jul 2006
Location: Sydney
Posts: 2,631
|
|
Wow! I really started something here with what I thought was an obvious and uncontroversial statement.
Let me rephrase what i was getting at:
If you have instrumental defects (bad tracking, miscollimation, poor focusing etc) you are better off fixing the hardware problem rather than trying to fix it with software. If you have vignetting or dust motes you are better off correcting them with proper flats rather than resorting to software (DBE and/or cloning). If your images are noisy you are better off collecting more data rather than using noise reduction etc, etc. I was essentially talking about what I think is the optimum way to correct problems, not about how a skilful Processor A can produce a better image with bad data compared to what a newbie Processor B can produce with good data.
Geoff
Last edited by Geoff45; 10-01-2015 at 03:21 PM.
|

10-01-2015, 04:10 PM
|
 |
Highest Observatory in Oz
|
|
Join Date: May 2006
Location: Canberra
Posts: 17,691
|
|
Quote:
Originally Posted by ghsmith45
Wow! I really started something here with what I thought was an obvious and uncontroversial statement.
Let me rephrase what i was getting at:
If you have instrumental defects (bad tracking, miscollimation, poor focusing etc) you are better off fixing the hardware problem rather than trying to fix it with software. If you have vignetting or dust motes you are better off correcting them with proper flats rather than resorting to software (DBE and/or cloning). If your images are noisy you are better off collecting more data rather than using noise reduction etc, etc. I was essentially talking about what I think is the optimum way to correct problems, not about how a skilful Processor A can produce a better image with bad data compared to what a newbie Processor B can produce with good data.
Geoff
|
Oh and I got that Geoff, all good
I was just suggesting that a lovely image can indeed be created from less than so called optimal data and especially without mega data
Lets not get too serious and misinterpret posts, no malice, ill will or otherwise here, regardless of how it may seem
Mike
|

10-01-2015, 05:56 PM
|
 |
Registered User
|
|
Join Date: Jul 2013
Location: Cecil Plains QLD
Posts: 1,228
|
|
Wow, what a stunning image! beautiful
Cheers
Jo
|

10-01-2015, 07:33 PM
|
 |
Narrowfield rules!
|
|
Join Date: Nov 2006
Location: Torquay
Posts: 5,065
|
|
Quote:
Originally Posted by ghsmith45
Wow! I really started something here with what I thought was an obvious and uncontroversial statement.
Let me rephrase what i was getting at:
If you have instrumental defects (bad tracking, miscollimation, poor focusing etc) you are better off fixing the hardware problem rather than trying to fix it with software. If you have vignetting or dust motes you are better off correcting them with proper flats rather than resorting to software (DBE and/or cloning). If your images are noisy you are better off collecting more data rather than using noise reduction etc, etc. I was essentially talking about what I think is the optimum way to correct problems, not about how a skilful Processor A can produce a better image with bad data compared to what a newbie Processor B can produce with good data.
Geoff
|
Well, I disagree again Geoff. Im not getting cranky, its just an interesting subject.
I think "competing" and "optimum" ways of producing a result (sometimes "any" result) is always a balance of hardware and software, and that balance always depends critically on circumstances.
Unless you already have gear perfectly set up and unlimited imaging time, there is always some sort of compromise on a given project that may absolutely give software the edge.
A/ Say you have 2 nights imaging at Wiruna or other dark site, some things arnt quite right with hardware, as you mention, so you sort them out the 1st night rather than take a risk taking bad data. The second night is cloudy, no result.
B/ Instead you image the 1st night with dodgy guiding, need to focus manually, pointing is bad so you hunt down the object manually. Take 6 hrs of dodgy subs and spend 10 hrs with software repair monday and post a pretty good effort on IIS and feel satisfied.
I saw exactly this at my last visit to Wiruna. The astrophotgrapher in question had rediculous hardware problems and just powered on (to my amazement) to produce, with software correction, a very pleasing result.
If you like, software out "competed" hardware perfection in this situation, easily. Software was the "optimum" solution by far.
Ive seen many hardware perfectionists on IIS spend months iron out hardware issues before producing an image. Is that an optimum situation?, lost opportunity can be a dire cost of perfection.
Many times I have sacrificed hardware setup perfection by compensating with software correction just to get a friggen result when circumstances demanded a result or tedious endless tuning and no image at all. If i and probably the majority of us always considered hardware perfection always the "optimum" path, then you would see a lot less images posted on these hallowed pages  .
Last edited by Bassnut; 10-01-2015 at 07:58 PM.
|

10-01-2015, 08:00 PM
|
 |
ze frogginator
|
|
Join Date: Oct 2007
Location: Sydney
Posts: 22,080
|
|
Quote:
Originally Posted by ghsmith45
Come on guys (Fred, Marc and Rick) are you really suggesting it doesn't matter how crappy your data is because you can hide all the crap with processing?
|
Crap data with good processing always looks better than good data with crap processing.  Honing one's processing skills on average data will yield excellent results with good data.
Of course good data with excellent seeing always win. Quality of the gear used also makes a big difference. Although sometime I look at the pics and gear specs and cringe a little... and that's primarily due to lack of processing skills.
TBH a few of the best shots I've seen were done with modest gear. Ray's stuff (Shiraz) for instance produces pictures that are right up there because he has a good understanding of both processing techniques and tuning his equipment.
|

10-01-2015, 09:02 PM
|
 |
PI rules
|
|
Join Date: Jul 2006
Location: Sydney
Posts: 2,631
|
|

Quote:
Originally Posted by Bassnut
Well, I disagree again Geoff. Im not getting cranky, its just an interesting subject.
I think "competing" and "optimum" ways of producing a result (sometimes "any" result) is always a balance of hardware and software, and that balance always depends critically on circumstances.
Unless you already have gear perfectly set up and unlimited imaging time, there is always some sort of compromise on a given project that may absolutely give software the edge.
A/ Say you have 2 nights imaging at Wiruna or other dark site, some things arnt quite right with hardware, as you mention, so you sort them out the 1st night rather than take a risk taking bad data. The second night is cloudy, no result.
B/ Instead you image the 1st night with dodgy guiding, need to focus manually, pointing is bad so you hunt down the object manually. Take 6 hrs of dodgy subs and spend 10 hrs with software repair monday and post a pretty good effort on IIS and feel satisfied.
I saw exactly this at my last visit to Wiruna. The astrophotgrapher in question had rediculous hardware problems and just powered on (to my amazement) to produce, with software correction, a very pleasing result.
If you like, software out "competed" hardware perfection in this situation, easily. Software was the "optimum" solution by far.
Ive seen many hardware perfectionists on IIS spend months iron out hardware issues before producing an image. Is that an optimum situation?, lost opportunity can be a dire cost of perfection.
Many times I have sacrificed hardware setup perfection by compensating with software correction just to get a friggen result when circumstances demanded a result or tedious endless tuning and no image at all. If i and probably the majority of us always considered hardware perfection always the "optimum" path, then you would see a lot less images posted on these hallowed pages  .
|
Hi Fred. I think a simple summary of what both of us are saying is "Do whatever you can to get the best data but if your data is not so good then just bring on your processing skills to deal with it"
The trouble is that "optimum" means different things to different people. There are those that are happy producing one "perfect" image rather than 10 they see faults with but others prefer to get 10 images with a few warts. Without naming names, there is one prominent astrophotographer who suggested spending a year perfecting your hardware before starting to image (PEC, polar alignment, feeler gauges to check squaring of the image train and the like.) He gets lots of APODS.
So I guess we won't get consensus on what is the "optimum" strategy.
I think we've now done this to death--I'm outa here.
Geoff
Last edited by Geoff45; 11-01-2015 at 09:26 AM.
|

11-01-2015, 10:04 AM
|
 |
Highest Observatory in Oz
|
|
Join Date: May 2006
Location: Canberra
Posts: 17,691
|
|
Quote:
Originally Posted by ghsmith45
I think we've now done this to death--I'm outa here.
Geoff
|
No, no, don't...I think you should all continue to discuss the bleeding obvious
|

11-01-2015, 10:41 AM
|
 |
Registered User
|
|
Join Date: Jan 2012
Location: Melbourne
Posts: 3,786
|
|
Quote:
Originally Posted by strongmanmike
No, no, don't...I think you should all continue to discuss the bleeding obvious

|
Hi Mike & others,
People are taking this all too seriously.
Just take one night - get as much data as you can of LRGB & maybe Ha.
Do your best with that data.
( except for those who want to use mega data )
cheers
Allan
|

11-01-2015, 07:01 PM
|
 |
Registered User
|
|
Join Date: Jan 2009
Location: Adelaide
Posts: 9,991
|
|
Quote:
Originally Posted by alpal
Hi Mike & others,
People are taking this all too seriously.
Just take one night - get as much data as you can of LRGB & maybe Ha.
Do your best with that data.
( except for those who want to use mega data )
cheers
Allan
|
Yes those that want to do both should be able to do that and not get criticised for it or poked at. Like I said do what you can achieve. I prefer mega data myself.
|

11-01-2015, 07:06 PM
|
 |
Highest Observatory in Oz
|
|
Join Date: May 2006
Location: Canberra
Posts: 17,691
|
|
Quote:
Originally Posted by alpal
Hi Mike & others,
People are taking this all too seriously.
Just take one night - get as much data as you can of LRGB & maybe Ha.
Do your best with that data.
( except for those who want to use mega data )
cheers
Allan
|
Need to imagine we are sitting around the eski at a star party and discussing this sort of stuff, which I am sure we all have, then the spirit of the discussion is clearer and things are less likely to be taken too seriously or personally
|

12-01-2015, 11:49 AM
|
 |
Registered User
|
|
Join Date: Jan 2009
Location: Adelaide
Posts: 9,991
|
|
Quote:
Originally Posted by ghsmith45
Wow! I really started something here with what I thought was an obvious and uncontroversial statement.
.......
............Geoff
|
No I don't think you did Geoff. There are certainly differences of opinion amongst imagers out there as to how to combat and beat noise. Not everyone agrees with each other. However, discussion is always interesting to see what people think about the subject so long as it remains civil and not personal. For me mega data with long subs allows me to have the data to work with and I can choose which data to reject and I always reject data. My processing skills these days (that is not to suggest there is not more to learn; as there always is) allow me to extract what I can from the data whilst producing my own artistic bent. Part of that bent is to produce an image without the obvious presence of noise. Our eyes in normal life don't see noise and I am trying to produce what our eyes might see. To me an image with noise is an incomplete view of what is out there. At one point or another one has to compromise about the level of noise one is going to accept. That is the part of the argument that is interesting and everyone has their limit.
|

12-01-2015, 03:07 PM
|
 |
Registered User
|
|
Join Date: Jan 2012
Location: Melbourne
Posts: 3,786
|
|
Quote:
Originally Posted by Paul Haese
No I don't think you did Geoff. There are certainly differences of opinion amongst imagers out there as to how to combat and beat noise. Not everyone agrees with each other. However, discussion is always interesting to see what people think about the subject so long as it remains civil and not personal. For me mega data with long subs allows me to have the data to work with and I can choose which data to reject and I always reject data. My processing skills these days (that is not to suggest there is not more to learn; as there always is) allow me to extract what I can from the data whilst producing my own artistic bent. Part of that bent is to produce an image without the obvious presence of noise. Our eyes in normal life don't see noise and I am trying to produce what our eyes might see. To me an image with noise is an incomplete view of what is out there. At one point or another one has to compromise about the level of noise one is going to accept. That is the part of the argument that is interesting and everyone has their limit.
|
Hi Paul,
noise is a natural phenomena.
If you have a dark area of space with no appreciable signal then the
noise will be above any signal.
In that case you cannot get rid of it no matter how many sub
frames you take.
e.g This Hubble picture of galaxy NGC 1300:
http://upload.wikimedia.org/wikipedi...xy-NGC1300.jpg
There is plenty of noise in the background but does it take anything away from the picture?
I would say no & it's far better to leave that noise there than to try
& cover it up with processing or spend more time taking sub frames.
Actually knowing where the noise level is in the image is also more valuable in scientific terms i.e. - it answers the question -
where is the natural noise floor in the picture?
I think we should all leave a bit of natural noise in our pictures.
cheers
Allan
|

12-01-2015, 03:50 PM
|
 |
Registered User
|
|
Join Date: Jan 2009
Location: Adelaide
Posts: 9,991
|
|
Quote:
Originally Posted by alpal
Hi Paul,
noise is a natural phenomena.
If you have a dark area of space with no appreciable signal then the
noise will be above any signal.
In that case you cannot get rid of it no matter how many sub
frames you take.
e.g This Hubble picture of galaxy NGC 1300:
http://upload.wikimedia.org/wikipedi...xy-NGC1300.jpg
There is plenty of noise in the background but does it take anything away from the picture?
I would say no & it's far better to leave that noise there than to try
& cover it up with processing or spend more time taking sub frames.
Actually knowing where the noise level is in the image is also more valuable in scientific terms i.e. - it answers the question -
where is the natural noise floor in the picture?
I think we should all leave a bit of natural noise in our pictures.
cheers
Allan
|
Yes noise is in the shadows. This is why I don't like stretching a lot and hence why I do long subs. Though the use of median stacking does eliminate a lot of noise that is visible in an image.
I don't like noise in my pretty pictures. I am not doing science so I try to eliminate it where ever possible from visible view. I am looking for artistry in my images, I also look for that in others images too. Images that I consider stunning incorporate colour control, detail and noise elimination.
For the most part given enough frames the noise floor is not visible in all images. Only a few of my images this last year show visible noise in the shadows. I am not suggesting it is not there in any of my images, I am saying it is not visible to the eye. It does not require a processing technique to remove it from view other than stacking. My point of view of course and no doubt many will disagree.
I respectfully disagree that there is always signal every where we look in space. I guess appreciable is a matter of degrees though. I am yet to image to point when I cannot see an improvement in the reduction of noise. I know it is there, I have just not encountered it yet. Mind you I have only done the occasional image of 40 odd hours. And; it is a method that works for me and a few others too.
|

12-01-2015, 05:20 PM
|
 |
Registered User
|
|
Join Date: Feb 2006
Location: Sydney
Posts: 18,185
|
|
Rob Gendler started the mega data images back 15 years ago and his images were always the best. I noticed though in his later images they tended to be shorter but then he was hiring a remote 14.5 inch RC in WA for a lot of them. So cost comes into it. So yeah mega data is a good thing. Martin worked with Rob when Martin was starting out and no doubt Rob instilled some good imaging approaches in him. Martin would collect some Tak FS128 data and Rob would process it (they both would).
I have noticed getting good data always makes processing a piece of cake. More is almost always better. Its more productive to work on having the basics done well than spending endless hours on Photoshopping bad data but you can always improve a so so data image but its never going to be top class. But I agree spending time getting your setup humming is the best time invested. I am not in the software is 50% of the image camp. Its definitely a big part but perhaps overstated in importance. Overprocessed images always look substandard to my eye. Because PI is so powerful in its processing tools it has tended to create a series of highly processed images that create a look which some like which is fine but its a bit like HDR photography - not generally a popular look to an image.
My problem is not so much automation, as my observatory is at home so I can image every night/all night on a target, its more clear sky time. Hence a site with good weather and seeing is ideal.
The modern way of dealing with the losses in time from weather, work, hardware glitches and so on is to use larger aperture and faster F ratios to get the all important SNR up. That certainly seems to have been the trend in the last 5 years. Faster wider aperture scopes are becoming more common. The ASA was one of the first. Fast F ratios have their own set of problems of flex, smaller critical focus zone.
So mega data would be different for different setups as different read noises of CCDs, different cooling power, different scopes of different apertures, different locations with better/worse seeing and clear weather times, permanent observatory versus setting up and breaking down. So I don't think you can put a number of hours on it but rather a SNR achieved or level of noise that is acceptable (that obviously varies between astrophotgrapher and the audiences). But lower noise is better than higher noise obviously. More detail is better than a lack of detail. Round pinpoint stars are better than eggy blobs. Sharp detail is better than blurry fuzz. As total hours target its more often than not limited by the weather.
I have done images over 30 hours that were not as good as images that were 10 hours. Seeing comes into it big time when doing longer focal length (not so much for widefield).
Sometimes weather interrupts an imaging sequence and puts it on hold until next year.
One advantage of using a similar setup over many years is being able to add to the data from earlier years and build it up that way.
But mega data of 20 hours plus I think assumes good weather that will often allow that. Sometimes Sydney gets many clear nights in a dry spell other times they are rare. My dark site tends to be clear because even if there is cloud about it tends to go around the large hill/geography of the site. I am sure Paul's site is more often clear than the major cities.
On the other hand this hobby is different things to different people and getting out under the stars is part of the fun to me. Some travel and setup, some live in light polluted cities, some live in rural dark sky areas and some have remote scopes in ideal sites. Different levels of commitment to the hobby and each has its own reward and its fun at the end of the day. There is also the major factor of cost and budget as its an expensive hobby.
Sorry for the long post.
Greg.
Last edited by gregbradley; 12-01-2015 at 05:30 PM.
|

12-01-2015, 05:58 PM
|
 |
Registered User
|
|
Join Date: Jan 2012
Location: Melbourne
Posts: 3,786
|
|
Paul,
Quote:
I respectfully disagree that there is always signal every where we look in space.
|
Hi Paul,
Isn't there such a thing as sky noise? - the atmosphere glows.
There is thermal noise & shot noise etc
If the signal is below that noise you won't see any signal.
What is the signal from a dark area of open space?
The Hubble Deep field would tell us that at our resolution
it is a the sum of light from all those 1000's of galaxys.
In which case we can't resolve them & they just turn into a smudge of background noise.
I believe the correct term is signal to noise ratio.
cheers
Allan
|

12-01-2015, 06:33 PM
|
 |
Registered User
|
|
Join Date: Jan 2009
Location: Adelaide
Posts: 9,991
|
|
Quote:
Originally Posted by gregbradley
Rob Gendler started the mega data images back 15 years ago and his images were always the best. I noticed though in his later images they tended to be shorter but then he was hiring a remote 14.5 inch RC in WA for a lot of them. So cost comes into it. So yeah mega data is a good thing. Martin worked with Rob when Martin was starting out and no doubt Rob instilled some good imaging approaches in him. Martin would collect some Tak FS128 data and Rob would process it (they both would).
I have noticed getting good data always makes processing a piece of cake. More is almost always better. Its more productive to work on having the basics done well than spending endless hours on Photoshopping bad data but you can always improve a so so data image but its never going to be top class. But I agree spending time getting your setup humming is the best time invested. I am not in the software is 50% of the image camp. Its definitely a big part but perhaps overstated in importance. Overprocessed images always look substandard to my eye. Because PI is so powerful in its processing tools it has tended to create a series of highly processed images that create a look which some like which is fine but its a bit like HDR photography - not generally a popular look to an image.
My problem is not so much automation, as my observatory is at home so I can image every night/all night on a target, its more clear sky time. Hence a site with good weather and seeing is ideal.
The modern way of dealing with the losses in time from weather, work, hardware glitches and so on is to use larger aperture and faster F ratios to get the all important SNR up. That certainly seems to have been the trend in the last 5 years. Faster wider aperture scopes are becoming more common. The ASA was one of the first. Fast F ratios have their own set of problems of flex, smaller critical focus zone.
So mega data would be different for different setups as different read noises of CCDs, different cooling power, different scopes of different apertures, different locations with better/worse seeing and clear weather times, permanent observatory versus setting up and breaking down. So I don't think you can put a number of hours on it but rather a SNR achieved or level of noise that is acceptable (that obviously varies between astrophotgrapher and the audiences). But lower noise is better than higher noise obviously. More detail is better than a lack of detail. Round pinpoint stars are better than eggy blobs. Sharp detail is better than blurry fuzz. As total hours target its more often than not limited by the weather.
I have done images over 30 hours that were not as good as images that were 10 hours. Seeing comes into it big time when doing longer focal length (not so much for widefield).
Sometimes weather interrupts an imaging sequence and puts it on hold until next year.
One advantage of using a similar setup over many years is being able to add to the data from earlier years and build it up that way.
But mega data of 20 hours plus I think assumes good weather that will often allow that. Sometimes Sydney gets many clear nights in a dry spell other times they are rare. My dark site tends to be clear because even if there is cloud about it tends to go around the large hill/geography of the site. I am sure Paul's site is more often clear than the major cities.
On the other hand this hobby is different things to different people and getting out under the stars is part of the fun to me. Some travel and setup, some live in light polluted cities, some live in rural dark sky areas and some have remote scopes in ideal sites. Different levels of commitment to the hobby and each has its own reward and its fun at the end of the day. There is also the major factor of cost and budget as its an expensive hobby.
Sorry for the long post.
Greg.
|
Some interesting point Greg.
I have had the benefit from Martin's advice on a number of occasions. So I guess Martin is passing on the knowledge just as Rob did.
Equipment can differ quite a lot in the end and that can determine outcomes and I quite agree with your comments here. There is no one set formula. Each system will require differing levels of data to achieve a desired result.
Automation has meant I can sleep while the system is collecting the data. I don't have to stay up all night. Nor is that helpful to me anymore. I have been a council member of the Astronomical Society of South Australia for 4 years now and more recently am now the instruments officer for the society as well as managing the cladding of the new 8 metre turret we are building. Not to mention other duties like wife time and for that matter work which seems to take a back seat a lot. All that as you know takes up time so in the end I just want the data. Though I still spend time under the stars with mates around new moon if the skies are clear and I enjoy that. Staying up night after night and being sleep deprived isn't much fun anymore.
I do get quite a few clear nights a year, though most of that seems to coincide with the moon in visibility. Hence more NB images I am producing.
Certainly the hobby means different things to different people and each to their own.
|

12-01-2015, 07:32 PM
|
 |
Registered User
|
|
Join Date: Jan 2009
Location: Adelaide
Posts: 9,991
|
|
Quote:
Originally Posted by alpal
Paul,
Hi Paul,
Isn't there such a thing as sky noise? - the atmosphere glows.
There is thermal noise & shot noise etc
If the signal is below that noise you won't see any signal.
What is the signal from a dark area of open space?
The Hubble Deep field would tell us that at our resolution
it is a the sum of light from all those 1000's of galaxys.
In which case we can't resolve them & they just turn into a smudge of background noise.
I believe the correct term is signal to noise ratio.
cheers
Allan
|
Point taken. Though noise being a random thing stacking methods will hide the noise pretty well.
|
Posting Rules
|
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts
HTML code is Off
|
|
|
All times are GMT +10. The time is now 11:35 PM.
|
|