#1  
Old 17-03-2019, 02:16 PM
Merlin66's Avatar
Merlin66 (Ken)
Registered User

Merlin66 is offline
 
Join Date: Oct 2005
Location: Junortoun Vic
Posts: 8,904
Wide field Multi imaging systems

https://amostech.com/TechnicalPapers.../Ackermann.pdf

A very interesting read on the subject.
Reply With Quote
  #2  
Old 18-03-2019, 12:26 PM
billdan's Avatar
billdan (Bill)
Registered User

billdan is offline
 
Join Date: Mar 2012
Location: Narangba, SE QLD
Posts: 1,551
That was a good read Ken, I learnt a lot about refractors and lenses plus all the history behind it.
Thanks for proving the link.

Bill
Reply With Quote
  #3  
Old 18-03-2019, 01:09 PM
multiweb's Avatar
multiweb (Marc)
ze frogginator

multiweb is offline
 
Join Date: Oct 2007
Location: Sydney
Posts: 22,060
+1 Top article. Read only a bit last night. Lots to digest.
Reply With Quote
  #4  
Old 20-03-2019, 11:46 AM
Sunfish's Avatar
Sunfish (Ray)
Registered User

Sunfish is offline
 
Join Date: Mar 2018
Location: Wollongong
Posts: 1,909
A great amount to take in but very interesting.

A question I have is whether it is reasonable or even possible to combine visual freuency data from an array of optical systems such as reflectors , camera lenses and refractors with different focal lengths of the same target. This seems to be what is happening with arrays of extremely narrow band cameras and lenses.

One would think stacking stretches the whole image anyway.


Quote:
Originally Posted by multiweb View Post
+1 Top article. Read only a bit last night. Lots to digest.
Reply With Quote
  #5  
Old 20-03-2019, 12:03 PM
RickS's Avatar
RickS (Rick)
PI cult recruiter

RickS is offline
 
Join Date: Apr 2010
Location: Brisbane
Posts: 10,584
Another vote of thanks from me, Ken. I have only skimmed the document but it looks very good. Perhaps an array would be an interesting retirement project some day

Quote:
Originally Posted by Sunfish View Post
A question I have is whether it is reasonable or even possible to combine visual freuency data from an array of optical systems such as reflectors , camera lenses and refractors with different focal lengths of the same target. This seems to be what is happening with arrays of extremely narrow band cameras and lenses.

One would think stacking stretches the whole image anyway.
It's not difficult to combine data from systems with different image scale and it can be useful to do so... more photons in total provides higher SNR. This makes it possible to stretch the data further revealing fainter features.

Cheers,
Rick.
Reply With Quote
  #6  
Old 20-03-2019, 02:40 PM
Sunfish's Avatar
Sunfish (Ray)
Registered User

Sunfish is offline
 
Join Date: Mar 2018
Location: Wollongong
Posts: 1,909
Thanks for that.

I tried that a combination of SCT and Refractor in a earlier post and it seemed to work, however I was not sure that the various inherent defects did not simply multiply.

Quote:
Originally Posted by RickS View Post
Another vote of thanks from me, Ken. I have only skimmed the document but it looks very good. Perhaps an array would be an interesting retirement project some day



It's not difficult to combine data from systems with different image scale and it can be useful to do so... more photons in total provides higher SNR. This makes it possible to stretch the data further revealing fainter features.

Cheers,
Rick.
Reply With Quote
  #7  
Old 21-03-2019, 10:18 AM
sil's Avatar
sil (Steve)
Not even a speck of dust

sil is offline
 
Join Date: Jun 2012
Location: Canberra
Posts: 1,474
Sorry, I couldn't resist. The Homo-centric Wide field Cafe Array. Most useless photographic array invented. Also available in Grassed-Area Array and Pavement Array configurations, not any better.

But yes nice article. Bit heavy for my brain. I think there's good potential there for using multiple cameras in an array to provide a better than any single one effective camera. Tieing them together would be the trick to the project. We can easily use subs from lots of different cameras to gain depth but not resolution and cameras like the L16 use the multicamera approach but don't quite work yet. I would guess Canon cameras will be the path to success as they are the most "hackable" for software/firmware. Coming up with a way to calibrate the optics of each lens/camera and be able to adjust in software to a"standard" that all the cameras can contribute to would be great. But of course why stop with a box of old cameras to make an array, why not a bunch of spare 16" dobsonians? Maybe it'd be more practical to have observatories remotely anywhere (our home setups?) that can be synced together?hm, mind drifting off to an idea i had ages ago: a piece of software to go through all source files (images/videos) and build a sub archive. making sure fits information on location and time is embedded and plate solved information etc then let you search for an object or patch of sky and it'll pull all the subs you have ever taken meeting search criteria so you can examine or stack/process. How many of us have unknowingly captured supernovae occuring in recent years? or bolids, fireballs etc? that can be used to help recreate tracking information and landing locations/energies etc. I think an immense citizen science pool is feasible like this and the clever clods will figure out useful ways to use the data to make discoveries and refine measurements and of course help us make pretty pictures
Attached Thumbnails
Click for full-size image (untitled.png)
133.5 KB18 views

Last edited by sil; 21-03-2019 at 10:44 AM.
Reply With Quote
Reply

Bookmarks

Thread Tools
Rate This Thread
Rate This Thread:

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump


All times are GMT +10. The time is now 07:53 AM.

Powered by vBulletin Version 3.8.7 | Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Advertisement
Testar
Advertisement
Bintel
Advertisement