View Full Version here: : Shoot now, focus later
Octane
23-06-2011, 07:16 PM
This is going to be nuts, if it works.
The company has had $50 million of investment pumped into it, and, the product is slated to be released later this year.
http://www.lytro.com/picture_gallery
http://www.youtube.com/watch?v=7babcK2GH3I
H
DavidTrap
23-06-2011, 08:12 PM
Saw this on Facebook recently - this will be very interesting.
I the technology these guys are using the "fly lens"?
DT
Hi H,
Thanks for the link.
Sounds really amazing.
They have a link to a PhD dissertation submitted to the Dept. of Computer
Science at Stanford by Ren Ng in July 2006 here -
http://www.lytro.com/renng-thesis.pdf
wavelandscott
23-06-2011, 10:50 PM
That was pretty neat stuff!
Maybe even I can become a photographer...
Steffen
23-06-2011, 11:21 PM
Fascinating! What I found most interesting (besides the focus party trick) is the potential to remove lens aberrations. This has great potential and could lead to the use of much faster yet much simpler lenses.
Cheers
Steffen.
All I can say is wow... intruiging to say the least. Who knows what the next 20.. or even 10 years will bring in regards to imaging and optics.
multiweb
23-06-2011, 11:53 PM
I wonder if this technology will be adapted to get rid of field curvature, astigmatism or even spherical aberrations in telescopes at processing time.
Octane
24-06-2011, 12:07 AM
Marc,
I think that will eventually happen.
Slightly off topic, but, Hasselblad have recently released a 200 megapixel camera. From memory, it's a 50 megapixel sensor, but, shifts the sensor one pixel up/down/left/right. Essentially, it takes six images to overcome the resolution issues inherent in a Bayer sensor design.
There are examples on their site and the results are mindboggling.
H
Then combine that with a foveon sensor and see what you come up with :D
Octane
24-06-2011, 12:12 AM
lmao @ Foveon.
Have you seen/read about Sigma's latest business suicide/faux pas?
$10K for a 15 megapixel 23mm diagonal sensor, titled the SD1.
Haha, $10K for colour casts and jaggies and lack of features.
H
Steffen
24-06-2011, 12:18 AM
According to Ren's dissertation it will (or at least could).
Cheers
Steffen.
bartman
24-06-2011, 01:55 AM
Just thinking out of the box......
Could this device be used in devices to help the near blind?
I think I have seen devices that integrate optical nerves with ccd cams.
Then the ccd gives impulses to the optical nerve and gives a very vague picture to the brain. Maybe some ingenuity on the software side of things could make these vague pics 'sharper'.
This following article is not the one I was after, but my google skills are sometimes off....
http://www.scribd.com/doc/38207410/Arteficial-Eye#
And now just reading some of it ...I'm not sure of the validity.....then again I have no idea about biochemistry et al........
Anyway Just a thought...
Bartman
Blimey that's awesome.
Sometimes the advances in technology are astounding. :jawdrop:
tlgerdes
25-06-2011, 06:44 AM
Here is another article on it about who is financing it
http://blogs.forbes.com/tomiogeron/2011/06/21/shoot-first-focus-later-with-lytros-new-camera-tech/?partner=yahootix
No fly by nighters here.
The night H posted the link (thanks again H), I stayed up late reading and
browsing through the PhD dissertation by Ren Ng.
What a beautiful piece of work. It certainly looks as if it has the potential to
be a multi-billion dollar idea.
Readers with electrical engineering or computer science/engineering backgrounds
will be familiar with the often used approach in those disciplines of transforming
particular classes of problems to some other "space", performing the required
manipulations there, and then transforming to the required target "space".
Manipulations in complex number space or Fourier transforms are a couple of good
examples. In fact, a significant part of the formal study in those disciplines
is being instructed in the necessary wizardry.
The idea of going from 4D "Light Field" space to a 4D "Light Field Fourier
Spectrum" to 2D "Fourier Slices" to 2D "Inverse Fourier Transforms" as one
embodiment of the Light Field camera's refocusing approach is a beautiful
example of this "through the looking glass" magic.
What I found particularly profound was the further convergence of concepts
such as ray tracing - something we normally do just inside a computer -
with digital photography.
In the past, ray tracing has been commonly used both in the generation of computer
imagery and in the design of optics, particularly lenses.
The first use of ray tracing is attributed to Arthur Appel in 1968. In the
pursuit of photo-realism of computer generated images, the algorithms are
improved upon over the subsequent decades. Concurrently, beginning with the first
commercial embodiments in the early 1980's, digital cameras go through multiple
generations of improvements. Then comes the introduction of the Light Field
camera concept, where there is a marriage of the abstraction of ray tracing
algorithms with digital cameras for photographing real-world images.
Likewise, and in particular at present in the motion picture world, we have the
use of high dynamic range imaging (HDRI) with formats such as Industrial Light
and Magic's OpenEXR because working in that "space" enables movie makers to
more easily marry real life images with computer generated images. For example, the
production of all the Harry Potter movies has employed this magic.
And when one considers HDR and radiosity techniques being the mainstay of many
modern ray tracing systems, one then starts to think abouts its marriage with
the Light Field ray tracing concepts.
One thing is for certain, the "Photoshop"-like image programs of the future are
going to have a lot more controls compared to the relatively simple color spaces
of today.
For example, in the future, during post-processing, might one be able to
manipulate the warmth of light that had arrived just from one particular
direction? Such questions will undoubtedly already be transparent to the Light
Field developers.
Being situated in the heart of Silicon Valley is a major advantage for Lytro as
a company, as that part of the world has more experience with the necessary
fabrication techniques than anywhere else. As they will be trading pixels for
the ability to do this post processing magic, one would speculate that the
initial offerings would fall short in terms of resolution compared to commodity
digital cameras today. But the future certainly has the prospect of being very
exciting and it could well end up being one of those applications that will even
more increase the world demand for disk space and network bandwidth. :thumbsup:
wavelandscott
25-06-2011, 08:51 PM
Gary,
As usual, thanks for a useful and enlightening post (I would never read the dissertation) and layman's overview. I think it is cool to use ideas like "magic" to explain the neat science. For those of us who are primatives in the wizardry of electronics the descrption is applicable because it is "magic" to me. I also appreciate the reference to Industrial Light and Magic, it is nice symmetry.
Well Done and Thanks...
Omaroo
21-10-2011, 08:59 AM
Coming early new year! I want one. Definitely.
http://www.engadget.com/2011/10/19/lytro-camera-hands-on-video/
http://www.engadget.com/photos/lytro-camera-hands-on/
okiscopey
21-10-2011, 10:09 AM
Given the data set captured by the Lytro, surely it would only need a bit of additional software to create standard (e.g. JPEG) images displaying infinite depth of field. Think of the possibilities!
adman
21-10-2011, 10:36 AM
I would love to see how you apply this to macro shooting
Thanks Chris,
Looks like it won't be long until interactive images from these cameras become
ubiquitous on the net.
There will be children born today that will grow up taking such capability for granted
as if "that's how things always worked".
vBulletin® v3.8.7, Copyright ©2000-2025, vBulletin Solutions, Inc.