View Single Post
  #56  
Old 03-04-2008, 02:09 AM
Suzy_A's Avatar
Suzy_A
Registered User

Suzy_A is offline
 
Join Date: Jan 2006
Location: Fremantle
Posts: 238
Quote:
Originally Posted by Peter Ward View Post
Susan...interesting analysis...but you might want to check your math. The sun is a broad spectral source and its approx 1Kw per sq m flux is spread over a very large spectral range.

Lasers are spectrally very pure, and sure, in their particular frequency, can rival old Sol...but in terms of total power are not very energetic....hence to get the same flux over a *very* narrow band they need to be rated in 10's (if not more) of watts rather than milliwatts....

Cheers
Peter
Hi Peter,

As you are probably aware, in SI units, Power (in watts) is defined as energy (in Joules) over time (seconds). Similarly, watt per square meter or a joule per second per square meter is power per unit area (m2).

A beam of photons at 1250 watts per square meter will carry the same amount of power per unit area and hence energy per unit time and area irrespective of the wavelength, spatial or chromatic coherence.

What is different of course is how the energy is absorbed (or not) when it strikes some surface. At both far ends of the spectrum, gamma and radio, the energy in the photons are not well absorbed by tissue and so quite high flux levels can be tolerated, although with gamma radiation the small proportion of energy that is absorbed does cause severe damage - only 3 joules per kilo whole body exposure (ie total 200 joules wb) can kill someone by wiping out their bone marrow. Drinking a cup of hot coffee will deposit more energy than that... RF radiation tends to cause thermal heating and the human body can absorb a MJ or so of evenly distributed heat without too much damage.

In the case of incoherent sunlight, certainly the UV part of the 1000 W/m2 (about 50 W/m2 I think?) is absorbed in the cornea and lens (causing in extreme cases short term things like snow-blindness and in the long-term cataracts etc) and the IR part is also largely absorbed before it gets through the eye. Even so, something like 850 - 900 J/s/m2 will be absorbed by the retina by looking at the sun - and this is independent on the coherence and wavelength apart from the UV and IR that I mentioed above. Obviously the energy will be much less than this as the aperature, ie pupil, is only a few mm across. Also due to chromatic, spherical etc aberations in the eye as well as the angular size of the source (the sun) which results in a fairly large image, the energy per unit area will be relatively low.

In the case of a coherent (or relatively) source like a diode laser, the energy per time per unit area for a 1 mW laser is, as I said, about the same as the sun, but due to lower aberations and aperture not being an issue, the energy per time per area will be somewhat higher, but still within the same order of magnitude or so, of for looking at the sun for the same time.

In the case of a 20 mW laser pointed at a pilot or even at just anyone at a distance (probably at more than anything more than a dozen metres or so), the time that the beam is in their eye would probably be too brief to cause permanent damage, not due to their blink reflex, but rather as it would be almost impossible to hold the laser steady enough and aim it well enough at the target, ie 3 - 7 mm pupil. Also of course at a few hundred metres the beam will have diverged up to a few cm across, thereby reducing the energy per unit area.

Am I still coherent or has the red wine and late night got to me?
Reply With Quote