View Single Post
  #18  
Old 19-02-2018, 09:49 PM
markbakovic's Avatar
markbakovic
Easily Confused

markbakovic is offline
 
Join Date: May 2016
Location: Syds
Posts: 33
I'm by no means trying to poo-poo anyone's ideas, the analogue computing thing as described sounds interesting, and the overarching point of looking for different platform paradigms to enable new insights or cast off unconsciously accepted shackles has solid historical precedent.

but.

light transmission of any kind is affected by the medium through which it travels. optical fibres (which trump air by huge scale factors for efficiency) induce wavelength shifts and pulse broadening with heavy transmission length dependence, and dictate careful signal control to fit what is transmitted into broad "bins" that can be reliably differentiated by a receiver. If you look at a research-grade R=80000 spectrogram of an M dwarf star and think "there are ~10k spectral features I could use to fingerprint that star at (sometimes) 0.07 angstrom resolution, therefore, I could get 64k "bits" out of that single spectrum with a simple binary intensity threshold for each spectral element..." you have to remember that only that instrument, that time sees that spectrum. The corrections necessary for even a low res spectrum to account for eg. which side of the earth's orbit we were on (doppler shift), what the sky was contributing (telescope elevation = atmospheric dispersion) time of night (OH glow) are substantial, and require knowledge external to the signal.

OK, so transmitting a single flash of colour and interpreting it doesn't have to be done to the 200th of a nanometer, you could use a more coarse spectrum. But existing, binary, laser diode/PiN detector technology can put 40 gigabits/s down the 1550nm transmission sweetspot of fused silica *already* in a "single" colour. Wavelength division multiplexing can usually manage 20ish simultaneous channels within that transmission sweetspot (effectively what you're talking about: multiple colours simultaneously) in the lab, though the telco people talk about ~300 channels. The problem is signal to noise, if you want to split a pulse of light into multiple colour bins with an optical system that sends different energy photons to spatially different detectors (eg pixels on a camera sensor), you're trying to detect a very small energy spike above each detector's background level, and your error rate skyrockets, so you can't use that same 0.05ns photodiode rise time available for brighter pulses that enables you to detect your laser's maximum reliable rep rate (or lack within a given data frame).

There is promising research in photonic circuits, which might be related to what Tony describes. In such devices direct photonic (and sometimes photon-moderated eg optoacoustic or surface physics) interactions are used to "gate" a signal so you get, or don't, light out a particular waveguide depending on what's going in another one, but not that output's actual source. These interactions obviously have a wavelength dependence (e.g can gate green with green, but not with red) and the waveguide geometry on the chip can be used to sort out or combine particular wavelengths as desired. It is hoped that electronics-style miniturisation/manufacturing tech improvements will lead to a several-orders-of-magnitude improvement in processing capacity (above all with very little waste heat, the perennial bugbear of electronics). I'm talking about gates and this implies "still binary" (albeit multichannel), but theoretically the door is open to "less binary" computation... but even the binary processing, and even over very tightly controlled and short (of order 1cm) transmission distances are in their infancy.
Reply With Quote