Mathematically, I assume a sigma is an actual reference to a standard deviation (the symbol sigma is used for the stddev). Since a standard deviation is related to a level of confidence for a particular population (or a certainty of a sample to be within a level of confidence), I assume that 1-Sigma is the equivalent of stating a level of confidence of 1-0.341 = 0.659 or a 65.9% confidence that the shadow will pass between the two red lines.
This doesn't sound correct to me though, as a standard deviation would actually result in a 68.2% level of confidence... can anyone clarify this?
|