View Single Post
  #8  
Old 21-06-2010, 11:14 PM
OICURMT's Avatar
OICURMT
Oh, I See You Are Empty!

OICURMT is offline
 
Join Date: Apr 2010
Location: Laramie, WY - United States of America
Posts: 1,555
Question

Mathematically, I assume a sigma is an actual reference to a standard deviation (the symbol sigma is used for the stddev). Since a standard deviation is related to a level of confidence for a particular population (or a certainty of a sample to be within a level of confidence), I assume that 1-Sigma is the equivalent of stating a level of confidence of 1-0.341 = 0.659 or a 65.9% confidence that the shadow will pass between the two red lines.

This doesn't sound correct to me though, as a standard deviation would actually result in a 68.2% level of confidence... can anyone clarify this?
Reply With Quote