Quote:
Originally Posted by bojan
Accuracy and resolution are two different beasts..
Accuracy means how far away you are from REAL value (compared to etalon or standard).
There is a third parameter which should be listed but is not, and that is repeatability of reading, because people mix it with accuracy.
Basically, if repeatability is good (that means, always the same reading whenever you measure the same thing) it can be calibrated if you have a standard.. then accuracy becomes the same as resolution
|
thanks, bojan.
I am familiar with precision versus accuracy, which is what you refer to as the "third parameter", I think.
however, i guess i had the thought experiment:
reading = 10.001
but accuracy states that true value could be anywhere from 9.996 to 10.006. Therefore, a precision of 0.001 is not helpful. In this case I am equating Acccuracy to error, which may not be strictly correct.