Precision and accuracy

Kurt Roeckx kurt at roeckx.be
Sat Jan 21 10:25:44 UTC 2017


On Fri, Jan 20, 2017 at 05:39:37PM -0800, Gary E. Miller wrote:
> Yo Kurt!
> 
> On Sat, 21 Jan 2017 01:23:47 +0100
> Kurt Roeckx <kurt at roeckx.be> wrote:
> 
> > On Fri, Jan 20, 2017 at 12:44:38PM -0800, Gary E. Miller wrote:
> > > Imagine in front of you are two handheld voltmeters, and a super
> > > precision voltage source.  
> > 
> > So let me correct all those terms to what people normally use them
> > for.
> 
> You use your terms, I'll use mine.  As long as we understand each other.
> 
> You left out the part where I started with:
> 
>     "Many will disagree with my terms, for those that do, just think
>     of the names I use as random variable names and try to follow the
>     concept."
> 
> I really did try to find some online definitions of these that are
> widely agreed on, and failed.  Everyone has their 'one true' dictionary,
> but they can never seem to find it.  If you can find one I'd like to see
> it.

https://en.wikipedia.org/wiki/Accuracy_and_precision

There are many places that describe it like that, and if you
really want I can find more references.

> > Suppose you have a reference voltage that has a voltage of
> > exactly 1.000000000 V. And you measure that voltage with your
> > voltmeter, and it shows you 0.995, 0.994, 0.993, 0.994, 0.992,
> > 0.994, 0.995, and so on. Let's assume the average value is 0.994
> > V, that it's a normal distribution and that the standard deviation
> > is 1 mV. People could say that that 1 mV is the precision, and 6
> > mV is the accuracy.
> 
> Ah, but now you have added in jitter.

I don't call that jitter. Jitter is only about variances in the
time base. Which is why I also said that for time jitter and
precision are very related.

> I said nothing about jitter,
> which not really a thing in real DVMs.  I have tested a lot of
> NoName DVMs and expensive DVMs and never seen them jitter.

I've seen them in many. For instance my fluke scopemeter in meter
mode will change at least over 5 different values. (It also has
crappy specifications.)

> > Suppose you take a different voltmeter, and that one always shows
> > 1.002 V. People might claim the precision is 0, but a better
> > number would be halve the resolution, so 0.5 mV. An even better
> > number could be 0.43 mV. That's sqrt(3) / 2 * halve the resolution. I
> > has at least a rectangular distribution, and I think that's the
> > correct formule to match the coverage of 1 standard deviation.
> 
> Which, since standard deviation is usefull only when you are already 
> lost, and only for one particular type of noise, not useful.
> 
> Time and time again I have heard: you only use standard deviation
> when you are already lost.  I 100% agree with that.

This is really about having the same coverage as the standard
deviation so that you can add the errors.

> > If you go and look at the calibration sheet of your reference
> > voltage, or one of those 8 digit volt meters, you're not going to
> > find a mention of the accuracy and precision. Instead they're
> > going to talk about the uncertainty, and say your reference voltage
> > gives a voltage of 1.000000001 +/- 3 nV, where the 3 nV is an
> > expanded uncertainty, maybe that it's a normal distribution,
> > and that it has a k factor of 2 to have a 95% coverage.
> 
> We must be buying from a different class of vendors.  I have only
> rarely seen normal or standard deviations in the real world.

At least in Europe you should always find that on calibration
information about your device that traces back to one of the
standard bodies like NIST. I doubt it's that much different for
NIST.

> > > Now plug in the Fluke meter.  I feed in the same 1.000000000 Volts
> > > and I read on the meter 1.000 Volts.  How accurate, NIST traceable,
> > > would you say the Fluke meter is?  You guess 0.1%?  Maybe, maybe
> > > not.
> > > 
> > > Now I set the calibrator to 1.000490000 and the meter still reads
> > > 1.000.  I set the meter to 1.00050000 and the meter changes to 
> > > read 1.001.  How accurate is the meter?  I say the meter is accurate
> > > to 0.001%  
> > 
> > That's some very nice fluke meter you got there. The best I could
> > find claims an accuracy of 0.05% (I assume of reading, but could
> > also be of range) + 1 count, which would be at least 1.5 mV. All
> > the others are worse with most doing 0.5% + 2 counts.
> 
> Yes, John Fluke Mfg. Co., Inc. made damn fine stuff.

And I'm saying that the best you can buy now from Fluke in that
range of devices is 0.05% + 1 count. And that 1 count is just as
important.


Kurt



More information about the devel mailing list