Resolution, precision, accuracy, repeatability: Difference between revisions

From Helpful
Jump to navigation Jump to search
mNo edit summary
mNo edit summary
Line 9: Line 9:
* absolute accuracy is relative to an  
* absolute accuracy is relative to an  
* relative accuracy is relative to an  
* relative accuracy is relative to an  
For
Relative Accuracy: Relative Accuracy is how close a measured value is to a standard value on relative terms. In other words, independent of scale and translation.
Absolute Accuracy: Absolute accuracy is how close a measured value is to a know absolute true value. Usually provided in known and agreed-on units such as meters, cm, mm, inches, or feet.
-->
-->


Line 59: Line 66:
: It reports whole degrees (1℃) resolution
: It reports whole degrees (1℃) resolution
: datasheet suggests 1°C repeatability
: datasheet suggests 1°C repeatability
: but specs say don't assume better than ±2℃ accuracy. I've seen differences that suggest it may be a little higher
: but specs say don't assume better than ±2℃ accuracy.  
: Assuming that's the correct use of accuracy, it may have 1℃ precision, but I wouldn't assume the same amount of repeatability
: Assuming that's the correct use of accuracy, it may have 1℃ precision, but I wouldn't assume the same amount of repeatability
adding your own calibration might mean you can get 1℃ accuracy
adding your own calibration might mean you can get 1℃ accuracy


I have a slightly less cheap temperature sensor (DS18B20)
I have a slightly less cheap temperature sensor (DS18B20)
Line 74: Line 82:
* repeatability of a single sensor to ''maybe'' 0.2°
* repeatability of a single sensor to ''maybe'' 0.2°
* difference between sensors can be 0.4° suggesting (I'm fairly sure these are fakes)
* difference between sensors can be 0.4° suggesting (I'm fairly sure these are fakes)
[[Humidity sensing]] is similarly intersting.
Cheap and good sensors tend not to claim better than ±5%RH but some perform better
in ways you need the above terms (and more) to describe.
https://www.kandrsmith.org/RJS/Misc/Hygrometers/calib_many.html
-->






If my multimeter shows me 1.153 volts, it makes it easy to assume its accuracy is three digit's worth (around 0.1%).
If my multimeter shows me 1.153 volts, it makes it easy to assume its accuracy is three digit's worth (around 0.1%).
But multimeters shouldn't be assumed to be better than 1%,
 
But generally, multimeters shouldn't be assumed to be better than 1%,
both because more extreme values (e.g. many-megaohm) are often harder to measure,
both because more extreme values (e.g. many-megaohm) are often harder to measure,
and because cheap ones don't care as much about calibration.
and because cheap ones don't care as much about calibration.
Cheap + extreme value combination may be more like 5%.
Cheap + extreme value combination may be more like 5%.
And those figures are ''not'' very easy to find.
And those figures are ''not'' very easy to find.





Revision as of 14:30, 9 August 2023

This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.


Accuracy is how far measurements are from their true value (which is sometimes a standard value).

how you know that true value itself to more accuracy is a good question, though mostly amounts to more work.


Precision is largely about how consistent measurements are

measurements can be very precise but not at all accurate - e.g. a precision instrument that is consistently off to one side due to bad calibration
it may then be that calibration is the only thing keeping it from being a lot more accurate -- OR things may be a lot messier and more complex


Resolution is (usually) the units in which is reported (or controlled)

resolution often hints at the order of accuracy and/or precision, but this can be misleading
yet high resolution is also a great way to hint at more accuracy and/or precision than you really have
e.g. does your 4-digit multimeter always show accurate digits? How would you tell?


Repeatability asks that when you later return to the same true value, how stable your measurement of it is

this is much like precision, but focuses more on the tool or the machine, than the measurement.
If repeatability is contrasted with reproducibility, then
repeatibility is often a "can one person on one instrument get the same measurement again", and
reproducibility is often a "if you have different operators, and/or different instruments, do you get the same measurement again?"
Resolution and repeatability can also be about how well you can control something - which also makes things more complex because both the control and the measurement of the result may each have their issues






If my multimeter shows me 1.153 volts, it makes it easy to assume its accuracy is three digit's worth (around 0.1%).

But generally, multimeters shouldn't be assumed to be better than 1%, both because more extreme values (e.g. many-megaohm) are often harder to measure, and because cheap ones don't care as much about calibration. Cheap + extreme value combination may be more like 5%. And those figures are not very easy to find.


-->


Showing and/or sorta-implying precision in numbers

Does averaging give you more digits?