Resolution, precision, accuracy, repeatability: Difference between revisions
Jump to navigation
Jump to search
mNo edit summary |
mNo edit summary |
||
(5 intermediate revisions by the same user not shown) | |||
Line 2: | Line 2: | ||
'''Accuracy''' is how far measurements are from their true value | '''Accuracy''' is how far measurements are from their true value | ||
: how you know that true value ''itself'' to more accuracy is a good question, though mostly amounts to more work | : how you know that true value ''itself'' to more accuracy is a good question, though mostly amounts to more work until you figure it's good enough | ||
'''Precision''' is largely about how consistent measurements are | '''Precision''' is largely about how consistent measurements are | ||
: measurements can be very precise but not at all accurate | : measurements can be very precise but not at all accurate | ||
:: e.g. a precision instrument that is not calibrated may be consistently within a very narrow margin (so seemingly precise to that small amount -- but inaccurate to the true value) | |||
:: it may then be that calibration is the only thing keeping it from being a lot more accurate -- OR things may be a lot messier and more complex | :: it may then be that calibration is the only thing keeping it from being a lot more accurate -- OR things may be a lot messier and more complex | ||
Line 30: | Line 25: | ||
: If repeatability is contrasted with '''reproducibility''', then | : If repeatability is contrasted with '''reproducibility''', then | ||
:: repeatibility is often a "can one person on one instrument get the same measurement again", and | :: repeatibility is often a "can one person on one instrument get the same measurement again", and | ||
:: reproducibility is often a "if you have different operators, and/or different instruments, do | :: reproducibility is often a "if you have different operators, and/or different instruments, do they get the same measurement?" | ||
: Resolution and repeatability are also words used when asking how well you can ''actuate/control'' something - which also makes things more complex because both the control and the measurement of the result may each have their own precision/accuracy details. | |||
<!-- | |||
Accuracy is sometimes split into relative accuracy and absolute accuracy. Not all definitions are the same here, but usually, | |||
: relative accuracy is about whether comparisons are accurate to a given reference | |||
: absolute accuracy where that reference is reality. | |||
For example, consider having a map of houses, and a map of roads. | |||
: Both can have high relative accuracy, in that the distances between objects are the distances in reality. | |||
: But they may not be georeferenced with the same assumptions, meaning they may be a few meters off of reality -- and thereby also probably a few meters offset from each other | |||
:: (and not necessarily off by a constant. You may need to creatively rubber-sheet things to get them to match well ''enough'') | |||
In other contexts, the same ideas lead to different details of importance, and different summaries. | |||
For example, measurement instruments | |||
Relative accuracy: Inherent instrument accuracy relative to calibration standards. It includes stability, temperature coefficient, linearity, repeatability, and calibration interpolation error. | |||
Absolute accuracy: The sum of the relative accuracy and the uncertainty of calibration standards. | |||
--> | |||
Line 59: | Line 75: | ||
: It reports whole degrees (1℃) resolution | : It reports whole degrees (1℃) resolution | ||
: datasheet suggests 1°C repeatability | : datasheet suggests 1°C repeatability | ||
: but specs say don't assume better than ±2℃ accuracy. | : but specs say don't assume better than ±2℃ accuracy. | ||
: Assuming that's the correct use of accuracy, it may have 1℃ precision, but I wouldn't assume the same amount of repeatability | : Assuming that's the correct use of accuracy, it may have 1℃ precision, but I wouldn't assume the same amount of repeatability | ||
adding your own calibration might mean you can get 1℃ accuracy | adding your own calibration might mean you can get 1℃ accuracy | ||
I have a slightly less cheap temperature sensor (DS18B20) | I have a slightly less cheap temperature sensor (DS18B20) | ||
Line 76: | Line 93: | ||
[[Humidity sensing]] is similarly intersting. | |||
Cheap and good sensors tend not to claim better than ±5%RH but some perform better | |||
in ways you need the above terms (and more) to describe. | |||
https://www.kandrsmith.org/RJS/Misc/Hygrometers/calib_many.html | |||
--> | |||
<!-- | |||
If my multimeter shows me 1.153 volts, it makes it easy to assume its accuracy is three digit's worth (around 0.1%). | If my multimeter shows me 1.153 volts, it makes it easy to assume its accuracy is three digit's worth (around 0.1%). | ||
But multimeters shouldn't be assumed to be better than 1%, | |||
But generally, multimeters shouldn't be assumed to be better than 1%, | |||
both because more extreme values (e.g. many-megaohm) are often harder to measure, | both because more extreme values (e.g. many-megaohm) are often harder to measure, | ||
and because cheap ones don't care as much about calibration. | and because cheap ones don't care as much about calibration. | ||
Cheap + extreme value combination may be more like 5%. | Cheap + extreme value combination may be more like 5%. | ||
And those figures are ''not'' very easy to find. | And those figures are ''not'' very easy to find. | ||
Line 152: | Line 179: | ||
Also remember that | Also remember that it is easier to be more precise while being no more accurate. | ||
Line 173: | Line 200: | ||
A statistica description can be seen as an estimator, and you may get better precision (consistent measurements), but not necessarily better accuracy (closeness to a true value). | A statistica description can be seen as an estimator, and you may get better precision (consistent measurements), but not necessarily better accuracy (closeness to a true value). | ||
Latest revision as of 15:54, 29 March 2024
✎ This article/section is a stub — some half-sorted notes, not necessarily checked, not necessarily correct. Feel free to ignore, or tell me about it.
Accuracy is how far measurements are from their true value
- how you know that true value itself to more accuracy is a good question, though mostly amounts to more work until you figure it's good enough
Precision is largely about how consistent measurements are
- measurements can be very precise but not at all accurate
- e.g. a precision instrument that is not calibrated may be consistently within a very narrow margin (so seemingly precise to that small amount -- but inaccurate to the true value)
- it may then be that calibration is the only thing keeping it from being a lot more accurate -- OR things may be a lot messier and more complex
Resolution is (usually) the units in which is reported (or controlled)
- resolution often hints at the order of accuracy and/or precision, but this can be misleading
- yet high resolution is also a great way to hint at more accuracy and/or precision than you really have
- e.g. does your 4-digit multimeter always show accurate digits? How would you tell?
Repeatability asks that when you later return to the same true value, how stable your measurement of it is
- this is much like precision, but focuses more on the tool or the machine, than the measurement.
- If repeatability is contrasted with reproducibility, then
- repeatibility is often a "can one person on one instrument get the same measurement again", and
- reproducibility is often a "if you have different operators, and/or different instruments, do they get the same measurement?"
- Resolution and repeatability are also words used when asking how well you can actuate/control something - which also makes things more complex because both the control and the measurement of the result may each have their own precision/accuracy details.