Skip to content

Commit

Permalink
Fix precision is not calculated from the rounded uncertainty (#2427)
Browse files Browse the repository at this point in the history
* Calculate the precision from the rounded uncertainty

* Added doctest

* Make the doc-test more real-life

* Changelog
  • Loading branch information
xispa authored Nov 19, 2023
1 parent 717e88b commit 3d07a87
Show file tree
Hide file tree
Showing 3 changed files with 37 additions and 1 deletion.
1 change: 1 addition & 0 deletions CHANGES.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,7 @@ Changelog
2.5.0 (unreleased)
------------------

- #2427 Fix precision is not calculated from the rounded uncertainty
- #2426 Fix ±0 is displayed for results within a range without uncertainty set
- #2424 Fix sample in "registered" after creation when user cannot receive
- #2422 Fix Maximum number of Iterations Exceeded when no catalogs set for AT type
Expand Down
18 changes: 17 additions & 1 deletion src/bika/lims/content/abstractanalysis.py
Original file line number Diff line number Diff line change
Expand Up @@ -953,7 +953,7 @@ def getPrecision(self, result=None):
in accordance with the manual uncertainty set.
- If Calculate Precision from Uncertainty is set in Analysis Service,
calculates the precision in accordance with the uncertainty infered
calculates the precision in accordance with the uncertainty inferred
from uncertainties ranges.
- If neither Manual Uncertainty nor Calculate Precision from
Expand All @@ -980,7 +980,23 @@ def getPrecision(self, result=None):
strres = str(result)
numdecimals = strres[::-1].find('.')
return numdecimals

uncertainty = api.to_float(uncertainty)
# Get the 'raw' significant digits from uncertainty
sig_digits = get_significant_digits(uncertainty)
# Round the uncertainty to its significant digit.
# Needed because the precision for the result has to be based on
# the *rounded* uncertainty. Note the following for a given
# uncertainty value:
# >>> round(0.09404, 2)
# 0.09
# >>> round(0.09504, 2)
# 0.1
# The precision when the uncertainty is 0.09504 is not 2, but 1
uncertainty = abs(round(uncertainty, sig_digits))
# Return the significant digit to apply
return get_significant_digits(uncertainty)

return self.getField('Precision').get(self)

@security.public
Expand Down
19 changes: 19 additions & 0 deletions src/senaite/core/tests/doctests/Uncertainties.rst
Original file line number Diff line number Diff line change
Expand Up @@ -246,6 +246,25 @@ Check the precision of the range 10-20 (0.4):
>>> fe.getFormattedResult()
'10.3'

Check the precision is calculated based on the rounded uncertainty:

>>> uncertainties_2 = [
... {'intercept_min': '1.0', 'intercept_max': '100000', 'errorvalue': '9.9%'},
... {'intercept_min': '0.5', 'intercept_max': '0.9', 'errorvalue': '0'}
... ]
>>> fe.setUncertainties(uncertainties_2)
>>> fe.setResult("9.6")
>>> fe.getResult()
'9.6'

>>> fe.getUncertainty()
'0.9504'

>>> fe.getPrecision()
0

>>> fe.getFormattedResult()
'10'

Test uncertainty for results above/below detection limits
.........................................................
Expand Down

0 comments on commit 3d07a87

Please sign in to comment.