-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implementation of Intensity Clipping Transform #7512
Comments
I'd be open to create a PR. However, I'd propose adding a soft clipping option as described in this medium article. The advantage is that it preserves the natural ordering of the intensity values thus loosing less contextual information than in the hard clipped version. |
Hi @jmnolte, your proposal looks good to me, and we'd be delighted to welcome your contribution. It would indeed be better to offer both soft and hard clipping options for added versatility. |
…roaches (#7535) Fixes Issue #7512. ### Description Addition of a transformation allowing values above or below a certain percentile to be clipped. Clipping can be hard or soft. With soft clipping, the function remains derivable and the order of the values is respected, with smoother corners. The soft clipping function is based on this medium article https://medium.com/life-at-hopper/clip-it-clip-it-good-1f1bf711b291 It's important to note that I've chosen to switch from Nones values to percentiles to take account of the fact that soft clipping can be one-sided or two-sided. In fact, providing percentiles of 100 or 0 doesn't change anything in the case of hard clipping, but it does in the case of soft clipping because the function is smoothed. Hence the interest in introducing the possibility of putting None to avoid smoothing the function on one side or the other. To implement this we had to define a `softplus` function in `monai.transforms.utils_pytorch_numpy_unification.py`. One of the problems is that `np.logaddexp` do not exactly yields same outputs as `torch.logaddexp`. I've left it as is and lowered the tolerance of the tests slightly, but it's possible to force the conversion to numpy and then switch back to torch to ensure better unification between the frameworks. I've also added the `soft_clip` function in `monai.transforms.utils.py` with the associated unit tests to ensure that the transformation works properly. ### Types of changes <!--- Put an `x` in all the boxes that apply, and remove the not applicable items --> - [x] Non-breaking change (fix or new feature that would not break existing functionality). - [ ] Breaking change (fix or new feature that would cause existing functionality to change). - [x] New tests added to cover the changes. - [ ] Integration tests passed locally by running `./runtests.sh -f -u --net --coverage`. - [x] Quick tests passed locally by running `./runtests.sh --quick --unittests --disttests`. - [x] In-line docstrings updated. - [x] Documentation updated, tested `make html` command in the `docs/` folder. --------- Signed-off-by: Lucas Robinet <robinet.lucas@iuct-oncopole.fr> Co-authored-by: YunLiu <55491388+KumoLiu@users.noreply.github.com> Co-authored-by: pre-commit-ci[bot] <66853113+pre-commit-ci[bot]@users.noreply.github.com>
Based on the implementation of the 'ScaleIntensityRangePercentiles' transform, we can add a new transform that only does intensity clipping.
MONAI/monai/transforms/intensity/array.py
Line 1205 in e9e2738
The text was updated successfully, but these errors were encountered: