Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DOC add quantile regression example #91

Merged
merged 2 commits into from
Jul 15, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
417 changes: 417 additions & 0 deletions docs/examples/quantile_regression.ipynb

Large diffs are not rendered by default.

1 change: 1 addition & 0 deletions mkdocs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -35,6 +35,7 @@ nav:
- About: index.md
- Examples:
- Regression on Workers' Compensation: examples/regression_on_workers_compensation.ipynb
- Quantile Regression: examples/quantile_regression.ipynb
- API Reference: reference/ # defer to gen-files + literate-nav
- Development: development.md
- Release Notes: https://github.com/lorentzenchr/model-diagnostics/releases
Expand Down
4 changes: 2 additions & 2 deletions src/model_diagnostics/calibration/identification.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,8 +25,8 @@ def identification_function(
) -> np.ndarray:
r"""Canonical identification function.

Identification functions act as generalised residuals. See Notes for further
details.
Identification functions act as generalised residuals. See [Notes](#notes) for
further details.

Parameters
----------
Expand Down
2 changes: 1 addition & 1 deletion src/model_diagnostics/calibration/plots.py
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ def plot_reliability_diagram(
predictions `y_pred` (x-axis).
The conditional expectation is estimated via isotonic regression (PAV algorithm)
of `y_obs` on `y_pred`.
See Notes for further details.
See [Notes](#notes) for further details.

Parameters
----------
Expand Down
4 changes: 2 additions & 2 deletions src/model_diagnostics/scoring/plots.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,7 @@ def plot_murphy_diagram(
over a range of their free parameter `eta`. This shows, if a model dominates all
others over a wide class of scoring functions or if the ranking is very much
dependent on the choice of scoring function.
See Notes for further details.
See [Notes](#notes) for further details.

Parameters
----------
Expand Down Expand Up @@ -66,7 +66,7 @@ def plot_murphy_diagram(

Notes
-----
For details, refer to [Ehm2015].
For details, refer to `[Ehm2015]`.

References
----------
Expand Down
7 changes: 4 additions & 3 deletions src/model_diagnostics/scoring/scoring.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
"""The scoring module provides scoring functions and the score decomposition.
"""The scoring module provides scoring functions, also known as loss functions,
and a score decomposition.
Each scoring function is implemented as a class that needs to be instantiated
before calling the `__call__` methode, e.g. `SquaredError()(y_obs=[1], y_pred=[2])`.
"""
Expand Down Expand Up @@ -576,7 +577,7 @@ class ElementaryScore(_BaseScoringFunction):

The elementary scoring function is consistent for the specified `functional` for
all values of `eta` and is the main ingredient for Murphy diagrams.
See Notes for further details.
See [Notes](#notes) for further details.

Parameters
----------
Expand All @@ -599,7 +600,7 @@ class ElementaryScore(_BaseScoringFunction):
The elementary scoring or loss function is given by

\[
S_\eta^h(y, z) = (\mathbf{1}\{\eta \le z\} - \mathbf{1}\{\eta \le y\})
S_\eta(y, z) = (\mathbf{1}\{\eta \le z\} - \mathbf{1}\{\eta \le y\})
V(y, \eta)
\]

Expand Down