-
Notifications
You must be signed in to change notification settings - Fork 87
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Metrics::f1 #166
Comments
This is indeed an issue in the Metrics package then. |
Unfortunately {Metrics} has not seen a commit since March 2019 and it looks like it has been abandoned. |
Hi, no I haven't opened an issue in the Thankfully the feature importance interface allows for custom losses, so we implemented our own f1 measure for binary classification. |
I created the issue in the So I think the documentation for Feature Importance can be updated stating more precisely to use |
Hello,
Thanks for the package.
I know that the
Metrics
package is not implemented by you but only exported by your package. But I encountered an issue trying to runFeatureImp
using thef1
metric.The
f1
metric from theMetrics
package seems to be wrongly implemented. The first two lines are already an indication (unique
), making the rest of the code also wrong.Reproducible example:
This is wrong as one would expect (for binary classification) with
1
as the true label, an F1 score of0.86
. If the positive label were0
the F1 score would be0.8
The text was updated successfully, but these errors were encountered: