You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Proposed solution: Two new training arguments: --model_selection_score and --model_selection_label. Example: 'acc' or 'f1' as model selection score. In compute_metrics_fn, check these arguments and set one_score based on values of those 2 arguments. No argument: Macro f1 (current default), 'f1' with no label provided, macro f1, 'acc' with no label provided, accuracy, 'acc' with label provided, error?, 'f1' with label provided, look up label index in dataset and use that label's f1.
Potential enhancements:
The text was updated successfully, but these errors were encountered: