-
Notifications
You must be signed in to change notification settings - Fork 19
Open
Labels
documentationImprovements or additions to documentationImprovements or additions to documentation
Description
Which element of the documentation do you want to modify and why?
In the documentation of julearn.run_cross_validation, it states that only ScorerLike is accepted as a scoring parameter, but a list of strings, for example, ["roc_auc", "balanced_accuracy"], is also admitted. I think it should be clearer that multiple metrics can be used in the scoring, as this is a big feature of the library.
It would also be good to clearly state that this scoring is not used for hyperparameter tuning, and that it is a separate parameter.
Finally, shouldn't None be stated as the default parameter?
Anything else to say?
Current documentation:
scoring: ScorerLike, optional
The scoring metric to use. See https://scikit-learn.org/stable/modules/model_evaluation.html for a comprehensive list of options. If None, use the model’s default scorer.
Reactions are currently unavailable
Metadata
Metadata
Labels
documentationImprovements or additions to documentationImprovements or additions to documentation