Skip to content

Metrics for hyperparameter tuning#547

Closed
egillax wants to merge 3 commits intodevelopfrom
hyperparam2
Closed

Metrics for hyperparameter tuning#547
egillax wants to merge 3 commits intodevelopfrom
hyperparam2

Conversation

@egillax
Copy link
Collaborator

@egillax egillax commented Feb 21, 2025

No description provided.

@jreps
Copy link
Collaborator

jreps commented Mar 10, 2025

This looks good - the only thing worth considering is whether instead of having a string 'evalmetric' as input into the model design we may want an evaluationMetricSetting created via a createEvalidationMetricSetting() that lets the user specify a function or a string name of a function and settings such as maximize (boolean). That way the package is more flexible to enable custom evaluation metrics. For backwards compatibility we should make the setting default to 'AUC' and maximize. Then we will need to add a table for this setting into the results tables - I'm happy to help with that part.

@egillax
Copy link
Collaborator Author

egillax commented Feb 17, 2026

closing since this was merged in #618

@egillax egillax closed this Feb 17, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants