Multi-Objective Optimization of Performance and Interpretability by Jointly Optimizing Hyperparameters of a Learner and Group Structures of Features
mlr_tuner_eagga.Rd
Performs joint multi-objective optimization of hyperparameters, feature selection, interaction constraints, and monotonicity constraints for a suitable mlr3::Learner
.
This tuner requires an appropriate mlr3::Learner
that supports feature selection, interaction constraints, and monotonicity constraints.
Currently, only XGBoost learners (mlr3learners::LearnerRegrXgboost
or mlr3learners::LearnerClassifXgboost
) are supported.
Dictionary
This Tuner can be instantiated via the dictionary
mlr_tuners or with the associated sugar function tnr()
:
Parameters
learner_id
(character) ID of the learner in the graph learner.
select_id
(character) ID of the parameter in the learner that controls feature selection.
interaction_id
(character) ID of the parameter in the learner that sets interaction constraints.
monotone_id
(character) ID of the parameter in the learner that sets monotonicity constraints.
mu
(integer) Population size.
lambda
(integer) Offspring size of each generation.
seed_calculate_proxy_measures
(integer) Random seed to make the training of the models on the full mlr3::Task to get the interpretability measures reproducible.
Progress Bars
$optimize()
supports progress bars via the package progressr
combined with a Terminator. Simply wrap the function in
progressr::with_progress()
to enable them. We recommend to use package
progress as backend; enable with progressr::handlers("progress")
.
Logging
All Tuners use a logger (as implemented in lgr) from package
bbotk.
Use lgr::get_logger("bbotk")
to access and control the logger.
References
Schneider, Lennart, Bischl, Bernd, Thomas, Janek (2023). “Multi-Objective Optimization of Performance and Interpretability of Tabular Supervised Machine Learning Models.” In Proceedings of the Genetic and Evolutionary Computation Conference, series GECCO '23, 538--547.
Super class
mlr3tuning::Tuner
-> TunerEAGGA