Home>

Mysterious argument () error when using Optuna + LihgtGBM
I want to store the best parameters somehow.

TypeError Traceback (most recent call last)
in
9 verbose_eval = 50,
10 best_params = best_params,
--->11 tuning_history = tuning_history)

~ \ anaconda3 \ envs \ py36 \ lib \ site-packages \ optuna \ integration \ _lightgbm_tuner \init.py in train (* args,kwargs) 31 _imports.check () 32 --->33 auto_booster = LightGBMTuner (* args,kwargs)
34 auto_booster.run ()
35 return auto_booster.get_best_booster ()

TypeError:init() got an unexpected keyword argument'best_params'

Python

Source code

best_params = {}
tuning_history = []

gbm = lgb.train (params, params,
lgb_train,
valid_sets = lgb_eval,
num_boost_round = 10000,
early_stopping_rounds = 100,
verbose_eval = 50,
best_params = best_params,
tuning_history = tuning_history)

What I tried

No response to dict () and list ()

Supplementary information (FW/tool version, etc.)

Please provide more detailed information here.

  • Answer # 1

    optuna's LightGBM Tuner is slightly different and simplified than (normal optuna + LightGBM). It seems that best_params and tuning_history are not supported in train.

    The official manual says it's not well documented and you should look at the official sample. In the official sample,best_params = model.paramsIt seems that best_params can be obtained later.

    Qiita: Tuning hyperparameters using LightGBM Tuner will be helpful. The parameters when actually not using Tuner are also introduced in the comments, and you can see the difference from ordinary optuna + LightGBM. There is no explanation on how to handle the parameters of best_params, but the best_params are calculated in the same way as the official sample.