How to use the mlopt.sklearn_tune.AdaBoostClassifierOpt function in mlopt

To help you get started, we’ve selected a few mlopt examples, based on popular ways it is used in public projects.

Secure your code as it's written. Use Snyk Code to scan source code in minutes - no build needed - and fix issues immediately.

github arnaudvl / ml-parameter-optimization / examples / ml_optimization.py View on Github external
# optimize parameters for logistic regression using gridsearch over a range of parameters
# note: no pre-processing steps will be done for the examples below
lr = LogisticRegressionOpt(X,y,params_cv=params_cv,model_name='lr_porto_seguro',save_dir=save_dir)
lr.tune_params()
print('Best model parameters:')
print(lr.best_model)
lr.save_model()

# we will reduce the size of the dataset for the rest of the examples
# to keep the training time reasonable
X_slice = X[:10000,:]
y_slice = y[:10000]

# adaboost
ada = AdaBoostClassifierOpt(X_slice,y_slice,params_cv=params_cv,model_name='ada_porto_seguro',save_dir=save_dir)
ada.tune_params()
print('Best model parameters:')
print(ada.best_model)
ada.save_model()

# we can set the range of the parameters tuned manually
# more info about the parameters to be tuned for each algorithm
# can be found in the MLOpt class

# kNN
tune_range = {'n_neighbors':[10,20,50,100]}
knn = KNNOpt(X_slice,y_slice,params_cv=params_cv,tune_range=tune_range,model_name='knn_porto_seguro',save_dir=save_dir)
knn.tune_params()
print('Best model parameters:')
print(knn.best_model)
knn.save_model()