操作系统寒武纪 - 会让企业IT高兴吗?
1070
2022-11-05
Optuna是一种自动超参优化软件框架,专为机器学习而设计
Optuna: A hyperparameter optimization framework
Website | Docs | Install Guide | Tutorial
Optuna is an automatic hyperparameter optimization software framework, particularly designed for machine learning. It features an imperative, define-by-run style user API. Thanks to our define-by-run API, the code written with Optuna enjoys high modularity, and the user of Optuna can dynamically construct the search spaces for the hyperparameters.
Key Features
Optuna has modern functionalities as follows:
Lightweight, versatile, and platform agnostic architectureParallel distributed optimizationPruning of unpromising trials
Basic Concepts
We use the terms study and trial as follows:
Study: optimization based on an objective functionTrial: a single execution of the objective function
Please refer to sample code below. The goal of a study is to find out the optimal set of hyperparameter values (e.g., classifier and svm_c) through multiple trials (e.g., n_trials=100). Optuna is a framework designed for the automation and the acceleration of the optimization studies.
import ...# Define an objective function to be minimized.def objective(trial): # Invoke suggest methods of a Trial object to generate hyperparameters. regressor_name = trial.suggest_categorical('classifier', ['SVR', 'RandomForest']) if regressor_name == 'SVR': svr_c = trial.suggest_loguniform('svr_c', 1e-10, 1e10) regressor_obj = sklearn.svm.SVR(C=svr_c) else: rf_max_depth = trial.suggest_int('rf_max_depth', 2, 32) regressor_obj = sklearn.ensemble.RandomForestRegressor(max_depth=rf_max_depth) X, y = sklearn.datasets.load_boston(return_X_y=True) X_train, X_val, y_train, y_val = sklearn.model_selection.train_test_split(X, y, random_state=0) regressor_obj.fit(X_train, y_train) y_pred = regressor_obj.predict(X_val) error = sklearn.metrics.mean_squared_error(y_val, y_pred) return error # An objective value linked with the Trial object.study = optuna.create_study() # Create a new study.study.optimize(objective, n_trials=100) # Invoke optimization of the objective function.
Integrations
Integrations modules, which allow pruning, or early stopping, of unpromising trials are available for the following libraries:
XGBoostLightGBMChainerKerasTensorFlowtf.kerasMXNetPyTorch IgnitePyTorch LightningFastAIAllenNLP
Installation
Optuna is available at the Python Package Index and on Anaconda Cloud.
# PyPI$ pip install optuna
# Anaconda Cloud$ conda install -c conda-forge optuna
Optuna supports Python 3.5 or newer.
Communication
GitHub Issues for bug reports, feature requests and questions.Gitter for interactive chat with developers.Stack Overflow for questions.
Contribution
Any contributions to Optuna are welcome! When you send a pull request, please follow the contribution guide.
License
MIT License (see LICENSE).
Reference
Takuya Akiba, Shotaro Sano, Toshihiko Yanase, Takeru Ohta, and Masanori Koyama. 2019. Optuna: A Next-generation Hyperparameter Optimization Framework. In KDD (arXiv).
版权声明:本文内容由网络用户投稿,版权归原作者所有,本站不拥有其著作权,亦不承担相应法律责任。如果您发现本站中有涉嫌抄袭或描述失实的内容,请联系我们jiasou666@gmail.com 处理,核实后本网站将在24小时内删除侵权内容。
发表评论
暂时没有评论,来抢沙发吧~