AdaNet - 是Google发布的快速灵活TensorFlow AutoML框架

网友投稿 624 2022-10-26

AdaNet - 是Google发布的快速灵活TensorFlow AutoML框架

AdaNet - 是Google发布的快速灵活TensorFlow AutoML框架

AdaNet

AdaNet is a lightweight TensorFlow-based framework for automatically learning high-quality models with minimal expert intervention. AdaNet builds on recent AutoML efforts to be fast and flexible while providing learning guarantees. Importantly, AdaNet provides a general framework for not only learning a neural network architecture, but also for learning to ensemble to obtain even better models.

This project is based on the AdaNet algorithm, presented in “AdaNet: Adaptive Structural Learning of Artificial Neural Networks” at ICML 2017, for learning the structure of a neural network as an ensemble of subnetworks.

AdaNet has the following goals:

Ease of use: Provide familiar APIs (e.g. Keras, Estimator) for training, evaluating, and serving models.Speed: Scale with available compute and quickly produce high quality models.Flexibility: Allow researchers and practitioners to extend AdaNet to novel subnetwork architectures, search spaces, and tasks.Learning guarantees: Optimize an objective that offers theoretical learning guarantees.

The following animation shows AdaNet adaptively growing an ensemble of neural networks. At each iteration, it measures the ensemble loss for each candidate, and selects the best one to move onto the next iteration. At subsequent iterations, the blue subnetworks are frozen, and only yellow subnetworks are trained:

AdaNet was first announced on the Google AI research blog: "Introducing AdaNet: Fast and Flexible AutoML with Learning Guarantees".

This is not an official Google product.

Features

AdaNet provides the following AutoML features:

Adaptive neural architecture search and ensemble learning in a single train call.Regression, binary and multi-class classification, and multi-head task support.A tf.estimator.Estimator API for training, evaluation, prediction, and serving models.The adanet.AutoEnsembleEstimator for learning to ensemble user-defined tf.estimator.Estimators.The ability to define subnetworks that change structure over time using tf.layers via the adanet.subnetwork API.CPU, GPU, and TPU support.Distributed multi-server training.TensorBoard integration.

Example

A simple example of learning to ensemble linear and neural network models:

import adanetimport tensorflow as tf# Define the model head for computing loss and evaluation metrics.head = MultiClassHead(n_classes=10)# Feature columns define how to process examples.feature_columns = ...# Learn to ensemble linear and neural network models.estimator = adanet.AutoEnsembleEstimator( head=head, candidate_pool={ "linear": tf.estimator.LinearEstimator( head=head, feature_columns=feature_columns, optimizer=...), "dnn": tf.estimator.DNNEstimator( head=head, feature_columns=feature_columns, optimizer=..., hidden_units=[1000, 500, 100])}, max_iteration_steps=50)estimator.train(input_fn=train_input_fn, steps=100)metrics = estimator.evaluate(input_fn=eval_input_fn)predictions = estimator.predict(input_fn=predict_input_fn)

Getting Started

To get you started:

API DocumentationTutorials: for understanding the AdaNet algorithm and learning to use this package

Requirements

Requires Python 2.7, 3.4, 3.5, 3.6, or 3.7.

adanet supports both TensorFlow 2.0 and TensorFlow >=1.14. It depends on bug fixes and enhancements not present in TensorFlow releases prior to 1.14. You must install or upgrade your TensorFlow package to at least 1.14:

$ pip install "tensorflow>=1.14,<2.1"

Installing with Pip

You can use the pip package manager to install the official adanet package from PyPi:

$ pip install adanet

Installing from Source

To install from source first you'll need to install bazel following their installation instructions.

Next clone the adanet repository:

$ git clone https://github.com/tensorflow/adanet$ cd adanet

From the adanet root directory run the tests:

$ bazel build -c opt //...# Copy the generated python proto module to core.$ cp bazel-genfiles/adanet/core/report_pb2.py adanet/core# Run tests with nosetests, but skip example tests.$ NOSE_EXCLUDE='.*nasnet.*.py.*' python3 -m nose

Once you have verified that the tests have passed, install adanet from source as a pip package .

You are now ready to experiment with adanet.

import adanet

Citing this Work

If you use this AdaNet library for academic research, you are encouraged to cite the following paper from the ICML 2019 AutoML Workshop:

@misc{weill2019adanet, title={AdaNet: A Scalable and Flexible Framework for Automatically Learning Ensembles}, author={Charles Weill and Javier Gonzalvo and Vitaly Kuznetsov and Scott Yang and Scott Yak and Hanna Mazzawi and Eugen Hotaj and Ghassen Jerfel and Vladimir Macko and Ben Adlam and Mehryar Mohri and Corinna Cortes}, year={2019}, eprint={1905.00080}, archivePrefix={arXiv}, primaryClass={cs.LG}}

License

AdaNet is released under the Apache License 2.0.

版权声明:本文内容由网络用户投稿,版权归原作者所有,本站不拥有其著作权,亦不承担相应法律责任。如果您发现本站中有涉嫌抄袭或描述失实的内容,请联系我们jiasou666@gmail.com 处理,核实后本网站将在24小时内删除侵权内容。

上一篇:【博客优化】个人博客CSS优化2-两个样式可直接使用
下一篇:【博客优化】个人博客CSS优化
相关文章

 发表评论

暂时没有评论,来抢沙发吧~