AdaNet is an extended implementation of AdaNet: Adaptive Structural Learning of Artificial Neural Networks by [Cortes et al., ICML 2017], an algorithm for iteratively learning both the structure and weights of a neural network as an ensemble of subnetworks.

Ensembles of subnetworks

In AdaNet, ensembles are first-class objects. Every model you train will be one form of an ensemble or another. An ensemble is composed of one or more subnetworks whose outputs are combined via an ensembler.

An ensemble is composed of subnetworks whose outputs are combined via an ensembler.Terminology.

Ensembles are model-agnostic, meaning a subnetwork can be as complex as deep neural network, or as simple as an if-statement. All that matters is that for a given input tensor, the subnetworks’ outputs can be combined by the ensembler to form a single prediction.

Iteration lifecycle

Each AdaNet iteration has the given lifecycle:

The lifecycle of an AdaNet iteration.AdaNet iteration lifecucle

Each of these concepts has an associated Python object:

  • Subnetwork Generator and Subnetwork are defined in the adanet.subnetwork package.
  • Ensemble Strategy, Ensembler, and Ensemble are defined in the adanet.ensemble package.


AdaNet is designed to operate primarily inside of TensorFlow’s computation graph. This allows it to efficiently utilize available resources like distributed compute, GPU, and TPU, using TensorFlow primitives.

AdaNet provides a unique adaptive computation graph, which can support building models that create and remove ops and variables over time, but still have the optimizations and scalability of TensorFlow’s graph-mode. This adaptive graph enables users to develop progressively growing models (e.g. boosting style), develop architecture search algorithms, and perform hyper-parameter tuning without needing to manage an external for-loop.

Example ensembles

Below are a few more examples of ensembles you can obtain with AdaNet depending on the search space you define. First, there is an ensemble composed of increasingly complex neural network subnetworks whose outputs are simply averaged:

An ensemble is composed of subnetworks with different complexities.Ensemble of subnetworks with different complexities.

Another common example is an ensemble learned on top of a shared embedding. Useful when the majority of the model parameters are an embedding of a feature. The individual subnetworks’ predictions are combined using a learned linear combination:

An ensemble is composed of subnetworks whose outputs are combined via an ensembler.Subnetworks sharing a common embedding.

Quick start

Now that you are familiar with AdaNet, you can explore our quick start guide.