Models module

This module provides models and building blocks to design continual learning architectures.

models

Dynamic Modules

Dynamic Modules are Pytorch modules that can be incrementally expanded to allow architectural modifications (multi-head classifiers, progressive networks, …).

DynamicModule()

Dynamic Modules are Avalanche modules that can be incrementally expanded to allow architectural modifications (multi-head classifiers, progressive networks, ...).

MultiTaskModule()

Base pytorch Module with support for task labels.

IncrementalClassifier(in_features[, ...])

Output layer that incrementally adds units whenever new classes are encountered.

MultiHeadClassifier(in_features[, ...])

Multi-head classifier with separate heads for each task.

Models

Neural network architectures that can be used as backbones for CL experiments.

MLP(hidden_size[, last_activation])

Simple nn.Module to create a multi-layer perceptron with BatchNorm and ReLU activations.

make_icarl_net(num_classes[, n, c])

Create IcarlNet network, the ResNet used in ICarl.

IcarlNet(num_classes[, n, c])

SimpleMLP_TinyImageNet([num_classes, ...])

Multi-layer Perceptron for TinyImageNet benchmark.

SimpleCNN([num_classes])

Convolutional Neural Network

MTSimpleCNN()

Convolutional Neural Network with multi-head classifier

SimpleMLP([num_classes, input_size, ...])

Multi-Layer Perceptron with custom parameters.

MTSimpleMLP([input_size, hidden_size])

Multi-layer perceptron with multi-head classifier

SimpleSequenceClassifier(input_size, ...[, ...])

MTSimpleSequenceClassifier(input_size, ...)

MobilenetV1([pretrained, latent_layer_num])

MobileNet v1 implementation.

NCMClassifier([class_mean])

NCM Classifier.

SLDAResNetModel([arch, output_layer_name, ...])

This is a model wrapper to reproduce experiments from the original paper of Deep Streaming Linear Discriminant Analysis by using a pretrained ResNet model.

MlpVAE(shape[, nhid, n_classes, device])

Variational autoencoder module: fully-connected and suited for any input shape and type.

LeNet5(n_classes, input_channels)

SlimResNet18(nclasses[, nf])

Slimmed ResNet18.

MTSlimResNet18(nclasses[, nf])

MultiTask Slimmed ResNet18.

Progressive Neural Networks

Modules that implement progressive neural networks models, layers, and adapters.

PNN([num_layers, in_features, ...])

Progressive Neural Network.

PNNLayer(in_features, out_features_per_column)

Progressive Neural Network layer.

PNNColumn(in_features, ...[, adapter])

Progressive Neural Network column.

LinearAdapter(in_features, ...)

Linear adapter for Progressive Neural Networks.

MLPAdapter(in_features, ...[, activation])

MLP adapter for Progressive Neural Networks.

Model Wrappers and Utilities

Wrappers and functions that add utility support to your models.

TrainEvalModel(feature_extractor, ...)

TrainEvalModel.

FeatureExtractorBackbone(model, ...)

This PyTorch module allows us to extract features from a backbone network given a layer name.

BaseModel()

A base abstract class for models

avalanche_forward(model, x, task_labels)

as_multitask(model, classifier_name)

Wraps around a model to make it a multitask model.

initialize_icarl_net(m)

Initialize the input network based on kaiming_normal with mode=fan_in for Conv2d and Linear blocks.

pytorchcv_wrapper.get_model(name[, pretrained])

This a direct wrapper to the model getter of pytorchcv.

Dynamic optimizer utilities

Utilities to handle optimizer’s update when using dynamic architectures. Dynamic Modules (e.g. multi-head) can change their parameters dynamically during training, which usually requires to update the optimizer to learn the new parameters or freeze the old ones.

reset_optimizer(optimizer, model)

Reset the optimizer to update the list of learnable parameters.

update_optimizer(optimizer, old_params, ...)

Update the optimizer by substituting old_params with new_params.

add_new_params_to_optimizer(optimizer, ...)

Add new parameters to the trainable parameters.