Models module

This module provides models and building blocks to design continual learning architectures.

models

Dynamic Modules

Dynamic Modules are Pytorch modules that can be incrementally expanded to allow architectural modifications (multi-head classifiers, progressive networks, …).

DynamicModule([auto_adapt])

Dynamic Modules are Avalanche modules that can be incrementally expanded to allow architectural modifications (multi-head classifiers, progressive networks, ...).

MultiTaskModule(**kwargs)

Base pytorch Module with support for task labels.

IncrementalClassifier(in_features[, ...])

Output layer that incrementally adds units whenever new classes are encountered.

MultiHeadClassifier(in_features[, ...])

Multi-head classifier with separate heads for each task.

CosineIncrementalClassifier(in_features[, ...])

Equivalent to IncrementalClassifier but using the cosine layer described in "Learning a Unified Classifier Incrementally via Rebalancing" by Saihui Hou et al.

Progressive Neural Networks

Modules that implement progressive neural networks models, layers, and adapters.

PNN([num_layers, in_features, ...])

Progressive Neural Network.

PNNLayer(in_features, out_features_per_column)

Progressive Neural Network layer.

PNNColumn(in_features, ...[, adapter])

Progressive Neural Network column.

LinearAdapter(in_features, ...)

Linear adapter for Progressive Neural Networks.

MLPAdapter(in_features, ...[, activation])

MLP adapter for Progressive Neural Networks.

Models

Neural network architectures that can be used as backbones for CL experiments.

MLP(hidden_size[, last_activation])

Simple nn.Module to create a multi-layer perceptron with BatchNorm and ReLU activations.

make_icarl_net(num_classes[, n, c])

Create IcarlNet network, the ResNet used in ICarl.

IcarlNet(num_classes[, n, c])

SimpleMLP_TinyImageNet([num_classes, ...])

Multi-layer Perceptron for TinyImageNet benchmark.

SimpleCNN([num_classes])

Convolutional Neural Network

MTSimpleCNN()

Convolutional Neural Network with multi-head classifier

SimpleMLP([num_classes, input_size, ...])

Multi-Layer Perceptron with custom parameters.

MTSimpleMLP([input_size, hidden_size])

Multi-layer perceptron with multi-head classifier

SimpleSequenceClassifier(input_size, ...[, ...])

MTSimpleSequenceClassifier(input_size, ...)

MobilenetV1([pretrained, latent_layer_num])

MobileNet v1 implementation.

NCMClassifier([normalize])

NCM Classifier.

SLDAResNetModel([arch, output_layer_name, ...])

This is a model wrapper to reproduce experiments from the original paper of Deep Streaming Linear Discriminant Analysis by using a pretrained ResNet model.

MlpVAE(shape[, nhid, n_classes, device])

Variational autoencoder module: fully-connected and suited for any input shape and type.

LeNet5(n_classes, input_channels)

SlimResNet18(nclasses[, nf])

Slimmed ResNet18.

MTSlimResNet18(nclasses[, nf])

MultiTask Slimmed ResNet18.

ExpertGate(shape, device[, arch, ...])

Overall parent module that holds the dictionary of expert autoencoders and expert classifiers.

packnet.PackNetModel(wrappee)

PackNet implements the PackNet algorithm for parameter isolation.

packnet.packnet_simple_mlp([num_classes, ...])

Convenience function for creating a PackNet compatible SimpleMLP model.

FeCAMClassifier([tukey, shrinkage, shrink1, ...])

Similar to NCM but uses malahanobis distance instead of l2 distance

cosine_layer.CosineLinear(in_features, ...)

Cosine layer defined in "Learning a Unified Classifier Incrementally via Rebalancing" by Saihui Hou et al.

cosine_layer.SplitCosineLinear(in_features, ...)

This class keeps two Cosine Linear layers, without sigma scaling, and handles the sigma parameter that is common for the two of them.

packnet.WeightAndBiasPackNetModule(wrappee)

A PackNet module that has a weight and bias.

Model Wrappers and Utilities

Wrappers and functions that add utility support to your models.

TrainEvalModel(feature_extractor, ...)

TrainEvalModel.

FeatureExtractorBackbone(model, ...)

This PyTorch module allows us to extract features from a backbone network given a layer name.

BaseModel()

A base abstract class for models

avalanche_forward(model, x, task_labels)

as_multitask(model, classifier_name)

Wraps around a model to make it a multitask model.

initialize_icarl_net(m)

Initialize the input network based on kaiming_normal with mode=fan_in for Conv2d and Linear blocks.

pytorchcv_wrapper.get_model(name[, pretrained])

This a direct wrapper to the model getter of pytorchcv.

Dynamic optimizer utilities

Utilities to handle optimizer’s update when using dynamic architectures. Dynamic Modules (e.g. multi-head) can change their parameters dynamically during training, which usually requires to update the optimizer to learn the new parameters or freeze the old ones.

reset_optimizer(optimizer, model)

Reset the optimizer to update the list of learnable parameters.

update_optimizer(optimizer, new_params[, ...])

Update the optimizer by adding new parameters, removing removed parameters, and adding new parameters to the optimizer, for instance after model has been adapted to a new task.

add_new_params_to_optimizer(optimizer, ...)

Add new parameters to the trainable parameters.