Models module

This module provides models and building blocks to design continual learning architectures.

models

Dynamic Modules

Dynamic Modules are Pytorch modules that can be incrementally expanded to allow architectural modifications (multi-head classifiers, progressive networks, …).

MultiTaskModule()

Base pytorch Module with support for task labels.

IncrementalClassifier(in_features[, ...])

Output layer that incrementally adds units whenever new classes are encountered.

MultiHeadClassifier(in_features[, ...])

Multi-head classifier with separate heads for each task.

TrainEvalModel(feature_extractor, ...)

TrainEvalModel.

Models

Neural network architectures that can be used as backbones for CL experiments.

PNN([num_layers, in_features, ...])

Progressive Neural Network.

MLP(hidden_size[, last_activation])

Simple nn.Module to create a multi-layer perceptron with BatchNorm and ReLU activations.

make_icarl_net(num_classes[, n, c])

Create IcarlNet network, the ResNet used in ICarl.

IcarlNet(num_classes[, n, c])

SimpleMLP_TinyImageNet([num_classes, ...])

Multi-layer Perceptron for TinyImageNet benchmark.

SimpleCNN([num_classes])

Convolutional Neural Network

MTSimpleCNN()

Convolutional Neural Network with multi-head classifier

SimpleMLP([num_classes, input_size, ...])

Multi-Layer Perceptron with custom parameters.

MTSimpleMLP([input_size, hidden_size])

Multi-layer perceptron with multi-head classifier

MobilenetV1([pretrained, latent_layer_num])

MobileNet v1 implementation.

NCMClassifier([class_mean])

NCM Classifier.

SLDAResNetModel([arch, output_layer_name, ...])

This is a model wrapper to reproduce experiments from the original paper of Deep Streaming Linear Discriminant Analysis by using a pretrained ResNet model.

pytorchcv_wrapper.get_model(name[, pretrained])

This a direct wrapper to the model getter of pytorchcv.

Model Utilities

avalanche_forward(model, x, task_labels)

as_multitask(model, classifier_name)

Wraps around a model to make it a multitask model.

initialize_icarl_net(m)

Initialize the input network based on kaiming_normal with mode=fan_in for Conv2d and Linear blocks.

Dynamic optimizer utilities

Utilities to handle optimizer’s update when using dynamic architectures. Dynamic Modules (e.g. multi-head) can change their parameters dynamically during training, which usually requires to update the optimizer to learn the new parameters or freeze the old ones.

reset_optimizer(optimizer, model)

Reset the optimizer to update the list of learnable parameters.

update_optimizer(optimizer, old_params, ...)

Update the optimizer by substituting old_params with new_params.

add_new_params_to_optimizer(optimizer, ...)

Add new parameters to the trainable parameters.