Models module
models
Dynamic Modules
Dynamic Modules are Pytorch modules that can be incrementally expanded to allow architectural modifications (multi-head classifiers, progressive networks, …).
Dynamic Modules are Avalanche modules that can be incrementally expanded to allow architectural modifications (multi-head classifiers, progressive networks, ...). |
|
Base pytorch Module with support for task labels. |
|
|
Output layer that incrementally adds units whenever new classes are encountered. |
|
Multi-head classifier with separate heads for each task. |
Models
|
Simple nn.Module to create a multi-layer perceptron with BatchNorm and ReLU activations. |
|
Create |
|
|
|
Multi-layer Perceptron for TinyImageNet benchmark. |
|
Convolutional Neural Network |
Convolutional Neural Network with multi-head classifier |
|
|
Multi-Layer Perceptron with custom parameters. |
|
Multi-layer perceptron with multi-head classifier |
|
|
|
|
|
MobileNet v1 implementation. |
|
NCM Classifier. |
|
This is a model wrapper to reproduce experiments from the original paper of Deep Streaming Linear Discriminant Analysis by using a pretrained ResNet model. |
|
Variational autoencoder module: fully-connected and suited for any input shape and type. |
|
|
|
Slimmed ResNet18. |
|
MultiTask Slimmed ResNet18. |
Progressive Neural Networks
Modules that implement progressive neural networks models, layers, and adapters.
|
Progressive Neural Network. |
|
Progressive Neural Network layer. |
|
Progressive Neural Network column. |
|
Linear adapter for Progressive Neural Networks. |
|
MLP adapter for Progressive Neural Networks. |
Model Wrappers and Utilities
Wrappers and functions that add utility support to your models.
|
TrainEvalModel. |
|
This PyTorch module allows us to extract features from a backbone network given a layer name. |
A base abstract class for models |
|
|
|
|
Wraps around a model to make it a multitask model. |
Initialize the input network based on kaiming_normal with mode=fan_in for Conv2d and Linear blocks. |
|
|
This a direct wrapper to the model getter of pytorchcv. |
Dynamic optimizer utilities
Utilities to handle optimizer’s update when using dynamic architectures. Dynamic Modules (e.g. multi-head) can change their parameters dynamically during training, which usually requires to update the optimizer to learn the new parameters or freeze the old ones.
|
Reset the optimizer to update the list of learnable parameters. |
|
Update the optimizer by substituting old_params with new_params. |
|
Add new parameters to the trainable parameters. |