Models module
models
Dynamic Modules
Dynamic Modules are Pytorch modules that can be incrementally expanded to allow architectural modifications (multi-head classifiers, progressive networks, …).
|
Dynamic Modules are Avalanche modules that can be incrementally expanded to allow architectural modifications (multi-head classifiers, progressive networks, ...). |
|
Base pytorch Module with support for task labels. |
|
Output layer that incrementally adds units whenever new classes are encountered. |
|
Multi-head classifier with separate heads for each task. |
|
Equivalent to IncrementalClassifier but using the cosine layer described in "Learning a Unified Classifier Incrementally via Rebalancing" by Saihui Hou et al. |
Progressive Neural Networks
Modules that implement progressive neural networks models, layers, and adapters.
|
Progressive Neural Network. |
|
Progressive Neural Network layer. |
|
Progressive Neural Network column. |
|
Linear adapter for Progressive Neural Networks. |
|
MLP adapter for Progressive Neural Networks. |
Models
|
Simple nn.Module to create a multi-layer perceptron with BatchNorm and ReLU activations. |
|
Create |
|
|
|
Multi-layer Perceptron for TinyImageNet benchmark. |
|
Convolutional Neural Network |
Convolutional Neural Network with multi-head classifier |
|
|
Multi-Layer Perceptron with custom parameters. |
|
Multi-layer perceptron with multi-head classifier |
|
|
|
|
|
MobileNet v1 implementation. |
|
NCM Classifier. |
|
This is a model wrapper to reproduce experiments from the original paper of Deep Streaming Linear Discriminant Analysis by using a pretrained ResNet model. |
|
Variational autoencoder module: fully-connected and suited for any input shape and type. |
|
|
|
Slimmed ResNet18. |
|
MultiTask Slimmed ResNet18. |
|
Overall parent module that holds the dictionary of expert autoencoders and expert classifiers. |
|
PackNet implements the PackNet algorithm for parameter isolation. |
|
Convenience function for creating a PackNet compatible |
|
Similar to NCM but uses malahanobis distance instead of l2 distance |
|
Cosine layer defined in "Learning a Unified Classifier Incrementally via Rebalancing" by Saihui Hou et al. |
|
This class keeps two Cosine Linear layers, without sigma scaling, and handles the sigma parameter that is common for the two of them. |
|
A PackNet module that has a weight and bias. |
Model Wrappers and Utilities
Wrappers and functions that add utility support to your models.
|
TrainEvalModel. |
|
This PyTorch module allows us to extract features from a backbone network given a layer name. |
A base abstract class for models |
|
|
|
|
Wraps around a model to make it a multitask model. |
Initialize the input network based on kaiming_normal with mode=fan_in for Conv2d and Linear blocks. |
|
|
This a direct wrapper to the model getter of pytorchcv. |
Dynamic optimizer utilities
Utilities to handle optimizer’s update when using dynamic architectures. Dynamic Modules (e.g. multi-head) can change their parameters dynamically during training, which usually requires to update the optimizer to learn the new parameters or freeze the old ones.
|
Reset the optimizer to update the list of learnable parameters. |
|
Update the optimizer by adding new parameters, removing removed parameters, and adding new parameters to the optimizer, for instance after model has been adapted to a new task. |
|
Add new parameters to the trainable parameters. |