Models module
models
Dynamic Modules
Dynamic Modules are Pytorch modules that can be incrementally expanded to allow architectural modifications (multi-head classifiers, progressive networks, …).
Multi-task modules are `torch.nn.Modules`s for multi-task scenarios. |
|
|
Output layer that incrementally adds units whenever new classes are encountered. |
|
Multi-head classifier with separate heads for each task. |
|
TrainEvalModel. |
Models
|
Progressive Neural Network. |
|
Create |
|
Multi-layer Perceptron for TinyImageNet benchmark. |
|
Convolutional Neural Network |
Convolutional Neural Network with multi-head classifier |
|
|
Multi-Layer Perceptron with custom parameters. |
|
Multi-layer perceptron with multi-head classifier |
|
MobileNet v1 implementation. |
|
NCM Classifier. |
|
This is a model wrapper to reproduce experiments from the original paper of Deep Streaming Linear Discriminant Analysis by using a pretrained ResNet model. |
|
This a direct wrapper to the model getter of pytorchcv. |
Model Utilities
|
|
|
Wraps around a model to make it a multitask model |
Initialize the input network based on kaiming_normal with mode=fan_in for Conv2d and Linear blocks. |
Dynamic optimizer utilities
Utilities to handle optimizer’s update when using dynamic architectures. Dynamic Modules (e.g. multi-head) can change their parameters dynamically during training, which usually requires to update the optimizer to learn the new parameters or freeze the old ones.
|
Reset the optimizer to update the list of learnable parameters. |
|
Update the optimizer by substituting old_params with new_params. |
|
Add new parameters to the trainable parameters. |