avalanche.checkpointing.constructor_based_serialization

avalanche.checkpointing.constructor_based_serialization(pickler, obj: T, cls: Type[T], deduplicate: bool = False, args=None, kwargs=None)[source]

This utility is used manage the pickling of an object by only storing its constructor parameters.

This will also register the function that will be used to unpickle the object.

Classes whose objects can be serialized by only providing the constructor parameters can be registered using this utility.

The standard way to register a class is to put the following function in the same script where the class is defined (in this example, the class is CIFAR100):

```python @dill.register(CIFAR100) def checkpoint_CIFAR100(pickler, obj: CIFAR100):

constructor_based_serialization(

pickler, obj, CIFAR100, deduplicate=True, # check constructor_based_serialization for details on de-duplication kwargs=dict(

root=obj.root, train=obj.train, transform=obj.transform, target_transform=obj.target_transform,

)

)

```

Consider that alternative mechanisms exists, such as implementing custom __getstate__ and __setstate__ methods or by manually providing a custom @dill.register function. For the last option, see: https://stackoverflow.com/questions/27351980/how-to-add-a-custom-type-to-dills-pickleable-types

This mechanism also supports de-duplicating unique objects, such as datasets. This is useful to avoid duplicating the memory usage when loading a checkpoint with a large number of datasets (or an experiment that was already checkpointed are re-loaded multiple times).

If deduplicate is True, then the object is marked as elegible for de-duplication. It will be de-duplicated (not loaded from the checkpoint) if the user provides, in the unique_objects parameter of maybe_load_checkpoint, an object with identical constructor parameters. Deduplication should be activated for dataset objects.