From 52a018ac84dadf9c1229daf19b188fb0545d232e Mon Sep 17 00:00:00 2001 From: Louis Dupont Date: Sun, 20 Aug 2023 15:56:29 +0300 Subject: [PATCH 1/7] wip --- documentation/source/configuration_files.md | 77 +------------ documentation/source/factories.md | 113 ++++++++++++++++++++ mkdocs.yml | 1 + 3 files changed, 119 insertions(+), 72 deletions(-) create mode 100644 documentation/source/factories.md diff --git a/documentation/source/configuration_files.md b/documentation/source/configuration_files.md index a3264f1a39..3798c01f7a 100644 --- a/documentation/source/configuration_files.md +++ b/documentation/source/configuration_files.md @@ -178,79 +178,12 @@ third_of_list: "${getitem: ${my_list}, 2}" first_of_list: "${first: ${my_list}}" last_of_list: "${last: ${my_list}}" ``` +You can register any additional resolver you want by simply following the official [documentation](https://omegaconf.readthedocs.io/en/latest/usage.html#resolvers). -The more advanced resolvers will instantiate objects. In the following example we define a few transforms that -will be used to augment a dataset. -```yaml -train_dataset_params: - transforms: - # for more options see common.factories.transforms_factory.py - - SegColorJitter: - brightness: 0.1 - contrast: 0.1 - saturation: 0.1 - - - SegRandomFlip: - prob: 0.5 - - - SegRandomRescale: - scales: [ 0.4, 1.6 ] -``` -Each one of the keys (`SegColorJitter`, `SegRandomFlip`, `SegRandomRescale`) is mapped to a type, and the configuration parameters under that key will be passed -to the type constructor by name (as key word arguments). - -If you want to see where this magic is happening, you can look for the `@resolve_param` decorator in the code - -```python -class ImageNetDataset(torch_datasets.ImageFolder): - - @resolve_param("transforms", factory=TransformsFactory()) - def __init__(self, root: str, transforms: Union[list, dict] = [], *args, **kwargs): - ... - ... -``` - -The `@resolve_param` wraps functions and resolves a string or a dictionary argument (in the example above "transforms") to an object. -To do so, it uses a factory object that maps a string or a dictionary to a type. when `__init__(..)` will be called, the function will receive -an object, and not a dictionary. The parameters under "transforms" in the YAML will be passed as -arguments for instantiation the objects. We will learn how to add a new type of object into these mappings in the next sections. - -## Registering a new object -To use a new object from your configuration file, you need to define the mapping of the string to a type. -This is done using one of the many registration function supported by SG. -```python -register_model -register_detection_module -register_metric -register_loss -register_dataloader -register_callback -register_transform -register_dataset -``` - -These decorator functions can be imported and used as follows: - -```python -from super_gradients.common.registry import register_model - -@register_model(name="MyNet") -class MyExampleNet(nn.Module): - def __init__(self, num_classes: int): - .... -``` - -This simple decorator, maps the name "MyNet" to the type `MyExampleNet`. Note that if your constructor -include required arguments, you will be expected to provide them when using this string - -```yaml -... -architecture: - MyNet: - num_classes: 8 -... - -``` +## Factories +Factories a similar to resolvers but were built specifically to instantiate SuperGradients objects within a recipe. +This is a key feature of SuperGradient which is being used in all of our recipes, and we recommend you to +go over the [documentation](factories.md). ## Required Hyper-Parameters Most parameters can be defined by default when including `default_train_params` in you `defaults`. diff --git a/documentation/source/factories.md b/documentation/source/factories.md new file mode 100644 index 0000000000..39e9588266 --- /dev/null +++ b/documentation/source/factories.md @@ -0,0 +1,113 @@ +# Working with Factories + +Factories in SuperGradients provide a powerful and concise way to instantiate objects in your configuration files. + +Prerequisites: +- [Training with Configuration Files](configuration_files.md) + +In this tutorial, we'll cover how to use existing factories, register new ones, and briefly explore the implementation details. + +## Utilizing Existing Factories + +Let's start by looking at how existing factories can be utilized to define a sequence of transforms for augmenting a dataset. + +```yaml +train_dataset_params: + transforms: + - SegColorJitter: + brightness: 0.1 + contrast: 0.1 + saturation: 0.1 + + - SegRandomFlip: + prob: 0.5 + + - SegRandomRescale: + scales: [0.4, 1.6] +``` + +In this example, SuperGradients will recognize the keys (`SegColorJitter`, `SegRandomFlip`, `SegRandomRescale`) +which refer to SuperGradient's classes. They will be instantiated and passed to the Dataset constructor. + +## Registering a Class + +To use a new object from your configuration file, you need to define the mapping of the string to a type. +This can be done using a registration functions. + +Here's an example of how you can register a new model called `MyNet`: + +```python +from super_gradients.common.registry import register_model + +@register_model(name="MyNet") +class MyExampleNet(nn.Module): + def __init__(self, num_classes: int): + .... +``` + +This simple decorator maps the name "MyNet" to the type `MyExampleNet`. If your constructor includes required arguments, +you will be expected to provide them in your YAML file: + +```yaml +architecture: + MyNet: + num_classes: 8 +``` + +Last step; make sure that you actually import the module including `MyExampleNet` into your script. +```python +from my_module import MyExampleNet # Importing the module is enough as it will trigger the register_model function + +@hydra.main(config_path=pkg_resources.resource_filename("super_gradients.recipes", ""), version_base="1.2") +def main(cfg: DictConfig) -> None: + Trainer.train_from_config(cfg) + +def run(): + init_trainer() + main() + +if __name__ == "__main__": + run() +``` + +## Under the Hood + +Now, let's briefly look at how factories used within SuperGradients. If you want to explore this magic, you can look for the `@resolve_param` decorator in the code. + +```python +class ImageNetDataset(torch_datasets.ImageFolder): + + @resolve_param("transforms", factory=TransformsFactory()) + def __init__(self, root: str, transforms: Union[list, dict] = [], *args, **kwargs): + ... + ... +``` + +The `@resolve_param` wraps functions and resolves a string or dictionary argument (in the example above "transforms") to an object. +When `__init__(..)` is called, the function will receive an object, not a dictionary. +The parameters under "transforms" in the YAML will be passed as arguments for instantiation. + +## Supported Factory Tyoes +Each of the type +``` + register_model + register_kd_model + register_detection_module + register_metric + register_loss + register_dataloader + register_callback + register_transform + register_dataset + register_pre_launch_callback + register_unet_backbone_stage + register_unet_up_block + register_target_generator + register_lr_scheduler + register_lr_warmup + register_sg_logger + register_collate_function + register_sampler + register_optimizer + register_processing +``` diff --git a/mkdocs.yml b/mkdocs.yml index 35c09e445d..79c640924f 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -27,6 +27,7 @@ nav: - Phase Callbacks: ./documentation/source/PhaseCallbacks.md - YAMLs and Recipes: - Configurations: ./documentation/source/configuration_files.md + - Factories: ./documentation/source/Factories.md - Recipes: ./src/super_gradients/recipes/Training_Recipes.md - Checkpoints: ./documentation/source/Checkpoints.md - Docker: ./documentation/source/SGDocker.md From ca51e01cf25257aa7d8e3960467c6d3be38c75fb Mon Sep 17 00:00:00 2001 From: Louis Dupont Date: Sun, 20 Aug 2023 22:15:01 +0300 Subject: [PATCH 2/7] clean version + update register in init Signed-off-by: Louis Dupont --- documentation/source/configuration_files.md | 1 + documentation/source/factories.md | 167 ++++++++++++++---- .../common/registry/__init__.py | 47 ++++- 3 files changed, 176 insertions(+), 39 deletions(-) diff --git a/documentation/source/configuration_files.md b/documentation/source/configuration_files.md index 3798c01f7a..2155b1b7b4 100644 --- a/documentation/source/configuration_files.md +++ b/documentation/source/configuration_files.md @@ -163,6 +163,7 @@ initial learning-rate. This feature is extremely usefully when experimenting wit Note that the arguments are referenced without the `--` prefix and that each parameter is referenced with its full path in the configuration tree, concatenated with a `.`. + ## Resolvers Resolvers are converting the strings from the YAML file into Python objects or values. The most basic resolvers are the Hydra native resolvers. Here are a few simple examples: diff --git a/documentation/source/factories.md b/documentation/source/factories.md index 39e9588266..ec87206d28 100644 --- a/documentation/source/factories.md +++ b/documentation/source/factories.md @@ -7,9 +7,11 @@ Prerequisites: In this tutorial, we'll cover how to use existing factories, register new ones, and briefly explore the implementation details. -## Utilizing Existing Factories +## Using Existing Factories -Let's start by looking at how existing factories can be utilized to define a sequence of transforms for augmenting a dataset. +If you had a look at the [recipes](https://github.com/Deci-AI/super-gradients/tree/master/src/super_gradients/recipes), you may have noticed that many objects are defined directly in the recipes. + +In the [Supervisely dataset recipe](https://github.com/Deci-AI/super-gradients/blob/master/src/super_gradients/recipes/dataset_params/supervisely_persons_dataset_params.yaml) you can see the following ```yaml train_dataset_params: @@ -18,78 +20,170 @@ train_dataset_params: brightness: 0.1 contrast: 0.1 saturation: 0.1 - - SegRandomFlip: prob: 0.5 - - SegRandomRescale: scales: [0.4, 1.6] ``` +If you load the `.yaml` recipe as is into a python dictionary, you would get the following +```python +{ + "train_dataset_params": { + "transforms": [ + { + "SegColorJitter": { + "brightness": 0.1, + "contrast": 0.1, + "saturation": 0.1 + } + }, + { + "SegRandomFlip": { + "prob": 0.5 + } + }, + { + "SegRandomRescale": { + "scales": [0.4, 1.6] + } + } + ] + } +} +``` + +This configuration alone is not very useful, as we need instances of the classes, not just their configurations. +So we would like to somehow instantiate these classes `SegColorJitter`, `SegRandomFlip` and `SegRandomRescale`. -In this example, SuperGradients will recognize the keys (`SegColorJitter`, `SegRandomFlip`, `SegRandomRescale`) -which refer to SuperGradient's classes. They will be instantiated and passed to the Dataset constructor. +Factories in SuperGradients come into play here! All these objects were registered beforehand in SuperGradients, +so that when you write these names in the recipe, SuperGradients will detect and instantiate them. ## Registering a Class -To use a new object from your configuration file, you need to define the mapping of the string to a type. -This can be done using a registration functions. +As explained above, only registered objects can be instantiated. +This registration consists of mapping the object to the corresponding type. -Here's an example of how you can register a new model called `MyNet`: +In the example above, the string `"SegColorJitter"` was mapped to the class `SegColorJitter`. and +this is how SuperGradients knows how to convert the string defined in the recipe, into an object. + +You can register the class using a name different from the actual class name. +However, it's generally recommended to use the same name for consistency and clarity. + +### Example ```python -from super_gradients.common.registry import register_model +from super_gradients.common.registry import register_transform -@register_model(name="MyNet") -class MyExampleNet(nn.Module): - def __init__(self, num_classes: int): - .... +@register_transform(name="MyTransformName") +class MyTransform: + def __init__(self, prob: float): + ... ``` +In this simple example, we register a new transform. +Note that here we registered (for the sake of the example) the class `MyTransform` to the name `MyTransformName` which is different. +We strongly recommend to not do it, and to instead register a class with its own name. -This simple decorator maps the name "MyNet" to the type `MyExampleNet`. If your constructor includes required arguments, -you will be expected to provide them in your YAML file: - +Once you registered a class, you can use it in your recipe. Here, we will add this transform to the original recipe ```yaml -architecture: - MyNet: - num_classes: 8 +train_dataset_params: + transforms: + - SegColorJitter: + brightness: 0.1 + contrast: 0.1 + saturation: 0.1 + - SegRandomFlip: + prob: 0.5 + - SegRandomRescale: + scales: [0.4, 1.6] + - MyTransformName: # We use the name used to register, which may be different from the name of the class + prob: 0.7 ``` -Last step; make sure that you actually import the module including `MyExampleNet` into your script. +Final Step: Ensure that you import the module containing `MyTransformName` into your script. +Doing so will trigger the registration function, allowing SuperGradients to recognize it. + +Here is an example (adapted from the [train_from_recipe script](https://github.com/Deci-AI/super-gradients/blob/master/src/super_gradients/train_from_recipe.py)). + ```python -from my_module import MyExampleNet # Importing the module is enough as it will trigger the register_model function +from .my_module import MyTransform # Importing the module is enough as it will trigger the register_model function -@hydra.main(config_path=pkg_resources.resource_filename("super_gradients.recipes", ""), version_base="1.2") -def main(cfg: DictConfig) -> None: +# The code below is the same as the basic `train_from_recipe.py` script +# See: https://github.com/Deci-AI/super-gradients/blob/master/src/super_gradients/train_from_recipe.py +from omegaconf import DictConfig +import hydra + +from super_gradients import Trainer, init_trainer + + +@hydra.main(config_path="recipes", version_base="1.2") +def _main(cfg: DictConfig) -> None: Trainer.train_from_config(cfg) -def run(): - init_trainer() - main() + +def main() -> None: + init_trainer() # `init_trainer` needs to be called before `@hydra.main` + _main() + if __name__ == "__main__": - run() + main() + ``` ## Under the Hood -Now, let's briefly look at how factories used within SuperGradients. If you want to explore this magic, you can look for the `@resolve_param` decorator in the code. +Until now, we saw how to use existing Factories, and how to register new ones. +In some cases, you may want to create objects that would benefit from using the factories. + +### Basic +The basic way to use factories as below. +``` +from super_gradients.common.factories import TransformsFactory +factory = TransformsFactory() +my_transform = factory.get({'MyTransformName': {'prob': 0.7}}) +``` +You may recognize that the input passed to `factory.get` is actually the dictionary that we get after loading the recipe +(See [Utilizing Existing Factories](#utilizing-existing-factories)) + +### Recommended +Factories become even more powerful when used with the `@resolve_param` decorator. +This feature allows functions to accept both instantiated objects and their dictionary representations. +It means you can pass either the actual python object or a dictionary that describes it straight from the recipe. ```python class ImageNetDataset(torch_datasets.ImageFolder): @resolve_param("transforms", factory=TransformsFactory()) - def __init__(self, root: str, transforms: Union[list, dict] = [], *args, **kwargs): - ... + def __init__(self, root: str, transform: Transform): ... ``` -The `@resolve_param` wraps functions and resolves a string or dictionary argument (in the example above "transforms") to an object. -When `__init__(..)` is called, the function will receive an object, not a dictionary. -The parameters under "transforms" in the YAML will be passed as arguments for instantiation. +Now, `ImageNetDataset` can be passed both an instance of `MyTransform` + +```python +my_transform = MyTransform(prob=0.7) +ImageNetDataset(root=..., transform=my_transform) +``` -## Supported Factory Tyoes -Each of the type +And a dictionary representing the same object +```python +my_transform = {'MyTransformName': {'prob': 0.7}} +ImageNetDataset(root=..., transform=my_transform) ``` + +This second way of instantiating the dataset combines perfectly with the concept `.yaml` recipes. + +## Supported Factory Types +Until here, we focused on a single type of factory, `TransformsFactory`, +associated with the registration decorator `register_transform`. + +SuperGradients supports a wide range of factories, used throughout the training process, +each with its own registering decorator. + +SuperGradients offers various types of factories, and each is associated with a specific registration decorator. + +``` python +from super_gradients.common.factories import ( register_model register_kd_model register_detection_module @@ -110,4 +204,5 @@ Each of the type register_sampler register_optimizer register_processing +) ``` diff --git a/src/super_gradients/common/registry/__init__.py b/src/super_gradients/common/registry/__init__.py index a72c6d3465..291afde3bf 100644 --- a/src/super_gradients/common/registry/__init__.py +++ b/src/super_gradients/common/registry/__init__.py @@ -1,4 +1,45 @@ -from super_gradients.common.registry.registry import register_model, register_metric, register_loss, register_detection_module, register_lr_scheduler +from super_gradients.common.registry.registry import ( + register_model, + register_kd_model, + register_detection_module, + register_metric, + register_loss, + register_dataloader, + register_callback, + register_transform, + register_dataset, + register_pre_launch_callback, + register_unet_backbone_stage, + register_unet_up_block, + register_target_generator, + register_lr_scheduler, + register_lr_warmup, + register_sg_logger, + register_collate_function, + register_sampler, + register_optimizer, + register_processing, +) - -__all__ = ["register_model", "register_detection_module", "register_metric", "register_loss", "register_lr_scheduler"] +__all__ = [ + "register_model", + "register_kd_model", + "register_detection_module", + "register_metric", + "register_loss", + "register_dataloader", + "register_callback", + "register_transform", + "register_dataset", + "register_pre_launch_callback", + "register_unet_backbone_stage", + "register_unet_up_block", + "register_target_generator", + "register_lr_scheduler", + "register_lr_warmup", + "register_sg_logger", + "register_collate_function", + "register_sampler", + "register_optimizer", + "register_processing", +] From b1bb5f0783ef52688d0febb4a0445da714a5a298 Mon Sep 17 00:00:00 2001 From: Louis Dupont Date: Sun, 20 Aug 2023 22:20:32 +0300 Subject: [PATCH 3/7] minor change --- documentation/source/factories.md | 7 +++---- 1 file changed, 3 insertions(+), 4 deletions(-) diff --git a/documentation/source/factories.md b/documentation/source/factories.md index ec87206d28..f5e2c062cd 100644 --- a/documentation/source/factories.md +++ b/documentation/source/factories.md @@ -56,15 +56,14 @@ This configuration alone is not very useful, as we need instances of the classes So we would like to somehow instantiate these classes `SegColorJitter`, `SegRandomFlip` and `SegRandomRescale`. Factories in SuperGradients come into play here! All these objects were registered beforehand in SuperGradients, -so that when you write these names in the recipe, SuperGradients will detect and instantiate them. +so that when you write these names in the recipe, SuperGradients will detect and instantiate them for you. ## Registering a Class As explained above, only registered objects can be instantiated. -This registration consists of mapping the object to the corresponding type. +This registration consists of mapping the object name to the corresponding class type. -In the example above, the string `"SegColorJitter"` was mapped to the class `SegColorJitter`. and -this is how SuperGradients knows how to convert the string defined in the recipe, into an object. +In the example above, the string `"SegColorJitter"` was mapped to the class `SegColorJitter`, and this is how SuperGradients knows how to convert the string defined in the recipe, into an object. You can register the class using a name different from the actual class name. However, it's generally recommended to use the same name for consistency and clarity. From 74d562ae5b554e5d1403880f2b2cee607383210f Mon Sep 17 00:00:00 2001 From: Louis Dupont Date: Mon, 21 Aug 2023 10:10:45 +0300 Subject: [PATCH 4/7] utilizing using --- documentation/source/factories.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/documentation/source/factories.md b/documentation/source/factories.md index f5e2c062cd..3a89c1050f 100644 --- a/documentation/source/factories.md +++ b/documentation/source/factories.md @@ -142,7 +142,7 @@ factory = TransformsFactory() my_transform = factory.get({'MyTransformName': {'prob': 0.7}}) ``` You may recognize that the input passed to `factory.get` is actually the dictionary that we get after loading the recipe -(See [Utilizing Existing Factories](#utilizing-existing-factories)) +(See [Using Existing Factories](#using-existing-factories)) ### Recommended Factories become even more powerful when used with the `@resolve_param` decorator. From aaeae5297e2a8f7273775694e2ff48f6884296d5 Mon Sep 17 00:00:00 2001 From: Louis Dupont Date: Mon, 21 Aug 2023 10:13:57 +0300 Subject: [PATCH 5/7] fix typo --- documentation/source/configuration_files.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/documentation/source/configuration_files.md b/documentation/source/configuration_files.md index 2155b1b7b4..5c8f3501b7 100644 --- a/documentation/source/configuration_files.md +++ b/documentation/source/configuration_files.md @@ -182,9 +182,9 @@ last_of_list: "${last: ${my_list}}" You can register any additional resolver you want by simply following the official [documentation](https://omegaconf.readthedocs.io/en/latest/usage.html#resolvers). ## Factories -Factories a similar to resolvers but were built specifically to instantiate SuperGradients objects within a recipe. +Factories are similar to resolvers but were built specifically to instantiate SuperGradients objects within a recipe. This is a key feature of SuperGradient which is being used in all of our recipes, and we recommend you to -go over the [documentation](factories.md). +go over this [introduction to Factories](factories.md). ## Required Hyper-Parameters Most parameters can be defined by default when including `default_train_params` in you `defaults`. From 1614acb085971f8789139c2971a2e33cabb801a2 Mon Sep 17 00:00:00 2001 From: Louis Dupont Date: Mon, 21 Aug 2023 12:17:16 +0300 Subject: [PATCH 6/7] fix --- documentation/source/configuration_files.md | 10 ++++------ 1 file changed, 4 insertions(+), 6 deletions(-) diff --git a/documentation/source/configuration_files.md b/documentation/source/configuration_files.md index 5c8f3501b7..c85fe06f09 100644 --- a/documentation/source/configuration_files.md +++ b/documentation/source/configuration_files.md @@ -53,8 +53,8 @@ train_dataset_params: - RandomHorizontalFlip - ToTensor - Normalize: - mean: ${dataset_params.img_mean} - std: ${dataset_params.img_std} + mean: [0.485, 0.456, 0.406] # mean for normalization + std: [0.229, 0.224, 0.225] # std for normalization val_dataset_params: root: /data/Imagenet/val @@ -65,8 +65,8 @@ val_dataset_params: size: 224 - ToTensor - Normalize: - mean: ${dataset_params.img_mean} - std: ${dataset_params.img_std} + mean: [0.485, 0.456, 0.406] # mean for normalization + std: [0.229, 0.224, 0.225] # std for normalization ``` Configuration file can also help you track the exact settings used for each one of your experiments, tweak and tune these settings, and share them with others. @@ -107,7 +107,6 @@ python -m super_gradients.evaluate_from_recipe --config-name=cifar10_resnet that will run only the evaluation part of the recipe (without any training iterations) - ## Hydra Hydra is an open-source Python framework that provides us with many useful functionalities for YAML management. You can learn about Hydra [here](https://hydra.cc/docs/intro). We use Hydra to load YAML files and convert them into dictionaries, while @@ -130,7 +129,6 @@ in the first arg of the command line. In the experiment directory a `.hydra` subdirectory will be created. The configuration files related to this run will be saved by hydra to that subdirectory. --------- Two Hydra features worth mentioning are _YAML Composition_ and _Command-Line Overrides_. #### YAML Composition From 6b8e8ee8aee63068e1aee19cef7b314c06a44373 Mon Sep 17 00:00:00 2001 From: Louis Dupont Date: Tue, 22 Aug 2023 13:21:56 +0300 Subject: [PATCH 7/7] fix type, add ',' and explain diff between register and resolve_params --- documentation/source/factories.md | 48 +++++++++++++++++-------------- 1 file changed, 26 insertions(+), 22 deletions(-) diff --git a/documentation/source/factories.md b/documentation/source/factories.md index 3a89c1050f..f7d1036330 100644 --- a/documentation/source/factories.md +++ b/documentation/source/factories.md @@ -104,7 +104,7 @@ Doing so will trigger the registration function, allowing SuperGradients to reco Here is an example (adapted from the [train_from_recipe script](https://github.com/Deci-AI/super-gradients/blob/master/src/super_gradients/train_from_recipe.py)). ```python -from .my_module import MyTransform # Importing the module is enough as it will trigger the register_model function +from .my_module import MyTransform # Importing the module is enough as it will trigger the register_transform function # The code below is the same as the basic `train_from_recipe.py` script # See: https://github.com/Deci-AI/super-gradients/blob/master/src/super_gradients/train_from_recipe.py @@ -152,7 +152,7 @@ It means you can pass either the actual python object or a dictionary that descr ```python class ImageNetDataset(torch_datasets.ImageFolder): - @resolve_param("transforms", factory=TransformsFactory()) + @resolve_param("transform", factory=TransformsFactory()) def __init__(self, root: str, transform: Transform): ... ``` @@ -172,6 +172,10 @@ ImageNetDataset(root=..., transform=my_transform) This second way of instantiating the dataset combines perfectly with the concept `.yaml` recipes. +**Difference with `register_transform`** +- `register_transform` is responsible to map a string to a class type. +- `@resolve_param("transform", factory=TransformsFactory())` is responsible to convert a config into an object, using the mapping created with `register_transform`. + ## Supported Factory Types Until here, we focused on a single type of factory, `TransformsFactory`, associated with the registration decorator `register_transform`. @@ -183,25 +187,25 @@ SuperGradients offers various types of factories, and each is associated with a ``` python from super_gradients.common.factories import ( - register_model - register_kd_model - register_detection_module - register_metric - register_loss - register_dataloader - register_callback - register_transform - register_dataset - register_pre_launch_callback - register_unet_backbone_stage - register_unet_up_block - register_target_generator - register_lr_scheduler - register_lr_warmup - register_sg_logger - register_collate_function - register_sampler - register_optimizer - register_processing + register_model, + register_kd_model, + register_detection_module, + register_metric, + register_loss, + register_dataloader, + register_callback, + register_transform, + register_dataset, + register_pre_launch_callback, + register_unet_backbone_stage, + register_unet_up_block, + register_target_generator, + register_lr_scheduler, + register_lr_warmup, + register_sg_logger, + register_collate_function, + register_sampler, + register_optimizer, + register_processing, ) ```