Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support Enum Modules #726

Closed
Gadersd opened this issue Aug 29, 2023 · 7 comments
Closed

Support Enum Modules #726

Gadersd opened this issue Aug 29, 2023 · 7 comments
Assignees
Labels
feature The feature request

Comments

@Gadersd
Copy link
Contributor

Gadersd commented Aug 29, 2023

Feature description

Support deriving Module for enums to enable code such as

#[derive(Module, Debug, Clone)]
enum Layer<B: Backend> {
    Slow(LayerBig<B>), 
    Fast(LayerSmall<B>), 
}

Feature motivation

This feature would make it easy to switch between layer variants and enable storing a vector of layers of different types.

@nathanielsimard
Copy link
Member

Yesss !

@antimora
Copy link
Collaborator

antimora commented Sep 8, 2023

Related ticket: #583. Someone shared how to use enum for sequential forward.

@antimora
Copy link
Collaborator

Someone filed a ticket regarding this: #983

@finnkauski
Copy link

finnkauski commented Feb 11, 2024

Hey folks,

My case is that I have a Pytorch model with Sequential and without Enum type support with derive, it would mean implementing Module by hand just to support a Vec<CustomEnum>. Which means you'd have to implement Module for the enum itself, then Record for the module record struct and Item for a item struct if I understand correctly. I had a look and all of this doesn't seem super trivial?

It seems that, loading something like this is not really trivial at the moment.

import torch 

class CustomLayer(torch.nn.Module):
    def __init__(self, channels, val):
        super().__init__()
        self.something = torch.ones(1, channels, 1) * val 

    def forward(self, x):
        return x * self.something
        

class Net(torch.nn.Module):
    def __init__(self):
        super().__init__()
        self.block = torch.nn.Sequential(
            *[torch.nn.Conv1d(1, 32, 7), torch.nn.Tanh(), torch.nn.ConvTranspose1d(32, 22, 8), CustomLayer(22, 100)]
        )
    def forward(self, x):
        return self.block(x)


model = Net()
res = model(torch.ones(1,1,400))

torch.save(model.state_dict(), "dummy.pt")

I haven't tried to see if the PyTorchRecorder would actually allow this if I implemented Module for an enum manually because I haven't grasped yet how to do that as mentioned above. I appreciate that my case this point is focused in part on the PyTorchRecorder capability but it seems like it depends on Module derive for Enums being implemented just to test it.

Are there any examples in the codebase I'm missing that can be used to guide an implementation of something like this from scratch?

@nathanielsimard
Copy link
Member

@finnkauski, based on your example, I believe it would be optimal to encapsulate each module within a struct. The usage of Sequential in your example seems to focus on automatically generating a forward pass rather than facilitating the addition of a dynamic number of different modules in a list. If you intend to load weights from a PyTorch recorder, a straightforward approach would be to remap block.seq.0 to block.conv1d, and so forth.

@finnkauski
Copy link

finnkauski commented Feb 12, 2024

@finnkauski, based on your example, I believe it would be optimal to encapsulate each module within a struct. The usage of Sequential in your example seems to focus on automatically generating a forward pass rather than facilitating the addition of a dynamic number of different modules in a list. If you intend to load weights from a PyTorch recorder, a straightforward approach would be to remap block.seq.0 to block.conv1d, and so forth.

I think this is a toy example, there are dynamic things at play in the real model which dictate the structure of the model and how many of certain layers there would be based on say the config values.

@laggui laggui added this to Burn 🔥 Feb 16, 2024
@laggui laggui moved this to In Progress in Burn 🔥 Feb 16, 2024
@laggui laggui self-assigned this Feb 16, 2024
@laggui laggui mentioned this issue Feb 20, 2024
1 task
@laggui laggui moved this from In Progress to In Review in Burn 🔥 Feb 20, 2024
@laggui
Copy link
Member

laggui commented Feb 22, 2024

Closed with #1337

I you're looking for named enum support, see #1343

@laggui laggui closed this as completed Feb 22, 2024
@github-project-automation github-project-automation bot moved this from In Review to Done in Burn 🔥 Feb 22, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature The feature request
Projects
Status: Done
Development

No branches or pull requests

5 participants