Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DynUNet does not apply Dropout #4844

Closed
KreitnerL opened this issue Aug 5, 2022 · 2 comments
Closed

DynUNet does not apply Dropout #4844

KreitnerL opened this issue Aug 5, 2022 · 2 comments

Comments

@KreitnerL
Copy link

Describe the bug
Even though the DynUNet does have an option for Dropout, the Dropout layers are never created.

To Reproduce

model = DynUNet(
    spatial_dims=2,
    in_channels=1,
    out_channels=1,
    kernel_size=(3, 3,3),
    strides=(1,2,1),
    upsample_kernel_size=(1,2,1),
    dropout=0.2
).to(device)
print(model)

Expected behavior
The Dropout layers should be added after each norm layer. See UnetBasicBlock

E.g. something like this:

class UnetBasicBlock(nn.Module):
    """
    A CNN module module that can be used for DynUNet, based on:
    `Automated Design of Deep Learning Methods for Biomedical Image Segmentation <https://arxiv.org/abs/1904.08128>`_.
    `nnU-Net: Self-adapting Framework for U-Net-Based Medical Image Segmentation <https://arxiv.org/abs/1809.10486>`_.

    Args:
        spatial_dims: number of spatial dimensions.
        in_channels: number of input channels.
        out_channels: number of output channels.
        kernel_size: convolution kernel size.
        stride: convolution stride.
        norm_name: feature normalization type and arguments.
        act_name: activation layer type and arguments.
        dropout: dropout probability.

    """

    def __init__(
        self,
        spatial_dims: int,
        in_channels: int,
        out_channels: int,
        kernel_size: Union[Sequence[int], int],
        stride: Union[Sequence[int], int],
        norm_name: Union[Tuple, str],
        act_name: Union[Tuple, str] = ("leakyrelu", {"inplace": True, "negative_slope": 0.01}),
        dropout: Optional[Union[Tuple, str, float]] = None,
    ):
        super().__init__()
        self.conv1 = get_conv_layer(
            spatial_dims,
            in_channels,
            out_channels,
            kernel_size=kernel_size,
            stride=stride,
            dropout=dropout,
            conv_only=True,
        )
        self.conv2 = get_conv_layer(
            spatial_dims, out_channels, out_channels, kernel_size=kernel_size, stride=1, dropout=dropout, conv_only=True
        )
        self.lrelu = get_act_layer(name=act_name)
        self.norm1 = get_norm_layer(name=norm_name, spatial_dims=spatial_dims, channels=out_channels)
        self.norm2 = get_norm_layer(name=norm_name, spatial_dims=spatial_dims, channels=out_channels)
        

       # CREATE DROPOUT LAYERS
        self.dropout1 = nn.Dropout2d(dropout if dropout is not None else 0)
        self.dropout2 = nn.Dropout2d(dropout if dropout is not None else 0)

    def forward(self, inp):
        out = self.conv1(inp)
        out = self.norm1(out)
        # USE HERE
        out = self.dropout1(out)
        out = self.lrelu(out)
        out = self.conv2(out)
        out = self.norm2(out)
        # AND HERE
        out = self.dropout2(out)
        out = self.lrelu(out)
        return out
@yiheng-wang-nv
Copy link
Contributor

Hi @Linus4world , which monai did you use? I think this issue has been fixed, see also: #4590

@KreitnerL
Copy link
Author

Ah I haven't seen the new version yet, thanks! Consider this fixed then ^^

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants