You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
There seems to be inconsistency in how different models handle backbone parameter freezing. Some models (like SwinTransformer) have a clear mechanism through frozen_stages parameter and _freeze_stages() method, while others (like FastSCNN) lack this functionality. This makes it difficult to have a consistent approach to transfer learning across different architectures.
Current State
Some models implement freezing through:
frozen_stages config parameter
_freeze_stages() method
Built-in parameter freezing logic
Other models (like FastSCNN) don't have these mechanisms, making it unclear how to:
Freeze specific layers
Control freezing through configs
Implement transfer learning consistently
Question
What's the recommended way to freeze backbone parameters in FastSCNN?
Should I manually set requires_grad=False for parameters, or is there a config-based solution?
Example
Using FastSCNN as an example:
# In SwinTransformer, we can do:model=dict(
backbone=dict(
type='SwinTransformer',
frozen_stages=2,
...
)
)
# But in FastSCNN, there's no equivalent:model=dict(
backbone=dict(
type='FastSCNN',
# No frozen_stages parameter
...
)
)
The text was updated successfully, but these errors were encountered:
Description
There seems to be inconsistency in how different models handle backbone parameter freezing. Some models (like SwinTransformer) have a clear mechanism through
frozen_stages
parameter and_freeze_stages()
method, while others (like FastSCNN) lack this functionality. This makes it difficult to have a consistent approach to transfer learning across different architectures.Current State
Some models implement freezing through:
frozen_stages
config parameter_freeze_stages()
methodOther models (like FastSCNN) don't have these mechanisms, making it unclear how to:
Question
requires_grad=False
for parameters, or is there a config-based solution?Example
Using FastSCNN as an example:
The text was updated successfully, but these errors were encountered: