Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove duplicate addition of callbacks/make optional #384

Open
romeokienzler opened this issue Jan 27, 2025 · 2 comments
Open

Remove duplicate addition of callbacks/make optional #384

romeokienzler opened this issue Jan 27, 2025 · 2 comments
Labels

Comments

@romeokienzler
Copy link
Collaborator

in cli_tools.py:


        parser.add_lightning_class_args(StateDictAwareModelCheckpoint, "ModelCheckpoint")
        parser.set_defaults({"ModelCheckpoint.filename": "{epoch}", "ModelCheckpoint.monitor": "val/loss"})

        parser.add_lightning_class_args(StateDictAwareModelCheckpoint, "StateDictModelCheckpoint")
        parser.set_defaults(
            {
                "StateDictModelCheckpoint.filename": "{epoch}_state_dict",
                "StateDictModelCheckpoint.save_weights_only": True,
                "StateDictModelCheckpoint.monitor": "val/loss",
            }
        )

We need to check if it makes sense to have both callbacks added at the same time, also need to remove those if the config already provides those

@paolofraccaro

@paolofraccaro
Copy link
Collaborator

paolofraccaro commented Feb 13, 2025

@romeokienzler We also have the issue in here that we add this with no choice of disabling (as far as I can tell), but more importantly the 'mode' (e.g. min/max) is not something we can change from here. Min (the default) is OK if we refer to loss, but not if we referred to something different. We also assume that there will always be a "val/loss" item to monitor. For object detection for example, I am monitoring MAP that has to go up.

    
        monitor = "val_map"
        mode = 'max'
        
        parser.add_lightning_class_args(StateDictAwareModelCheckpoint, "ModelCheckpoint")
        parser.set_defaults({"ModelCheckpoint.filename": "{epoch}", "ModelCheckpoint.monitor": monitor, "ModelCheckpoint.mode": mode})

        parser.add_lightning_class_args(StateDictAwareModelCheckpoint, "StateDictModelCheckpoint")
        parser.set_defaults(
            {
                "StateDictModelCheckpoint.filename": "{epoch}_state_dict",
                "StateDictModelCheckpoint.save_weights_only": True,
                "StateDictModelCheckpoint.monitor": monitor,
                "StateDictModelCheckpoint.mode": mode,
             }
        )

@blumenstiel
Copy link
Collaborator

@paolofraccaro @romeokienzler Currently you can overwrite them in the config on the highest level, not in the Trainer. Not sure if you can deactive them (This is also quite annoying for Iterate).

StateDictModelCheckpoint:
     filename: "..."
...

But I agree it is not intuitve and not explained by terratorch afaikt. It might be better to define default callbacks in general (checkpoints, LR montior, rich progress bar) and add them if the use does not overwrite them.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants