-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
2D FCMAE #71
Merged
2D FCMAE #71
Changes from all commits
Commits
Show all changes
80 commits
Select commit
Hold shift + click to select a range
c6692f1
refactor data loading into its own module
ziw-liu 3d8e7e2
update type annotations
ziw-liu fdcbf55
move the logging module out
ziw-liu a291381
move old logging into utils
ziw-liu 3cf8fa2
rename tests to match module name
ziw-liu d4cd41d
bump torch
ziw-liu e87d396
draft fcmae encoder
ziw-liu dccce5f
add stem to the encoder
ziw-liu 5508731
wip: masked stem layernorm
ziw-liu 3eec48e
wip: patchify masked features for linear
ziw-liu 8c54feb
use mlp from timm
ziw-liu 83ecf4a
hack: POC training script for FCMAE
ziw-liu 2fffc99
fix mask for fitting
ziw-liu 2a598b2
remove training script
ziw-liu b9b1880
default architecture
ziw-liu fd7700d
fine-tuning options
ziw-liu 054249f
fix cli for finetuning
ziw-liu d867e10
draft combined data module
ziw-liu b06a300
fix import
ziw-liu 39eafab
manual validation loss reduction
ziw-liu 9fbf7a5
update linting
ziw-liu e00f5f3
update development guide
ziw-liu 9e345b6
update type hints
ziw-liu 96deca5
bump iohub
ziw-liu e06aa57
draft ctmc v1 dataset
ziw-liu ea8b300
Merge branch 'main' into fcmae
ziw-liu 72de113
update tests
ziw-liu 13d0aa0
move test_data
ziw-liu 78aed97
remove path conversion
ziw-liu 74e7db3
configurable normalizations (#68)
edyoshikun 9b3b032
fix ctmc dataloading
ziw-liu a356936
add example ctmc v1 loading script
ziw-liu bac26be
changing the normalization and augmentations default from None to emp…
edyoshikun 0b598c7
invert intensity transform
ziw-liu ddb30e9
concatenated data module
ziw-liu 9504755
subsample videos
ziw-liu 808e39c
livecell dataset
ziw-liu 43d641d
all sample fields are optional
ziw-liu 42f81cf
fix multi-dataloader validation
ziw-liu 4546fc7
lint
ziw-liu 306f3ef
fixing preprocessing for varying array shapes (i.e aics dataset)
edyoshikun 1a0e3ce
update loading scripts
ziw-liu d3ec94d
fix CombineMode
ziw-liu 02e6d0b
always use untrainable head for FCMAE
ziw-liu e18d305
move log values to GPU before syncing
ziw-liu 01c71cf
custom head
ziw-liu dd64b31
ddp caching fixes
ziw-liu b3ea8d7
fix caching when using combined loader
ziw-liu d3db2bb
compose normalizations for predict and test stages
ziw-liu d5a3fd6
Merge branch 'fcmae' into 2d-fcmae
ziw-liu a549d4e
black
ziw-liu d74e731
Merge branch 'fcmae' into 2d-fcmae
ziw-liu a38da8b
fix normalization in example config
ziw-liu af317c4
fix normalization in example config
ziw-liu 96aac51
prefetch more in validation
ziw-liu d9a471d
fix collate when multi-sample transform is not used
ziw-liu 669ee83
ddp caching fixes
ziw-liu b2e23b8
fix caching when using combined loader
ziw-liu acdf362
Merge branch 'fcmae' into 2d-fcmae
ziw-liu 8132b68
typing fixes
ziw-liu 4c7a484
fix test dataset
ziw-liu 7cfe403
fix invert transform
ziw-liu 0b22f1a
add ddp prepare flag for combined data module
ziw-liu ed01065
remove redundant operations
ziw-liu c12fbf7
filter empty detections
ziw-liu f226801
pass trainer to underlying data modules in concatenated
ziw-liu 073acf4
hack: add test dataloader for LiveCell dataset
ziw-liu 2771fdb
test datasets for livecell and ctmc
ziw-liu 1732974
Merge branch 'main' into 2d-fcmae
ziw-liu 178df34
fix merge error
ziw-liu 77149e0
fix merge error
ziw-liu 3b1ff5c
fix mAP default for over 100 detections
ziw-liu 31522ae
bump torchmetric
ziw-liu bf1b9d3
fix combined loader training for virtual staining task
ziw-liu d2a63c1
fix non-combined data loader training
ziw-liu bd29616
add fcmae to graph script
ziw-liu b98c34c
fix type hint
ziw-liu 464ae0c
Merge branch 'main' into 2d-fcmae
ziw-liu 8052189
format
ziw-liu bbf22fb
add back convolutiuon option for fcmae head
ziw-liu File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here I know that this is the end of the validation meaning that it won't backprop and therefore maybe no need to detach the tensor before doing any logging. Is this the common practice? I know that for the
train_step
andvalidation_step
is more relevant and important If we don't detach here it doesnt affect it any way? Just curious..There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
To log the loss value (through lightning logger), the detaching is automatic. We only need to care about it when logging manually (images).