-
Notifications
You must be signed in to change notification settings - Fork 652
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Apply data augmentation on MVTec #542
Comments
Anomalib models expect their inputs to be of shape [Batch x Channel x Height x Weight]. It seems that one of your transforms has changed the order of the dimensions of the input tensor. Would you mind sharing your transform.json so we could have a closer look? |
These are all contents of the file. I applied only RGBShift, nothing else. |
I see. When using a custom transform configuration, the transforms specified in the Looking at it now, I feel our design regarding custom transform configurations is probably not very intuitive and should be improved, or at least better documented. Ideally Anomalib should automatically append mandatory transforms such as In your case, your problem would be solved by adding both the
|
It works, thanks for the help! |
I tried to apply transforms on MVTec dataset according to Add transform_config to the main config.yaml file.
But I got an error:
RuntimeError: Given groups=1, weight of size [64, 3, 7, 7], expected input[32, 224, 224, 3] to have 3 channels, but got 224 channels instead
I'm not sure if I done right. I created a transform pipeline using albumentations serialization augmentation pipeline, saved it as 'transform.json' and put the file into anomalib fold.
Then change the 'null' at transform_config of config.yaml to 'transform.json'
I've tried patchcore and cflow, neither worked
I assume that the channels of the images changed, but not sure why this happened and how to fix it.
The text was updated successfully, but these errors were encountered: