-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Sanity Check - Looking for a basic CIFAR10 hyperparameter set #331
Comments
Pinging in particular previous issue openers that reported FID scores/loss on CIFAR10 (@zzz313 @DavidXie03 @chengyiqiu1121), would be really grateful if you could take a look and see if there's something obviously wrong. Thank you! |
Pinging @lucidrains as well, hopefully if there's something I should obviously not be doing you probably have the best shot at noticing. Thanks! |
hi, the |
here is the unet config in my code, and after training, using DDIM sampler, the diffusion model gets FID 10.88 dataset_name: cifar10
lr: 2e-4
device: cuda:0
batch: 128
epoch: 700000
unet:
dim: 128
dim_mults: (1, 2, 2, 2)
dropout: 0.1 |
Thank you, I'll test it! |
@samuelemarro Encountered the same problem as you, but I have found this codebase's implementation has some difference to the official implementation, such as the UNet strucuture (channel dim, multi-head or single-head attention), learning rate warmup. I am following this repo to reproduce the results on CIFAR10. Hope it will help. |
I'm running the
denoising_diffusion_pytorch.py
script as-is on the CIFAR10 dataset, however the FID quickly plateaus to ~90, which is a far cry from both those reported in the DDIM/DDPM paper and even in other open issues (e.g. #326). Here are my hyperparameters:No matter how I tune it, I can't seem to beat ~70. Am I going crazy? I feel like there's something obvious I'm missing, but I can't see what.
The text was updated successfully, but these errors were encountered: