Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

training on CIFAR-10 performs not well, whether L2 loss or FID #326

Open
chengyiqiu1121 opened this issue Jun 17, 2024 · 0 comments
Open

Comments

@chengyiqiu1121
Copy link

chengyiqiu1121 commented Jun 17, 2024

hi developers:
I am sorry to bother you! I am training on CIFAR-10, with setting train_num_step=500k, batch=16, unet dim 64. But get FID 21.57 not well. And when i increase the train step to 1000k, the FID increases to 23. when I change the dataset to GTSRB, with train step 500k, it gets FID 7.9, and change dataset to CelebA($64\times 64$), it gets FID 13. That is really confusing.
Now i am training the model from scratch again using CIFAR-10 with unet dim 128. The loss (L2) is decreasing very slowly(around 0.05). I would appreciate it if you could help me!

image
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant