-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
More variant architecture of se_densnet need to try and test #1
Comments
@John1231983 Hi, I found redundant code which you pointed, I think it's my mistake. For testing se_densenet, I have write some scripts to test se_densenet on cifar10 datasset today. And I will test whether redundant code removed can influence train-test result.Let's wait a few days to see result, and I will update README in a week. |
Good. And consider to remove seblock after transition block also. We often use seblock after denseblock only. Transition block only helps to reduce feature size |
@John1231983 Yes, thank you for your suggestions, I will take it and make more comparative experiments. |
@John1231983 Hi, John. I update my test result just now, pls check it. Thank you very much. |
Good job, But the result shows that with and without seblock has similar performance. In full code, you have added the sebock in Trans layer and Dense layer. How about add them in the loop function and remove in the _Transition and _Dense layer? I mean add in And remove it in Thanks |
@John1231983 It's worthy to make a test, I will update new result after job done, pls keep watching.Thanks. |
The new test result has updated. |
Good. That is what I expect. You also can try something as:
Actually, we do not know where seblock will be good for densenet architecture. In my opinion, the first case may get better result |
Hi, thanks for sharing your experiment results. I checked and found that you may have some redundant code in the _Dense layer that adds thr seblock in of convolution. You added it in loop (for) and after first convolution. Why do you add seblock in _Dense_layer again? Thanks
The text was updated successfully, but these errors were encountered: