Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

More variant architecture of se_densnet need to try and test #1

Open
John1231983 opened this issue Nov 8, 2018 · 8 comments
Open
Labels
enhancement New feature or request good suggestion issue is helpful

Comments

@John1231983
Copy link

Hi, thanks for sharing your experiment results. I checked and found that you may have some redundant code in the _Dense layer that adds thr seblock in of convolution. You added it in loop (for) and after first convolution. Why do you add seblock in _Dense_layer again? Thanks

@ygean
Copy link
Owner

ygean commented Nov 9, 2018

@John1231983 Hi, I found redundant code which you pointed, I think it's my mistake. For testing se_densenet, I have write some scripts to test se_densenet on cifar10 datasset today. And I will test whether redundant code removed can influence train-test result.Let's wait a few days to see result, and I will update README in a week.

@John1231983
Copy link
Author

Good. And consider to remove seblock after transition block also. We often use seblock after denseblock only. Transition block only helps to reduce feature size

@ygean
Copy link
Owner

ygean commented Nov 10, 2018

@John1231983 Yes, thank you for your suggestions, I will take it and make more comparative experiments.

@ygean
Copy link
Owner

ygean commented Nov 15, 2018

@John1231983 Hi, John. I update my test result just now, pls check it. Thank you very much.

@ygean ygean added the good suggestion issue is helpful label Nov 15, 2018
@John1231983
Copy link
Author

Good job, But the result shows that with and without seblock has similar performance. In full code, you have added the sebock in Trans layer and Dense layer. How about add them in the loop function and remove in the _Transition and _Dense layer? I mean add in
https://github.com/zhouyuangan/SE_DenseNet/blob/e5bb2fda0e49d50c274e29330c0948d3aa4a4376/se_densenet_full.py#L209

and
https://github.com/zhouyuangan/SE_DenseNet/blob/e5bb2fda0e49d50c274e29330c0948d3aa4a4376/se_densenet_full.py#L219

And remove it in
https://github.com/zhouyuangan/SE_DenseNet/blob/e5bb2fda0e49d50c274e29330c0948d3aa4a4376/se_densenet_full.py#L167

https://github.com/zhouyuangan/SE_DenseNet/blob/e5bb2fda0e49d50c274e29330c0948d3aa4a4376/se_densenet_full.py#L137

Thanks

@ygean
Copy link
Owner

ygean commented Nov 15, 2018

@John1231983 It's worthy to make a test, I will update new result after job done, pls keep watching.Thanks.

@ygean ygean changed the title Something wrong? More variant architecture of se_densnet need to try and test Nov 15, 2018
@ygean ygean added the enhancement New feature or request label Nov 15, 2018
@ygean
Copy link
Owner

ygean commented Nov 17, 2018

The new test result has updated.
I will release train and test code in a few days.

@John1231983
Copy link
Author

Good. That is what I expect. You also can try something as:

  1. Sebock in loop only after denseblock ( remove seblock in transittion)
  2. Seblock in loop only after transition block ( remove seblock in denseblock)

Actually, we do not know where seblock will be good for densenet architecture. In my opinion, the first case may get better result

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request good suggestion issue is helpful
Projects
None yet
Development

No branches or pull requests

2 participants