-
Notifications
You must be signed in to change notification settings - Fork 18.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Global pooling layer #2316
Comments
It does. |
Done in #1214. There is a global pooling switch in the |
Thanks. I've tried using it and it seems really useful. However, would it be possible to add the L2-norm or std to the choice of pooling layer? The article in the original post has the L2-norm implemented. With a mean, std/L2-norm and max global pooling layer, the neural network should be able to extract enough information from the data it is pooling to generate good predictions, whereas with only a mean and a max pooling layer, important information about the variance is lost. This is particularly an issue when pooling over an unspecified large amount of data using the global pooling layer, which is exactly what the global pooling layer could be really useful for. Is it perhaps something I could implement rather easily? I guess L2 can be created with just very minor modifications to PoolingParameter_PoolMethod_AVE in pooling_layer.cpp. |
Does Caffe has a global pooling layer such as mentioned in http://benanne.github.io/2014/08/05/spotify-cnns.html? (search for global temporal pooling layer).
It's very useful as Caffe expands to other datatypes than images, where the position isn't relevant. From the article: "This layer pools across the entire time axis, effectively computing statistics of the learned features across time. I included three different pooling functions: the mean, the maximum and the L2-norm."
I guess that it isn't implemented. How difficult would it be for me to implement? Could you give me a few pointers on where to start?
The text was updated successfully, but these errors were encountered: