Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add adaptive pool #1239

Merged
merged 16 commits into from
Jun 30, 2020
Merged

add adaptive pool #1239

merged 16 commits into from
Jun 30, 2020

Conversation

dnabanita7
Copy link
Contributor

@dnabanita7 dnabanita7 commented Jun 19, 2020

I have added AdaptiveMaxPool and AdaptiveMeanPool so that we can do a similar PyTorch implementation. cc @darsnack

PR

  • Tests are added
  • Entry in NEWS.md
  • Documentation, if applicable
  • Final review from @MikeInnes or @dhairyagandhi96 (for API changes).

Flux issue linking

Flux#1224

MLH issue linking

0.3.x-projects#26

@darsnack
Copy link
Member

@dhairyagandhi96 do you know why no CI is being performed for this PR?

@CarloLucibello
Copy link
Member

CarloLucibello commented Jun 19, 2020

weird

@darsnack
Copy link
Member

The PR contains the CI YAML files as far as I can tell...

Could bors be used to trigger?

@dnabanita7
Copy link
Contributor Author

Shall I use GitHub Actions to add the CI workflow? I will create another PR for it.

@darsnack
Copy link
Member

darsnack commented Jun 19, 2020

@dnabanita7 can you try forcing an empty commit with no changes?

The CI should already be set up. It works for all the other commits..

@dnabanita7
Copy link
Contributor Author

It works now.

@darsnack
Copy link
Member

Yeah thanks @CarloLucibello for restarting it!

src/layers/conv.jl Outdated Show resolved Hide resolved
src/layers/conv.jl Outdated Show resolved Hide resolved
Copy link
Member

@darsnack darsnack left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since you need to make another commit anyways, can you change the naming from in_size -> insize (same for other variables with underscores).

src/layers/conv.jl Show resolved Hide resolved
src/layers/conv.jl Outdated Show resolved Hide resolved
src/layers/conv.jl Outdated Show resolved Hide resolved
src/layers/conv.jl Show resolved Hide resolved
test/layers/conv.jl Outdated Show resolved Hide resolved
Copy link
Member

@darsnack darsnack left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since you are removing the comments, I would get rid of the blank lines in between. Personally I would only have a blank line before the return statement.

src/layers/conv.jl Outdated Show resolved Hide resolved
src/layers/conv.jl Outdated Show resolved Hide resolved
@dnabanita7
Copy link
Contributor Author

dnabanita7 commented Jun 19, 2020

PoolDims(::Tuple{Vararg{T,M}} where T, or PoolDims(::Union{Int64, Tuple{Vararg{Int64,L}}}; stride, padding, dilation) where {M, L} shall I change the dispatch in struct according to this?

@dnabanita7
Copy link
Contributor Author

Since you are removing the comments, I would get rid of the blank lines in between. Personally I would only have a blank line before the return statement.

The spaces between functions too? Well other functions don't have that in that file.

@CarloLucibello
Copy link
Member

Flux uses 2 whitespaces indent (unfortunately in my opinion), so you should correct all indents, except for docstrings (where you should leave 4)

@darsnack
Copy link
Member

The spaces between functions too? Well other functions don't have that in that file.

No that comment is out of date. Please ignore.

@darsnack
Copy link
Member

PoolDims(::Tuple{Vararg{T,M}} where T, or PoolDims(::Union{Int64, Tuple{Vararg{Int64,L}}}; stride, padding, dilation) where {M, L} shall I change the dispatch in struct according to this?

What exactly do you mean?

@dnabanita7
Copy link
Contributor Author

The error shown is method mismatch. Shall I change the constructor in struct to the above mentioned?

src/layers/conv.jl Outdated Show resolved Hide resolved
Copy link
Member

@darsnack darsnack left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The error is from a constructor in NNlib.jl. You aren't dispatching to https://github.com/FluxML/NNlib.jl/blob/aa4bb07f35322498c9fabd391ab1ecd73d09afd4/src/dim_helpers/PoolDims.jl#L24 because the method signature says k needs to be a tuple of ints (or a single int), and you are passing a tuple of floats.

@dnabanita7
Copy link
Contributor Author

dnabanita7 commented Jun 21, 2020

I guess its good to get a final review from @DhairyaLGandhi or @MikeInnes

@avik-pal
Copy link
Member

@dnabanita7 An entry in the NEWS.md would be needed for this.

@dnabanita7
Copy link
Contributor Author

cc @DhairyaLGandhi

@DhairyaLGandhi
Copy link
Member

Can we also add the tests to the GPU tests. I know maxpool is already in the test suite, but good for making sure we are doing layer regression tests consistently.

src/layers/conv.jl Outdated Show resolved Hide resolved
@DhairyaLGandhi
Copy link
Member

There's a few other places that would need the \div treatment. Thanks for looking into it!

src/layers/conv.jl Outdated Show resolved Hide resolved
@CarloLucibello
Copy link
Member

thanks for this contribution

bors r+

@bors
Copy link
Contributor

bors bot commented Jun 30, 2020

Build succeeded:

@bors bors bot merged commit 4182b48 into FluxML:master Jun 30, 2020
@dnabanita7 dnabanita7 deleted the pooling branch July 1, 2020 05:08
@avik-pal avik-pal mentioned this pull request Jul 1, 2020
2 tasks
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants