Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Move conv to Convolutions.jl or ConvolutionsBase.jl? #615

Open
dlfivefifty opened this issue Dec 16, 2024 · 13 comments
Open

Move conv to Convolutions.jl or ConvolutionsBase.jl? #615

dlfivefifty opened this issue Dec 16, 2024 · 13 comments

Comments

@dlfivefifty
Copy link

Downstream I have a DSP extension just to overload conv:

https://github.com/JuliaArrays/InfiniteArrays.jl/blob/master/ext/InfiniteArraysDSPExt.jl

I wonder if moving conv out would be better. This could take two approaches:

  1. Move out all convolutions code to Convolutions.jl. Potentially, FFTW could be a weak dependency but that may be hard.
  2. Make a package that just defines conv with no implementation called ConvolutionsBase.jl. This has the benefit that there is no FFTW.jl dependency.
@martinholters
Copy link
Member

Potentially, FFTW could be a weak dependency but that may be hard.

Even if we came up with a dispatch where an extension could hook in to do frequency-domain convolution, I wouldn't be happy to see the result of a convolution change depending on whether FFTW is loaded, even if it is within the numerical noise floor.

Make a package that just defines conv with no implementation called ConvolutionsBase.jl.

Sounds easy, but might not be, as it should then also define (by documentation) the exact API. Probably including any dispatch hooks like conv_axis_with_offset; would be too bad if one had to depend on DSP because one doesn't want to add a method to conv, but rather conv_axis_with_offset or conv_output_axes or whatever might become public API there.

So maybe it's better to start with a Convolutions.jl package that tries to fulfill the need of any downstream users and provide useful hooks. Once that has stabilized, one can then think about moving the definition of the API to a ConvolutionsBase.jl package.

Maybe we should try to bring in more users of convolution to the discussion. E.g. multiplication of polynomials is basically convolution, but Polynomials.jl can't depend on DSP.jl because DSP.jl depends on Polynomials.jl. Would a Convolutions.jl package be helpful for Polynomials.jl, @jverzani? If so, what would you expect of the API?

@wheeheee
Copy link
Member

I think (among others) ApproxFunBase.jl and Korg.jl also use only conv. Don't know if the maintainers are ok with being pinged?

@dlfivefifty
Copy link
Author

I can speak for ApproxFunBase. It's fine to ping me 😂

@jverzani
Copy link
Contributor

Polynomials would likely benefit from this and could work around any API chose, I would assume as the requirements are pretty minimal (convolution of two vectors or ntuples basically).

@dlfivefifty
Copy link
Author

I think convolution of two Float64 vectors is the case where FFTW.jl is needed but I think the current proposal is we leave FFTW as a hard dependency.

@wheeheee
Copy link
Member

I can speak for ApproxFunBase. It's fine to ping me 😂

oh man, I saw jishnub's icon and thought that was it 😅

Polynomials would likely benefit from this and could work around any API chose, I would assume as the requirements are pretty minimal (convolution of two vectors or ntuples basically).

Hm, currently we don't support convolution of ntuples. @martinholters would it be appropriate to add that?

@martinholters
Copy link
Member

Hm, currently we don't support convolution of ntuples. @martinholters would it be appropriate to add that?

Sure, why not? Maybe not in DSP.jl, but in Convolutions.jl, that sounds reasonable. For future reference: for short homogenous tuples, this works quite well:

conv(a::Tuple, b::Tuple) = ntuple(n -> sum(a[k]*b[n-k+1] for k in max(1, n-length(b)+1):min(length(a), n)), length(a) + length(b) - 1)

We'd probably want to do some @generated magic, but the above might still be useful for an else part of an if @generated.

@dlfivefifty
Copy link
Author

I think there needs to be an obvious use case for Tuple. When would one want it instead of an SVector? Does a convolution make sense when the types are different?

@jverzani
Copy link
Contributor

It comes up in Polynomials for tuple-backed polynomials. But that might be quite niche, so needn't be in a more general package. I was just answering what was used, not necessarily, what was expected in an external package.

@dlfivefifty
Copy link
Author

When do polynomials have coefficients of different types? Or are you using it instead of an SVector to avoid a dependency?

@jverzani
Copy link
Contributor

The different types might occur, but primarily we have an option for using an ntuple which speeds up some operations (it was originally borrowed from https://github.com/tkoolen/StaticUnivariatePolynomials.jl). As surmised, an SVector is avoided just because of the dependency. Perhaps we can revisit that decision.

@dlfivefifty
Copy link
Author

I haven’t looked at how Polynomials.jl is implemented but supporting tuples seems like it must require a bunch of special cases, eg addition. I suspect it would be simpler to only allow AbstractVector coefficients. It probably wouldn’t need to know anything about StaticArrays.jl,, eg, the special implementation of conv mentioned would just be in a ConvulationsStaticArraysExt.jl

@jverzani
Copy link
Contributor

Fine with me. Thanks for the consideration.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants