-
Notifications
You must be signed in to change notification settings - Fork 309
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Expand numerical differentiation options #58
Comments
In particular, we discussed adding implementations of spline-based differentiation and a TV derivative method. |
I think we should offer various differentiation methods in pysindy, but not add them to the main package via source code. Instead, it makes perfectly sense to have this as a distinct package and import it to pysindy as a requirement. This will add a bit more overhead, but will the numerical differentiation part more open to other project and also pysindy itself more stable to changes in the numerical differentiation part. I have a crude draft of such a project here: https://github.com/Ohjeah/derivative.py Maybe we can also have a look at autograd, e.g. https://github.com/google/jax |
I think separate is right. There might be other system discovery methods besides SINDy--separate packages makes reuse easy. I also had autograd methods on my to do, but derivatives for noisy experimental data seem like a priority. Overall, both projects look similar in content and references, meaning the essential methods are:
I think the biggest differences are in 2 and 5. How should this develop from here? |
I agree that numerical differentiation is large enough of a project in and of itself that we should probably not try to tackle it ourselves within PySINDy. For us to be able to list an external repo as a requirement, I think it will need to be registered with PyPI. @andgoldschmidt's implementation looks a little more mature at the moment, but it's not yet quite ready to be used in PySINDy. @andgoldschmidt, would you be willing to set up your repo on PyPI? With regards to autograd methods, my understanding is that they require the form of the function to be differentiated to be known (e.g. |
Could this be used as an option: It provides smoothing but also the derivative. |
Also, I found this code based on Chartrand's original MATLAB code: https://github.com/stur86/tvregdiff But I think there may be issues with it. |
|
Implemented in #85 |
* add possibility to use daily Q in hourly dataset * reformat imports * beautify feature list creation
@andgoldschmidt has implemented a number of numerical differentiation methods, some of which could easily be ported over to PySINDy. I'm creating this issue to create a discussion space for this project.
The text was updated successfully, but these errors were encountered: