-
-
Notifications
You must be signed in to change notification settings - Fork 156
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
sciml_train
output API documentation
#702
Comments
The I agree more documentation would be nice once the API settles but julia> result = sciml_train(...)
julia> result |> typeof
SciMLBase.OptimizationSolution{...}
julia> result |> typeof |> fieldnames
(:u, :prob, :alg, :minimum, :retcode, :original)
julia> @doc result
No documentation found.
...
Summary
≡≡≡≡≡≡≡≡≡
struct SciMLBase.OptimizationSolution{...}
Fields
≡≡≡≡≡≡≡≡
u :: Vector{Float32}
prob :: SciMLBase.OptimizationProblem{false, SciMLBase.OptimizationFunction{false, GalacticOptim.AutoZygote, SciMLBase.OptimizationFunction{true, GalacticOptim.AutoZygote, DiffEqFlux.var"#84#89"{Main.anonymous.var"#49#50"}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, GalacticOptim.var"#278#288"{GalacticOptim.var"#277#287"{SciMLBase.OptimizationFunction{true, GalacticOptim.AutoZygote, DiffEqFlux.var"#84#89"{Main.anonymous.var"#49#50"}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Nothing}}, GalacticOptim.var"#281#291"{GalacticOptim.var"#277#287"{SciMLBase.OptimizationFunction{true, GalacticOptim.AutoZygote, DiffEqFlux.var"#84#89"{Main.anonymous.var"#49#50"}, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Nothing}}, GalacticOptim.var"#286#296", Nothing, Nothing, Nothing}, Vector{Float32}, SciMLBase.NullParameters, Nothing, Nothing, Nothing, Nothing, Base.Pairs{Symbol, Integer, Tuple{Symbol, Symbol}, NamedTuple{(:iterations, :show_trace), Tuple{Int64, Bool}}}}
alg :: Optim.LBFGS{Nothing, LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Optim.var"#19#21"}
minimum :: Float32
retcode :: Symbol
original :: Optim.MultivariateOptimizationResults{Optim.LBFGS{Nothing, LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Optim.var"#19#21"}, Vector{Float32}, Float32, Float32, Vector{Optim.OptimizationState{Float32, Optim.LBFGS{Nothing, LineSearches.InitialStatic{Float64}, LineSearches.HagerZhang{Float64, Base.RefValue{Bool}}, Optim.var"#19#21"}}}, Bool, NamedTuple{(:f_limit_reached, :g_limit_reached, :h_limit_reached, :time_limit, :callback, :f_increased), NTuple{6, Bool}}}
|
Thanks for the feedback @IlyaOrson. I agree that some information can be accessed in this way. In VSCode, just by adding a dot ( And besides this, having a good API is much more welcoming to new users. In general I like reading good documentation with APIs, together with recommendations on how to properly use them and use cases. |
Ah right. Using Since the properties are not members of a struct but a function call it is not possible to get all the possibilities directly. This is a julia interactive discoverability problem in general imho. |
Thanks for the clarification. Therefore, I think my point for a clear API documentation is even stronger 😄 |
Many thanks for raising this. Had the same issue, was browsing for around an hour before I stumbled upon this. My impression is that for some reason this is often the case with Julia code, which has been quite annoying at times. I would even be willing to help with the documentations myself if I knew how to find such things/how to add to the documentations. |
It is annoying indeed, but the tab-completions will improve on the next release fortunately. The documentation depends on us though. Here is a guide on how to patch/contribute to existing julia packages. |
It's because it's changing. There's nothing actionable left here than other than the removal of |
Thanks for the feedback @ChrisRackauckas. These look like major changes, are there any plans to ensure some backwards compatibility? If these are implemented it will break an important fraction of the code for my research. I'm happy to see these changes happening, though (for the many issues we've discussed several times). Does it mean that the future of SciML is dropping Flux in favour of Lux? What about the name of DiffEqFlux.jl? I heard that Enzyme.jl will very soon handle BLAS calls. Is the end goal of DiffEqFlux.jl to use the DifferentialEquations.jl + Lux.jl + Enzyme.jl + GalacticOptim.jl combo? |
There will be a deprecation period of about a year where it throws a warning telling you what to do. That will help users update. But for the most part, if you put your research code on a manifest and should keep working indefinitely.
Pretty much.
Yeah, IIUC @wsmoses has been working on it over the last week so there's probably already some BLAS support at this point, just not on released versions. And there's a draft rule system: EnzymeAD/Enzyme.jl#177 . SciML tools are set to immediately shift over to using Enzyme ASAP (and already use it in many places). The end goal is a little different from that. There's many end goals here though. One is to make DiffEqSensitivity.jl a well-documented package in its own right. But to also simplify DiffEqSensitivity.jl, we want to get AbstractDifferentiation.jl up to snuff so it can be as optimized in its vjp switching, in which case the vjp choice would just be an AbstractDifferentiation.jl choice. We hope that can then be the main interface that most users learn for autodiff, and so "pass an AD.jl type to tell us what AD to use internally" should be an "obvious" kind of thing to most Julia users, rather than the DiffEq-specific We really don't care what AD users use on the outside, that doesn't really effect the performance all too much, it's the internal AD that really matters. So externally it might still look like Zygote for awhile, but transition to fully demonstrating AbstractDifferentiation.jl everywhere as a nice swappable front end, along with GalacticOptim.jl. GalacticOptim.jl is just becoming a standard Julia optimization package, in which case it becomes a well-documented workflow and Then, the tutorials should all be updated to use either Lux.jl for ML-like workflows, or SimpleChains.jl for smaller neural networks with more optimality. The fate of DiffEqFlux then is a bit up in the error. It will become a library of pre-defined layers for ML use cases, like continuous normalizing flows and the like, but pretty much no actual SciML use cases will need it. That means we'll need to figure out what to do with all of the tutorials, because tutorials on solving inverse problems and the like will still be useful even though DiffEqFlux as a library will be completely unnecessary for that. But we're reworking our documentation to be more "org-wide", i.e. I want one documentation with buttons for the different packages/modules, in which case a tutorial section for inverse problems could be documented better on its own from DiffEqFlux. But again, this has all been in the works since like 2020 with the grand vision of making DiffEqFlux essentially disappear due to all of the standard Julia tools being good enough to gobble up its domain. When that's all in place, there will be a deprecation period telling everyone what to do, the tutorials will be fully updated in that direction, and DiffEqFlux will live on as a layers-with-diffeq library. |
This is the first major step of SciML/DiffEqFlux.jl#702, which is the start of solving #582
Details of the process in SciML/SciMLSensitivity.jl#582 Major movements: SciML/SciMLSensitivity.jl#583 |
sciml_train has been removed and deprecated. |
I have been using DiffEqFlux.jl for some time, but something that has been bothering me is the lack of documentation in terms of APIs. Particularly, for such an important function as
sciml_train
, the API only exists for the inputs, but not the outputs. I have been able to come across some fields of the output such asminimizer
, but I'd really like to be able to browse through the whole list of fields. I'm sure this would also be super useful to new users, wanting to explore the full capabilities of the library.So far I haven't been able to find the full list of fields (e.g. using
fieldnames
doesn't work). If someone could point me to the right way to displaying them, I'd like to help with a PR to include this in the docs. Also some help in describing each one of them would be appreciated. Thanks!The text was updated successfully, but these errors were encountered: