Skip to content

Commit

Permalink
readme update (#149)
Browse files Browse the repository at this point in the history
  • Loading branch information
ThummeTo committed Sep 11, 2024
1 parent 32ee3db commit 231fc30
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 8 deletions.
9 changes: 2 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,15 +52,10 @@ You can evaluate FMUs inside of your loss function.
- Implicit solvers using `autodiff=true` is not supported (now), but you can use implicit solvers with `autodiff=false`.

- Sensitivity information over state change by event $\partial x^{+} / \partial x^{-}$ can't be accessed in FMI.
These sensitivities are simplified on basis of one of the following assumptions (defined by user):
(1) the state after event depends on nothing, so sensitivities are zero or
(2) the state after event instance only depends on the same state before the event instance
The second is often correct for e.g. mechanical contacts, but may lead to wrong gradients for arbitrary discontinuous systems.
However even if the gradient might not be 100% correct in any case, gradients are often usable for optimization tasks.
These sensitivities are sampled if the FMU supports `fmiXGet/SetState`. If this feature is not available, wrong sensitivities are computed, which my influence your optimization (dependent on the use case).
This issue is also part of the [*OpenScaling*](https://itea4.org/project/openscaling.html) research project.

- Discontinuous systems with implicit solvers use continuous adjoints instead of automatic differentiation through the ODE solver.
This might lead to issues, because FMUs are by design not capable of being simulated backwards in time.
- If continuous adjoints instead of automatic differentiation through the ODE solver (discrete adjoint) are applied, this might lead to issues, because FMUs are by design not capable of being simulated backwards in time.
On the other hand, many FMUs are capable of doing so.
This issue is also part of the [*OpenScaling*](https://itea4.org/project/openscaling.html) research project.

Expand Down
2 changes: 1 addition & 1 deletion src/neural.jl
Original file line number Diff line number Diff line change
Expand Up @@ -152,7 +152,7 @@ mutable struct CS_NeuralFMU{F,C} <: NeuralFMU
tspan::Any

p::Union{AbstractArray{<:Real},Nothing}
re::Any # restrucure function
re::Any # restructure function

snapshots::Bool

Expand Down

2 comments on commit 231fc30

@ThummeTo
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@JuliaRegistrator
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Registration pull request created: JuliaRegistries/General/114985

Tip: Release Notes

Did you know you can add release notes too? Just add markdown formatted text underneath the comment after the text
"Release notes:" and it will be added to the registry PR, and if TagBot is installed it will also be added to the
release that TagBot creates. i.e.

@JuliaRegistrator register

Release notes:

## Breaking changes

- blah

To add them here just re-invoke and the PR will be updated.

Tagging

After the above pull request is merged, it is recommended that a tag is created on this repository for the registered package version.

This will be done automatically if the Julia TagBot GitHub Action is installed, or can be done manually through the github interface, or via:

git tag -a v0.13.0 -m "<description of version>" 231fc302a696dc5d0ebb6d3b17afc2966a682bc7
git push origin v0.13.0

Please sign in to comment.