From 08b540f74aa5617db2d9e76f28426db195501999 Mon Sep 17 00:00:00 2001 From: "Documenter.jl" Date: Mon, 14 Aug 2023 20:30:39 +0000 Subject: [PATCH] build based on 9e38446 --- dev/API_choosen/index.html | 8 ++++---- dev/Bachmann/index.html | 2 +- dev/Beer/index.html | 2 +- dev/Beer_julia_import/index.html | 2 +- dev/Best_options/index.html | 2 +- dev/Boehm/index.html | 2 +- dev/Brannmark/index.html | 2 +- dev/Gradient_hessian_support/index.html | 2 +- dev/Model_selection/index.html | 2 +- dev/Parameter_estimation/index.html | 2 +- dev/index.html | 2 +- dev/search/index.html | 2 +- 12 files changed, 15 insertions(+), 15 deletions(-) diff --git a/dev/API_choosen/index.html b/dev/API_choosen/index.html index 22a555e5..dbb8c39d 100644 --- a/dev/API_choosen/index.html +++ b/dev/API_choosen/index.html @@ -1,14 +1,14 @@ -API · PEtab.jl

API

PEtab.PEtabModelType
PEtabModel

A Julia-compatible representation of a PEtab-specified problem.

Created by readPEtabModel, this object contains helper functions for setting up cost, gradient, and Hessian computations, as well as handling potential model events (callbacks).

Note1: Several of the functions in PEtabModel are not intended to be accessed by the user. For example, compute_h (and similar functions) require indices that are built in the background to efficiently map parameters between experimental (simulation) conditions. Rather, PEtabModel holds all information needed to create a PEtabODEProblem, and in the future, PEtabSDEProblem, etc.

Note2: ODEProblem.p refers to the parameters for the underlying DifferentialEquations.jl ODEProblem.

Fields

  • modelName: The model name extracted from the PEtab YAML file.
  • compute_h: Computes the observable h for a specific time point and simulation condition.
  • compute_u0!: Computes in-place initial values using ODEProblem.p for a simulation condition; compute_u0!(u0, p).
  • compute_u0: Computes initial values as above, but not in-place; u0 = compute_u0(p).
  • compute_σ: Computes the noise parameter σ for a specific time point and simulation condition.
  • compute_∂h∂u!: Computes the gradient of h with respect to ODEModel states (u) for a specific time point and simulation condition.
  • compute_∂σ∂u!: Computes the gradient of σ with respect to ODEModel states (u) for a specific time point and simulation condition.
  • compute_∂h∂p!: Computes the gradient of h with respect to ODEProblem.p.
  • compute_∂σ∂p!: Computes the gradient of σ with respect to ODEProblem.p.
  • computeTStops: Computes the event times in case the model has DiscreteCallbacks (events).
  • convertTspan::Bool: Tracks whether the time span should be converted to Dual numbers for ForwardDiff.jl gradients, in case the model has DiscreteCallbacks and the trigger time is a parameter set to be estimated.
  • dirModel: The directory where the model.xml and PEtab files are stored.
  • dirJulia: The directory where the Julia-model files created by parsing the PEtab files (e.g., SBML file) are stored.
  • odeSystem: A ModellingToolkit.jl ODE system obtained from parsing the model SBML file.
  • parameterMap: A ModellingToolkit.jl parameter map for the ODE system.
  • stateMap: A ModellingToolkit.jl state map for the ODE system describing how the initial values are computed, e.g., whether or not certain initial values are computed from parameters in the parameterMap.
  • parameterNames: The names of the parameters in the odeSystem.
  • stateNames: The names of the states in the odeSystem.
  • pathMeasurements: The path to the PEtab measurements file.
  • pathConditions: The path to the PEtab conditions file.
  • pathObservables: The path to the PEtab observables file.
  • pathParameters: The path to the PEtab parameters file.
  • pathSBML: The path to the PEtab SBML file.
  • pathYAML: The path to the PEtab YAML file.
  • modelCallbackSet: This stores potential model callbacks or events.
  • checkIfCallbackIsActive: Piecewise SBML statements are transformed to DiscreteCallbacks that are activated at a specific time-point. The piecewise callback has a default value at t0 and is only triggered when reaching tactivation. If tactivation ≤ 0 (never reached when solving the model), this function checks whether the callback should be triggered before solving the model.
source
PEtab.readPEtabModelFunction
readPEtabModel(pathYAML::String;
+API · PEtab.jl

API

PEtab.PEtabModelType
PEtabModel

A Julia-compatible representation of a PEtab-specified problem.

Created by readPEtabModel, this object contains helper functions for setting up cost, gradient, and Hessian computations, as well as handling potential model events (callbacks).

Note1: Several of the functions in PEtabModel are not intended to be accessed by the user. For example, compute_h (and similar functions) require indices that are built in the background to efficiently map parameters between experimental (simulation) conditions. Rather, PEtabModel holds all information needed to create a PEtabODEProblem, and in the future, PEtabSDEProblem, etc.

Note2: ODEProblem.p refers to the parameters for the underlying DifferentialEquations.jl ODEProblem.

Fields

  • modelName: The model name extracted from the PEtab YAML file.
  • compute_h: Computes the observable h for a specific time point and simulation condition.
  • compute_u0!: Computes in-place initial values using ODEProblem.p for a simulation condition; compute_u0!(u0, p).
  • compute_u0: Computes initial values as above, but not in-place; u0 = compute_u0(p).
  • compute_σ: Computes the noise parameter σ for a specific time point and simulation condition.
  • compute_∂h∂u!: Computes the gradient of h with respect to ODEModel states (u) for a specific time point and simulation condition.
  • compute_∂σ∂u!: Computes the gradient of σ with respect to ODEModel states (u) for a specific time point and simulation condition.
  • compute_∂h∂p!: Computes the gradient of h with respect to ODEProblem.p.
  • compute_∂σ∂p!: Computes the gradient of σ with respect to ODEProblem.p.
  • computeTStops: Computes the event times in case the model has DiscreteCallbacks (events).
  • convertTspan::Bool: Tracks whether the time span should be converted to Dual numbers for ForwardDiff.jl gradients, in case the model has DiscreteCallbacks and the trigger time is a parameter set to be estimated.
  • dirModel: The directory where the model.xml and PEtab files are stored.
  • dirJulia: The directory where the Julia-model files created by parsing the PEtab files (e.g., SBML file) are stored.
  • odeSystem: A ModellingToolkit.jl ODE system obtained from parsing the model SBML file.
  • parameterMap: A ModellingToolkit.jl parameter map for the ODE system.
  • stateMap: A ModellingToolkit.jl state map for the ODE system describing how the initial values are computed, e.g., whether or not certain initial values are computed from parameters in the parameterMap.
  • parameterNames: The names of the parameters in the odeSystem.
  • stateNames: The names of the states in the odeSystem.
  • pathMeasurements: The path to the PEtab measurements file.
  • pathConditions: The path to the PEtab conditions file.
  • pathObservables: The path to the PEtab observables file.
  • pathParameters: The path to the PEtab parameters file.
  • pathSBML: The path to the PEtab SBML file.
  • pathYAML: The path to the PEtab YAML file.
  • modelCallbackSet: This stores potential model callbacks or events.
  • checkIfCallbackIsActive: Piecewise SBML statements are transformed to DiscreteCallbacks that are activated at a specific time-point. The piecewise callback has a default value at t0 and is only triggered when reaching tactivation. If tactivation ≤ 0 (never reached when solving the model), this function checks whether the callback should be triggered before solving the model.
source
PEtab.readPEtabModelFunction
readPEtabModel(pathYAML::String;
                forceBuildJuliaFiles::Bool=false,
                verbose::Bool=true,
                ifElseToEvent::Bool=true,
                writeToFile::Bool=true,
-               jlFilePath::String="")::PEtabModel

Parses a PEtab specified problem with a YAML-file located at pathYAML into a Julia-accessible format.

When parsing a PEtab problem, several things happen under the hood:

  1. The SBML file is translated into ModelingToolkit.jl format to allow for symbolic computations of the ODE-model Jacobian. Piecewise and model events are further written into DifferentialEquations.jl callbacks.
  2. The observable PEtab table is translated into a Julia file with functions for computing the observable (h), noise parameter (σ), and initial values (u0).
  3. To allow gradients via adjoint sensitivity analysis and/or forward sensitivity equations, the gradients of h and σ are computed symbolically with respect to the ODE model's states (u) and parameters (odeProblem.p).

All of this happens automatically, and resulting files are stored under petabModel.dirJulia assuming writeToFile=true. To save time, forceBuildJlFiles=false by default, which means that Julia files are not rebuilt if they already exist.

In case a Julia model files is provided instead of a SBML file provide file path under jlFilePath.

Arguments

  • pathYAML::String: Path to the PEtab problem YAML file.
  • forceBuildJuliaFiles::Bool=false: If true, forces the creation of Julia files for the problem even if they already exist.
  • verbose::Bool=true: If true, displays verbose output during parsing.
  • ifElseToEvent::Bool=true: If true, rewrites if-else statements in the SBML model as event-based callbacks.
  • jlFilePath::String="": Path to an existing Julia file. Should only be provided if a Julia model file is availble.

Example

petabModel = readPEtabModel("path/to/petab/problem.yaml")
source
PEtab.PEtabODEProblemType

PEtabODEProblem

Everything needed to setup an optimization problem (compute cost, gradient, hessian and parameter bounds) for a PEtab model.

Note

The parameter vector θ is always assumed to be on the parameter scale specified in the PEtab parameters file. If needed, θ is transformed to the linear scale inside the function call.

Fields

  • computeCost: For θ computes the negative likelihood (objective to minimize)
  • computeChi2: For θ compute χ2 value
  • computeGradient!: For θ computes in-place gradient computeGradient!(gradient, θ)
  • computeGradient: For θ computes out-place gradient gradient = computeGradient(θ)
  • computeHessian!: For θ computes in-place hessian-(approximation) computeHessian!(hessian, θ)
  • computeHessian: For θ computes out-place hessian-(approximation) hessian = computeHessian(θ)
  • computeSimulatedValues: For θ compute the corresponding model (simulated) values to the measurements in the same order as in the Measurements PEtab table
  • computeResiduals: For θ compute the residuals (hmodel - hobserved)^2 / σ^2 in the same order as in the Measurements PEtab table
  • gradientMethod: The method used to compute the gradient (either :ForwardDiff, :ForwardEquations, :Adjoint, or :Zygote).
  • hessianMethod: The method used to compute or approximate the Hessian (either :ForwardDiff, :BlocForwardDiff, or :GaussNewton).
  • nParametersToEstimate: The number of parameters to estimate.
  • θ_estNames: The names of the parameters in θ.
  • θ_nominal: The nominal values of θ as specified in the PEtab parameters file.
  • θ_nominalT: The nominal values of θ on the parameter scale (e.g., log) as specified in the PEtab parameters file.
  • lowerBounds: The lower parameter bounds on the parameter scale for θ as specified in the PEtab parameters file.
  • upperBounds: The upper parameter bounds on the parameter scale for θ as specified in the PEtab parameters file.
  • petabModel: The PEtabModel used to construct the PEtabODEProblem.
  • odeSolverOptions: The options for the ODE solver specified when creating the PEtabODEProblem.
  • odeSolverGradientOptions: The options for the ODE solver gradient specified when creating the PEtabODEProblem.
source
PEtab.createPEtabODEProblemFunction
createPEtabODEProblem(petabModel::PEtabModel; <keyword arguments>)

Given a PEtabModel creates a PEtabODEProblem with potential user specified options.

The keyword arguments (described below) allow the user to choose cost, gradient, and Hessian methods, ODE solver options, and other tuneable options that can potentially make computations more efficient for some "edge-case" models. Please refer to the documentation for guidance on selecting the most efficient options for different types of models.

If a keyword argument is not set, a suitable default option is chosen based on the number of model parameters.

Note

Every problem is unique, so even though the default settings often work well they might not be optimal.

Keyword arguments

  • odeSolverOptions::ODESolverOptions: Options for the ODE solver when computing the cost, such as solver and tolerances.
  • odeSolverGradientOptions::ODESolverOptions: Options for the ODE solver when computing the gradient, such as the ODE solver options used in adjoint sensitivity analysis. Defaults to odeSolverOptions if not set to nothing.
  • ssSolverOptions::SteadyStateSolverOptions: Options for finding steady-state for models with pre-equilibrium. Steady-state can be found via simulation or rootfinding, which can be set using SteadyStateSolverOptions (see documentation). If not set, defaults to simulation with wrms < 1 termination.
  • ssSolverGradientOptions::SteadyStateSolverOptions: Options for finding steady-state for models with pre-equilibrium when computing gradients. Defaults to ssSolverOptions value if not set.
  • costMethod::Symbol=:Standard: Method for computing the cost (objective). Two options are available: :Standard, which is the most efficient, and :Zygote, which is less efficient but compatible with the Zygote automatic differentiation library.
  • gradientMethod=nothing: Method for computing the gradient of the objective. Four options are available:
    • :ForwardDiff: Compute the gradient via forward-mode automatic differentiation using ForwardDiff.jl. Most efficient for models with ≤50 parameters. The number of chunks can be optionally set using chunkSize.
    • :ForwardEquations: Compute the gradient via the model sensitivities, where sensealg specifies how to solve for the sensitivities. Most efficient when the Hessian is approximated using the Gauss-Newton method and when the optimizer can reuse the sensitivities (reuseS) from gradient computations in Hessian computations (e.g., when the optimizer always computes the gradient before the Hessian).
    • :Adjoint: Compute the gradient via adjoint sensitivity analysis, where sensealg specifies which algorithm to use. Most efficient for large models (≥75 parameters).
    • :Zygote: Compute the gradient via the Zygote package, where sensealg specifies which sensitivity algorithm to use when solving the ODE model. This is the most inefficient option and not recommended.
  • hessianMethod=nothing: method for computing the Hessian of the cost. There are three available options:
    • :ForwardDiff: Compute the Hessian via forward-mode automatic differentiation using ForwardDiff.jl. This is often only computationally feasible for models with ≤20 parameters but can greatly improve optimizer convergence.
    • :BlockForwardDiff: Compute the Hessian block approximation via forward-mode automatic differentiation using ForwardDiff.jl. The approximation consists of two block matrices: the first is the Hessian for only the dynamic parameters (parameter part of the ODE system), and the second is for the non-dynamic parameters (e.g., noise parameters). This is computationally feasible for models with ≤20 dynamic parameters and often performs better than BFGS methods.
    • :GaussNewton: Approximate the Hessian via the Gauss-Newton method, which often performs better than the BFGS method. If we can reuse the sensitivities from the gradient in the optimizer (see reuseS), this method is best paired with gradientMethod=:ForwardEquations.
  • sparseJacobian::Bool=false: When solving the ODE du/dt=f(u, p, t), whether implicit solvers use a sparse Jacobian. Sparse Jacobian often performs best for large models (≥100 states).
  • specializeLevel=SciMLBase.FullSpecialize: Specialization level when building the ODE problem. It is not recommended to change this parameter (see https://docs.sciml.ai/SciMLBase/stable/interfaces/Problems/).
  • sensealg: Sensitivity algorithm for gradient computations. The available options for each gradient method are:
    • :ForwardDiff: None (as ForwardDiff takes care of all computation steps).
    • :ForwardEquations: :ForwardDiff (uses ForwardDiff.jl and typicaly performs best) or ForwardDiffSensitivity() and ForwardSensitivity() from SciMLSensitivity.jl (https://github.com/SciML/SciMLSensitivity.jl).
    • :Adjoint: InterpolatingAdjoint() and QuadratureAdjoint() from SciMLSensitivity.jl.
    • :Zygote: All sensealg in SciMLSensitivity.jl.
  • sensealgSS=nothing: Sensitivity algorithm for adjoint gradient computations for steady-state simulations. The available options are SteadyStateAdjoint(), InterpolatingAdjoint(), and QuadratureAdjoint() from SciMLSensitivity.jl. SteadyStateAdjoint() is the most efficient but requires a non-singular Jacobian, and in the case of a non-singular Jacobian, the code automatically switches to InterpolatingAdjoint().
  • chunkSize=nothing: Chunk-size for ForwardDiff.jl when computing the gradient and Hessian via forward-mode automatic differentiation. If nothing is provided, the default value is used. Tuning chunkSize is non-trivial, and we plan to add automatic functionality for this.
  • splitOverConditions::Bool=false: For gradient and Hessian via ForwardDiff.jl, whether or not to split calls to ForwardDiff across experimental (simulation) conditions. This parameter should only be set to true if the model has many parameters specific to an experimental condition; otherwise, the overhead from the calls will increase run time. See the Beer example for a case where this is needed.
  • reuseS::Bool=false : If set to true, reuse the sensitivities computed during gradient computations for the Gauss-Newton Hessian approximation. This option is only applicable when using hessianMethod=:GaussNewton and gradientMethod=:ForwardEquations. Note that it should only be used when the optimizer always computes the gradient before the Hessian.
  • verbose::Bool=true : If set to true, print progress messages while setting up the PEtabODEProblem.
source
PEtab.ODESolverOptionsType
ODESolverOptions(solver, <keyword arguments>)

ODE-solver options (solver, tolerances, etc...) to use when computing gradient/cost for a PEtabODEProblem.

More information about the available options and solvers can be found in the documentation for DifferentialEquations.jl (https://docs.sciml.ai/DiffEqDocs/stable/solvers/ode_solve/). Recommended settings for which solver and options to use for different problems can be found below and in the documentation.

Arguments

  • solver: Any of the ODE solvers in DifferentialEquations.jl. For small (≤20 states) mildly stiff models, composite solvers such as AutoVern7(Rodas5P()) perform well. For stiff small models, Rodas5P() performs well. For medium-sized models (≤75 states), QNDF(), FBDF(), and CVODE_BDF() perform well. CVODE_BDF() is not compatible with automatic differentiation and thus cannot be used if the gradient is computed via automatic differentiation or if the Gauss-Newton Hessian approximation is used. If the gradient is computed via adjoint sensitivity analysis, CVODE_BDF() is often the best choice as it is typically more reliable than QNDF() and FBDF() (fails less often).
  • abstol=1e-8: Absolute tolerance when solving the ODE system. Not recommended to increase above 1e-6 for gradients.
  • reltol=1e-8: Relative tolerance when solving the ODE system. Not recommended to increase above 1e-6 for gradients.
  • force_dtmin=false: Whether or not to force dtmin when solving the ODE system.
  • dtmin=nothing: Minimal acceptable step-size when solving the ODE system.
  • maxiters=10000: Maximum number of iterations when solving the ODE system. Increasing above the default value can cause the optimization to take substantial time.
source
PEtab.SteadyStateSolverOptionsType
SteadyStateSolverOptions(method::Symbol;
+               jlFilePath::String="")::PEtabModel

Parses a PEtab specified problem with a YAML-file located at pathYAML into a Julia-accessible format.

When parsing a PEtab problem, several things happen under the hood:

  1. The SBML file is translated into ModelingToolkit.jl format to allow for symbolic computations of the ODE-model Jacobian. Piecewise and model events are further written into DifferentialEquations.jl callbacks.
  2. The observable PEtab table is translated into a Julia file with functions for computing the observable (h), noise parameter (σ), and initial values (u0).
  3. To allow gradients via adjoint sensitivity analysis and/or forward sensitivity equations, the gradients of h and σ are computed symbolically with respect to the ODE model's states (u) and parameters (odeProblem.p).

All of this happens automatically, and resulting files are stored under petabModel.dirJulia assuming writeToFile=true. To save time, forceBuildJlFiles=false by default, which means that Julia files are not rebuilt if they already exist.

In case a Julia model files is provided instead of a SBML file provide file path under jlFilePath.

Arguments

  • pathYAML::String: Path to the PEtab problem YAML file.
  • forceBuildJuliaFiles::Bool=false: If true, forces the creation of Julia files for the problem even if they already exist.
  • verbose::Bool=true: If true, displays verbose output during parsing.
  • ifElseToEvent::Bool=true: If true, rewrites if-else statements in the SBML model as event-based callbacks.
  • jlFilePath::String="": Path to an existing Julia file. Should only be provided if a Julia model file is availble.

Example

petabModel = readPEtabModel("path/to/petab/problem.yaml")
source
PEtab.PEtabODEProblemType

PEtabODEProblem

Everything needed to setup an optimization problem (compute cost, gradient, hessian and parameter bounds) for a PEtab model.

Note

The parameter vector θ is always assumed to be on the parameter scale specified in the PEtab parameters file. If needed, θ is transformed to the linear scale inside the function call.

Fields

  • computeCost: For θ computes the negative likelihood (objective to minimize)
  • computeChi2: For θ compute χ2 value
  • computeGradient!: For θ computes in-place gradient computeGradient!(gradient, θ)
  • computeGradient: For θ computes out-place gradient gradient = computeGradient(θ)
  • computeHessian!: For θ computes in-place hessian-(approximation) computeHessian!(hessian, θ)
  • computeHessian: For θ computes out-place hessian-(approximation) hessian = computeHessian(θ)
  • computeSimulatedValues: For θ compute the corresponding model (simulated) values to the measurements in the same order as in the Measurements PEtab table
  • computeResiduals: For θ compute the residuals (hmodel - hobserved)^2 / σ^2 in the same order as in the Measurements PEtab table
  • gradientMethod: The method used to compute the gradient (either :ForwardDiff, :ForwardEquations, :Adjoint, or :Zygote).
  • hessianMethod: The method used to compute or approximate the Hessian (either :ForwardDiff, :BlocForwardDiff, or :GaussNewton).
  • nParametersToEstimate: The number of parameters to estimate.
  • θ_estNames: The names of the parameters in θ.
  • θ_nominal: The nominal values of θ as specified in the PEtab parameters file.
  • θ_nominalT: The nominal values of θ on the parameter scale (e.g., log) as specified in the PEtab parameters file.
  • lowerBounds: The lower parameter bounds on the parameter scale for θ as specified in the PEtab parameters file.
  • upperBounds: The upper parameter bounds on the parameter scale for θ as specified in the PEtab parameters file.
  • petabModel: The PEtabModel used to construct the PEtabODEProblem.
  • odeSolverOptions: The options for the ODE solver specified when creating the PEtabODEProblem.
  • odeSolverGradientOptions: The options for the ODE solver gradient specified when creating the PEtabODEProblem.
source
PEtab.createPEtabODEProblemFunction
createPEtabODEProblem(petabModel::PEtabModel; <keyword arguments>)

Given a PEtabModel creates a PEtabODEProblem with potential user specified options.

The keyword arguments (described below) allow the user to choose cost, gradient, and Hessian methods, ODE solver options, and other tuneable options that can potentially make computations more efficient for some "edge-case" models. Please refer to the documentation for guidance on selecting the most efficient options for different types of models.

If a keyword argument is not set, a suitable default option is chosen based on the number of model parameters.

Note

Every problem is unique, so even though the default settings often work well they might not be optimal.

Keyword arguments

  • odeSolverOptions::ODESolverOptions: Options for the ODE solver when computing the cost, such as solver and tolerances.
  • odeSolverGradientOptions::ODESolverOptions: Options for the ODE solver when computing the gradient, such as the ODE solver options used in adjoint sensitivity analysis. Defaults to odeSolverOptions if not set to nothing.
  • ssSolverOptions::SteadyStateSolverOptions: Options for finding steady-state for models with pre-equilibrium. Steady-state can be found via simulation or rootfinding, which can be set using SteadyStateSolverOptions (see documentation). If not set, defaults to simulation with wrms < 1 termination.
  • ssSolverGradientOptions::SteadyStateSolverOptions: Options for finding steady-state for models with pre-equilibrium when computing gradients. Defaults to ssSolverOptions value if not set.
  • costMethod::Symbol=:Standard: Method for computing the cost (objective). Two options are available: :Standard, which is the most efficient, and :Zygote, which is less efficient but compatible with the Zygote automatic differentiation library.
  • gradientMethod=nothing: Method for computing the gradient of the objective. Four options are available:
    • :ForwardDiff: Compute the gradient via forward-mode automatic differentiation using ForwardDiff.jl. Most efficient for models with ≤50 parameters. The number of chunks can be optionally set using chunkSize.
    • :ForwardEquations: Compute the gradient via the model sensitivities, where sensealg specifies how to solve for the sensitivities. Most efficient when the Hessian is approximated using the Gauss-Newton method and when the optimizer can reuse the sensitivities (reuseS) from gradient computations in Hessian computations (e.g., when the optimizer always computes the gradient before the Hessian).
    • :Adjoint: Compute the gradient via adjoint sensitivity analysis, where sensealg specifies which algorithm to use. Most efficient for large models (≥75 parameters).
    • :Zygote: Compute the gradient via the Zygote package, where sensealg specifies which sensitivity algorithm to use when solving the ODE model. This is the most inefficient option and not recommended.
  • hessianMethod=nothing: method for computing the Hessian of the cost. There are three available options:
    • :ForwardDiff: Compute the Hessian via forward-mode automatic differentiation using ForwardDiff.jl. This is often only computationally feasible for models with ≤20 parameters but can greatly improve optimizer convergence.
    • :BlockForwardDiff: Compute the Hessian block approximation via forward-mode automatic differentiation using ForwardDiff.jl. The approximation consists of two block matrices: the first is the Hessian for only the dynamic parameters (parameter part of the ODE system), and the second is for the non-dynamic parameters (e.g., noise parameters). This is computationally feasible for models with ≤20 dynamic parameters and often performs better than BFGS methods.
    • :GaussNewton: Approximate the Hessian via the Gauss-Newton method, which often performs better than the BFGS method. If we can reuse the sensitivities from the gradient in the optimizer (see reuseS), this method is best paired with gradientMethod=:ForwardEquations.
  • sparseJacobian::Bool=false: When solving the ODE du/dt=f(u, p, t), whether implicit solvers use a sparse Jacobian. Sparse Jacobian often performs best for large models (≥100 states).
  • specializeLevel=SciMLBase.FullSpecialize: Specialization level when building the ODE problem. It is not recommended to change this parameter (see https://docs.sciml.ai/SciMLBase/stable/interfaces/Problems/).
  • sensealg: Sensitivity algorithm for gradient computations. The available options for each gradient method are:
    • :ForwardDiff: None (as ForwardDiff takes care of all computation steps).
    • :ForwardEquations: :ForwardDiff (uses ForwardDiff.jl and typicaly performs best) or ForwardDiffSensitivity() and ForwardSensitivity() from SciMLSensitivity.jl (https://github.com/SciML/SciMLSensitivity.jl).
    • :Adjoint: InterpolatingAdjoint() and QuadratureAdjoint() from SciMLSensitivity.jl.
    • :Zygote: All sensealg in SciMLSensitivity.jl.
  • sensealgSS=nothing: Sensitivity algorithm for adjoint gradient computations for steady-state simulations. The available options are SteadyStateAdjoint(), InterpolatingAdjoint(), and QuadratureAdjoint() from SciMLSensitivity.jl. SteadyStateAdjoint() is the most efficient but requires a non-singular Jacobian, and in the case of a non-singular Jacobian, the code automatically switches to InterpolatingAdjoint().
  • chunkSize=nothing: Chunk-size for ForwardDiff.jl when computing the gradient and Hessian via forward-mode automatic differentiation. If nothing is provided, the default value is used. Tuning chunkSize is non-trivial, and we plan to add automatic functionality for this.
  • splitOverConditions::Bool=false: For gradient and Hessian via ForwardDiff.jl, whether or not to split calls to ForwardDiff across experimental (simulation) conditions. This parameter should only be set to true if the model has many parameters specific to an experimental condition; otherwise, the overhead from the calls will increase run time. See the Beer example for a case where this is needed.
  • reuseS::Bool=false : If set to true, reuse the sensitivities computed during gradient computations for the Gauss-Newton Hessian approximation. This option is only applicable when using hessianMethod=:GaussNewton and gradientMethod=:ForwardEquations. Note that it should only be used when the optimizer always computes the gradient before the Hessian.
  • verbose::Bool=true : If set to true, print progress messages while setting up the PEtabODEProblem.
source
PEtab.ODESolverOptionsType
ODESolverOptions(solver, <keyword arguments>)

ODE-solver options (solver, tolerances, etc...) to use when computing gradient/cost for a PEtabODEProblem.

More information about the available options and solvers can be found in the documentation for DifferentialEquations.jl (https://docs.sciml.ai/DiffEqDocs/stable/solvers/ode_solve/). Recommended settings for which solver and options to use for different problems can be found below and in the documentation.

Arguments

  • solver: Any of the ODE solvers in DifferentialEquations.jl. For small (≤20 states) mildly stiff models, composite solvers such as AutoVern7(Rodas5P()) perform well. For stiff small models, Rodas5P() performs well. For medium-sized models (≤75 states), QNDF(), FBDF(), and CVODE_BDF() perform well. CVODE_BDF() is not compatible with automatic differentiation and thus cannot be used if the gradient is computed via automatic differentiation or if the Gauss-Newton Hessian approximation is used. If the gradient is computed via adjoint sensitivity analysis, CVODE_BDF() is often the best choice as it is typically more reliable than QNDF() and FBDF() (fails less often).
  • abstol=1e-8: Absolute tolerance when solving the ODE system. Not recommended to increase above 1e-6 for gradients.
  • reltol=1e-8: Relative tolerance when solving the ODE system. Not recommended to increase above 1e-6 for gradients.
  • force_dtmin=false: Whether or not to force dtmin when solving the ODE system.
  • dtmin=nothing: Minimal acceptable step-size when solving the ODE system.
  • maxiters=10000: Maximum number of iterations when solving the ODE system. Increasing above the default value can cause the optimization to take substantial time.
source
PEtab.SteadyStateSolverOptionsType
SteadyStateSolverOptions(method::Symbol;
                          howCheckSimulationReachedSteadyState::Symbol=:wrms,
                          rootfindingAlgorithm=nothing,
                          abstol=nothing,
                          reltol=nothing,
-                         maxiters=nothing)

Setup options for finding steady-state via either method=:Rootfinding or method=:Simulate.

For method=:Rootfinding, the steady-state u* is found by solving the problem du = f(u, p, t) ≈ 0 with tolerances abstol and reltol via an automatically chosen optimization algorithm (rootfindingAlgorithm=nothing) or via any algorithm in NonlinearSolve.jl.

For method=:Simulate, the steady-state u* is found by simulating the ODE system until du = f(u, p, t) ≈ 0. Two options are available for howCheckSimulationReachedSteadyState:

  • :wrms : Weighted root-mean square √(∑((du ./ (reltol * u .+ abstol)).^2) / length(u)) < 1
  • :Newton : If Newton-step Δu is sufficiently small √(∑((Δu ./ (reltol * u .+ abstol)).^2) / length(u)) < 1. - Newton often performs better but requires an invertible Jacobian. In case it's not fulfilled, the code switches automatically to :wrms.

maxiters refers to either the maximum number of rootfinding steps or the maximum number of integration steps, depending on the chosen method.

source
PEtab.remakePEtabProblemFunction
remakePEtabProblem(petabProblem::PEtabODEProblem, parametersChange::Dict) :: PEtabODEProblem

Fixate model parameters for a given PEtabODEProblem without recompiling the problem.

This function allows you to modify parameters without the need to recompile the underlying code, resulting in reduced latency. To fixate the parameter k1, you can use parametersChange=Dict(:k1 => 1.0).

If model derivatives are computed using ForwardDiff.jl with a chunk-size of N, the new PEtabODEProblem will only evaluate the necessary number of chunks of size N to compute the full gradient for the remade problem.

source
PEtab.FidesType
Fides

Fides is a Python Newton-trust region optimizer for box-bounded optimization problems.

It is particularly effective when the full Hessian cannot be computed, but the Gauss-Newton Hessian approximation can be computed. If constructed with Fides(verbose=true), it prints optimization progress during the process.

source
PEtab.callibrateModelFunction
calibrateModel(petabProblem::PEtabODEProblem,
+                         maxiters=nothing)

Setup options for finding steady-state via either method=:Rootfinding or method=:Simulate.

For method=:Rootfinding, the steady-state u* is found by solving the problem du = f(u, p, t) ≈ 0 with tolerances abstol and reltol via an automatically chosen optimization algorithm (rootfindingAlgorithm=nothing) or via any algorithm in NonlinearSolve.jl.

For method=:Simulate, the steady-state u* is found by simulating the ODE system until du = f(u, p, t) ≈ 0. Two options are available for howCheckSimulationReachedSteadyState:

  • :wrms : Weighted root-mean square √(∑((du ./ (reltol * u .+ abstol)).^2) / length(u)) < 1
  • :Newton : If Newton-step Δu is sufficiently small √(∑((Δu ./ (reltol * u .+ abstol)).^2) / length(u)) < 1. - Newton often performs better but requires an invertible Jacobian. In case it's not fulfilled, the code switches automatically to :wrms.

maxiters refers to either the maximum number of rootfinding steps or the maximum number of integration steps, depending on the chosen method.

source
PEtab.remakePEtabProblemFunction
remakePEtabProblem(petabProblem::PEtabODEProblem, parametersChange::Dict) :: PEtabODEProblem

Fixate model parameters for a given PEtabODEProblem without recompiling the problem.

This function allows you to modify parameters without the need to recompile the underlying code, resulting in reduced latency. To fixate the parameter k1, you can use parametersChange=Dict(:k1 => 1.0).

If model derivatives are computed using ForwardDiff.jl with a chunk-size of N, the new PEtabODEProblem will only evaluate the necessary number of chunks of size N to compute the full gradient for the remade problem.

source
PEtab.FidesType
Fides

Fides is a Python Newton-trust region optimizer for box-bounded optimization problems.

It is particularly effective when the full Hessian cannot be computed, but the Gauss-Newton Hessian approximation can be computed. If constructed with Fides(verbose=true), it prints optimization progress during the process.

source
PEtab.callibrateModelFunction
calibrateModel(petabProblem::PEtabODEProblem,
                optimizer;
-               <keyword arguments>)

Perform multi-start local optimization for a given PEtabODEProblem and return (fmin, minimizer) for all runs.

Arguments

  • petabProblem::PEtabODEProblem: The PEtabODEProblem to be calibrated.
  • optimizer: The optimizer algorithm to be used. Currently, we support three different algorithms:
    1. Fides(): The Newton trust-region Fides optimizer from Python. Please refer to the documentation for setup examples. This optimizer performs well when computing the full Hessian is not possible, and the Gauss-Newton Hessian approximation can be used.
    2. IPNewton(): The interior-point Newton method from Optim.jl. This optimizer performs well when it is computationally feasible to compute the full Hessian.
    3. LBFGS() or BFGS() from Optim.jl: These optimizers are suitable when the computation of the Gauss-Newton Hessian approximation is too expensive, such as when adjoint sensitivity analysis is required for the gradient.
  • nOptimisationStarts::Int: Number of multi-starts to be performed. Defaults to 100.
  • samplingMethod: Method for generating start guesses. Any method from QuasiMonteCarlo.jl is supported, with LatinHypercube as the default.
  • options: Optimization options. For Optim.jl optimizers, it accepts an Optim.Options struct. For Fides, please refer to the Fides documentation and the PEtab.jl documentation for information on setting options.
source
PEtab.runPEtabSelectFunction
runPEtabSelect(pathYAML, optimizer; <keyword arguments>)

Given a PEtab-select YAML file perform model selection with the algorithms specified in the YAML file.

Results are written to a YAML file in the same directory as the PEtab-select YAML file.

Each candidate model produced during the model selection undergoes parameter estimation using local multi-start optimization. Three optimizers are supported: optimizer=Fides() (Fides Newton-trust region), optimizer=IPNewton() from Optim.jl, and optimizer=LBFGS() from Optim.jl. Additional keywords for the optimisation are nOptimisationStarts::Int- number of multi-starts for parameter estimation (defaults to 100) and optimizationSamplingMethod - which is any sampling method from QuasiMonteCarlo.jl for generating start guesses (defaults to LatinHypercubeSample). See also (add callibrate model)

Simulation options can be set using any keyword argument accepted by the createPEtabODEProblem function. For example, setting gradientMethod=:ForwardDiff specifies the use of forward-mode automatic differentiation for gradient computation. If left blank, we automatically select appropriate options based on the size of the problem.

source
PEtab.solveSBMLModelFunction

solveSBMLModel(pathSBML, solver, timeSpan; abstol=1e-8, reltol=1e-8, saveat=Float64[], verbose=true)

Solve an ODE SBML model at the values reported in the SBML file over the specified time span (t0::Float, tend::Float).

Solvers from the OrdinaryDiffEq.jl package are supported. If you want to save the ODE solution at specific time-points, e.g., [1.0, 3.0], provide the saveat argument as saveat=[1.0, 3.0]. The output is provided in the format of OrdinaryDiffEq.jl. The Julia model files are saved in the same directory as the SBML file, in a subdirectory named "SBML".

Note

This function is primarily intended for testing the SBML importer.

source
+ <keyword arguments>)

Perform multi-start local optimization for a given PEtabODEProblem and return (fmin, minimizer) for all runs.

Arguments

  • petabProblem::PEtabODEProblem: The PEtabODEProblem to be calibrated.
  • optimizer: The optimizer algorithm to be used. Currently, we support three different algorithms:
    1. Fides(): The Newton trust-region Fides optimizer from Python. Please refer to the documentation for setup examples. This optimizer performs well when computing the full Hessian is not possible, and the Gauss-Newton Hessian approximation can be used.
    2. IPNewton(): The interior-point Newton method from Optim.jl. This optimizer performs well when it is computationally feasible to compute the full Hessian.
    3. LBFGS() or BFGS() from Optim.jl: These optimizers are suitable when the computation of the Gauss-Newton Hessian approximation is too expensive, such as when adjoint sensitivity analysis is required for the gradient.
  • nOptimisationStarts::Int: Number of multi-starts to be performed. Defaults to 100.
  • samplingMethod: Method for generating start guesses. Any method from QuasiMonteCarlo.jl is supported, with LatinHypercube as the default.
  • options: Optimization options. For Optim.jl optimizers, it accepts an Optim.Options struct. For Fides, please refer to the Fides documentation and the PEtab.jl documentation for information on setting options.
source
PEtab.runPEtabSelectFunction
runPEtabSelect(pathYAML, optimizer; <keyword arguments>)

Given a PEtab-select YAML file perform model selection with the algorithms specified in the YAML file.

Results are written to a YAML file in the same directory as the PEtab-select YAML file.

Each candidate model produced during the model selection undergoes parameter estimation using local multi-start optimization. Three optimizers are supported: optimizer=Fides() (Fides Newton-trust region), optimizer=IPNewton() from Optim.jl, and optimizer=LBFGS() from Optim.jl. Additional keywords for the optimisation are nOptimisationStarts::Int- number of multi-starts for parameter estimation (defaults to 100) and optimizationSamplingMethod - which is any sampling method from QuasiMonteCarlo.jl for generating start guesses (defaults to LatinHypercubeSample). See also (add callibrate model)

Simulation options can be set using any keyword argument accepted by the createPEtabODEProblem function. For example, setting gradientMethod=:ForwardDiff specifies the use of forward-mode automatic differentiation for gradient computation. If left blank, we automatically select appropriate options based on the size of the problem.

source
PEtab.solveSBMLModelFunction

solveSBMLModel(pathSBML, solver, timeSpan; abstol=1e-8, reltol=1e-8, saveat=Float64[], verbose=true)

Solve an ODE SBML model at the values reported in the SBML file over the specified time span (t0::Float, tend::Float).

Solvers from the OrdinaryDiffEq.jl package are supported. If you want to save the ODE solution at specific time-points, e.g., [1.0, 3.0], provide the saveat argument as saveat=[1.0, 3.0]. The output is provided in the format of OrdinaryDiffEq.jl. The Julia model files are saved in the same directory as the SBML file, in a subdirectory named "SBML".

Note

This function is primarily intended for testing the SBML importer.

source
diff --git a/dev/Bachmann/index.html b/dev/Bachmann/index.html index 5b65f73f..3e3c58e8 100644 --- a/dev/Bachmann/index.html +++ b/dev/Bachmann/index.html @@ -34,4 +34,4 @@ @printf("First element in the gradient = %.2e\n", gradient[1]) @printf("First element in the Gauss-Newton Hessian = %.2f\n", hessian[1, 1])
Cost for Bachmann = -418.41
 First element in the gradient = -1.85e-03
-First element in the Gauss-Newton Hessian = 584.10
+First element in the Gauss-Newton Hessian = 584.10 diff --git a/dev/Beer/index.html b/dev/Beer/index.html index 8b85f481..191db604 100644 --- a/dev/Beer/index.html +++ b/dev/Beer/index.html @@ -21,4 +21,4 @@ @printf("First element in the gradient = %.2e\n", gradient[1]) @printf("First element in the hessian = %.2f\n", hessian[1, 1])
Cost = -58622.91
 First element in the gradient = 7.17e-02
-First element in the hessian = 755266.33
+First element in the hessian = 755266.33 diff --git a/dev/Beer_julia_import/index.html b/dev/Beer_julia_import/index.html index bc10c3c2..20220aee 100644 --- a/dev/Beer_julia_import/index.html +++ b/dev/Beer_julia_import/index.html @@ -52,4 +52,4 @@ pathYaml = joinpath(@__DIR__, "Beer", "Beer_MolBioSystems2014.yaml") pathJuliaFile = joinpath(@__DIR__, "Beer", "Julia_import_files", "Beer_Julia_Import.jl") petabModel = readPEtabModel(pathYaml, verbose=true, jlFilePath=pathJuliaFile)
PEtabModel for model Beer. ODE-system has 4 states and 9 parameters.
-Generated Julia files are at ...

Moving forward, you can use the imported model similar to any other model imported from an SBML-file. To get an idea of how to use the wpetabModel` to compute the cost, gradient or hessian for an ODE parameter estimation problem, please refer to the tutorial for the Beer model.

+Generated Julia files are at ...

Moving forward, you can use the imported model similar to any other model imported from an SBML-file. To get an idea of how to use the wpetabModel` to compute the cost, gradient or hessian for an ODE parameter estimation problem, please refer to the tutorial for the Beer model.

diff --git a/dev/Best_options/index.html b/dev/Best_options/index.html index e2c1c857..1ed0d48b 100644 --- a/dev/Best_options/index.html +++ b/dev/Best_options/index.html @@ -13,4 +13,4 @@ odeSolverOptions=ODESolverOptions(CVODE_BDF(), abstol=1e-8, reltol=1e-8), odeSolverGradientOptions=ODESolverOptions(CVODE_BDF(), abstol=1e-8, reltol=1e-8), gradientMethod=:Adjoint, - sensealg=InterpolatingAdjoint()) + sensealg=InterpolatingAdjoint()) diff --git a/dev/Boehm/index.html b/dev/Boehm/index.html index 5da4446c..0e3ae1c3 100644 --- a/dev/Boehm/index.html +++ b/dev/Boehm/index.html @@ -31,4 +31,4 @@ @printf("First element in the gradient = %.2e\n", gradient[1]) @printf("First element in the hessian = %.2f\n", hessian[1, 1])
Cost = 138.22
 First element in the gradient = 2.20e-02
-First element in the hessian = 2199.49

Where to go from here

Next, we suggest you take a look at the Choosing best options for a PEtab problem guide. Additionally, we recommend exploring the Supported gradient and hessian methods section. In case you want to provide your model-file as a Julia-file instead of an SBML file take a look at Providing a model as a Julia file instead of an SBML File.

+First element in the hessian = 2199.49

Where to go from here

Next, we suggest you take a look at the Choosing best options for a PEtab problem guide. Additionally, we recommend exploring the Supported gradient and hessian methods section. In case you want to provide your model-file as a Julia-file instead of an SBML file take a look at Providing a model as a Julia file instead of an SBML File.

diff --git a/dev/Brannmark/index.html b/dev/Brannmark/index.html index fb16730f..e9ced736 100644 --- a/dev/Brannmark/index.html +++ b/dev/Brannmark/index.html @@ -16,4 +16,4 @@ petabProblem.computeGradient!(gradient, p) @printf("Cost= %.2f\n", cost) @printf("First element in the gradient = %.2e\n", gradient[1])
Cost = 141.89
-First element in the gradient = 2.70e-03

Some useful notes regarding the steady-state solver:

+First element in the gradient = 2.70e-03

Some useful notes regarding the steady-state solver:

diff --git a/dev/Gradient_hessian_support/index.html b/dev/Gradient_hessian_support/index.html index cee24b5c..9569c80a 100644 --- a/dev/Gradient_hessian_support/index.html +++ b/dev/Gradient_hessian_support/index.html @@ -3,4 +3,4 @@ \begin{bmatrix} H_{p} & \mathbf{0} \\ \mathbf{0} & \mathbf{H}_q -\end{bmatrix}\]

+\end{bmatrix}\]

diff --git a/dev/Model_selection/index.html b/dev/Model_selection/index.html index 04b9c882..5864d5e7 100644 --- a/dev/Model_selection/index.html +++ b/dev/Model_selection/index.html @@ -26,4 +26,4 @@ [ Info: Callibrating model M1_6 [ Info: Model selection round 4 with 1 candidates [ Info: Callibrating model M1_7 -[ Info: Saving results for best model at 0002/PEtab_select_forward_AIC.yaml

The YAML file storing the model selection results will be saved at pathSave.

To run the code, you will need the PEtab files, which you can find here. You can also find a fully runnable example of this tutorial here.

+[ Info: Saving results for best model at 0002/PEtab_select_forward_AIC.yaml

The YAML file storing the model selection results will be saved at pathSave.

To run the code, you will need the PEtab files, which you can find here. You can also find a fully runnable example of this tutorial here.

diff --git a/dev/Parameter_estimation/index.html b/dev/Parameter_estimation/index.html index 99f0555c..a06fbc1e 100644 --- a/dev/Parameter_estimation/index.html +++ b/dev/Parameter_estimation/index.html @@ -26,4 +26,4 @@ nOptimisationStarts=5, samplingMethod=QuasiMonteCarlo.LatinHypercubeSample(), options=py"{'maxiter' : 200}"o) -@printf("Best found value = %.3f\n", minimum(fvals))
Best found value = 147.544

Please note that since Fides is a Python package, when providing options, they must be in the form of a Python dictionary using the py"..." string.

+@printf("Best found value = %.3f\n", minimum(fvals))
Best found value = 147.544

Please note that since Fides is a Python package, when providing options, they must be in the form of a Python dictionary using the py"..." string.

diff --git a/dev/index.html b/dev/index.html index a217bb8c..1cfbf1ad 100644 --- a/dev/index.html +++ b/dev/index.html @@ -1,2 +1,2 @@ -Home · PEtab.jl

PEtab.jl

This is the documentation of PEtab.jl, a Julia package designed to import ODE parameter estimation problems specified in the PEtab format into Julia.

PEtab.jl uses Julia's DifferentialEquations.jl package for ODE solvers and ModelingToolkit.jl for symbolic model processing, which enables fast model simulations. This, combined with support for gradients via forward- and adjoint-sensitivity approaches, and hessian via both exact and approximate methods, allows for efficient parameter estimation for both small and large models. In an extensive benchmark study, PEtab.jl was found to be 2-4 times faster than the pyPESTO toolbox that leverages the AMICI interface to the Sundials suite.

This documentation includes:

  • A guide for getting started with PEtab.jl
  • Tutorials on medium-sized models, small models with several condition-specific parameters, models with pre-equilibration conditions (steady-state simulations), how to define and import a model written in Julia, how to perform parameter estimation, and how to perform model selection with PEtab-select.
  • Details about available hessian and gradient options.
  • Discussion of the best options for specific model types, including small, medium, and large models.

Installation

PEtab.jl can be installed via

julia> ] add PEtab

or alternatively via

julia> using Pkg; Pkg.add("PEtab")

Feature list

PEtab.jl provides a range of features to import and analyze ODE parameter estimation problems specified in the PEtab format. These include:

  • Importing ODE systems specified either by an SBML file or as a Julia file.
  • Model selection via PEtab Select.
  • Symbolic model pre-processing via ModelingToolkit.jl.
  • Support for all ODE solvers in DifferentialEquations.jl.
  • Gradient calculations using several approaches:
    • Forward-mode automatic differentiation with ForwardDiff.jl.
    • Forward sensitivity analysis with ForwardDiff.jl or SciMLSensitivity.jl.
    • Adjoint sensitivity analysis with any of the algorithms in SciMLSensitivity.jl.
    • Automatic differentiation via Zygote.jl.
  • Hessians computed via:
    • Forward-mode automatic differentiation with ForwardDiff.jl (exact).
    • Block approach with ForwardDiff.jl (approximate).
    • Gauss-Newton method (approximate and often more performant than (L)-BFGS).
  • Handling pre-equilibration and pre-simulation conditions.
  • Support for models with discrete events and logical operations.

Citation

We will soon publish a preprint you can cite if you found PEtab.jl helpful in your work.

+Home · PEtab.jl

PEtab.jl

This is the documentation of PEtab.jl, a Julia package designed to import ODE parameter estimation problems specified in the PEtab format into Julia.

PEtab.jl uses Julia's DifferentialEquations.jl package for ODE solvers and ModelingToolkit.jl for symbolic model processing, which enables fast model simulations. This, combined with support for gradients via forward- and adjoint-sensitivity approaches, and hessian via both exact and approximate methods, allows for efficient parameter estimation for both small and large models. In an extensive benchmark study, PEtab.jl was found to be 2-4 times faster than the pyPESTO toolbox that leverages the AMICI interface to the Sundials suite.

This documentation includes:

  • A guide for getting started with PEtab.jl
  • Tutorials on medium-sized models, small models with several condition-specific parameters, models with pre-equilibration conditions (steady-state simulations), how to define and import a model written in Julia, how to perform parameter estimation, and how to perform model selection with PEtab-select.
  • Details about available hessian and gradient options.
  • Discussion of the best options for specific model types, including small, medium, and large models.

Installation

PEtab.jl can be installed via

julia> ] add PEtab

or alternatively via

julia> using Pkg; Pkg.add("PEtab")

Feature list

PEtab.jl provides a range of features to import and analyze ODE parameter estimation problems specified in the PEtab format. These include:

  • Importing ODE systems specified either by an SBML file or as a Julia file.
  • Model selection via PEtab Select.
  • Symbolic model pre-processing via ModelingToolkit.jl.
  • Support for all ODE solvers in DifferentialEquations.jl.
  • Gradient calculations using several approaches:
    • Forward-mode automatic differentiation with ForwardDiff.jl.
    • Forward sensitivity analysis with ForwardDiff.jl or SciMLSensitivity.jl.
    • Adjoint sensitivity analysis with any of the algorithms in SciMLSensitivity.jl.
    • Automatic differentiation via Zygote.jl.
  • Hessians computed via:
    • Forward-mode automatic differentiation with ForwardDiff.jl (exact).
    • Block approach with ForwardDiff.jl (approximate).
    • Gauss-Newton method (approximate and often more performant than (L)-BFGS).
  • Handling pre-equilibration and pre-simulation conditions.
  • Support for models with discrete events and logical operations.

Citation

We will soon publish a preprint you can cite if you found PEtab.jl helpful in your work.

diff --git a/dev/search/index.html b/dev/search/index.html index 8b5edf89..1b4f47a3 100644 --- a/dev/search/index.html +++ b/dev/search/index.html @@ -1,2 +1,2 @@ -Search · PEtab.jl

Loading search...

    +Search · PEtab.jl

    Loading search...