diff --git a/dev/.documenter-siteinfo.json b/dev/.documenter-siteinfo.json index 0dbb9c41e7..c1a82e4321 100644 --- a/dev/.documenter-siteinfo.json +++ b/dev/.documenter-siteinfo.json @@ -1 +1 @@ -{"documenter":{"julia_version":"1.10.5","generation_timestamp":"2024-09-26T09:03:50","documenter_version":"1.7.0"}} \ No newline at end of file +{"documenter":{"julia_version":"1.10.5","generation_timestamp":"2024-09-26T16:19:18","documenter_version":"1.7.0"}} \ No newline at end of file diff --git a/dev/api/index.html b/dev/api/index.html index d83584e388..3cbc91f2c5 100644 --- a/dev/api/index.html +++ b/dev/api/index.html @@ -1,5 +1,5 @@ -API reference · Enzyme.jl

API reference

Types and constants

Functions and macros

Documentation

Enzyme.@import_fruleMacro
import_frule(::fn, tys...)

Automatically import a ChainRulesCore.fruleas a custom forward modeEnzymeRule. When called in batch mode, this will end up calling the primal multiple times, which may result in incorrect behavior if the function mutates, and slow code, always. Importing the rule fromChainRules` is also likely to be slower than writing your own rule, and may also be slower than not having a rule at all.

Use with caution.

Enzyme.@import_frule(typeof(Base.sort), Any);
+API reference · Enzyme.jl

API reference

Types and constants

Functions and macros

Documentation

Enzyme.@import_fruleMacro
import_frule(::fn, tys...)

Automatically import a ChainRulesCore.fruleas a custom forward modeEnzymeRule. When called in batch mode, this will end up calling the primal multiple times, which may result in incorrect behavior if the function mutates, and slow code, always. Importing the rule fromChainRules` is also likely to be slower than writing your own rule, and may also be slower than not having a rule at all.

Use with caution.

Enzyme.@import_frule(typeof(Base.sort), Any);
 
 x=[1.0, 2.0, 0.0]; dx=[0.1, 0.2, 0.3]; ddx = [0.01, 0.02, 0.03];
 
@@ -14,7 +14,7 @@
 (var"1" = (var"1" = [0.3, 0.1, 0.2], var"2" = [0.03, 0.01, 0.02]),)
 (var"1" = [0.3, 0.1, 0.2],)
 (var"1" = [0.0, 1.0, 2.0], var"2" = [0.3, 0.1, 0.2])
-
source
Enzyme.@import_rruleMacro
import_rrule(::fn, tys...)

Automatically import a ChainRules.rrule as a custom reverse mode EnzymeRule. When called in batch mode, this will end up calling the primal multiple times which results in slower code. This macro assumes that the underlying function to be imported is read-only, and returns a Duplicated or Const object. This macro also assumes that the inputs permit a .+= operation and that the output has a valid Enzyme.make_zero function defined. It also assumes that overwritten(x) accurately describes if there is any non-preserved data from forward to reverse, not just the outermost data structure being overwritten as provided by the specification.

Finally, this macro falls back to almost always caching all of the inputs, even if it may not be needed for the derivative computation.

As a result, this auto importer is also likely to be slower than writing your own rule, and may also be slower than not having a rule at all.

Use with caution.

Enzyme.@import_rrule(typeof(Base.sort), Any);
source
Enzyme.gradient!Method
gradient!(::ReverseMode, dx, f, x)

Compute the gradient of an array-input function f using reverse mode, storing the derivative result in an existing array dx. Both x and dx must be Arrays of the same type.

Example:

f(x) = x[1]*x[2]
+
source
Enzyme.@import_rruleMacro
import_rrule(::fn, tys...)

Automatically import a ChainRules.rrule as a custom reverse mode EnzymeRule. When called in batch mode, this will end up calling the primal multiple times which results in slower code. This macro assumes that the underlying function to be imported is read-only, and returns a Duplicated or Const object. This macro also assumes that the inputs permit a .+= operation and that the output has a valid Enzyme.make_zero function defined. It also assumes that overwritten(x) accurately describes if there is any non-preserved data from forward to reverse, not just the outermost data structure being overwritten as provided by the specification.

Finally, this macro falls back to almost always caching all of the inputs, even if it may not be needed for the derivative computation.

As a result, this auto importer is also likely to be slower than writing your own rule, and may also be slower than not having a rule at all.

Use with caution.

Enzyme.@import_rrule(typeof(Base.sort), Any);
source
Enzyme.gradient!Method
gradient!(::ReverseMode, dx, f, x)

Compute the gradient of an array-input function f using reverse mode, storing the derivative result in an existing array dx. Both x and dx must be Arrays of the same type.

Example:

f(x) = x[1]*x[2]
 
 dx = [0.0, 0.0]
 gradient!(Reverse, dx, f, [2.0, 3.0])
@@ -24,7 +24,7 @@
 gradient!(ReverseWithPrimal, dx, f, [2.0, 3.0])
 
 # output
-(derivs = ([3.0, 2.0],), val = 6.0)
source
Enzyme.gradientMethod
gradient(::ForwardMode, f, x; shadows=onehot(x), chunk=nothing)

Compute the gradient of an array-input function f using forward mode. The optional keyword argument shadow is a vector of one-hot vectors of type x which are used to forward-propagate into the return. For performance reasons, this should be computed once, outside the call to gradient, rather than within this call.

Example:

f(x) = x[1]*x[2]
+(derivs = ([3.0, 2.0],), val = 6.0)
source
Enzyme.gradientMethod
gradient(::ForwardMode, f, x; shadows=onehot(x), chunk=nothing)

Compute the gradient of an array-input function f using forward mode. The optional keyword argument shadow is a vector of one-hot vectors of type x which are used to forward-propagate into the return. For performance reasons, this should be computed once, outside the call to gradient, rather than within this call.

Example:

f(x) = x[1]*x[2]
 
 gradient(Forward, f, [2.0, 3.0])
 
@@ -45,7 +45,7 @@
 grad = gradient(Forward, f, [2.0, 3.0, 4.0])
 
 # output
-([3.0 2.0 0.0; 0.0 1.0 1.0],)
source
Enzyme.gradientMethod
gradient(::ReverseMode, f, args...)

Compute the gradient of a real-valued function f using reverse mode. For each differentiable argument, this function will allocate and return new derivative object, returning a tuple of derivatives for each argument. If an argument is not differentiable, the element of the returned tuple with be nothing.

In reverse mode (here), the derivatives will be the same type as the original argument.

This is a structure gradient. For a struct x it returns another instance of the same type, whose fields contain the components of the gradient. In the result, grad.a contains ∂f/∂x.a for any differential x.a, while grad.c == x.c for other types.

Examples:

f(x) = x[1]*x[2]
+([3.0 2.0 0.0; 0.0 1.0 1.0],)
source
Enzyme.gradientMethod
gradient(::ReverseMode, f, args...)

Compute the gradient of a real-valued function f using reverse mode. For each differentiable argument, this function will allocate and return new derivative object, returning a tuple of derivatives for each argument. If an argument is not differentiable, the element of the returned tuple with be nothing.

In reverse mode (here), the derivatives will be the same type as the original argument.

This is a structure gradient. For a struct x it returns another instance of the same type, whose fields contain the components of the gradient. In the result, grad.a contains ∂f/∂x.a for any differential x.a, while grad.c == x.c for other types.

Examples:

f(x) = x[1]*x[2]
 
 grad = gradient(Reverse, f, [2.0, 3.0])
 
@@ -74,7 +74,7 @@
 (derivs = ([3.0], [2.0]), val = 6.0)
grad = gradient(ReverseWithPrimal, mul, [2.0], Const([3.0]))
 
 # output
-(derivs = ([3.0], nothing), val = 6.0)
source
Enzyme.hvp!Method
hvp!(res::X, f::F, x::X, v::X) where {F, X}

Compute an in-place Hessian-vector product of an array-input scalar-output function f, as evaluated at x times the vector v. The result will be stored into res. The function still allocates and zero's a buffer to store the intermediate gradient, which is not returned to the user.

In other words, compute res .= hessian(f)(x) * v

See hvp_and_gradient! for a function to compute both the hvp and the gradient in a single call.

Example:

f(x) = sin(x[1] * x[2])
+(derivs = ([3.0], nothing), val = 6.0)
source
Enzyme.hvp!Method
hvp!(res::X, f::F, x::X, v::X) where {F, X}

Compute an in-place Hessian-vector product of an array-input scalar-output function f, as evaluated at x times the vector v. The result will be stored into res. The function still allocates and zero's a buffer to store the intermediate gradient, which is not returned to the user.

In other words, compute res .= hessian(f)(x) * v

See hvp_and_gradient! for a function to compute both the hvp and the gradient in a single call.

Example:

f(x) = sin(x[1] * x[2])
 
 res = Vector{Float64}(undef, 2)
 hvp!(res, f, [2.0, 3.0], [5.0, 2.7])
@@ -83,14 +83,14 @@
 # output
 2-element Vector{Float64}:
  19.6926882637302
- 16.201003759768003
source
Enzyme.hvpMethod
hvp(f::F, x::X, v::X) where {F, X}

Compute the Hessian-vector product of an array-input scalar-output function f, as evaluated at x times the vector v.

In other words, compute hessian(f)(x) * v

See hvp! for a version which stores the result in an existing buffer and also hvp_and_gradient! for a function to compute both the hvp and the gradient in a single call.

Example:

f(x) = sin(x[1] * x[2])
+ 16.201003759768003
source
Enzyme.hvpMethod
hvp(f::F, x::X, v::X) where {F, X}

Compute the Hessian-vector product of an array-input scalar-output function f, as evaluated at x times the vector v.

In other words, compute hessian(f)(x) * v

See hvp! for a version which stores the result in an existing buffer and also hvp_and_gradient! for a function to compute both the hvp and the gradient in a single call.

Example:

f(x) = sin(x[1] * x[2])
 
 hvp(f, [2.0, 3.0], [5.0, 2.7])
 
 # output
 2-element Vector{Float64}:
  19.6926882637302
- 16.201003759768003
source
Enzyme.hvp_and_gradient!Method
hvp_and_gradient!(res::X, grad::X, f::F, x::X, v::X) where {F, X}

Compute an in-place Hessian-vector product of an array-input scalar-output function f, as evaluated at x times the vector v as well as the gradient, storing the gradient into grad. Both the hessian vector product and the gradient can be computed together more efficiently than computing them separately.

The result will be stored into res. The gradient will be stored into grad.

In other words, compute res .= hessian(f)(x) * v and grad .= gradient(Reverse, f)(x)

Example:

f(x) = sin(x[1] * x[2])
+ 16.201003759768003
source
Enzyme.hvp_and_gradient!Method
hvp_and_gradient!(res::X, grad::X, f::F, x::X, v::X) where {F, X}

Compute an in-place Hessian-vector product of an array-input scalar-output function f, as evaluated at x times the vector v as well as the gradient, storing the gradient into grad. Both the hessian vector product and the gradient can be computed together more efficiently than computing them separately.

The result will be stored into res. The gradient will be stored into grad.

In other words, compute res .= hessian(f)(x) * v and grad .= gradient(Reverse, f)(x)

Example:

f(x) = sin(x[1] * x[2])
 
 res = Vector{Float64}(undef, 2)
 grad = Vector{Float64}(undef, 2)
@@ -101,7 +101,7 @@
 # output
 2-element Vector{Float64}:
  2.880510859951098
- 1.920340573300732
source
Enzyme.jacobianMethod
jacobian(::ForwardMode, args...; kwargs...)

Equivalent to gradient(::ForwardMode, args...; kwargs...)

source
Enzyme.jacobianMethod
jacobian(::ReverseMode, f, x; n_outs=nothing, chunk=nothing)
+ 1.920340573300732
source
Enzyme.jacobianMethod
jacobian(::ForwardMode, args...; kwargs...)

Equivalent to gradient(::ForwardMode, args...; kwargs...)

source
Enzyme.jacobianMethod
jacobian(::ReverseMode, f, x; n_outs=nothing, chunk=nothing)
 jacobian(::ReverseMode, f, x)

Compute the jacobian of a array-output function f using (potentially vector) reverse mode. The chunk argument optionally denotes the chunk size to use and n_outs optionally denotes the shape of the array returned by f (e.g size(f(x))).

Example:

f(x) = [ x[1] * x[2], x[2] + x[3] ]
 
 jacobian(Reverse, f, [2.0, 3.0, 4.0])
@@ -122,7 +122,7 @@
 grad = jacobian(ReverseWithPrimal, f, [2.0, 3.0, 4.0], n_outs=Val((2,)))
 
 # output
-(derivs = ([3.0 2.0 0.0; 0.0 1.0 1.0],), val = [6.0, 7.0])

This function will return an AbstractArray whose shape is (size(output)..., size(input)...). No guarantees are presently made about the type of the AbstractArray returned by this function (which may or may not be the same as the input AbstractArray if provided).

In the future, when this function is extended to handle non-array return types, this function will retun an AbstractArray of shape size(output) of values of the input type. ```

source
Enzyme.typetreeFunction
function typetree(T, ctx, dl, seen=TypeTreeTable())

Construct a Enzyme typetree from a Julia type.

Warning

When using a memoized lookup by providing seen across multiple calls to typtree the user must call copy on the returned value before mutating it.

source
Enzyme.unsafe_to_pointerMethod
unsafe_to_pointer
Warning

Assumes that val is globally rooted and pointer to it can be leaked. Prefer pointer_from_objref. Only use inside Enzyme.jl should be for Types.

source
EnzymeCore.autodiffMethod
autodiff(::ForwardMode, f, Activity, args::Annotation...)

Auto-differentiate function f at arguments args using forward mode.

args may be numbers, arrays, structs of numbers, structs of arrays and so on. Enzyme will only differentiate in respect to arguments that are wrapped in a Duplicated or similar argument. Unlike reverse mode in autodiff, Active arguments are not allowed here, since all derivative results of immutable objects will be returned and should instead use Duplicated or variants like DuplicatedNoNeed.

Activity is the Activity of the return value, it may be:

  • Const if the return is not to be differentiated with respect to
  • Duplicated, if the return is being differentiated with respect to
  • BatchDuplicated, like Duplicated, but computing multiple derivatives at once. All batch sizes must be the same for all arguments.

Example returning both original return and derivative:

f(x) = x*x
+(derivs = ([3.0 2.0 0.0; 0.0 1.0 1.0],), val = [6.0, 7.0])

This function will return an AbstractArray whose shape is (size(output)..., size(input)...). No guarantees are presently made about the type of the AbstractArray returned by this function (which may or may not be the same as the input AbstractArray if provided).

In the future, when this function is extended to handle non-array return types, this function will retun an AbstractArray of shape size(output) of values of the input type. ```

source
Enzyme.typetreeFunction
function typetree(T, ctx, dl, seen=TypeTreeTable())

Construct a Enzyme typetree from a Julia type.

Warning

When using a memoized lookup by providing seen across multiple calls to typtree the user must call copy on the returned value before mutating it.

source
Enzyme.unsafe_to_pointerMethod
unsafe_to_pointer
Warning

Assumes that val is globally rooted and pointer to it can be leaked. Prefer pointer_from_objref. Only use inside Enzyme.jl should be for Types.

source
EnzymeCore.autodiffMethod
autodiff(::ForwardMode, f, Activity, args::Annotation...)

Auto-differentiate function f at arguments args using forward mode.

args may be numbers, arrays, structs of numbers, structs of arrays and so on. Enzyme will only differentiate in respect to arguments that are wrapped in a Duplicated or similar argument. Unlike reverse mode in autodiff, Active arguments are not allowed here, since all derivative results of immutable objects will be returned and should instead use Duplicated or variants like DuplicatedNoNeed.

Activity is the Activity of the return value, it may be:

  • Const if the return is not to be differentiated with respect to
  • Duplicated, if the return is being differentiated with respect to
  • BatchDuplicated, like Duplicated, but computing multiple derivatives at once. All batch sizes must be the same for all arguments.

Example returning both original return and derivative:

f(x) = x*x
 res, ∂f_∂x = autodiff(ForwardWithPrimal, f, Duplicated, Duplicated(3.14, 1.0))
 
 # output
@@ -132,7 +132,7 @@
 
 # output
 
-(6.28,)
source
EnzymeCore.autodiffMethod
autodiff(::ReverseMode, f, Activity, args::Annotation...)

Auto-differentiate function f at arguments args using reverse mode.

Limitations:

  • f may only return a Real (of a built-in/primitive type) or nothing, not an array, struct, BigFloat, etc. To handle vector-valued return types, use a mutating f! that returns nothing and stores it's return value in one of the arguments, which must be wrapped in a Duplicated.

args may be numbers, arrays, structs of numbers, structs of arrays and so on. Enzyme will only differentiate in respect to arguments that are wrapped in an Active (for arguments whose derivative result must be returned rather than mutated in place, such as primitive types and structs thereof) or Duplicated (for mutable arguments like arrays, Refs and structs thereof).

Activity is the Activity of the return value, it may be Const or Active.

Example:

a = 4.2
+(6.28,)
source
EnzymeCore.autodiffMethod
autodiff(::ReverseMode, f, Activity, args::Annotation...)

Auto-differentiate function f at arguments args using reverse mode.

Limitations:

  • f may only return a Real (of a built-in/primitive type) or nothing, not an array, struct, BigFloat, etc. To handle vector-valued return types, use a mutating f! that returns nothing and stores it's return value in one of the arguments, which must be wrapped in a Duplicated.

args may be numbers, arrays, structs of numbers, structs of arrays and so on. Enzyme will only differentiate in respect to arguments that are wrapped in an Active (for arguments whose derivative result must be returned rather than mutated in place, such as primitive types and structs thereof) or Duplicated (for mutable arguments like arrays, Refs and structs thereof).

Activity is the Activity of the return value, it may be Const or Active.

Example:

a = 4.2
 b = [2.2, 3.3]; ∂f_∂b = zero(b)
 c = 55; d = 9
 
@@ -145,13 +145,13 @@
 
 # output
 
-((6.0,), 9.0)
Note

Enzyme gradients with respect to integer values are zero. Active will automatically convert plain integers to floating point values, but cannot do so for integer values in tuples and structs.

source
EnzymeCore.autodiffMethod
autodiff(::Function, ::Mode, args...)

Specialization of autodiff to handle do argument closures.


+((6.0,), 9.0)
Note

Enzyme gradients with respect to integer values are zero. Active will automatically convert plain integers to floating point values, but cannot do so for integer values in tuples and structs.

source
EnzymeCore.autodiffMethod
autodiff(::Function, ::Mode, args...)

Specialization of autodiff to handle do argument closures.


 autodiff(Reverse, Active(3.1)) do x
   return x*x
 end
 
 # output
-((6.2,),)
source
EnzymeCore.autodiff_deferredMethod
autodiff_deferred(::ForwardMode, f, Activity, args::Annotation...)

Same as autodiff(::ForwardMode, f, Activity, args...) but uses deferred compilation to support usage in GPU code, as well as high-order differentiation.

source
EnzymeCore.autodiff_deferredMethod
autodiff_deferred(::ReverseMode, f, Activity, args::Annotation...)

Same as autodiff but uses deferred compilation to support usage in GPU code, as well as high-order differentiation.

source
EnzymeCore.autodiff_deferred_thunkMethod
autodiff_deferred_thunk(::ReverseModeSplit, ftype, Activity, argtypes::Type{<:Annotation}...)

Provide the split forward and reverse pass functions for annotated function type ftype when called with args of type argtypes when using reverse mode.

Activity is the Activity of the return value, it may be Const, Active, or Duplicated (or its variants DuplicatedNoNeed, BatchDuplicated, and BatchDuplicatedNoNeed).

The forward function will return a tape, the primal (or nothing if not requested), and the shadow (or nothing if not a Duplicated variant), and tapes the corresponding type arguements provided.

The reverse function will return the derivative of Active arguments, updating the Duplicated arguments in place. The same arguments to the forward pass should be provided, followed by the adjoint of the return (if the return is active), and finally the tape from the forward pass.

Example:


+((6.2,),)
source
EnzymeCore.autodiff_deferredMethod
autodiff_deferred(::ForwardMode, f, Activity, args::Annotation...)

Same as autodiff(::ForwardMode, f, Activity, args...) but uses deferred compilation to support usage in GPU code, as well as high-order differentiation.

source
EnzymeCore.autodiff_deferredMethod
autodiff_deferred(::ReverseMode, f, Activity, args::Annotation...)

Same as autodiff but uses deferred compilation to support usage in GPU code, as well as high-order differentiation.

source
EnzymeCore.autodiff_deferred_thunkMethod
autodiff_deferred_thunk(::ReverseModeSplit, ftype, Activity, argtypes::Type{<:Annotation}...)

Provide the split forward and reverse pass functions for annotated function type ftype when called with args of type argtypes when using reverse mode.

Activity is the Activity of the return value, it may be Const, Active, or Duplicated (or its variants DuplicatedNoNeed, BatchDuplicated, and BatchDuplicatedNoNeed).

The forward function will return a tape, the primal (or nothing if not requested), and the shadow (or nothing if not a Duplicated variant), and tapes the corresponding type arguements provided.

The reverse function will return the derivative of Active arguments, updating the Duplicated arguments in place. The same arguments to the forward pass should be provided, followed by the adjoint of the return (if the return is active), and finally the tape from the forward pass.

Example:


 A = [2.2]; ∂A = zero(A)
 v = 3.3
 
@@ -171,7 +171,7 @@
 
 # output
 
-(7.26, 2.2, [3.3])
source
EnzymeCore.autodiff_thunkMethod
autodiff_thunk(::ForwardMode, ftype, Activity, argtypes::Type{<:Annotation}...)

Provide the thunk forward mode function for annotated function type ftype when called with args of type argtypes.

Activity is the Activity of the return value, it may be Const or Duplicated (or its variants DuplicatedNoNeed, BatchDuplicated, andBatchDuplicatedNoNeed).

The forward function will return the primal (if requested) and the shadow (or nothing if not a Duplicated variant).

Example returning both the return derivative and original return:

a = 4.2
+(7.26, 2.2, [3.3])
source
EnzymeCore.autodiff_thunkMethod
autodiff_thunk(::ForwardMode, ftype, Activity, argtypes::Type{<:Annotation}...)

Provide the thunk forward mode function for annotated function type ftype when called with args of type argtypes.

Activity is the Activity of the return value, it may be Const or Duplicated (or its variants DuplicatedNoNeed, BatchDuplicated, andBatchDuplicatedNoNeed).

The forward function will return the primal (if requested) and the shadow (or nothing if not a Duplicated variant).

Example returning both the return derivative and original return:

a = 4.2
 b = [2.2, 3.3]; ∂f_∂b = zero(b)
 c = 55; d = 9
 
@@ -191,7 +191,7 @@
 
 # output
 
-(6.28,)
source
EnzymeCore.autodiff_thunkMethod
autodiff_thunk(::ReverseModeSplit, ftype, Activity, argtypes::Type{<:Annotation}...)

Provide the split forward and reverse pass functions for annotated function type ftype when called with args of type argtypes when using reverse mode.

Activity is the Activity of the return value, it may be Const, Active, or Duplicated (or its variants DuplicatedNoNeed, BatchDuplicated, and BatchDuplicatedNoNeed).

The forward function will return a tape, the primal (or nothing if not requested), and the shadow (or nothing if not a Duplicated variant), and tapes the corresponding type arguements provided.

The reverse function will return the derivative of Active arguments, updating the Duplicated arguments in place. The same arguments to the forward pass should be provided, followed by the adjoint of the return (if the return is active), and finally the tape from the forward pass.

Example:


+(6.28,)
source
EnzymeCore.autodiff_thunkMethod
autodiff_thunk(::ReverseModeSplit, ftype, Activity, argtypes::Type{<:Annotation}...)

Provide the split forward and reverse pass functions for annotated function type ftype when called with args of type argtypes when using reverse mode.

Activity is the Activity of the return value, it may be Const, Active, or Duplicated (or its variants DuplicatedNoNeed, BatchDuplicated, and BatchDuplicatedNoNeed).

The forward function will return a tape, the primal (or nothing if not requested), and the shadow (or nothing if not a Duplicated variant), and tapes the corresponding type arguements provided.

The reverse function will return the derivative of Active arguments, updating the Duplicated arguments in place. The same arguments to the forward pass should be provided, followed by the adjoint of the return (if the return is active), and finally the tape from the forward pass.

Example:


 A = [2.2]; ∂A = zero(A)
 v = 3.3
 
@@ -210,24 +210,24 @@
 
 # output
 
-(7.26, 2.2, [3.3])
source
EnzymeCore.ActiveType
Active(x)

Mark a function argument x of autodiff as active, Enzyme will auto-differentiate in respect Active arguments.

Note

Enzyme gradients with respect to integer values are zero. Active will automatically convert plain integers to floating point values, but cannot do so for integer values in tuples and structs.

source
EnzymeCore.BatchDuplicatedType
BatchDuplicated(x, ∂f_∂xs)

Like Duplicated, except contains several shadows to compute derivatives for all at once. Argument ∂f_∂xs should be a tuple of the several values of type x.

source
EnzymeCore.ConstType
Const(x)

Mark a function argument x of autodiff as constant, Enzyme will not auto-differentiate in respect Const arguments.

source
EnzymeCore.DuplicatedType
Duplicated(x, ∂f_∂x)

Mark a function argument x of autodiff as duplicated, Enzyme will auto-differentiate in respect to such arguments, with dx acting as an accumulator for gradients (so $\partial f / \partial x$ will be added to) ∂f_∂x.

source
EnzymeCore.DuplicatedNoNeedType
DuplicatedNoNeed(x, ∂f_∂x)

Like Duplicated, except also specifies that Enzyme may avoid computing the original result and only compute the derivative values.

This should only be used if x is a write-only variable. Otherwise, if the differentiated function stores values in x and reads them back in subsequent computations, using DuplicatedNoNeed may result in incorrect derivatives. In particular, DuplicatedNoNeed should not be used for preallocated workspace, even if the user might not care about its final value, as marking a variable as NoNeed means that reads from the variable are now undefined.

source
EnzymeCore.FFIABIType
struct FFIABI <: ABI

Foreign function call ABI. JIT the differentiated function, then inttoptr call the address.

source
EnzymeCore.ForwardModeType
struct Forward{ReturnPrimal, ABI, ErrIfFuncWritten,RuntimeActivity} <: Mode{ABI, ErrIfFuncWritten, RuntimeActivity}

Forward mode differentiation

source
EnzymeCore.MixedDuplicatedType
MixedDuplicated(x, ∂f_∂x)

Like Duplicated, except x may contain both active [immutable] and duplicated [mutable] data which is differentiable. Only used within custom rules.

source
EnzymeCore.ModeType
abstract type Mode

Abstract type for what differentiation mode will be used.

source
EnzymeCore.ReverseModeType
struct ReverseMode{ReturnPrimal,RuntimeActivity,ABI,Holomorphic,ErrIfFuncWritten} <: Mode{ABI, ErrIfFuncWritten, RuntimeActivity}

Reverse mode differentiation.

  • ReturnPrimal: Should Enzyme return the primal return value from the augmented-forward.
  • RuntimeActivity: Should Enzyme enable runtime activity (default off)
  • ABI: What runtime ABI to use
  • Holomorphic: Whether the complex result function is holomorphic and we should compute d/dz
  • ErrIfFuncWritten: Should Enzyme err if the function differentiated is a closure and written to.
source
EnzymeCore.ReverseModeSplitType
struct ReverseModeSplit{ReturnPrimal,ReturnShadow,RuntimeActivity,Width,ModifiedBetween,ABI} <: Mode{ABI,ErrIfFuncWritten,RuntimeActivity}

Reverse mode differentiation.

  • ReturnPrimal: Should Enzyme return the primal return value from the augmented-forward.
  • ReturnShadow: Should Enzyme return the shadow return value from the augmented-forward.
  • RuntimeActivity: Should Enzyme differentiate with runtime activity on (default off).
  • Width: Batch Size (0 if to be automatically derived)
  • ModifiedBetween: Tuple of each argument's modified between state (true if to be automatically derived).
source
EnzymeCore.compiler_job_from_backendFunction
compiler_job_from_backend(::KernelAbstractions.Backend, F::Type, TT:Type)::GPUCompiler.CompilerJob

Returns a GPUCompiler CompilerJob from a backend as specified by the first argument to the function.

For example, in CUDA one would do:

function EnzymeCore.compiler_job_from_backend(::CUDABackend, @nospecialize(F::Type), @nospecialize(TT::Type))
+(7.26, 2.2, [3.3])
source
EnzymeCore.ActiveType
Active(x)

Mark a function argument x of autodiff as active, Enzyme will auto-differentiate in respect Active arguments.

Note

Enzyme gradients with respect to integer values are zero. Active will automatically convert plain integers to floating point values, but cannot do so for integer values in tuples and structs.

source
EnzymeCore.BatchDuplicatedType
BatchDuplicated(x, ∂f_∂xs)

Like Duplicated, except contains several shadows to compute derivatives for all at once. Argument ∂f_∂xs should be a tuple of the several values of type x.

source
EnzymeCore.ConstType
Const(x)

Mark a function argument x of autodiff as constant, Enzyme will not auto-differentiate in respect Const arguments.

source
EnzymeCore.DuplicatedType
Duplicated(x, ∂f_∂x)

Mark a function argument x of autodiff as duplicated, Enzyme will auto-differentiate in respect to such arguments, with dx acting as an accumulator for gradients (so $\partial f / \partial x$ will be added to) ∂f_∂x.

source
EnzymeCore.DuplicatedNoNeedType
DuplicatedNoNeed(x, ∂f_∂x)

Like Duplicated, except also specifies that Enzyme may avoid computing the original result and only compute the derivative values.

This should only be used if x is a write-only variable. Otherwise, if the differentiated function stores values in x and reads them back in subsequent computations, using DuplicatedNoNeed may result in incorrect derivatives. In particular, DuplicatedNoNeed should not be used for preallocated workspace, even if the user might not care about its final value, as marking a variable as NoNeed means that reads from the variable are now undefined.

source
EnzymeCore.FFIABIType
struct FFIABI <: ABI

Foreign function call ABI. JIT the differentiated function, then inttoptr call the address.

source
EnzymeCore.ForwardModeType
struct Forward{ReturnPrimal, ABI, ErrIfFuncWritten,RuntimeActivity} <: Mode{ABI, ErrIfFuncWritten, RuntimeActivity}

Forward mode differentiation

source
EnzymeCore.MixedDuplicatedType
MixedDuplicated(x, ∂f_∂x)

Like Duplicated, except x may contain both active [immutable] and duplicated [mutable] data which is differentiable. Only used within custom rules.

source
EnzymeCore.ModeType
abstract type Mode

Abstract type for what differentiation mode will be used.

source
EnzymeCore.ReverseModeType
struct ReverseMode{ReturnPrimal,RuntimeActivity,ABI,Holomorphic,ErrIfFuncWritten} <: Mode{ABI, ErrIfFuncWritten, RuntimeActivity}

Reverse mode differentiation.

  • ReturnPrimal: Should Enzyme return the primal return value from the augmented-forward.
  • RuntimeActivity: Should Enzyme enable runtime activity (default off)
  • ABI: What runtime ABI to use
  • Holomorphic: Whether the complex result function is holomorphic and we should compute d/dz
  • ErrIfFuncWritten: Should Enzyme err if the function differentiated is a closure and written to.
source
EnzymeCore.ReverseModeSplitType
struct ReverseModeSplit{ReturnPrimal,ReturnShadow,RuntimeActivity,Width,ModifiedBetween,ABI} <: Mode{ABI,ErrIfFuncWritten,RuntimeActivity}

Reverse mode differentiation.

  • ReturnPrimal: Should Enzyme return the primal return value from the augmented-forward.
  • ReturnShadow: Should Enzyme return the shadow return value from the augmented-forward.
  • RuntimeActivity: Should Enzyme differentiate with runtime activity on (default off).
  • Width: Batch Size (0 if to be automatically derived)
  • ModifiedBetween: Tuple of each argument's modified between state (true if to be automatically derived).
source
EnzymeCore.compiler_job_from_backendFunction
compiler_job_from_backend(::KernelAbstractions.Backend, F::Type, TT:Type)::GPUCompiler.CompilerJob

Returns a GPUCompiler CompilerJob from a backend as specified by the first argument to the function.

For example, in CUDA one would do:

function EnzymeCore.compiler_job_from_backend(::CUDABackend, @nospecialize(F::Type), @nospecialize(TT::Type))
     mi = GPUCompiler.methodinstance(F, TT)
     return GPUCompiler.CompilerJob(mi, CUDA.compiler_config(CUDA.device()))
-end
source
EnzymeCore.make_zeroFunction
make_zero(::Type{T}, seen::IdDict, prev::T, ::Val{copy_if_inactive}=Val(false))::T
 
 Recursively make a zero'd copy of the value `prev` of type `T`. The argument `copy_if_inactive` specifies
-what to do if the type `T` is guaranteed to be inactive, use the primal (the default) or still copy the value.
source
EnzymeCore.make_zero!Function
make_zero!(val::T, seen::IdSet{Any}=IdSet())::Nothing
-
-Recursively set a variables differentiable fields to zero. Only applicable for mutable types `T`.
source
EnzymeCore.EnzymeRules.AugmentedReturnType
AugmentedReturn(primal, shadow, tape)

Augment the primal return value of a function with its shadow, as well as any additional information needed to correctly compute the reverse pass, stored in tape.

Unless specified by the config that a variable is not overwritten, rules must assume any arrays/data structures/etc are overwritten between the forward and the reverse pass. Any floats or variables passed by value are always preserved as is (as are the arrays themselves, just not necessarily the values in the array).

See also augmented_primal.

source
EnzymeCore.EnzymeRules.FwdConfigType
FwdConfig{NeedsPrimal, NeedsShadow, Width, RuntimeActivity}
-FwdConfigWidth{Width} = FwdConfig{<:Any, <:Any, Width}

Configuration type to dispatch on in custom forward rules (see forward.

  • NeedsPrimal and NeedsShadow: boolean values specifying whether the primal and shadow (resp.) should be returned.
  • Width: an integer that specifies the number of adjoints/shadows simultaneously being propagated.
  • RuntimeActivity: whether runtime activity is enabled.

Getters for the type parameters are provided by needs_primal, needs_shadow, width and runtime_activity.

source
EnzymeCore.EnzymeRules.RevConfigType
RevConfig{NeedsPrimal, NeedsShadow, Width, Overwritten, RuntimeActivity}
-RevConfigWidth{Width} = RevConfig{<:Any, <:Any, Width}

Configuration type to dispatch on in custom reverse rules (see augmented_primal and reverse).

  • NeedsPrimal and NeedsShadow: boolean values specifying whether the primal and shadow (resp.) should be returned.
  • Width: an integer that specifies the number of adjoints/shadows simultaneously being propagated.
  • Overwritten: a tuple of booleans of whether each argument (including the function itself) is modified between the forward and reverse pass (true if potentially modified between).
  • RuntimeActivity: whether runtime activity is enabled.

Getters for the four type parameters are provided by needs_primal, needs_shadow, width, overwritten, and runtime_activity.

source
EnzymeCore.EnzymeRules.augmented_primalFunction
augmented_primal(::RevConfig, func::Annotation{typeof(f)}, RT::Type{<:Annotation}, args::Annotation...)

Must return an AugmentedReturn type.

  • The primal must be the same type of the original return if needs_primal(config), otherwise nothing.
  • The shadow must be nothing if needs_shadow(config) is false. If width is 1, the shadow should be the same type of the original return. If the width is greater than 1, the shadow should be NTuple{original return, width}.
  • The tape can be any type (including Nothing) and is preserved for the reverse call.
source
EnzymeCore.EnzymeRules.forwardFunction
forward(fwdconfig, func::Annotation{typeof(f)}, RT::Type{<:Annotation}, args::Annotation...)

Calculate the forward derivative. The first argument is a `FwdConfig object describing parameters of the differentiation. The second argument func is the callable for which the rule applies to. Either wrapped in a Const), or a Duplicated if it is a closure. The third argument is the return type annotation, and all other arguments are the annotated function arguments.

source
EnzymeCore.EnzymeRules.inactiveFunction
inactive(func::typeof(f), args...)

Mark a particular function as always being inactive in both its return result and the function call itself.

source
EnzymeCore.EnzymeRules.inactive_noinlFunction
inactive_noinl(func::typeof(f), args...)

Mark a particular function as always being inactive in both its return result and the function call itself, but do not prevent inlining of the function.

source
EnzymeCore.EnzymeRules.noaliasFunction
noalias(func::typeof(f), args...)

Mark a particular function as always being a fresh allocation which does not alias any other accessible memory.

source
EnzymeCore.EnzymeRules.primal_typeMethod
primal_type(::FwdConfig, ::Type{<:Annotation{RT}})
-primal_type(::RevConfig, ::Type{<:Annotation{RT}})

Compute the exepcted primal return type given a reverse mode config and return activity

source
EnzymeCore.EnzymeRules.reverseFunction
reverse(::RevConfig, func::Annotation{typeof(f)}, dret::Active, tape, args::Annotation...)
-reverse(::RevConfig, func::Annotation{typeof(f)}, ::Type{<:Annotation), tape, args::Annotation...)

Takes gradient of derivative, activity annotation, and tape. If there is an active return dret is passed as Active{T} with the derivative of the active return val. Otherwise dret is passed as Type{Duplicated{T}}, etc.

source
EnzymeCore.EnzymeRules.shadow_typeMethod
shadow_type(::FwdConfig, ::Type{<:Annotation{RT}})
-shadow_type(::RevConfig, ::Type{<:Annotation{RT}})

Compute the exepcted shadow return type given a reverse mode config and return activity

source
EnzymeTestUtils.@test_msgMacro
@test_msg msg condion kws...

This is per Test.@test condion kws... except that if it fails it also prints the msg. If msg=="" then this is just like @test, nothing is printed

Examles

julia> @test_msg "It is required that the total is under 10" sum(1:1000) < 10;
+what to do if the type `T` is guaranteed to be inactive, use the primal (the default) or still copy the value.
source
EnzymeCore.make_zero!Function
make_zero!(val::T, seen::IdSet{Any}=IdSet())::Nothing
+
+Recursively set a variables differentiable fields to zero. Only applicable for mutable types `T`.
source
EnzymeCore.EnzymeRules.AugmentedReturnType
AugmentedReturn(primal, shadow, tape)

Augment the primal return value of a function with its shadow, as well as any additional information needed to correctly compute the reverse pass, stored in tape.

Unless specified by the config that a variable is not overwritten, rules must assume any arrays/data structures/etc are overwritten between the forward and the reverse pass. Any floats or variables passed by value are always preserved as is (as are the arrays themselves, just not necessarily the values in the array).

See also augmented_primal.

source
EnzymeCore.EnzymeRules.FwdConfigType
FwdConfig{NeedsPrimal, NeedsShadow, Width, RuntimeActivity}
+FwdConfigWidth{Width} = FwdConfig{<:Any, <:Any, Width}

Configuration type to dispatch on in custom forward rules (see forward.

  • NeedsPrimal and NeedsShadow: boolean values specifying whether the primal and shadow (resp.) should be returned.
  • Width: an integer that specifies the number of adjoints/shadows simultaneously being propagated.
  • RuntimeActivity: whether runtime activity is enabled.

Getters for the type parameters are provided by needs_primal, needs_shadow, width and runtime_activity.

source
EnzymeCore.EnzymeRules.RevConfigType
RevConfig{NeedsPrimal, NeedsShadow, Width, Overwritten, RuntimeActivity}
+RevConfigWidth{Width} = RevConfig{<:Any, <:Any, Width}

Configuration type to dispatch on in custom reverse rules (see augmented_primal and reverse).

  • NeedsPrimal and NeedsShadow: boolean values specifying whether the primal and shadow (resp.) should be returned.
  • Width: an integer that specifies the number of adjoints/shadows simultaneously being propagated.
  • Overwritten: a tuple of booleans of whether each argument (including the function itself) is modified between the forward and reverse pass (true if potentially modified between).
  • RuntimeActivity: whether runtime activity is enabled.

Getters for the four type parameters are provided by needs_primal, needs_shadow, width, overwritten, and runtime_activity.

source
EnzymeCore.EnzymeRules.augmented_primalFunction
augmented_primal(::RevConfig, func::Annotation{typeof(f)}, RT::Type{<:Annotation}, args::Annotation...)

Must return an AugmentedReturn type.

  • The primal must be the same type of the original return if needs_primal(config), otherwise nothing.
  • The shadow must be nothing if needs_shadow(config) is false. If width is 1, the shadow should be the same type of the original return. If the width is greater than 1, the shadow should be NTuple{original return, width}.
  • The tape can be any type (including Nothing) and is preserved for the reverse call.
source
EnzymeCore.EnzymeRules.forwardFunction
forward(fwdconfig, func::Annotation{typeof(f)}, RT::Type{<:Annotation}, args::Annotation...)

Calculate the forward derivative. The first argument is a `FwdConfig object describing parameters of the differentiation. The second argument func is the callable for which the rule applies to. Either wrapped in a Const), or a Duplicated if it is a closure. The third argument is the return type annotation, and all other arguments are the annotated function arguments.

source
EnzymeCore.EnzymeRules.inactiveFunction
inactive(func::typeof(f), args...)

Mark a particular function as always being inactive in both its return result and the function call itself.

source
EnzymeCore.EnzymeRules.inactive_noinlFunction
inactive_noinl(func::typeof(f), args...)

Mark a particular function as always being inactive in both its return result and the function call itself, but do not prevent inlining of the function.

source
EnzymeCore.EnzymeRules.noaliasFunction
noalias(func::typeof(f), args...)

Mark a particular function as always being a fresh allocation which does not alias any other accessible memory.

source
EnzymeCore.EnzymeRules.primal_typeMethod
primal_type(::FwdConfig, ::Type{<:Annotation{RT}})
+primal_type(::RevConfig, ::Type{<:Annotation{RT}})

Compute the exepcted primal return type given a reverse mode config and return activity

source
EnzymeCore.EnzymeRules.reverseFunction
reverse(::RevConfig, func::Annotation{typeof(f)}, dret::Active, tape, args::Annotation...)
+reverse(::RevConfig, func::Annotation{typeof(f)}, ::Type{<:Annotation), tape, args::Annotation...)

Takes gradient of derivative, activity annotation, and tape. If there is an active return dret is passed as Active{T} with the derivative of the active return val. Otherwise dret is passed as Type{Duplicated{T}}, etc.

source
EnzymeCore.EnzymeRules.shadow_typeMethod
shadow_type(::FwdConfig, ::Type{<:Annotation{RT}})
+shadow_type(::RevConfig, ::Type{<:Annotation{RT}})

Compute the exepcted shadow return type given a reverse mode config and return activity

source
EnzymeTestUtils.@test_msgMacro
@test_msg msg condion kws...

This is per Test.@test condion kws... except that if it fails it also prints the msg. If msg=="" then this is just like @test, nothing is printed

Examles

julia> @test_msg "It is required that the total is under 10" sum(1:1000) < 10;
 Test Failed at REPL[1]:1
   Expression: sum(1:1000) < 10
   Problem: It is required that the total is under 10
@@ -249,7 +249,7 @@
   Test Failed at REPL[153]:1
     Expression: sum(1:1000) < 10
      Evaluated: 500500 < 10
-  ERROR: There was an error during testing
source
EnzymeTestUtils.test_forwardMethod
test_forward(f, Activity, args...; kwargs...)

Test Enzyme.autodiff of f in Forward-mode against finite differences.

f has all constraints of the same argument passed to Enzyme.autodiff, with additional constraints:

  • If it mutates one of its arguments, it must return that argument.

Arguments

  • Activity: the activity of the return value of f
  • args: Each entry is either an argument to f, an activity type accepted by autodiff, or a tuple of the form (arg, Activity), where Activity is the activity type of arg. If the activity type specified requires a tangent, a random tangent will be automatically generated.

Keywords

  • rng::AbstractRNG: The random number generator to use for generating random tangents.
  • fdm=FiniteDifferences.central_fdm(5, 1): The finite differences method to use.
  • fkwargs: Keyword arguments to pass to f.
  • rtol: Relative tolerance for isapprox.
  • atol: Absolute tolerance for isapprox.
  • testset_name: Name to use for a testset in which all tests are evaluated.

Examples

Here we test a rule for a function of scalars. Because we don't provide an activity annotation for y, it is assumed to be Const.

using Enzyme, EnzymeTestUtils
+  ERROR: There was an error during testing
source
EnzymeTestUtils.test_forwardMethod
test_forward(f, Activity, args...; kwargs...)

Test Enzyme.autodiff of f in Forward-mode against finite differences.

f has all constraints of the same argument passed to Enzyme.autodiff, with additional constraints:

  • If it mutates one of its arguments, it must return that argument.

Arguments

  • Activity: the activity of the return value of f
  • args: Each entry is either an argument to f, an activity type accepted by autodiff, or a tuple of the form (arg, Activity), where Activity is the activity type of arg. If the activity type specified requires a tangent, a random tangent will be automatically generated.

Keywords

  • rng::AbstractRNG: The random number generator to use for generating random tangents.
  • fdm=FiniteDifferences.central_fdm(5, 1): The finite differences method to use.
  • fkwargs: Keyword arguments to pass to f.
  • rtol: Relative tolerance for isapprox.
  • atol: Absolute tolerance for isapprox.
  • testset_name: Name to use for a testset in which all tests are evaluated.

Examples

Here we test a rule for a function of scalars. Because we don't provide an activity annotation for y, it is assumed to be Const.

using Enzyme, EnzymeTestUtils
 
 x, y = randn(2)
 for Tret in (Const, Duplicated, DuplicatedNoNeed), Tx in (Const, Duplicated)
@@ -261,7 +261,7 @@
     Ty in (Const, BatchDuplicated)
 
     test_forward(*, Tret, (x, Tx), (y, Ty))
-end
source
EnzymeTestUtils.test_reverseMethod
test_reverse(f, Activity, args...; kwargs...)

Test Enzyme.autodiff_thunk of f in ReverseSplitWithPrimal-mode against finite differences.

f has all constraints of the same argument passed to Enzyme.autodiff_thunk, with additional constraints:

  • If an Array{<:AbstractFloat} appears in the input/output, then a reshaped version of it may not also appear in the input/output.

Arguments

  • Activity: the activity of the return value of f.
  • args: Each entry is either an argument to f, an activity type accepted by autodiff, or a tuple of the form (arg, Activity), where Activity is the activity type of arg. If the activity type specified requires a shadow, one will be automatically generated.

Keywords

  • rng::AbstractRNG: The random number generator to use for generating random tangents.
  • fdm=FiniteDifferences.central_fdm(5, 1): The finite differences method to use.
  • fkwargs: Keyword arguments to pass to f.
  • rtol: Relative tolerance for isapprox.
  • atol: Absolute tolerance for isapprox.
  • testset_name: Name to use for a testset in which all tests are evaluated.

Examples

Here we test a rule for a function of scalars. Because we don't provide an activity annotation for y, it is assumed to be Const.

using Enzyme, EnzymeTestUtils
+end
source
EnzymeTestUtils.test_reverseMethod
test_reverse(f, Activity, args...; kwargs...)

Test Enzyme.autodiff_thunk of f in ReverseSplitWithPrimal-mode against finite differences.

f has all constraints of the same argument passed to Enzyme.autodiff_thunk, with additional constraints:

  • If an Array{<:AbstractFloat} appears in the input/output, then a reshaped version of it may not also appear in the input/output.

Arguments

  • Activity: the activity of the return value of f.
  • args: Each entry is either an argument to f, an activity type accepted by autodiff, or a tuple of the form (arg, Activity), where Activity is the activity type of arg. If the activity type specified requires a shadow, one will be automatically generated.

Keywords

  • rng::AbstractRNG: The random number generator to use for generating random tangents.
  • fdm=FiniteDifferences.central_fdm(5, 1): The finite differences method to use.
  • fkwargs: Keyword arguments to pass to f.
  • rtol: Relative tolerance for isapprox.
  • atol: Absolute tolerance for isapprox.
  • testset_name: Name to use for a testset in which all tests are evaluated.

Examples

Here we test a rule for a function of scalars. Because we don't provide an activity annotation for y, it is assumed to be Const.

using Enzyme, EnzymeTestUtils
 
 x = randn()
 y = randn()
@@ -270,4 +270,4 @@
 end

Here we test a rule for a function of an array in batch reverse-mode:

x = randn(3)
 for Tret in (Const, Active), Tx in (Const, BatchDuplicated)
     test_reverse(prod, Tret, (x, Tx))
-end
source
Enzyme.API.inlineall!Method
inlineall!(val::Bool)

Whether to inline all (non-recursive) functions generated by Julia within a single compilation unit. This may improve Enzyme's ability to successfully differentiate code and improve performance of the original and generated derivative program. It often, however, comes with an increase in compile time. This is off by default.

source
Enzyme.API.instname!Method
instname!(val::Bool)

Whether to add a name to all LLVM values. This may be helpful for debugging generated programs, both primal and derivative. Off by default.

source
Enzyme.API.looseTypeAnalysis!Method
looseTypeAnalysis!(val::Bool)

Enzyme runs a type analysis to deduce the corresponding types of all values being differentiated. This is necessary to compute correct derivatives of various values. For example, a copy of Float32's requires a different derivative than a memcpy of Float64's, Ptr's, etc. In some cases Enzyme may not be able to deduce all the types necessary and throw an unknown type error. If this is the case, open an issue. One can silence these issues by setting looseTypeAnalysis!(true) which tells Enzyme to make its best guess. This will remove the error and allow differentiation to continue, however, it may produce incorrect results. Alternatively one can consider increasing the space of the evaluated type lattice which gives Enzyme more time to run a more thorough analysis through the use of maxtypeoffset!

source
Enzyme.API.maxtypedepth!Method
maxtypedepth!(val::Int)

Enzyme runs a type analysis to deduce the corresponding types of all values being differentiated. This is necessary to compute correct derivatives of various values. To ensure this analysis temrinates, it operates on a finite lattice of possible states. This function sets the maximum depth into a type that Enzyme will consider. A smaller value will cause type analysis to run faster, but may result in some necessary types not being found and result in unknown type errors. A larger value may result in unknown type errors being resolved by searching a larger space, but may run longer. The default setting is 6.

source
Enzyme.API.maxtypeoffset!Method
maxtypeoffset!(val::Int)

Enzyme runs a type analysis to deduce the corresponding types of all values being differentiated. This is necessary to compute correct derivatives of various values. To ensure this analysis temrinates, it operates on a finite lattice of possible states. This function sets the maximum offset into a type that Enzyme will consider. A smaller value will cause type analysis to run faster, but may result in some necessary types not being found and result in unknown type errors. A larger value may result in unknown type errors being resolved by searching a larger space, but may run longer. The default setting is 512.

source
Enzyme.API.printactivity!Method
printactivity!(val::Bool)

An debugging option for developers of Enzyme. If one sets this flag prior to the first differentiation of a function, Enzyme will print (to stderr) a log of all decisions made during Activity Analysis (the analysis which determines what values/instructions are differentiated). This may be useful for debugging MixedActivity errors, correctness, and performance errors. Off by default

source
Enzyme.API.printall!Method
printall!(val::Bool)

An debugging option for developers of Enzyme. If one sets this flag prior to the first differentiation of a function, Enzyme will print (to stderr) the LLVM function being differentiated, as well as all generated derivatives immediately after running Enzyme (but prior to any other optimizations). Off by default

source
Enzyme.API.printdiffuse!Method
printdiffuse!(val::Bool)

An debugging option for developers of Enzyme. If one sets this flag prior to the first differentiation of a function, Enzyme will print (to stderr) information about each LLVM value – specifically whether it and its shadow is required for computing the derivative. In contrast to printunnecessary!, this flag prints debug log for the analysis which determines for each value and shadow value, whether it can find a user which would require it to be kept around (rather than being deleted). This is prior to any cache optimizations and a debug log of Differential Use Analysis. This may be helpful for debugging caching, phi node deletion, performance, and other errors. Off by default

source
Enzyme.API.printperf!Method
printperf!(val::Bool)

An debugging option for developers of Enzyme. If one sets this flag prior to the first differentiation of a function, Enzyme will print (to stderr) performance information about generated derivative programs. It will provide debug information that warns why particular values are cached for the reverse pass, and thus require additional computation/storage. This is particularly helpful for debugging derivatives which OOM or otherwise run slow. ff by default

source
Enzyme.API.printtype!Method
printtype!(val::Bool)

An debugging option for developers of Enzyme. If one sets this flag prior to the first differentiation of a function, Enzyme will print (to stderr) a log of all decisions made during Type Analysis (the analysis which Enzyme determines the type of all values in the program). This may be useful for debugging correctness errors, illegal type analysis errors, insufficient type information errors, correctness, and performance errors. Off by default

source
Enzyme.API.printunnecessary!Method
printunnecessary!(val::Bool)

An debugging option for developers of Enzyme. If one sets this flag prior to the first differentiation of a function, Enzyme will print (to stderr) information about each LLVM value – specifically whether it and its shadow is required for computing the derivative. In contrast to printdiffuse!, this flag prints the final results after running cache optimizations such as minCut (see Recompute vs Cache Heuristics from this paper and slides 31-33 from this presentation) for a description of the caching algorithm. This may be helpful for debugging caching, phi node deletion, performance, and other errors. Off by default

source
Enzyme.API.strictAliasing!Method
strictAliasing!(val::Bool)

Whether Enzyme's type analysis will assume strict aliasing semantics. When strict aliasing semantics are on (the default), Enzyme can propagate type information up through conditional branches. This may lead to illegal type errors when analyzing code with unions. Disabling strict aliasing will enable these union types to be correctly analyzed. However, it may lead to some errors that sufficient type information cannot be deduced. One can turn these insufficient type information errors into to warnings by calling looseTypeAnalysis!(true) which tells Enzyme to use its best guess in such scenarios.

source
Enzyme.API.strong_zero!Method
strong_zero!(val::Bool)

Whether to enforce multiplication by zero as enforcing a zero result even if multiplying against a NaN or infinity. Necessary for some programs in which a value has a zero derivative since it is unused, even if it has an otherwise infinite or nan derivative.

source
Enzyme.API.typeWarning!Method
typeWarning!(val::Bool)

Whether to print a warning when Type Analysis learns informatoin about a value's type which cannot be represented in the current size of the lattice. See maxtypeoffset! for more information. Off by default.

source
+end
source
Enzyme.API.inlineall!Method
inlineall!(val::Bool)

Whether to inline all (non-recursive) functions generated by Julia within a single compilation unit. This may improve Enzyme's ability to successfully differentiate code and improve performance of the original and generated derivative program. It often, however, comes with an increase in compile time. This is off by default.

source
Enzyme.API.instname!Method
instname!(val::Bool)

Whether to add a name to all LLVM values. This may be helpful for debugging generated programs, both primal and derivative. Off by default.

source
Enzyme.API.looseTypeAnalysis!Method
looseTypeAnalysis!(val::Bool)

Enzyme runs a type analysis to deduce the corresponding types of all values being differentiated. This is necessary to compute correct derivatives of various values. For example, a copy of Float32's requires a different derivative than a memcpy of Float64's, Ptr's, etc. In some cases Enzyme may not be able to deduce all the types necessary and throw an unknown type error. If this is the case, open an issue. One can silence these issues by setting looseTypeAnalysis!(true) which tells Enzyme to make its best guess. This will remove the error and allow differentiation to continue, however, it may produce incorrect results. Alternatively one can consider increasing the space of the evaluated type lattice which gives Enzyme more time to run a more thorough analysis through the use of maxtypeoffset!

source
Enzyme.API.maxtypedepth!Method
maxtypedepth!(val::Int)

Enzyme runs a type analysis to deduce the corresponding types of all values being differentiated. This is necessary to compute correct derivatives of various values. To ensure this analysis temrinates, it operates on a finite lattice of possible states. This function sets the maximum depth into a type that Enzyme will consider. A smaller value will cause type analysis to run faster, but may result in some necessary types not being found and result in unknown type errors. A larger value may result in unknown type errors being resolved by searching a larger space, but may run longer. The default setting is 6.

source
Enzyme.API.maxtypeoffset!Method
maxtypeoffset!(val::Int)

Enzyme runs a type analysis to deduce the corresponding types of all values being differentiated. This is necessary to compute correct derivatives of various values. To ensure this analysis temrinates, it operates on a finite lattice of possible states. This function sets the maximum offset into a type that Enzyme will consider. A smaller value will cause type analysis to run faster, but may result in some necessary types not being found and result in unknown type errors. A larger value may result in unknown type errors being resolved by searching a larger space, but may run longer. The default setting is 512.

source
Enzyme.API.printactivity!Method
printactivity!(val::Bool)

An debugging option for developers of Enzyme. If one sets this flag prior to the first differentiation of a function, Enzyme will print (to stderr) a log of all decisions made during Activity Analysis (the analysis which determines what values/instructions are differentiated). This may be useful for debugging MixedActivity errors, correctness, and performance errors. Off by default

source
Enzyme.API.printall!Method
printall!(val::Bool)

An debugging option for developers of Enzyme. If one sets this flag prior to the first differentiation of a function, Enzyme will print (to stderr) the LLVM function being differentiated, as well as all generated derivatives immediately after running Enzyme (but prior to any other optimizations). Off by default

source
Enzyme.API.printdiffuse!Method
printdiffuse!(val::Bool)

An debugging option for developers of Enzyme. If one sets this flag prior to the first differentiation of a function, Enzyme will print (to stderr) information about each LLVM value – specifically whether it and its shadow is required for computing the derivative. In contrast to printunnecessary!, this flag prints debug log for the analysis which determines for each value and shadow value, whether it can find a user which would require it to be kept around (rather than being deleted). This is prior to any cache optimizations and a debug log of Differential Use Analysis. This may be helpful for debugging caching, phi node deletion, performance, and other errors. Off by default

source
Enzyme.API.printperf!Method
printperf!(val::Bool)

An debugging option for developers of Enzyme. If one sets this flag prior to the first differentiation of a function, Enzyme will print (to stderr) performance information about generated derivative programs. It will provide debug information that warns why particular values are cached for the reverse pass, and thus require additional computation/storage. This is particularly helpful for debugging derivatives which OOM or otherwise run slow. ff by default

source
Enzyme.API.printtype!Method
printtype!(val::Bool)

An debugging option for developers of Enzyme. If one sets this flag prior to the first differentiation of a function, Enzyme will print (to stderr) a log of all decisions made during Type Analysis (the analysis which Enzyme determines the type of all values in the program). This may be useful for debugging correctness errors, illegal type analysis errors, insufficient type information errors, correctness, and performance errors. Off by default

source
Enzyme.API.printunnecessary!Method
printunnecessary!(val::Bool)

An debugging option for developers of Enzyme. If one sets this flag prior to the first differentiation of a function, Enzyme will print (to stderr) information about each LLVM value – specifically whether it and its shadow is required for computing the derivative. In contrast to printdiffuse!, this flag prints the final results after running cache optimizations such as minCut (see Recompute vs Cache Heuristics from this paper and slides 31-33 from this presentation) for a description of the caching algorithm. This may be helpful for debugging caching, phi node deletion, performance, and other errors. Off by default

source
Enzyme.API.strictAliasing!Method
strictAliasing!(val::Bool)

Whether Enzyme's type analysis will assume strict aliasing semantics. When strict aliasing semantics are on (the default), Enzyme can propagate type information up through conditional branches. This may lead to illegal type errors when analyzing code with unions. Disabling strict aliasing will enable these union types to be correctly analyzed. However, it may lead to some errors that sufficient type information cannot be deduced. One can turn these insufficient type information errors into to warnings by calling looseTypeAnalysis!(true) which tells Enzyme to use its best guess in such scenarios.

source
Enzyme.API.strong_zero!Method
strong_zero!(val::Bool)

Whether to enforce multiplication by zero as enforcing a zero result even if multiplying against a NaN or infinity. Necessary for some programs in which a value has a zero derivative since it is unused, even if it has an otherwise infinite or nan derivative.

source
Enzyme.API.typeWarning!Method
typeWarning!(val::Bool)

Whether to print a warning when Type Analysis learns informatoin about a value's type which cannot be represented in the current size of the lattice. See maxtypeoffset! for more information. Off by default.

source
diff --git a/dev/dev_docs/index.html b/dev/dev_docs/index.html index f4323601c0..35d77ace76 100644 --- a/dev/dev_docs/index.html +++ b/dev/dev_docs/index.html @@ -14,4 +14,4 @@ julia -e "using Pkg; pkg\"add LLVM_full_jll@${LLVM_MAJOR_VER}\"" LLVM_DIR=`julia -e "using LLVM_full_jll; print(LLVM_full_jll.artifact_dir)"` echo "LLVM_DIR=$LLVM_DIR" -cmake ../enzyme/ -G Ninja -DENZYME_EXTERNAL_SHARED_LIB=ON -DLLVM_DIR=${LLVM_DIR} -DLLVM_EXTERNAL_LIT=${LLVM_DIR}/tools/lit/lit.py

Manual build of Julia

cmake ../enzyme/ -G Ninja -DENZYME_EXTERNAL_SHARED_LIB=ON -DLLVM_DIR=${PATH_TO_BUILDDIR_OF_JULIA}/usr/lib/cmake/llvm/
+cmake ../enzyme/ -G Ninja -DENZYME_EXTERNAL_SHARED_LIB=ON -DLLVM_DIR=${LLVM_DIR} -DLLVM_EXTERNAL_LIT=${LLVM_DIR}/tools/lit/lit.py

Manual build of Julia

cmake ../enzyme/ -G Ninja -DENZYME_EXTERNAL_SHARED_LIB=ON -DLLVM_DIR=${PATH_TO_BUILDDIR_OF_JULIA}/usr/lib/cmake/llvm/
diff --git a/dev/faq/index.html b/dev/faq/index.html index 85abf77159..43b157f972 100644 --- a/dev/faq/index.html +++ b/dev/faq/index.html @@ -279,4 +279,4 @@ # output -ERROR: Type of ghost or constant type Duplicated{Val{1.0}} is marked as differentiable. +ERROR: Type of ghost or constant type Duplicated{Val{1.0}} is marked as differentiable. diff --git a/dev/generated/autodiff/index.html b/dev/generated/autodiff/index.html index 5d05dc3f72..31ff9d5ab7 100644 --- a/dev/generated/autodiff/index.html +++ b/dev/generated/autodiff/index.html @@ -66,4 +66,4 @@ hess[1][2] == 1.0
true

as well as the second row/column

hess[2][1] == 1.0
 
-hess[2][2] == 0.0
true

This page was generated using Literate.jl.

+hess[2][2] == 0.0
true

This page was generated using Literate.jl.

diff --git a/dev/generated/box/index.html b/dev/generated/box/index.html index 100de3a6cd..8c84dea3f8 100644 --- a/dev/generated/box/index.html +++ b/dev/generated/box/index.html @@ -362,4 +362,4 @@ 9.732210923954578e-5 0.0026401658789532625 0.015152571729925521 - 0.03212933056407103

and we get down to a percent difference on the order of $1e^{-5}$, showing Enzyme calculated the correct derivative. Success!


This page was generated using Literate.jl.

+ 0.03212933056407103

and we get down to a percent difference on the order of $1e^{-5}$, showing Enzyme calculated the correct derivative. Success!


This page was generated using Literate.jl.

diff --git a/dev/generated/custom_rule/index.html b/dev/generated/custom_rule/index.html index f0d8ebc637..97ec2763c1 100644 --- a/dev/generated/custom_rule/index.html +++ b/dev/generated/custom_rule/index.html @@ -167,4 +167,4 @@ test_reverse(fun, RT, (x, Tx), (y, Ty)) end end -end
Test.DefaultTestSet("f rules", Any[Test.DefaultTestSet("forward", Any[Test.DefaultTestSet("RT = Const, Tx = Const, Ty = Const", Any[Test.DefaultTestSet("test_forward: g with return activity Const on (::Vector{Float64}, Const), (::Vector{Float64}, Const)", Any[], 6, false, false, true, 1.727341401847138e9, 1.727341403544662e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_forward.jl")], 0, false, false, true, 1.72734139931139e9, 1.727341403544671e9, false, "custom_rule.md"), Test.DefaultTestSet("RT = Const, Tx = Const, Ty = Duplicated", Any[Test.DefaultTestSet("test_forward: g with return activity Const on (::Vector{Float64}, Const), (::Vector{Float64}, Duplicated)", Any[], 6, false, false, true, 1.727341403545931e9, 1.727341404379789e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_forward.jl")], 0, false, false, true, 1.727341403544714e9, 1.727341404379793e9, false, "custom_rule.md"), Test.DefaultTestSet("RT = Const, Tx = Duplicated, Ty = Const", Any[Test.DefaultTestSet("test_forward: g with return activity Const on (::Vector{Float64}, Duplicated), (::Vector{Float64}, Const)", Any[], 6, false, false, true, 1.727341404381058e9, 1.727341405821925e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_forward.jl")], 0, false, false, true, 1.727341404379833e9, 1.727341405821929e9, false, "custom_rule.md"), Test.DefaultTestSet("RT = Const, Tx = Duplicated, Ty = Duplicated", Any[Test.DefaultTestSet("test_forward: g with return activity Const on (::Vector{Float64}, Duplicated), (::Vector{Float64}, Duplicated)", Any[], 6, false, false, true, 1.727341405823152e9, 1.727341407229172e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_forward.jl")], 0, false, false, true, 1.727341405821965e9, 1.727341407229194e9, false, "custom_rule.md"), Test.DefaultTestSet("RT = DuplicatedNoNeed, Tx = Const, Ty = Const", Any[Test.DefaultTestSet("test_forward: g with return activity DuplicatedNoNeed on (::Vector{Float64}, Const), (::Vector{Float64}, Const)", Any[], 6, false, false, true, 1.727341407230424e9, 1.727341408661007e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_forward.jl")], 0, false, false, true, 1.727341407229239e9, 1.72734140866101e9, false, "custom_rule.md"), Test.DefaultTestSet("RT = DuplicatedNoNeed, Tx = Const, Ty = Duplicated", Any[Test.DefaultTestSet("test_forward: g with return activity DuplicatedNoNeed on (::Vector{Float64}, Const), (::Vector{Float64}, Duplicated)", Any[], 6, false, false, true, 1.727341408662232e9, 1.727341410888727e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_forward.jl")], 0, false, false, true, 1.727341408661049e9, 1.727341410888731e9, false, "custom_rule.md"), Test.DefaultTestSet("RT = DuplicatedNoNeed, Tx = Duplicated, Ty = Const", Any[Test.DefaultTestSet("test_forward: g with return activity DuplicatedNoNeed on (::Vector{Float64}, Duplicated), (::Vector{Float64}, Const)", Any[], 6, false, false, true, 1.727341410890005e9, 1.72734141237566e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_forward.jl")], 0, false, false, true, 1.727341410888763e9, 1.727341412375663e9, false, "custom_rule.md"), Test.DefaultTestSet("RT = DuplicatedNoNeed, Tx = Duplicated, Ty = Duplicated", Any[Test.DefaultTestSet("test_forward: g with return activity DuplicatedNoNeed on (::Vector{Float64}, Duplicated), (::Vector{Float64}, Duplicated)", Any[], 6, false, false, true, 1.727341412376947e9, 1.727341414121063e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_forward.jl")], 0, false, false, true, 1.727341412375691e9, 1.727341414121067e9, false, "custom_rule.md"), Test.DefaultTestSet("RT = Duplicated, Tx = Const, Ty = Const", Any[Test.DefaultTestSet("test_forward: g with return activity Duplicated on (::Vector{Float64}, Const), (::Vector{Float64}, Const)", Any[], 7, false, false, true, 1.727341414122308e9, 1.727341415199234e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_forward.jl")], 0, false, false, true, 1.727341414121095e9, 1.727341415199238e9, false, "custom_rule.md"), Test.DefaultTestSet("RT = Duplicated, Tx = Const, Ty = Duplicated", Any[Test.DefaultTestSet("test_forward: g with return activity Duplicated on (::Vector{Float64}, Const), (::Vector{Float64}, Duplicated)", Any[], 7, false, false, true, 1.727341415200479e9, 1.727341416368174e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_forward.jl")], 0, false, false, true, 1.727341415199268e9, 1.727341416368195e9, false, "custom_rule.md"), Test.DefaultTestSet("RT = Duplicated, Tx = Duplicated, Ty = Const", Any[Test.DefaultTestSet("test_forward: g with return activity Duplicated on (::Vector{Float64}, Duplicated), (::Vector{Float64}, Const)", Any[], 7, false, false, true, 1.727341416369421e9, 1.727341417560924e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_forward.jl")], 0, false, false, true, 1.727341416368229e9, 1.727341417560927e9, false, "custom_rule.md"), Test.DefaultTestSet("RT = Duplicated, Tx = Duplicated, Ty = Duplicated", Any[Test.DefaultTestSet("test_forward: g with return activity Duplicated on (::Vector{Float64}, Duplicated), (::Vector{Float64}, Duplicated)", Any[], 7, false, false, true, 1.727341417562155e9, 1.72734141881892e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_forward.jl")], 0, false, false, true, 1.727341417560953e9, 1.727341418818923e9, false, "custom_rule.md")], 0, false, false, true, 1.727341399311343e9, 1.727341418818924e9, false, "custom_rule.md"), Test.DefaultTestSet("reverse", Any[Test.DefaultTestSet("RT = Active, Tx = Duplicated, Ty = Duplicated, fun = g", Any[Test.DefaultTestSet("test_reverse: g with return activity Active on (::Vector{Float64}, Duplicated), (::Vector{Float64}, Duplicated)", Any[], 11, false, false, true, 1.727341418945894e9, 1.727341421602819e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_reverse.jl")], 0, false, false, true, 1.727341418818988e9, 1.727341421602825e9, false, "custom_rule.md"), Test.DefaultTestSet("RT = Active, Tx = Duplicated, Ty = Duplicated, fun = h", Any[Test.DefaultTestSet("test_reverse: h with return activity Active on (::Vector{Float64}, Duplicated), (::Vector{Float64}, Duplicated)", Any[], 11, false, false, true, 1.727341421725913e9, 1.727341425231602e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_reverse.jl")], 0, false, false, true, 1.727341421602855e9, 1.727341425231605e9, false, "custom_rule.md")], 0, false, false, true, 1.727341418818945e9, 1.727341425231606e9, false, "custom_rule.md")], 0, false, false, true, 1.727341399311295e9, 1.727341425231607e9, false, "custom_rule.md")

In any package that implements Enzyme rules using EnzymeRules, it is recommended to add EnzymeTestUtils as a test dependency to test the rules.


This page was generated using Literate.jl.

+end
Test.DefaultTestSet("f rules", Any[Test.DefaultTestSet("forward", Any[Test.DefaultTestSet("RT = Const, Tx = Const, Ty = Const", Any[Test.DefaultTestSet("test_forward: g with return activity Const on (::Vector{Float64}, Const), (::Vector{Float64}, Const)", Any[], 6, false, false, true, 1.72736752871121e9, 1.727367530486127e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_forward.jl")], 0, false, false, true, 1.7273675261085e9, 1.727367530486134e9, false, "custom_rule.md"), Test.DefaultTestSet("RT = Const, Tx = Const, Ty = Duplicated", Any[Test.DefaultTestSet("test_forward: g with return activity Const on (::Vector{Float64}, Const), (::Vector{Float64}, Duplicated)", Any[], 6, false, false, true, 1.727367530487304e9, 1.727367531337305e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_forward.jl")], 0, false, false, true, 1.727367530486184e9, 1.72736753133731e9, false, "custom_rule.md"), Test.DefaultTestSet("RT = Const, Tx = Duplicated, Ty = Const", Any[Test.DefaultTestSet("test_forward: g with return activity Const on (::Vector{Float64}, Duplicated), (::Vector{Float64}, Const)", Any[], 6, false, false, true, 1.727367531338473e9, 1.727367532746098e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_forward.jl")], 0, false, false, true, 1.727367531337349e9, 1.727367532746103e9, false, "custom_rule.md"), Test.DefaultTestSet("RT = Const, Tx = Duplicated, Ty = Duplicated", Any[Test.DefaultTestSet("test_forward: g with return activity Const on (::Vector{Float64}, Duplicated), (::Vector{Float64}, Duplicated)", Any[], 6, false, false, true, 1.727367532747288e9, 1.727367534207238e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_forward.jl")], 0, false, false, true, 1.727367532746142e9, 1.727367534207243e9, false, "custom_rule.md"), Test.DefaultTestSet("RT = DuplicatedNoNeed, Tx = Const, Ty = Const", Any[Test.DefaultTestSet("test_forward: g with return activity DuplicatedNoNeed on (::Vector{Float64}, Const), (::Vector{Float64}, Const)", Any[], 6, false, false, true, 1.727367534208454e9, 1.727367535683175e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_forward.jl")], 0, false, false, true, 1.727367534207287e9, 1.72736753568318e9, false, "custom_rule.md"), Test.DefaultTestSet("RT = DuplicatedNoNeed, Tx = Const, Ty = Duplicated", Any[Test.DefaultTestSet("test_forward: g with return activity DuplicatedNoNeed on (::Vector{Float64}, Const), (::Vector{Float64}, Duplicated)", Any[], 6, false, false, true, 1.727367535684342e9, 1.727367537965779e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_forward.jl")], 0, false, false, true, 1.727367535683223e9, 1.727367537965783e9, false, "custom_rule.md"), Test.DefaultTestSet("RT = DuplicatedNoNeed, Tx = Duplicated, Ty = Const", Any[Test.DefaultTestSet("test_forward: g with return activity DuplicatedNoNeed on (::Vector{Float64}, Duplicated), (::Vector{Float64}, Const)", Any[], 6, false, false, true, 1.727367537966897e9, 1.727367539466071e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_forward.jl")], 0, false, false, true, 1.727367537965834e9, 1.727367539466075e9, false, "custom_rule.md"), Test.DefaultTestSet("RT = DuplicatedNoNeed, Tx = Duplicated, Ty = Duplicated", Any[Test.DefaultTestSet("test_forward: g with return activity DuplicatedNoNeed on (::Vector{Float64}, Duplicated), (::Vector{Float64}, Duplicated)", Any[], 6, false, false, true, 1.727367539467181e9, 1.727367541251246e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_forward.jl")], 0, false, false, true, 1.727367539466103e9, 1.727367541251251e9, false, "custom_rule.md"), Test.DefaultTestSet("RT = Duplicated, Tx = Const, Ty = Const", Any[Test.DefaultTestSet("test_forward: g with return activity Duplicated on (::Vector{Float64}, Const), (::Vector{Float64}, Const)", Any[], 7, false, false, true, 1.727367541252389e9, 1.727367542352226e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_forward.jl")], 0, false, false, true, 1.727367541251283e9, 1.727367542352229e9, false, "custom_rule.md"), Test.DefaultTestSet("RT = Duplicated, Tx = Const, Ty = Duplicated", Any[Test.DefaultTestSet("test_forward: g with return activity Duplicated on (::Vector{Float64}, Const), (::Vector{Float64}, Duplicated)", Any[], 7, false, false, true, 1.727367542353355e9, 1.727367543547481e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_forward.jl")], 0, false, false, true, 1.727367542352258e9, 1.727367543547486e9, false, "custom_rule.md"), Test.DefaultTestSet("RT = Duplicated, Tx = Duplicated, Ty = Const", Any[Test.DefaultTestSet("test_forward: g with return activity Duplicated on (::Vector{Float64}, Duplicated), (::Vector{Float64}, Const)", Any[], 7, false, false, true, 1.727367543548647e9, 1.727367544757137e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_forward.jl")], 0, false, false, true, 1.727367543547516e9, 1.727367544757141e9, false, "custom_rule.md"), Test.DefaultTestSet("RT = Duplicated, Tx = Duplicated, Ty = Duplicated", Any[Test.DefaultTestSet("test_forward: g with return activity Duplicated on (::Vector{Float64}, Duplicated), (::Vector{Float64}, Duplicated)", Any[], 7, false, false, true, 1.727367544758278e9, 1.727367546061348e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_forward.jl")], 0, false, false, true, 1.727367544757167e9, 1.727367546061351e9, false, "custom_rule.md")], 0, false, false, true, 1.72736752610845e9, 1.727367546061352e9, false, "custom_rule.md"), Test.DefaultTestSet("reverse", Any[Test.DefaultTestSet("RT = Active, Tx = Duplicated, Ty = Duplicated, fun = g", Any[Test.DefaultTestSet("test_reverse: g with return activity Active on (::Vector{Float64}, Duplicated), (::Vector{Float64}, Duplicated)", Any[], 11, false, false, true, 1.727367546187852e9, 1.727367549253161e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_reverse.jl")], 0, false, false, true, 1.727367546061414e9, 1.727367549253167e9, false, "custom_rule.md"), Test.DefaultTestSet("RT = Active, Tx = Duplicated, Ty = Duplicated, fun = h", Any[Test.DefaultTestSet("test_reverse: h with return activity Active on (::Vector{Float64}, Duplicated), (::Vector{Float64}, Duplicated)", Any[], 11, false, false, true, 1.72736754937941e9, 1.72736755293924e9, false, "/home/runner/work/Enzyme.jl/Enzyme.jl/lib/EnzymeTestUtils/src/test_reverse.jl")], 0, false, false, true, 1.727367549253209e9, 1.727367552939244e9, false, "custom_rule.md")], 0, false, false, true, 1.727367546061371e9, 1.727367552939245e9, false, "custom_rule.md")], 0, false, false, true, 1.727367526108401e9, 1.727367552939246e9, false, "custom_rule.md")

In any package that implements Enzyme rules using EnzymeRules, it is recommended to add EnzymeTestUtils as a test dependency to test the rules.


This page was generated using Literate.jl.

diff --git a/dev/index.html b/dev/index.html index b5e550601a..b0b8f1ed8c 100644 --- a/dev/index.html +++ b/dev/index.html @@ -134,4 +134,4 @@ julia> grad 2-element Vector{Float64}: 2.880510859951098 - 1.920340573300732 + 1.920340573300732 diff --git a/dev/internal_api/index.html b/dev/internal_api/index.html index 77de480e76..2ea3566296 100644 --- a/dev/internal_api/index.html +++ b/dev/internal_api/index.html @@ -1,2 +1,2 @@ -Internal API · Enzyme.jl

Internal API

Note

This is the documentation of Enzymes's internal API. The internal API is not subject to semantic versioning and may change at any time and without deprecation.

+Internal API · Enzyme.jl

Internal API

Note

This is the documentation of Enzymes's internal API. The internal API is not subject to semantic versioning and may change at any time and without deprecation.