ExaModels
ExaModels.ExaModels
— ModuleExaModels
An algebraic modeling and automatic differentiation tool in Julia Language, specialized for SIMD abstraction of nonlinear programs.
For more information, please visit https://github.com/exanauts/ExaModels.jl
ExaModels.AdjointNode1
— TypeAdjointNode1{F, T, I}
A node with one child for first-order forward pass tree
Fields:
x::T
: function valuey::T
: first-order sensitivityinner::I
: children
ExaModels.AdjointNode2
— TypeAdjointNode2{F, T, I1, I2}
A node with two children for first-order forward pass tree
Fields:
x::T
: function valuey1::T
: first-order sensitivity w.r.t. first argumenty2::T
: first-order sensitivity w.r.t. second argumentinner1::I1
: children #1inner2::I2
: children #2
ExaModels.AdjointNodeSource
— TypeAdjointNodeSource{VT}
A source of AdjointNode
. adjoint_node_source[i]
returns an AdjointNodeVar
at index i
.
Fields:
inner::VT
: variable vector
ExaModels.AdjointNodeVar
— TypeAdjointNodeVar{I, T}
A variable node for first-order forward pass tree
Fields:
i::I
: indexx::T
: value
ExaModels.AdjointNull
— TypeNull
A null node
ExaModels.Compressor
— TypeCompressor{I}
Data structure for the sparse index
Fields:
inner::I
: stores the sparse index as a tuple form
ExaModels.ExaCore
— TypeExaCore([array_eltype::Type; backend = backend, minimize = true])
Returns an intermediate data object ExaCore
, which later can be used for creating ExaModel
Example
julia> using ExaModels
+API Manual · ExaModels.jl ExaModels
ExaModels.ExaModels
— ModuleExaModels
An algebraic modeling and automatic differentiation tool in Julia Language, specialized for SIMD abstraction of nonlinear programs.
For more information, please visit https://github.com/exanauts/ExaModels.jl
sourceExaModels.AdjointNode1
— TypeAdjointNode1{F, T, I}
A node with one child for first-order forward pass tree
Fields:
x::T
: function valuey::T
: first-order sensitivityinner::I
: children
sourceExaModels.AdjointNode2
— TypeAdjointNode2{F, T, I1, I2}
A node with two children for first-order forward pass tree
Fields:
x::T
: function valuey1::T
: first-order sensitivity w.r.t. first argumenty2::T
: first-order sensitivity w.r.t. second argumentinner1::I1
: children #1inner2::I2
: children #2
sourceExaModels.AdjointNodeSource
— TypeAdjointNodeSource{VT}
A source of AdjointNode
. adjoint_node_source[i]
returns an AdjointNodeVar
at index i
.
Fields:
inner::VT
: variable vector
sourceExaModels.AdjointNodeVar
— TypeAdjointNodeVar{I, T}
A variable node for first-order forward pass tree
Fields:
i::I
: indexx::T
: value
sourceExaModels.AdjointNull
— TypeNull
A null node
sourceExaModels.Compressor
— TypeCompressor{I}
Data structure for the sparse index
Fields:
inner::I
: stores the sparse index as a tuple form
sourceExaModels.ExaCore
— TypeExaCore([array_eltype::Type; backend = backend, minimize = true])
Returns an intermediate data object ExaCore
, which later can be used for creating ExaModel
Example
julia> using ExaModels
julia> c = ExaCore()
An ExaCore
@@ -31,7 +31,7 @@
Backend: ......................... CUDA.CUDAKernels.CUDABackend
number of objective patterns: .... 0
- number of constraint patterns: ... 0
sourceExaModels.ExaModel
— MethodExaModel(core)
Returns an ExaModel
object, which can be solved by nonlinear optimization solvers within JuliaSmoothOptimizer
ecosystem, such as NLPModelsIpopt
or MadNLP
.
Example
julia> using ExaModels
+ number of constraint patterns: ... 0
sourceExaModels.ExaModel
— MethodExaModel(core)
Returns an ExaModel
object, which can be solved by nonlinear optimization solvers within JuliaSmoothOptimizer
ecosystem, such as NLPModelsIpopt
or MadNLP
.
Example
julia> using ExaModels
julia> c = ExaCore(); # create an ExaCore object
@@ -58,7 +58,7 @@
julia> result = ipopt(m; print_level=0) # solve the problem
"Execution stats: first-order stationary"
-
sourceExaModels.Node1
— TypeNode1{F, I}
A node with one child for symbolic expression tree
Fields:
inner::I
: children
sourceExaModels.Node2
— TypeNode2{F, I1, I2}
A node with two children for symbolic expression tree
Fields:
inner1::I1
: children #1inner2::I2
: children #2
sourceExaModels.Null
— TypeNull
A null node
sourceExaModels.ParIndexed
— TypeParIndexed{I, J}
A parameterized data node
Fields:
inner::I
: parameter for the data
sourceExaModels.ParSource
— TypeParSource
A source of parameterized data
sourceExaModels.SIMDFunction
— TypeSIMDFunction(gen::Base.Generator, o0 = 0, o1 = 0, o2 = 0)
Returns a SIMDFunction
using the gen
.
Arguments:
gen
: an iterable function specified in Base.Generator
formato0
: offset for the function evaluationo1
: offset for the derivative evalutiono2
: offset for the second-order derivative evalution
sourceExaModels.SecondAdjointNode1
— TypeSecondAdjointNode1{F, T, I}
A node with one child for second-order forward pass tree
Fields:
x::T
: function valuey::T
: first-order sensitivityh::T
: second-order sensitivityinner::I
: DESCRIPTION
sourceExaModels.SecondAdjointNode2
— TypeSecondAdjointNode2{F, T, I1, I2}
A node with one child for second-order forward pass tree
Fields:
x::T
: function valuey1::T
: first-order sensitivity w.r.t. first argumenty2::T
: first-order sensitivity w.r.t. first argumenth11::T
: second-order sensitivity w.r.t. first argumenth12::T
: second-order sensitivity w.r.t. first and second argumenth22::T
: second-order sensitivity w.r.t. second argumentinner1::I1
: children #1inner2::I2
: children #2
sourceExaModels.SecondAdjointNodeSource
— TypeSecondAdjointNodeSource{VT}
A source of AdjointNode
. adjoint_node_source[i]
returns an AdjointNodeVar
at index i
.
Fields:
inner::VT
: variable vector
sourceExaModels.SecondAdjointNodeVar
— TypeSecondAdjointNodeVar{I, T}
A variable node for first-order forward pass tree
Fields:
i::I
: indexx::T
: value
sourceExaModels.SecondAdjointNull
— TypeNull
A null node
sourceExaModels.Var
— TypeVar{I}
A variable node for symbolic expression tree
Fields:
i::I
: (parameterized) index
sourceExaModels.VarSource
— TypeVarSource
A source of variable nodes
sourceExaModels.WrapperNLPModel
— MethodWrapperNLPModel(VT, m)
Returns a WrapperModel{T,VT}
wrapping m <: AbstractNLPModel{T}
sourceExaModels.WrapperNLPModel
— MethodWrapperNLPModel(m)
Returns a WrapperModel{Float64,Vector{64}}
wrapping m
sourceExaModels.constraint
— Methodconstraint(core, n; start = 0, lcon = 0, ucon = 0)
Adds empty constraints of dimension n, so that later the terms can be added with constraint!
.
sourceExaModels.constraint
— Methodconstraint(core, generator; start = 0, lcon = 0, ucon = 0)
Adds constraints specified by a generator
to core
, and returns an Constraint
object.
Keyword Arguments
start
: The initial guess of the solution. Can either be Number
, AbstractArray
, or Generator
.lcon
: The constraint lower bound. Can either be Number
, AbstractArray
, or Generator
.ucon
: The constraint upper bound. Can either be Number
, AbstractArray
, or Generator
.
Example
julia> using ExaModels
+
sourceExaModels.Node1
— TypeNode1{F, I}
A node with one child for symbolic expression tree
Fields:
inner::I
: children
sourceExaModels.Node2
— TypeNode2{F, I1, I2}
A node with two children for symbolic expression tree
Fields:
inner1::I1
: children #1inner2::I2
: children #2
sourceExaModels.Null
— TypeNull
A null node
sourceExaModels.ParIndexed
— TypeParIndexed{I, J}
A parameterized data node
Fields:
inner::I
: parameter for the data
sourceExaModels.ParSource
— TypeParSource
A source of parameterized data
sourceExaModels.SIMDFunction
— TypeSIMDFunction(gen::Base.Generator, o0 = 0, o1 = 0, o2 = 0)
Returns a SIMDFunction
using the gen
.
Arguments:
gen
: an iterable function specified in Base.Generator
formato0
: offset for the function evaluationo1
: offset for the derivative evalutiono2
: offset for the second-order derivative evalution
sourceExaModels.SecondAdjointNode1
— TypeSecondAdjointNode1{F, T, I}
A node with one child for second-order forward pass tree
Fields:
x::T
: function valuey::T
: first-order sensitivityh::T
: second-order sensitivityinner::I
: DESCRIPTION
sourceExaModels.SecondAdjointNode2
— TypeSecondAdjointNode2{F, T, I1, I2}
A node with one child for second-order forward pass tree
Fields:
x::T
: function valuey1::T
: first-order sensitivity w.r.t. first argumenty2::T
: first-order sensitivity w.r.t. first argumenth11::T
: second-order sensitivity w.r.t. first argumenth12::T
: second-order sensitivity w.r.t. first and second argumenth22::T
: second-order sensitivity w.r.t. second argumentinner1::I1
: children #1inner2::I2
: children #2
sourceExaModels.SecondAdjointNodeSource
— TypeSecondAdjointNodeSource{VT}
A source of AdjointNode
. adjoint_node_source[i]
returns an AdjointNodeVar
at index i
.
Fields:
inner::VT
: variable vector
sourceExaModels.SecondAdjointNodeVar
— TypeSecondAdjointNodeVar{I, T}
A variable node for first-order forward pass tree
Fields:
i::I
: indexx::T
: value
sourceExaModels.SecondAdjointNull
— TypeNull
A null node
sourceExaModels.Var
— TypeVar{I}
A variable node for symbolic expression tree
Fields:
i::I
: (parameterized) index
sourceExaModels.VarSource
— TypeVarSource
A source of variable nodes
sourceExaModels.WrapperNLPModel
— MethodWrapperNLPModel(VT, m)
Returns a WrapperModel{T,VT}
wrapping m <: AbstractNLPModel{T}
sourceExaModels.WrapperNLPModel
— MethodWrapperNLPModel(m)
Returns a WrapperModel{Float64,Vector{64}}
wrapping m
sourceExaModels.constraint
— Methodconstraint(core, n; start = 0, lcon = 0, ucon = 0)
Adds empty constraints of dimension n, so that later the terms can be added with constraint!
.
sourceExaModels.constraint
— Methodconstraint(core, generator; start = 0, lcon = 0, ucon = 0)
Adds constraints specified by a generator
to core
, and returns an Constraint
object.
Keyword Arguments
start
: The initial guess of the solution. Can either be Number
, AbstractArray
, or Generator
.lcon
: The constraint lower bound. Can either be Number
, AbstractArray
, or Generator
.ucon
: The constraint upper bound. Can either be Number
, AbstractArray
, or Generator
.
Example
julia> using ExaModels
julia> c = ExaCore();
@@ -70,7 +70,7 @@
s.t. (...)
g♭ ≤ [g(x,p)]_{p ∈ P} ≤ g♯
- where |P| = 9
sourceExaModels.constraint
— Methodconstraint(core, expr [, pars]; start = 0, lcon = 0, ucon = 0)
Adds constraints specified by a expr
and pars
to core
, and returns an Constraint
object.
sourceExaModels.drpass
— Methoddrpass(d::D, y, adj)
Performs dense gradient evaluation via the reverse pass on the computation (sub)graph formed by forward pass
Arguments:
d
: first-order computation (sub)graphy
: result vectoradj
: adjoint propagated up to the current node
sourceExaModels.gradient!
— Methodgradient!(y, f, x, adj)
Performs dense gradient evalution
Arguments:
y
: result vectorf
: the function to be differentiated in SIMDFunction
formatx
: variable vectoradj
: initial adjoint
sourceExaModels.grpass
— Methodgrpass(d::D, comp, y, o1, cnt, adj)
Performs dsparse gradient evaluation via the reverse pass on the computation (sub)graph formed by forward pass
Arguments:
d
: first-order computation (sub)graphcomp
: a Compressor
, which helps map counter to sparse vector indexy
: result vectoro1
: index offsetcnt
: counteradj
: adjoint propagated up to the current node
sourceExaModels.hdrpass
— Methodhdrpass(t1::T1, t2::T2, comp, y1, y2, o2, cnt, adj)
Performs sparse hessian evaluation ((df1/dx)(df2/dx)'
portion) via the reverse pass on the computation (sub)graph formed by second-order forward pass
Arguments:
t1
: second-order computation (sub)graph regarding f1t2
: second-order computation (sub)graph regarding f2comp
: a Compressor
, which helps map counter to sparse vector indexy1
: result vector #1y2
: result vector #2 (only used when evaluating sparsity)o2
: index offsetcnt
: counteradj
: second adjoint propagated up to the current node
sourceExaModels.jrpass
— Methodjrpass(d::D, comp, i, y1, y2, o1, cnt, adj)
Performs sparse jacobian evaluation via the reverse pass on the computation (sub)graph formed by forward pass
Arguments:
d
: first-order computation (sub)graphcomp
: a Compressor
, which helps map counter to sparse vector indexi
: constraint index (this is i
-th constraint)y1
: result vector #1y2
: result vector #2 (only used when evaluating sparsity)o1
: index offsetcnt
: counteradj
: adjoint propagated up to the current node
sourceExaModels.multipliers
— Methodmultipliers(result, y)
Returns the multipliers for constraints y
associated with result
, obtained by solving the model.
Example
julia> using ExaModels, NLPModelsIpopt
+ where |P| = 9
sourceExaModels.constraint
— Methodconstraint(core, expr [, pars]; start = 0, lcon = 0, ucon = 0)
Adds constraints specified by a expr
and pars
to core
, and returns an Constraint
object.
sourceExaModels.drpass
— Methoddrpass(d::D, y, adj)
Performs dense gradient evaluation via the reverse pass on the computation (sub)graph formed by forward pass
Arguments:
d
: first-order computation (sub)graphy
: result vectoradj
: adjoint propagated up to the current node
sourceExaModels.gradient!
— Methodgradient!(y, f, x, adj)
Performs dense gradient evalution
Arguments:
y
: result vectorf
: the function to be differentiated in SIMDFunction
formatx
: variable vectoradj
: initial adjoint
sourceExaModels.grpass
— Methodgrpass(d::D, comp, y, o1, cnt, adj)
Performs dsparse gradient evaluation via the reverse pass on the computation (sub)graph formed by forward pass
Arguments:
d
: first-order computation (sub)graphcomp
: a Compressor
, which helps map counter to sparse vector indexy
: result vectoro1
: index offsetcnt
: counteradj
: adjoint propagated up to the current node
sourceExaModels.hdrpass
— Methodhdrpass(t1::T1, t2::T2, comp, y1, y2, o2, cnt, adj)
Performs sparse hessian evaluation ((df1/dx)(df2/dx)'
portion) via the reverse pass on the computation (sub)graph formed by second-order forward pass
Arguments:
t1
: second-order computation (sub)graph regarding f1t2
: second-order computation (sub)graph regarding f2comp
: a Compressor
, which helps map counter to sparse vector indexy1
: result vector #1y2
: result vector #2 (only used when evaluating sparsity)o2
: index offsetcnt
: counteradj
: second adjoint propagated up to the current node
sourceExaModels.jrpass
— Methodjrpass(d::D, comp, i, y1, y2, o1, cnt, adj)
Performs sparse jacobian evaluation via the reverse pass on the computation (sub)graph formed by forward pass
Arguments:
d
: first-order computation (sub)graphcomp
: a Compressor
, which helps map counter to sparse vector indexi
: constraint index (this is i
-th constraint)y1
: result vector #1y2
: result vector #2 (only used when evaluating sparsity)o1
: index offsetcnt
: counteradj
: adjoint propagated up to the current node
sourceExaModels.multipliers
— Methodmultipliers(result, y)
Returns the multipliers for constraints y
associated with result
, obtained by solving the model.
Example
julia> using ExaModels, NLPModelsIpopt
julia> c = ExaCore();
@@ -88,7 +88,7 @@
julia> val[1] ≈ 0.81933930
-true
sourceExaModels.multipliers_L
— Methodmultipliers_L(result, x)
Returns the multipliers_L for variable x
associated with result
, obtained by solving the model.
Example
julia> using ExaModels, NLPModelsIpopt
+true
sourceExaModels.multipliers_L
— Methodmultipliers_L(result, x)
Returns the multipliers_L for variable x
associated with result
, obtained by solving the model.
Example
julia> using ExaModels, NLPModelsIpopt
julia> c = ExaCore();
@@ -103,7 +103,7 @@
julia> val = multipliers_L(result, x);
julia> isapprox(val, fill(0, 10), atol=sqrt(eps(Float64)), rtol=Inf)
-true
sourceExaModels.multipliers_U
— Methodmultipliers_U(result, x)
Returns the multipliers_U for variable x
associated with result
, obtained by solving the model.
Example
julia> using ExaModels, NLPModelsIpopt
+true
sourceExaModels.multipliers_U
— Methodmultipliers_U(result, x)
Returns the multipliers_U for variable x
associated with result
, obtained by solving the model.
Example
julia> using ExaModels, NLPModelsIpopt
julia> c = ExaCore();
@@ -118,7 +118,7 @@
julia> val = multipliers_U(result, x);
julia> isapprox(val, fill(2, 10), atol=sqrt(eps(Float64)), rtol=Inf)
-true
sourceExaModels.objective
— Methodobjective(core::ExaCore, generator)
Adds objective terms specified by a generator
to core
, and returns an Objective
object. Note: it is assumed that the terms are summed.
Example
julia> using ExaModels
+true
sourceExaModels.objective
— Methodobjective(core::ExaCore, generator)
Adds objective terms specified by a generator
to core
, and returns an Objective
object. Note: it is assumed that the terms are summed.
Example
julia> using ExaModels
julia> c = ExaCore();
@@ -129,7 +129,7 @@
min (...) + ∑_{p ∈ P} f(x,p)
- where |P| = 10
sourceExaModels.objective
— Methodobjective(core::ExaCore, expr [, pars])
Adds objective terms specified by a expr
and pars
to core
, and returns an Objective
object.
sourceExaModels.sgradient!
— Methodsgradient!(y, f, x, adj)
Performs sparse gradient evalution
Arguments:
y
: result vectorf
: the function to be differentiated in SIMDFunction
formatx
: variable vectoradj
: initial adjoint
sourceExaModels.shessian!
— Methodshessian!(y1, y2, f, x, adj1, adj2)
Performs sparse jacobian evalution
Arguments:
y1
: result vector #1y2
: result vector #2 (only used when evaluating sparsity)f
: the function to be differentiated in SIMDFunction
formatx
: variable vectoradj1
: initial first adjointadj2
: initial second adjoint
sourceExaModels.sjacobian!
— Methodsjacobian!(y1, y2, f, x, adj)
Performs sparse jacobian evalution
Arguments:
y1
: result vector #1y2
: result vector #2 (only used when evaluating sparsity)f
: the function to be differentiated in SIMDFunction
formatx
: variable vectoradj
: initial adjoint
sourceExaModels.solution
— Methodsolution(result, x)
Returns the solution for variable x
associated with result
, obtained by solving the model.
Example
julia> using ExaModels, NLPModelsIpopt
+ where |P| = 10
sourceExaModels.objective
— Methodobjective(core::ExaCore, expr [, pars])
Adds objective terms specified by a expr
and pars
to core
, and returns an Objective
object.
sourceExaModels.sgradient!
— Methodsgradient!(y, f, x, adj)
Performs sparse gradient evalution
Arguments:
y
: result vectorf
: the function to be differentiated in SIMDFunction
formatx
: variable vectoradj
: initial adjoint
sourceExaModels.shessian!
— Methodshessian!(y1, y2, f, x, adj1, adj2)
Performs sparse jacobian evalution
Arguments:
y1
: result vector #1y2
: result vector #2 (only used when evaluating sparsity)f
: the function to be differentiated in SIMDFunction
formatx
: variable vectoradj1
: initial first adjointadj2
: initial second adjoint
sourceExaModels.sjacobian!
— Methodsjacobian!(y1, y2, f, x, adj)
Performs sparse jacobian evalution
Arguments:
y1
: result vector #1y2
: result vector #2 (only used when evaluating sparsity)f
: the function to be differentiated in SIMDFunction
formatx
: variable vectoradj
: initial adjoint
sourceExaModels.solution
— Methodsolution(result, x)
Returns the solution for variable x
associated with result
, obtained by solving the model.
Example
julia> using ExaModels, NLPModelsIpopt
julia> c = ExaCore();
@@ -144,7 +144,7 @@
julia> val = solution(result, x);
julia> isapprox(val, fill(1, 10), atol=sqrt(eps(Float64)), rtol=Inf)
-true
sourceExaModels.variable
— Methodvariable(core, dims...; start = 0, lvar = -Inf, uvar = Inf)
Adds variables with dimensions specified by dims
to core
, and returns Variable
object. dims
can be either Integer
or UnitRange
.
Keyword Arguments
start
: The initial guess of the solution. Can either be Number
, AbstractArray
, or Generator
.lvar
: The variable lower bound. Can either be Number
, AbstractArray
, or Generator
.uvar
: The variable upper bound. Can either be Number
, AbstractArray
, or Generator
.
Example
julia> using ExaModels
+true
sourceExaModels.variable
— Methodvariable(core, dims...; start = 0, lvar = -Inf, uvar = Inf)
Adds variables with dimensions specified by dims
to core
, and returns Variable
object. dims
can be either Integer
or UnitRange
.
Keyword Arguments
start
: The initial guess of the solution. Can either be Number
, AbstractArray
, or Generator
.lvar
: The variable lower bound. Can either be Number
, AbstractArray
, or Generator
.uvar
: The variable upper bound. Can either be Number
, AbstractArray
, or Generator
.
Example
julia> using ExaModels
julia> c = ExaCore();
@@ -157,7 +157,7 @@
Variable
x ∈ R^{9 × 3}
-
sourceExaModels.@register_bivariate
— Macroregister_bivariate(f, df1, df2, ddf11, ddf12, ddf22)
Register a bivariate function f
to ExaModels
, so that it can be used within objective and constraint expressions
Arguments:
f
: functiondf1
: derivative function (w.r.t. first argument)df2
: derivative function (w.r.t. second argument)ddf11
: second-order derivative funciton (w.r.t. first argument)ddf12
: second-order derivative funciton (w.r.t. first and second argument)ddf22
: second-order derivative funciton (w.r.t. second argument)
Example
julia> using ExaModels
+
sourceExaModels.@register_bivariate
— Macroregister_bivariate(f, df1, df2, ddf11, ddf12, ddf22)
Register a bivariate function f
to ExaModels
, so that it can be used within objective and constraint expressions
Arguments:
f
: functiondf1
: derivative function (w.r.t. first argument)df2
: derivative function (w.r.t. second argument)ddf11
: second-order derivative funciton (w.r.t. first argument)ddf12
: second-order derivative funciton (w.r.t. first and second argument)ddf22
: second-order derivative funciton (w.r.t. second argument)
Example
julia> using ExaModels
julia> relu23(x) = (x > 0 || y > 0) ? (x + y)^3 : zero(x)
relu23 (generic function with 1 method)
@@ -177,7 +177,7 @@
julia> ddrelu2322(x) = (x > 0 || y > 0) ? 6 * (x + y) : zero(x)
ddrelu2322 (generic function with 1 method)
-julia> @register_bivariate(relu23, drelu231, drelu232, ddrelu2311, ddrelu2312, ddrelu2322)
sourceExaModels.@register_univariate
— Macro@register_univariate(f, df, ddf)
Register a univariate function f
to ExaModels
, so that it can be used within objective and constraint expressions
Arguments:
f
: functiondf
: derivative functionddf
: second-order derivative funciton
Example
julia> using ExaModels
+julia> @register_bivariate(relu23, drelu231, drelu232, ddrelu2311, ddrelu2312, ddrelu2322)
sourceExaModels.@register_univariate
— Macro@register_univariate(f, df, ddf)
Register a univariate function f
to ExaModels
, so that it can be used within objective and constraint expressions
Arguments:
f
: functiondf
: derivative functionddf
: second-order derivative funciton
Example
julia> using ExaModels
julia> relu3(x) = x > 0 ? x^3 : zero(x)
relu3 (generic function with 1 method)
@@ -188,4 +188,4 @@
julia> ddrelu3(x) = x > 0 ? 6*x : zero(x)
ddrelu3 (generic function with 1 method)
-julia> @register_univariate(relu3, drelu3, ddrelu3)
sourceSettings
This document was generated with Documenter.jl version 1.7.0 on Wednesday 6 November 2024. Using Julia version 1.9.4.
+julia> @register_univariate(relu3, drelu3, ddrelu3)