Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update #711 checking if OptimizationFunction is used for derivative based optimizers #711 #715

Merged
merged 33 commits into from
Apr 7, 2024
Merged
Show file tree
Hide file tree
Changes from 30 commits
Commits
Show all changes
33 commits
Select commit Hold shift + click to select a range
01f4479
Update OptimizationOptimJL.jl
ParasPuneetSingh Mar 5, 2024
404f4f3
Merge pull request #1 from ParasPuneetSingh/ParasPuneetSingh-patch-1
ParasPuneetSingh Mar 5, 2024
b2c044d
Update OptimizationOptimJL.jl
ParasPuneetSingh Mar 9, 2024
ef4e4a5
Update OptimizationBBO.jl
ParasPuneetSingh Mar 9, 2024
ed73535
Update OptimizationCMAEvolutionStrategy.jl
ParasPuneetSingh Mar 9, 2024
134d7fc
Update OptimizationEvolutionary.jl
ParasPuneetSingh Mar 9, 2024
702e095
Update OptimizationFlux.jl
ParasPuneetSingh Mar 9, 2024
6e2afe3
Update OptimizationGCMAES.jl
ParasPuneetSingh Mar 9, 2024
d1c0111
Update OptimizationOptimJL.jl
ParasPuneetSingh Mar 12, 2024
1ba53fb
Update OptimizationNLopt.jl
ParasPuneetSingh Mar 12, 2024
1767142
Update OptimizationOptimisers.jl
ParasPuneetSingh Mar 12, 2024
da5955b
Update OptimizationPRIMA.jl
ParasPuneetSingh Mar 13, 2024
64e16ec
Update OptimizationPolyalgorithms.jl
ParasPuneetSingh Mar 13, 2024
3ae278f
Update OptimizationSpeedMapping.jl
ParasPuneetSingh Mar 13, 2024
4887714
Update OptimizationBBO.jl
ParasPuneetSingh Mar 13, 2024
a1348e3
Update OptimizationNLopt.jl
ParasPuneetSingh Mar 13, 2024
0b6f600
Update OptimizationOptimJL.jl
ParasPuneetSingh Mar 13, 2024
6ea74cf
Update OptimizationOptimJL.jl
ParasPuneetSingh Mar 13, 2024
5132aea
Update OptimizationOptimJL.jl
ParasPuneetSingh Mar 13, 2024
785c218
Update OptimizationMOI.jl
ParasPuneetSingh Mar 13, 2024
8f3e381
Update OptimizationOptimJL.jl
ParasPuneetSingh Mar 26, 2024
a5652d7
Update OptimizationOptimJL.jl
ParasPuneetSingh Mar 26, 2024
6379379
Update OptimizationOptimJL.jl
ParasPuneetSingh Mar 26, 2024
6fb808b
Update lib/OptimizationOptimJL/src/OptimizationOptimJL.jl
Vaibhavdixit02 Apr 1, 2024
ea6d522
Update OptimizationNLopt.jl
ParasPuneetSingh Apr 3, 2024
73cd120
Update OptimizationOptimJL.jl
ParasPuneetSingh Apr 3, 2024
bae6741
Update OptimizationOptimJL.jl
ParasPuneetSingh Apr 3, 2024
b9854f5
Update OptimizationOptimJL.jl
ParasPuneetSingh Apr 3, 2024
15a5fa0
Update OptimizationPolyalgorithms.jl
ParasPuneetSingh Apr 4, 2024
0857a62
Update Project.toml
ParasPuneetSingh Apr 4, 2024
bb82850
Update Project.toml
ParasPuneetSingh Apr 5, 2024
2d6b7c4
Merge branch 'master' into master
Vaibhavdixit02 Apr 7, 2024
e0610cd
Update Project.toml
Vaibhavdixit02 Apr 7, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ OptimizationPolyalgorithms = "0.1, 0.2"
OptimizationSpeedMapping = "0.1, 0.2"
OrdinaryDiffEq = "6"
ReverseDiff = ">= 1.9.0"
SciMLBase = "2"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not just in docs also the Project.toml in the base directory of the repo

SciMLBase = "2.30.0"
SciMLSensitivity = "7"
Tracker = ">= 0.2"
Zygote = ">= 0.5"
2 changes: 2 additions & 0 deletions lib/OptimizationBBO/src/OptimizationBBO.jl
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,8 @@ SciMLBase.requiresbounds(::BBO) = true
SciMLBase.allowsbounds(::BBO) = true
SciMLBase.supports_opt_cache_interface(opt::BBO) = true



for j in string.(BlackBoxOptim.SingleObjectiveMethodNames)
eval(Meta.parse("Base.@kwdef struct BBO_" * j * " <: BBO method=:" * j * " end"))
eval(Meta.parse("export BBO_" * j))
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,6 +10,10 @@

SciMLBase.allowsbounds(::CMAEvolutionStrategyOpt) = true
SciMLBase.supports_opt_cache_interface(opt::CMAEvolutionStrategyOpt) = true
SciMLBase.requiresgradient(::CMAEvolutionStrategyOpt) = false
SciMLBase.requireshessian(::CMAEvolutionStrategyOpt) = false
SciMLBase.requiresconsjac(::CMAEvolutionStrategyOpt) = false
SciMLBase.requiresconshess(::CMAEvolutionStrategyOpt) = false

Check warning on line 16 in lib/OptimizationCMAEvolutionStrategy/src/OptimizationCMAEvolutionStrategy.jl

View check run for this annotation

Codecov / codecov/patch

lib/OptimizationCMAEvolutionStrategy/src/OptimizationCMAEvolutionStrategy.jl#L13-L16

Added lines #L13 - L16 were not covered by tests

function __map_optimizer_args(prob::OptimizationCache, opt::CMAEvolutionStrategyOpt;
callback = nothing,
Expand Down
4 changes: 4 additions & 0 deletions lib/OptimizationEvolutionary/src/OptimizationEvolutionary.jl
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,10 @@
SciMLBase.allowsbounds(opt::Evolutionary.AbstractOptimizer) = true
SciMLBase.allowsconstraints(opt::Evolutionary.AbstractOptimizer) = true
SciMLBase.supports_opt_cache_interface(opt::Evolutionary.AbstractOptimizer) = true
SciMLBase.requiresgradient(opt::Evolutionary.AbstractOptimizer) = false
SciMLBase.requireshessian(opt::Evolutionary.AbstractOptimizer) = false
SciMLBase.requiresconsjac(opt::Evolutionary.AbstractOptimizer) = false
SciMLBase.requiresconshess(opt::Evolutionary.AbstractOptimizer) = false

Check warning on line 13 in lib/OptimizationEvolutionary/src/OptimizationEvolutionary.jl

View check run for this annotation

Codecov / codecov/patch

lib/OptimizationEvolutionary/src/OptimizationEvolutionary.jl#L10-L13

Added lines #L10 - L13 were not covered by tests

decompose_trace(trace::Evolutionary.OptimizationTrace) = last(trace)
decompose_trace(trace::Evolutionary.OptimizationTraceRecord) = trace
Expand Down
4 changes: 4 additions & 0 deletions lib/OptimizationFlux/src/OptimizationFlux.jl
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,10 @@
using Optimization.SciMLBase

SciMLBase.supports_opt_cache_interface(opt::Flux.Optimise.AbstractOptimiser) = true
SciMLBase.requiresgradient(opt::Flux.Optimise.AbstractOptimiser) = true
SciMLBase.requireshessian(opt::Flux.Optimise.AbstractOptimiser) = false
SciMLBase.requiresconsjac(opt::Flux.Optimise.AbstractOptimiser) = false
SciMLBase.requiresconshess(opt::Flux.Optimise.AbstractOptimiser) = false

Check warning on line 11 in lib/OptimizationFlux/src/OptimizationFlux.jl

View check run for this annotation

Codecov / codecov/patch

lib/OptimizationFlux/src/OptimizationFlux.jl#L8-L11

Added lines #L8 - L11 were not covered by tests

function SciMLBase.__init(prob::SciMLBase.OptimizationProblem,
opt::Flux.Optimise.AbstractOptimiser,
Expand Down
5 changes: 5 additions & 0 deletions lib/OptimizationGCMAES/src/OptimizationGCMAES.jl
Original file line number Diff line number Diff line change
Expand Up @@ -12,9 +12,14 @@
SciMLBase.allowsbounds(::GCMAESOpt) = true
SciMLBase.allowscallback(::GCMAESOpt) = false
SciMLBase.supports_opt_cache_interface(opt::GCMAESOpt) = true
SciMLBase.requiresgradient(::GCMAESOpt) = true
SciMLBase.requireshessian(::GCMAESOpt) = false
SciMLBase.requiresconsjac(::GCMAESOpt) = false
SciMLBase.requiresconshess(::GCMAESOpt) = false


function __map_optimizer_args(cache::OptimizationCache, opt::GCMAESOpt;
callback = nothing,

Check warning on line 22 in lib/OptimizationGCMAES/src/OptimizationGCMAES.jl

View check run for this annotation

Codecov / codecov/patch

lib/OptimizationGCMAES/src/OptimizationGCMAES.jl#L15-L22

Added lines #L15 - L22 were not covered by tests
maxiters::Union{Number, Nothing} = nothing,
maxtime::Union{Number, Nothing} = nothing,
abstol::Union{Number, Nothing} = nothing,
Expand Down
5 changes: 5 additions & 0 deletions lib/OptimizationMOI/src/OptimizationMOI.jl
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,11 @@ const MOI = MathOptInterface

const DenseOrSparse{T} = Union{Matrix{T}, SparseMatrixCSC{T}}

SciMLBase.requiresgradient(opt::Union{MOI.AbstractOptimizer,MOI.OptimizerWithAttributes}) = true
SciMLBase.requireshessian(opt::Union{MOI.AbstractOptimizer,MOI.OptimizerWithAttributes}) = true
SciMLBase.requiresconsjac(opt::Union{MOI.AbstractOptimizer,MOI.OptimizerWithAttributes}) = true
SciMLBase.requiresconshess(opt::Union{MOI.AbstractOptimizer,MOI.OptimizerWithAttributes}) = true

function SciMLBase.allowsbounds(opt::Union{MOI.AbstractOptimizer,
MOI.OptimizerWithAttributes})
true
Expand Down
37 changes: 37 additions & 0 deletions lib/OptimizationNLopt/src/OptimizationNLopt.jl
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,43 @@
SciMLBase.allowsbounds(opt::Union{NLopt.Algorithm, NLopt.Opt}) = true
SciMLBase.supports_opt_cache_interface(opt::Union{NLopt.Algorithm, NLopt.Opt}) = true

function SciMLBase.requiresgradient(opt::NLopt.Algorithm) #https://github.com/JuliaOpt/NLopt.jl/blob/master/src/NLopt.jl#L18C7-L18C16
str_opt = string(opt)
if str_opt[2] == "D"
return true

Check warning on line 15 in lib/OptimizationNLopt/src/OptimizationNLopt.jl

View check run for this annotation

Codecov / codecov/patch

lib/OptimizationNLopt/src/OptimizationNLopt.jl#L12-L15

Added lines #L12 - L15 were not covered by tests
else
return false

Check warning on line 17 in lib/OptimizationNLopt/src/OptimizationNLopt.jl

View check run for this annotation

Codecov / codecov/patch

lib/OptimizationNLopt/src/OptimizationNLopt.jl#L17

Added line #L17 was not covered by tests
end
end

function SciMLBase.requireshessian(opt::NLopt.Algorithm) #https://github.com/JuliaOpt/NLopt.jl/blob/master/src/NLopt.jl#L18C7-L18C16
str_opt = string(opt)
if (str_opt[2] == "D" && str_opt[4] == "N")
return true

Check warning on line 24 in lib/OptimizationNLopt/src/OptimizationNLopt.jl

View check run for this annotation

Codecov / codecov/patch

lib/OptimizationNLopt/src/OptimizationNLopt.jl#L21-L24

Added lines #L21 - L24 were not covered by tests
else
return false

Check warning on line 26 in lib/OptimizationNLopt/src/OptimizationNLopt.jl

View check run for this annotation

Codecov / codecov/patch

lib/OptimizationNLopt/src/OptimizationNLopt.jl#L26

Added line #L26 was not covered by tests
end
end

function SciMLBase.requireshessian(opt::NLopt.Algorithm) #https://github.com/JuliaOpt/NLopt.jl/blob/master/src/NLopt.jl#L18C7-L18C16
str_opt = string(opt)
if str_opt[2] == "D" && str_opt[4] == "N"
return true

Check warning on line 33 in lib/OptimizationNLopt/src/OptimizationNLopt.jl

View check run for this annotation

Codecov / codecov/patch

lib/OptimizationNLopt/src/OptimizationNLopt.jl#L30-L33

Added lines #L30 - L33 were not covered by tests
else
return false

Check warning on line 35 in lib/OptimizationNLopt/src/OptimizationNLopt.jl

View check run for this annotation

Codecov / codecov/patch

lib/OptimizationNLopt/src/OptimizationNLopt.jl#L35

Added line #L35 was not covered by tests
end
end
function SciMLBase.requiresconsjac(opt::NLopt.Algorithm) #https://github.com/JuliaOpt/NLopt.jl/blob/master/src/NLopt.jl#L18C7-L18C16
str_opt = string(opt)
if str_opt[3] == "O" || str_opt[3] == "I" || str_opt[5] == "G"
return true

Check warning on line 41 in lib/OptimizationNLopt/src/OptimizationNLopt.jl

View check run for this annotation

Codecov / codecov/patch

lib/OptimizationNLopt/src/OptimizationNLopt.jl#L38-L41

Added lines #L38 - L41 were not covered by tests
else
return false

Check warning on line 43 in lib/OptimizationNLopt/src/OptimizationNLopt.jl

View check run for this annotation

Codecov / codecov/patch

lib/OptimizationNLopt/src/OptimizationNLopt.jl#L43

Added line #L43 was not covered by tests
end
end



function __map_optimizer_args!(cache::OptimizationCache, opt::NLopt.Opt;
callback = nothing,
maxiters::Union{Number, Nothing} = nothing,
Expand Down
8 changes: 7 additions & 1 deletion lib/OptimizationOptimJL/src/OptimizationOptimJL.jl
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,13 @@
SciMLBase.supports_opt_cache_interface(opt::Optim.AbstractOptimizer) = true
SciMLBase.supports_opt_cache_interface(opt::Union{Optim.Fminbox, Optim.SAMIN}) = true
SciMLBase.supports_opt_cache_interface(opt::Optim.ConstrainedOptimizer) = true
SciMLBase.requiresgradient(opt::Optim.AbstractOptimizer) = !(opt isa Optim.ZerothOrderOptimizer)
SciMLBase.requiresgradient(::IPNewton) = true
SciMLBase.requireshessian(::IPNewton) = true
SciMLBase.requiresconsjac(::IPNewton) = true
SciMLBase.requireshessian(opt::Optim.NewtonTrustRegion) = true
SciMLBase.requireshessian(opt::Optim.Newton) = true
SciMLBase.requiresgradient(opt::Optim.Fminbox) = true

Check warning on line 23 in lib/OptimizationOptimJL/src/OptimizationOptimJL.jl

View check run for this annotation

Codecov / codecov/patch

lib/OptimizationOptimJL/src/OptimizationOptimJL.jl#L17-L23

Added lines #L17 - L23 were not covered by tests

function __map_optimizer_args(cache::OptimizationCache,
opt::Union{Optim.AbstractOptimizer, Optim.Fminbox,
Expand Down Expand Up @@ -128,7 +135,6 @@
local x, cur, state

cur, state = iterate(cache.data)

!(cache.opt isa Optim.ZerothOrderOptimizer) && cache.f.grad === nothing &&
error("Use OptimizationFunction to pass the derivatives or automatically generate them with one of the autodiff backends")

Expand Down
1 change: 1 addition & 0 deletions lib/OptimizationOptimisers/src/OptimizationOptimisers.jl
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,8 @@
using Optimization.SciMLBase

SciMLBase.supports_opt_cache_interface(opt::AbstractRule) = true
SciMLBase.requiresgradient(opt::AbstractRule) = true
include("sophia.jl")

Check warning on line 9 in lib/OptimizationOptimisers/src/OptimizationOptimisers.jl

View check run for this annotation

Codecov / codecov/patch

lib/OptimizationOptimisers/src/OptimizationOptimisers.jl#L8-L9

Added lines #L8 - L9 were not covered by tests

function SciMLBase.__init(prob::SciMLBase.OptimizationProblem, opt::AbstractRule,
data = Optimization.DEFAULT_DATA; save_best = true,
Expand Down
3 changes: 3 additions & 0 deletions lib/OptimizationPRIMA/src/OptimizationPRIMA.jl
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,9 @@ SciMLBase.supports_opt_cache_interface(::PRIMASolvers) = true
SciMLBase.allowsconstraints(::Union{LINCOA, COBYLA}) = true
SciMLBase.allowsbounds(opt::Union{BOBYQA, LINCOA, COBYLA}) = true
SciMLBase.requiresconstraints(opt::COBYLA) = true
SciMLBase.requiresgradient(opt::Union{BOBYQA, LINCOA, COBYLA}) = true
SciMLBase.requiresconsjac(opt::Union{LINCOA, COBYLA}) = true


function Optimization.OptimizationCache(prob::SciMLBase.OptimizationProblem,
opt::PRIMASolvers, data;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,8 @@ using Optimization.SciMLBase, OptimizationOptimJL, OptimizationOptimisers

struct PolyOpt end
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PolyOpt is defined after the trait call switch around the order


SciMLBase.requiresgradient(opt::PolyOpt) = true

function SciMLBase.__solve(prob::OptimizationProblem,
opt::PolyOpt,
args...;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@
SciMLBase.allowsbounds(::SpeedMappingOpt) = true
SciMLBase.allowscallback(::SpeedMappingOpt) = false
SciMLBase.supports_opt_cache_interface(opt::SpeedMappingOpt) = true
SciMLBase.requiresgradient(opt::SpeedMappingOpt) = true

Check warning on line 14 in lib/OptimizationSpeedMapping/src/OptimizationSpeedMapping.jl

View check run for this annotation

Codecov / codecov/patch

lib/OptimizationSpeedMapping/src/OptimizationSpeedMapping.jl#L14

Added line #L14 was not covered by tests

function __map_optimizer_args(cache::OptimizationCache, opt::SpeedMappingOpt;
callback = nothing,
Expand Down
Loading