Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update #711 checking if OptimizationFunction is used for derivative based optimizers #711 #715

Merged
merged 33 commits into from
Apr 7, 2024

Conversation

ParasPuneetSingh
Copy link
Collaborator

@ParasPuneetSingh ParasPuneetSingh commented Mar 5, 2024

Added checking if OptimizationFunction is used for derivative based optimizers #711

Adding trait for requiring derivatives and throwing error to use OptimizationFunction
@ParasPuneetSingh ParasPuneetSingh changed the title Update #711 Added a new trait for checking if OptimizationFunction is used for derivative based optimizers #711 Update #711 checking if OptimizationFunction is used for derivative based optimizers #711 Mar 5, 2024
Copy link

codecov bot commented Mar 6, 2024

Codecov Report

Attention: Patch coverage is 0% with 50 lines in your changes are missing coverage. Please review.

Project coverage is 11.37%. Comparing base (80f2155) to head (e0610cd).
Report is 2 commits behind head on master.

Files Patch % Lines
lib/OptimizationNLopt/src/OptimizationNLopt.jl 0.00% 20 Missing ⚠️
lib/OptimizationOptimJL/src/OptimizationOptimJL.jl 0.00% 7 Missing ⚠️
...onStrategy/src/OptimizationCMAEvolutionStrategy.jl 0.00% 4 Missing ⚠️
...zationEvolutionary/src/OptimizationEvolutionary.jl 0.00% 4 Missing ⚠️
lib/OptimizationFlux/src/OptimizationFlux.jl 0.00% 4 Missing ⚠️
lib/OptimizationGCMAES/src/OptimizationGCMAES.jl 0.00% 4 Missing ⚠️
lib/OptimizationMOI/src/OptimizationMOI.jl 0.00% 4 Missing ⚠️
...timizationOptimisers/src/OptimizationOptimisers.jl 0.00% 1 Missing ⚠️
...onPolyalgorithms/src/OptimizationPolyalgorithms.jl 0.00% 1 Missing ⚠️
...zationSpeedMapping/src/OptimizationSpeedMapping.jl 0.00% 1 Missing ⚠️
Additional details and impacted files
@@             Coverage Diff             @@
##           master     #715       +/-   ##
===========================================
- Coverage   66.01%   11.37%   -54.64%     
===========================================
  Files          22       22               
  Lines        1477     1512       +35     
===========================================
- Hits          975      172      -803     
- Misses        502     1340      +838     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

Copy link
Member

@Vaibhavdixit02 Vaibhavdixit02 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be set for all the packages in lib/ (i.e. the solver wrappers)

Updated the trait requiresderivative to requiresgradient.
Added requiresgradient, requireshessian, requiresconshess and requiresconsjac traits.
Added new traits: requiresgradient, requireshessian, requiresconsjac and requiresconshess.
Added the traits requiresgradient, requireshessian, requiresconsjac and requiresconshess.
Added traits requireshessian, requiresgradient, requiresconshess and requiresconsjac.
Added traits requireshessian, requiresgradient, requiresconsjac and requiresconshess.
@@ -14,6 +14,7 @@ SciMLBase.requiresbounds(opt::Optim.SAMIN) = true
SciMLBase.supports_opt_cache_interface(opt::Optim.AbstractOptimizer) = true
SciMLBase.supports_opt_cache_interface(opt::Union{Optim.Fminbox, Optim.SAMIN}) = true
SciMLBase.supports_opt_cache_interface(opt::Optim.ConstrainedOptimizer) = true
SciMLBase.requiresgradient(Optim.AbstractOptimizer) = !(opt isa Optim.ZerothOrderOptimizer)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, but this wrapper will need more of the traits set to true (Newton, NewtonTrustRegion and IPNewton)

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And not all of them follow the abstract type, especially check the SAMIN and FminBox algorithms

@@ -9,6 +9,11 @@ abstract type BBO end
SciMLBase.requiresbounds(::BBO) = true
SciMLBase.allowsbounds(::BBO) = true
SciMLBase.supports_opt_cache_interface(opt::BBO) = true
SciMLBase.requiresgradient(::BBO) = false
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since the default value is false, you can only declare the ones you need to set to true here, so for example you'd not need to define anything here in OptimzationBBO since its all false

Added traits for IPNewton, Newton, and NewtonTrustRegion.
Added requiresgradient trait to NLOPT.jl
Set trait requiresgradient to true.
@@ -8,6 +8,7 @@ using Optimization.SciMLBase

SciMLBase.allowsbounds(opt::Union{NLopt.Algorithm, NLopt.Opt}) = true
SciMLBase.supports_opt_cache_interface(opt::Union{NLopt.Algorithm, NLopt.Opt}) = true
SciMLBase.requiresgradient(opt::Union{NLopt.Algorithm, NLopt.Opt}) = true
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would be useful to check if NLopt has a type hierarchy we can use here for doing this better, since it has all kinds of algorithms anything we put here will be wrong - in which case I would want to set the hessian to true as well.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

How about if I set the traits requiresgradient and requireshessian true for only the gradient-based algorithms (local and global). Only the derivative-free local optimizers will have these traits as false then.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's what I am hoping we can do yeah. But you should also be able to set the requireshessian to true only for second order methods by looking at the docs of nlopt. For the derivative-free vs others you plan to use the prefix in the names of the methods?

@@ -128,7 +136,8 @@ function SciMLBase.__solve(cache::OptimizationCache{
local x, cur, state

cur, state = iterate(cache.data)

(requiresgradient(Optim.AbstractOptimizer))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be done in the solve defined in SciMLBase, I'll point you to it later once this PR is done, you can remove this error from here

Set traits requiresgradient and requiresconsjac to true for certain algorithms.
Set trait requiresgradient to true.
Set trait requiresgradient to true.
Removed redundant traits declared.
Set traits requiresgradient, requireshessian and requiresconsjac to true based on the algorithm.
Updated traits for Newton and NewtonTrustRegion as they are second order, they automatically require the calculation of hessians.
Set the requiresgradient for Fminbox to true as it is required for the objective function evaluation and constraint handling.
Set traits requiresgradient, requireshessian, requiresconshess and requiresconsjac to true.
Removed redundant check for requiresgradient trait.
@Vaibhavdixit02
Copy link
Member

@ParasPuneetSingh there's a few errors here please fix them from the CI run it seems like syntax issues

@ParasPuneetSingh
Copy link
Collaborator Author

I checked the Cl run and corrected the syntax errors in OptimizationNLopt.jl but now it says that "Optim.AbstractOptimizer" is not a valid function argument name, but it has been used for other traits as well, I don't get this syntax error.

@@ -4,6 +4,8 @@ using Reexport
@reexport using Optimization
using Optimization.SciMLBase, OptimizationOptimJL, OptimizationOptimisers

SciMLBase.requiresgradient(opt::PolyOpt) = true

struct PolyOpt end
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

PolyOpt is defined after the trait call switch around the order

Interchanged the struct definition and trait checking.
Copy link
Member

@Vaibhavdixit02 Vaibhavdixit02 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Change the compat bound on SciMLBase to 2.30.0 then this should be ready to merge

Updated compat for SciMLBase
@@ -64,7 +64,7 @@ OptimizationPolyalgorithms = "0.1, 0.2"
OptimizationSpeedMapping = "0.1, 0.2"
OrdinaryDiffEq = "6"
ReverseDiff = ">= 1.9.0"
SciMLBase = "2"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not just in docs also the Project.toml in the base directory of the repo

Project.toml Outdated Show resolved Hide resolved
@Vaibhavdixit02 Vaibhavdixit02 merged commit 8320ff3 into SciML:master Apr 7, 2024
38 of 43 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants