Skip to content

Commit

Permalink
Fix zygote constraint bug and update rosenbrock doc
Browse files Browse the repository at this point in the history
  • Loading branch information
Vaibhavdixit02 committed Oct 22, 2023
1 parent e8a682e commit 3e33079
Show file tree
Hide file tree
Showing 2 changed files with 41 additions and 15 deletions.
50 changes: 38 additions & 12 deletions docs/src/examples/rosenbrock.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,37 +22,45 @@ _p = [1.0, 100.0]
f = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff())
l1 = rosenbrock(x0, _p)
prob = OptimizationProblem(f, x0, _p)
```
## Optim.jl Solvers

using OptimizationOptimJL
# Start with some derivative-free optimizers
### Start with some derivative-free optimizers

```@example rosenbrock
using OptimizationOptimJL
sol = solve(prob, SimulatedAnnealing())
prob = OptimizationProblem(f, x0, _p, lb = [-1.0, -1.0], ub = [0.8, 0.8])
sol = solve(prob, SAMIN())
l1 = rosenbrock(x0, _p)
prob = OptimizationProblem(rosenbrock, x0, _p)
sol = solve(prob, NelderMead())
```

# Now a gradient-based optimizer with forward-mode automatic differentiation
### Now a gradient-based optimizer with forward-mode automatic differentiation

```@example rosenbrock
optf = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff())
prob = OptimizationProblem(optf, x0, _p)
sol = solve(prob, BFGS())
```

# Now a second order optimizer using Hessians generated by forward-mode automatic differentiation
### Now a second order optimizer using Hessians generated by forward-mode automatic differentiation

```@example rosenbrock
sol = solve(prob, Newton())
```

# Now a second order Hessian-free optimizer
### Now a second order Hessian-free optimizer

```@example rosenbrock
sol = solve(prob, Optim.KrylovTrustRegion())
```

# Now derivative-based optimizers with various constraints
### Now derivative-based optimizers with various constraints

```@example rosenbrock
cons = (res, x, p) -> res .= [x[1]^2 + x[2]^2]
optf = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff(); cons = cons)
Expand All @@ -68,24 +76,32 @@ sol = solve(prob, IPNewton())
prob = OptimizationProblem(optf, x0, _p, lcons = [0.5], ucons = [0.5],
lb = [-500.0, -500.0], ub = [50.0, 50.0])
sol = solve(prob, IPNewton()) # Notice now that x[1]^2 + x[2]^2 ≈ 0.5:
# cons(sol.u, _p) = 0.49999999999999994
sol = solve(prob, IPNewton())
# Notice now that x[1]^2 + x[2]^2 ≈ 0.5:
cons(sol.u, _p)
```

```@example rosenbrock
function con_c(res, x, p)
res .= [x[1]^2 + x[2]^2]
end
optf = OptimizationFunction(rosenbrock, Optimization.AutoForwardDiff(); cons = con_c)
prob = OptimizationProblem(optf, x0, _p, lcons = [-Inf], ucons = [0.25^2])
sol = solve(prob, IPNewton()) # -Inf < cons_circ(sol.u, _p) = 0.25^2
```

## Evolutionary.jl Solvers

```@example rosenbrock
using OptimizationEvolutionary
sol = solve(prob, CMAES(μ = 40, λ = 100), abstol = 1e-15) # -Inf < cons_circ(sol.u, _p) = 0.25^2
```

## IPOPT through OptimizationMOI

```@example rosenbrock
using OptimizationMOI, Ipopt
function con2_c(res, x, p)
Expand All @@ -95,36 +111,46 @@ end
optf = OptimizationFunction(rosenbrock, Optimization.AutoZygote(); cons = con2_c)
prob = OptimizationProblem(optf, x0, _p, lcons = [-Inf, -Inf], ucons = [Inf, Inf])
sol = solve(prob, Ipopt.Optimizer())
```

# Now let's switch over to OptimizationOptimisers with reverse-mode AD
## Now let's switch over to OptimizationOptimisers with reverse-mode AD

```@example rosenbrock
using OptimizationOptimisers
optf = OptimizationFunction(rosenbrock, Optimization.AutoZygote())
prob = OptimizationProblem(optf, x0, _p)
sol = solve(prob, Adam(0.05), maxiters = 1000, progress = false)
```

## Try out CMAEvolutionStrategy.jl's evolutionary methods

```@example rosenbrock
using OptimizationCMAEvolutionStrategy
sol = solve(prob, CMAEvolutionStrategyOpt())
```

## Now try a few NLopt.jl solvers with symbolic differentiation via ModelingToolkit.jl

```@example rosenbrock
using OptimizationNLopt, ModelingToolkit
optf = OptimizationFunction(rosenbrock, Optimization.AutoModelingToolkit())
prob = OptimizationProblem(optf, x0, _p)
sol = solve(prob, Opt(:LN_BOBYQA, 2))
sol = solve(prob, Opt(:LD_LBFGS, 2))
```

## Add some box constraints and solve with a few NLopt.jl methods
### Add some box constraints and solve with a few NLopt.jl methods

```@example rosenbrock
prob = OptimizationProblem(optf, x0, _p, lb = [-1.0, -1.0], ub = [0.8, 0.8])
sol = solve(prob, Opt(:LD_LBFGS, 2))
sol = solve(prob, Opt(:G_MLSL_LDS, 2), local_method = Opt(:LD_LBFGS, 2), maxiters = 10000) #a global optimizer with random starts of local optimization
```

## BlackBoxOptim.jl Solvers

```@example rosenbrock
using OptimizationBBO
prob = Optimization.OptimizationProblem(rosenbrock, x0, _p, lb = [-1.0, 0.2],
ub = [0.8, 0.43])
Expand Down
6 changes: 3 additions & 3 deletions ext/OptimizationZygoteExt.jl
Original file line number Diff line number Diff line change
Expand Up @@ -123,13 +123,13 @@ function Optimization.instantiate_function(f, cache::Optimization.ReInitCache,
if f.cons === nothing
cons = nothing
else
cons = (res, θ) -> f.cons(res, θ, cache.p)
cons_oop = (x) -> (_res = zeros(eltype(x), num_cons); cons(_res, x); _res)
cons = (res, θ) -> f.cons(res, θ, cache.p)
cons_oop = (x) -> (_res = Zygote.Buffer(x, num_cons); cons(_res, x); copy(_res))

Check warning on line 127 in ext/OptimizationZygoteExt.jl

View check run for this annotation

Codecov / codecov/patch

ext/OptimizationZygoteExt.jl#L126-L127

Added lines #L126 - L127 were not covered by tests
end

if cons !== nothing && f.cons_j === nothing
cons_j = function (J, θ)
J .= Zygote.jacobian(cons_oop, θ)
J .= first(Zygote.jacobian(cons_oop, θ))

Check warning on line 132 in ext/OptimizationZygoteExt.jl

View check run for this annotation

Codecov / codecov/patch

ext/OptimizationZygoteExt.jl#L132

Added line #L132 was not covered by tests
end
else
cons_j = (J, θ) -> f.cons_j(J, θ, cache.p)
Expand Down

0 comments on commit 3e33079

Please sign in to comment.