Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove lb and ub from kwargs before passing to solver function in lbfgsb #780

Merged
merged 1 commit into from
Jun 18, 2024

Conversation

Vaibhavdixit02
Copy link
Member

Checklist

  • Appropriate tests were added
  • Any code changes were done in a way that does not break public API
  • All documentation related to code changes were updated
  • The new code follows the
    contributor guidelines, in particular the SciML Style Guide and
    COLPRAC.
  • Any new documentation only uses public API

Additional context

Add any other context about the problem here.

@Vaibhavdixit02 Vaibhavdixit02 mentioned this pull request Jun 18, 2024
@Vaibhavdixit02 Vaibhavdixit02 merged commit 78ddd51 into master Jun 18, 2024
23 of 42 checks passed
@Vaibhavdixit02 Vaibhavdixit02 deleted the lbfgsbkwargs branch June 18, 2024 11:14
@cefitzg
Copy link

cefitzg commented Jun 18, 2024

@Vaibhavdixit02, I Pkg.rm'd Optimization 3.26.0 and re-added this morning (still 3.26.0). I reran the same code and I'm still getting. Is there a new version that includes the fix?

11ERROR: LoadError: MethodError: no method matching (::LBFGSB.L_BFGS_B)(::Optimization.var"#22#34"{OptimizationCache{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", OptimizationForwardDiffExt.var"#38#56"{ForwardDiff.GradientConfig{ForwardDiff.Tag{OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Float64}, Float64, 11, Vector{ForwardDiff.Dual{ForwardDiff.Tag{OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Float64}, Float64, 11}}}, OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}}, OptimizationForwardDiffExt.var"#41#59"{ForwardDiff.HessianConfig{ForwardDiff.Tag{OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Float64}, Float64, 11, Vector{ForwardDiff.Dual{ForwardDiff.Tag{OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Float64}, ForwardDiff.Dual{ForwardDiff.Tag{OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Float64}, Float64, 11}, 11}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Float64}, Float64, 11}}}, OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}}, OptimizationForwardDiffExt.var"#44#62", Nothing, OptimizationForwardDiffExt.var"#48#66"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Nothing, Nothing, OptimizationForwardDiffExt.var"#53#71"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}, Vector{Float64}, Vector{Float64}, Nothing, Nothing, Nothing, Optimization.LBFGS, Base.Iterators.Cycle{Tuple{OptimizationBase.NullData}}, Bool, Optimization.var"#11#13"}}, ::OptimizationForwardDiffExt.var"#38#56"{ForwardDiff.GradientConfig{ForwardDiff.Tag{OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Float64}, Float64, 11, Vector{ForwardDiff.Dual{ForwardDiff.Tag{OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Float64}, Float64, 11}}}, OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}}, ::Vector{Float64}, ::Matrix{Float64}; m::Int64, lb::Vector{Float64}, ub::Vector{Float64}, maxiter::Int64)

Closest candidates are:
  (::LBFGSB.L_BFGS_B)(::Any, ::Any, ::AbstractVector, ::AbstractMatrix; m, factr, pgtol, iprint, maxfun, maxiter) got unsupported keyword arguments "lb", "ub"
   @ LBFGSB ~/.julia/packages/LBFGSB/UZibA/src/wrapper.jl:31
  (::LBFGSB.L_BFGS_B)(::Any, ::AbstractVector, ::AbstractMatrix; m, factr, pgtol, iprint, maxfun, maxiter) got unsupported keyword arguments "lb", "ub"
   @ LBFGSB ~/.julia/packages/LBFGSB/UZibA/src/wrapper.jl:75

Stacktrace:
 [1] kwerr(::@NamedTuple{m::Int64, lb::Vector{Float64}, ub::Vector{Float64}, maxiter::Int64}, ::LBFGSB.L_BFGS_B, ::Function, ::Function, ::Vector{Float64}, ::Matrix{Float64})
   @ Base ./error.jl:165
 [2] __solve(cache::OptimizationCache{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", OptimizationForwardDiffExt.var"#38#56"{ForwardDiff.GradientConfig{ForwardDiff.Tag{OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Float64}, Float64, 11, Vector{ForwardDiff.Dual{ForwardDiff.Tag{OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Float64}, Float64, 11}}}, OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}}, OptimizationForwardDiffExt.var"#41#59"{ForwardDiff.HessianConfig{ForwardDiff.Tag{OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Float64}, Float64, 11, Vector{ForwardDiff.Dual{ForwardDiff.Tag{OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Float64}, ForwardDiff.Dual{ForwardDiff.Tag{OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Float64}, Float64, 11}, 11}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Float64}, Float64, 11}}}, OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}}, OptimizationForwardDiffExt.var"#44#62", Nothing, OptimizationForwardDiffExt.var"#48#66"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Nothing, Nothing, OptimizationForwardDiffExt.var"#53#71"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}, Vector{Float64}, Vector{Float64}, Nothing, Nothing, Nothing, Optimization.LBFGS, Base.Iterators.Cycle{Tuple{OptimizationBase.NullData}}, Bool, Optimization.var"#11#13"})
   @ Optimization ~/.julia/packages/Optimization/fnasG/src/lbfgsb.jl:234
 [3] solve!(cache::OptimizationCache{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", OptimizationForwardDiffExt.var"#38#56"{ForwardDiff.GradientConfig{ForwardDiff.Tag{OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Float64}, Float64, 11, Vector{ForwardDiff.Dual{ForwardDiff.Tag{OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Float64}, Float64, 11}}}, OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}}, OptimizationForwardDiffExt.var"#41#59"{ForwardDiff.HessianConfig{ForwardDiff.Tag{OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Float64}, Float64, 11, Vector{ForwardDiff.Dual{ForwardDiff.Tag{OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Float64}, ForwardDiff.Dual{ForwardDiff.Tag{OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Float64}, Float64, 11}, 11}}, Vector{ForwardDiff.Dual{ForwardDiff.Tag{OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Float64}, Float64, 11}}}, OptimizationForwardDiffExt.var"#37#55"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}}, OptimizationForwardDiffExt.var"#44#62", Nothing, OptimizationForwardDiffExt.var"#48#66"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Nothing, Nothing, OptimizationForwardDiffExt.var"#53#71"{OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}}, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, OptimizationBase.ReInitCache{Vector{Float64}, SciMLBase.NullParameters}, Vector{Float64}, Vector{Float64}, Nothing, Nothing, Nothing, Optimization.LBFGS, Base.Iterators.Cycle{Tuple{OptimizationBase.NullData}}, Bool, Optimization.var"#11#13"})
   @ SciMLBase ~/.julia/packages/SciMLBase/JUp1I/src/solve.jl:188
 [4] solve(::OptimizationProblem{true, OptimizationFunction{true, AutoForwardDiff{nothing, Nothing}, var"#1#2", Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, typeof(SciMLBase.DEFAULT_OBSERVED_NO_TIME), Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing, Nothing}, Vector{Float64}, SciMLBase.NullParameters, Vector{Float64}, Vector{Float64}, Nothing, Nothing, Nothing, Nothing, @Kwargs{}}, ::Optimization.LBFGS; kwargs::@Kwargs{maxiters::Int64})
   @ SciMLBase ~/.julia/packages/SciMLBase/JUp1I/src/solve.jl:96
 [5] macro expansion
   @ ~/Desktop/lbfgsb_amm_quest_6_7/cycler.jl:99 [inlined]
 [6] macro expansion
   @ ./timing.jl:279 [inlined]
 [7] top-level scope
   @ ~/Desktop/lbfgsb_amm_quest_6_7/cycler.jl:269
in expression starting at /Users/cefitzg/Desktop/lbfgsb_amm_quest_6_7/cycler.jl:98

@Vaibhavdixit02
Copy link
Member Author

Yup you need to update to 3.26.1

@cefitzg
Copy link

cefitzg commented Jun 18, 2024

@Vaibhavdixit02, I think you need to release it. Pkg.add("Optimization") installs 3.26.0 and Pkg.add(Pkg.PackageSpec(;name="Optimization", version="3.26.1")) leads to

ERROR: Unsatisfiable requirements detected for package Optimization [7f7a1694]:
 Optimization [7f7a1694] log:
 ├─possible versions are: 3.5.0-3.26.0 or uninstalled
 └─restricted to versions 3.26.1 by an explicit requirement — no versions left
Stacktrace:
  [1] check_constraints(graph::Pkg.Resolve.Graph)
    @ Pkg.Resolve ~/.julia/juliaup/julia-1.10.4+0.aarch64.apple.darwin14/share/julia/stdlib/v1.10/Pkg/src/Resolve/graphtype.jl:998
  [2] Pkg.Resolve.Graph(compat::Dict{…}, compat_weak::Dict{…}, uuid_to_name::Dict{…}, reqs::Dict{…}, fixed::Dict{…}, verbose::Bool, julia_version::VersionNumber)
    @ Pkg.Resolve ~/.julia/juliaup/julia-1.10.4+0.aarch64.apple.darwin14/share/julia/stdlib/v1.10/Pkg/src/Resolve/graphtype.jl:345
  [3] deps_graph(env::Pkg.Types.EnvCache, registries::Vector{…}, uuid_to_name::Dict{…}, reqs::Dict{…}, fixed::Dict{…}, julia_version::VersionNumber, installed_only::Bool)
    @ Pkg.Operations ~/.julia/juliaup/julia-1.10.4+0.aarch64.apple.darwin14/share/julia/stdlib/v1.10/Pkg/src/Operations.jl:587
  [4] resolve_versions!(env::Pkg.Types.EnvCache, registries::Vector{…}, pkgs::Vector{…}, julia_version::VersionNumber, installed_only::Bool)
    @ Pkg.Operations ~/.julia/juliaup/julia-1.10.4+0.aarch64.apple.darwin14/share/julia/stdlib/v1.10/Pkg/src/Operations.jl:407
  [5] targeted_resolve(env::Pkg.Types.EnvCache, registries::Vector{…}, pkgs::Vector{…}, preserve::Pkg.Types.PreserveLevel, julia_version::VersionNumber)
    @ Pkg.Operations ~/.julia/juliaup/julia-1.10.4+0.aarch64.apple.darwin14/share/julia/stdlib/v1.10/Pkg/src/Operations.jl:1361
  [6] tiered_resolve(env::Pkg.Types.EnvCache, registries::Vector{…}, pkgs::Vector{…}, julia_version::VersionNumber, try_all_installed::Bool)
    @ Pkg.Operations ~/.julia/juliaup/julia-1.10.4+0.aarch64.apple.darwin14/share/julia/stdlib/v1.10/Pkg/src/Operations.jl:1350
  [7] _resolve(io::Base.TTY, env::Pkg.Types.EnvCache, registries::Vector{…}, pkgs::Vector{…}, preserve::Pkg.Types.PreserveLevel, julia_version::VersionNumber)
    @ Pkg.Operations ~/.julia/juliaup/julia-1.10.4+0.aarch64.apple.darwin14/share/julia/stdlib/v1.10/Pkg/src/Operations.jl:1371
  [8] add(ctx::Pkg.Types.Context, pkgs::Vector{…}, new_git::Set{…}; preserve::Pkg.Types.PreserveLevel, platform::Base.BinaryPlatforms.Platform)
    @ Pkg.Operations ~/.julia/juliaup/julia-1.10.4+0.aarch64.apple.darwin14/share/julia/stdlib/v1.10/Pkg/src/Operations.jl:1388
  [9] add
    @ ~/.julia/juliaup/julia-1.10.4+0.aarch64.apple.darwin14/share/julia/stdlib/v1.10/Pkg/src/Operations.jl:1377 [inlined]
 [10] add(ctx::Pkg.Types.Context, pkgs::Vector{…}; preserve::Pkg.Types.PreserveLevel, platform::Base.BinaryPlatforms.Platform, kwargs::@Kwargs{…})
    @ Pkg.API ~/.julia/juliaup/julia-1.10.4+0.aarch64.apple.darwin14/share/julia/stdlib/v1.10/Pkg/src/API.jl:278
 [11] add(pkgs::Vector{Pkg.Types.PackageSpec}; io::Base.TTY, kwargs::@Kwargs{})
    @ Pkg.API ~/.julia/juliaup/julia-1.10.4+0.aarch64.apple.darwin14/share/julia/stdlib/v1.10/Pkg/src/API.jl:159
 [12] add(pkgs::Vector{Pkg.Types.PackageSpec})
    @ Pkg.API ~/.julia/juliaup/julia-1.10.4+0.aarch64.apple.darwin14/share/julia/stdlib/v1.10/Pkg/src/API.jl:148
 [13] add(pkg::Pkg.Types.PackageSpec)
    @ Pkg.API ~/.julia/juliaup/julia-1.10.4+0.aarch64.apple.darwin14/share/julia/stdlib/v1.10/Pkg/src/API.jl:146
 [14] top-level scope
    @ REPL[4]:1
Some type information was truncated. Use `show(err)` to see complete types.

@Vaibhavdixit02
Copy link
Member Author

It was released an hour ago, you registry is probably not updated, do ]up and then try to ]add

@cefitzg
Copy link

cefitzg commented Jun 18, 2024

@Vaibhavdixit02 thanks for your help!

@cefitzg
Copy link

cefitzg commented Jun 18, 2024

@Vaibhavdixit02, I am getting a retcode of failure across many seeds, but when I look at the warning, it says,

Unrecognized stop reason: CONVERGENCE: REL_REDUCTION_OF_F_<=_FACTR*EPSMCH 

I think that should actually be coded as success in STOP_REASON_MAP.

@Vaibhavdixit02
Copy link
Member Author

Yes, can you do a PR?

@Vaibhavdixit02
Copy link
Member Author

#781 I'll merge and release tomorrow morning

@cefitzg
Copy link

cefitzg commented Jun 19, 2024

Thanks for handling! Didn't get to yesterday.

@cefitzg
Copy link

cefitzg commented Jun 24, 2024

@Vaibhavdixit02, I see you updatedSTOP_REASON_MAP with entry r"Unrecognized stop reason: CONVERGENCE: REL_REDUCTION_OF_F_<=_FACTR*EPSMCH" => ReturnCode.Success. I did ]up and reinstalled Optimization.jl v3.26.1. When I try this:

adtype = Optimization.AutoForwardDiff()
optf = Optimization.OptimizationFunction((x, theta) -> model_loss(x),adtype)
optprob = Optimization.OptimizationProblem(optf, p0, lb = lb, ub = ub)
@time begin 
sol = solve(optprob, Optimization.LBFGS(), maxiters = 1000)
end 


println("this is objective ",sol.objective)
println("this is u ", sol.u) 
println("this is retcode", sol.retcode)
println("this is original ", sol.original)
println("this is stats ", sol.stats)
println("this is algo ", sol.alg)

I get

this is objective 27.911580355237593
this is u [1.0e-5, 2.0, 1.0e-5, 101.0, 5.790442804916359, 97.87923987663744, 0.04824302458295835, 0.16774889888027866, 4.379844932471519, 189.60744251129844, 111.92673502133884, 5.295335167975459, 10001.0, 2256.6765792978426, 101.26539066865509, 1.0e-5, 1.0e-5, 3.757626573457553]
this is retcodeFailure
this is original nothing
this is stats SciMLBase.OptimizationStats(1000, 10.29965591430664, 1000, 1000, 0)
this is algo Optimization.LBFGS(10)

I can't tell what the issue is.

  [336ed68f] CSV v0.10.14
  [a93c6f00] DataFrames v1.6.1
  [0c46a032] DifferentialEquations v7.13.0
  [9aa1b823] FastClosures v0.3.2
  [f6369f11] ForwardDiff v0.10.36
  [c27321d9] Glob v1.3.1
  [a5e1c1ea] LatinHypercubeSampling v1.9.0
  [961ee093] ModelingToolkit v9.19.0
  [7f7a1694] Optimization v3.26.1
  [91a5bcdd] Plots v1.40.4
  [1ed8b502] SciMLSensitivity v7.61.1
  [90137ffa] StaticArrays v1.9.5
  [e88e6eb3] Zygote v0.6.70
  [37e2e46d] LinearAlgebra
  [9a3f8284] Random

For the p0 I select in snippet above, the warning is triggered

┌ Warning: Unrecognized stop reason: CONVERGENCE: REL_REDUCTION_OF_F_<=_FACTR*EPSMCH             . Defaulting to ReturnCode.Failure.
└ @ Optimization ~/.julia/packages/Optimization/ucp7G/src/utils.jl:107

@Vaibhavdixit02
Copy link
Member Author

There was no release after this PR, I have triggered it now so you should be able to see it in a few hours

@cefitzg
Copy link

cefitzg commented Jun 24, 2024

thanks!

@cefitzg
Copy link

cefitzg commented Jun 25, 2024

@Vaibhavdixit02, I gave it a whirl this morning and I don't believe it was triggered, as I got the same result. Will try again tomorrow or later in the day.

@Vaibhavdixit02
Copy link
Member Author

Your registry might be outdated, can you try updating and restart julia

https://github.com/SciML/Optimization.jl/releases/tag/v3.26.2

@cefitzg
Copy link

cefitzg commented Jun 25, 2024

@Vaibhavdixit02 something is still off here. I'll try it again tomorrow. I'm not sure of the issue.

@cefitzg
Copy link

cefitzg commented Jun 26, 2024

@Vaibhavdixit02, I was able to get the new release

  [336ed68f] CSV v0.10.14
  [a93c6f00] DataFrames v1.6.1
  [0c46a032] DifferentialEquations v7.13.0
  [9aa1b823] FastClosures v0.3.2
  [f6369f11] ForwardDiff v0.10.36
  [c27321d9] Glob v1.3.1
  [a5e1c1ea] LatinHypercubeSampling v1.9.0
  [961ee093] ModelingToolkit v9.19.0
  [7f7a1694] Optimization v3.26.2
  [1ed8b502] SciMLSensitivity v7.61.1
  [90137ffa] StaticArrays v1.9.5
  [e88e6eb3] Zygote v0.6.70
  [37e2e46d] LinearAlgebra
  [9a3f8284] Random

and all my seeds are converging with message

┌ Warning: Unrecognized stop reason: CONVERGENCE: REL_REDUCTION_OF_F_<=_FACTR*EPSMCH             . Defaulting to ReturnCode.Failure.

And yet this is still giving me retcode failure when I run:

println("this is retcode", sol.retcode)

this is retcodeFailure

I am not sure what's the issue as you updated STOP_REASON_MAP. I don't believe there's a typo as I compared entry in STOP_REASON_MAP.

r"Unrecognized stop reason: CONVERGENCE: REL_REDUCTION_OF_F_<=_FACTR*EPSMCH" => ReturnCode.Success

matches the warning

  Unrecognized stop reason: CONVERGENCE: REL_REDUCTION_OF_F_<=_FACTR*EPSMCH 

Let me know if you have ideas on what could be the trouble.

@Vaibhavdixit02
Copy link
Member Author

You were absolutely right, sorry for the delay. Created a PR that should fix it (for real!) #782

@cefitzg
Copy link

cefitzg commented Jun 26, 2024

@Vaibhavdixit02, thanks for your help! I'll look for 3.26.3!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants