Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

multipathfinder crashes if PosDefException is thrown #181

Open
sebapersson opened this issue May 14, 2024 · 5 comments
Open

multipathfinder crashes if PosDefException is thrown #181

sebapersson opened this issue May 14, 2024 · 5 comments

Comments

@sebapersson
Copy link

Thanks for this package, great that Julia has a pathfinder implementation!

Regarding the issue, when running multipathfinder for a hard to integrate ODE model, for one of the initial points a PosDefException is thrown which causes multipathfinder to fail.

I think having an entire potentially long run of multipathfinder fail due to a single bad startguess is not ideal. So maybe there should be some exception handling in multipathfinder allowing a bad initial point to be discarded? This exception handling should cover both Hessian inverse computation, as well the optimization run (I have seen that sometimes Optim.LBFGS can also throw an error). To alert the user a warning can be thrown when this happens.

MVE

MVE (can take a bit of time to run).

using LogDensityProblems, LogDensityProblemsAD, MCMCChains, Bijectors,
      Distributions, ModelingToolkit, OrdinaryDiffEq, DataFrames,
      PEtab, Pathfinder

@parameters d a ω x1_0 x2_0
@variables t x1(t) x2(t)
D = Differential(t)
eqs = [
    D(x1) ~ x2,
    D(x2) ~ -d*(x1^2 - 1)*x2 - x1 + a*cos(2*ω*t)
]
specie_map = [x1 => 0, x2 => 0.0]
parameter_map = [a => 5, d => 5, ω => 2.464]
@named sys = ODESystem(eqs, defaults=Dict(x1 => x1_0, x2 => x2_0))

import StableRNGs
rng = StableRNGs.StableRNG(123)
# Simulate the model
oprob = ODEProblem(sys, specie_map, (0.0, 200.0), parameter_map)
tsave = collect(range(0.0, 200.0, 101))
dist1 = Normal(0.0, 0.2)
dist2 = Normal(0.0, 0.8)
_sol = solve(oprob, Rodas4(), abstol=1e-12, reltol=1e-12, saveat=tsave, tstops=tsave)
obs1 = _sol[:x1] .+ rand(rng, dist1, length(tsave))
obs2 = _sol[:x2] .+ rand(rng, dist2, length(tsave))

## Setup the parameter estimation problem
@parameters sigma1 sigma2
obs_x1 = PEtabObservable(x1, sigma1)
obs_x2 = PEtabObservable(x2, sigma2)
observables = Dict("obs_x1" => obs_x1, "obs_x2" => obs_x2)

_a = PEtabParameter(a, value=5.0, lb=2.0, ub=8.0, scale=:lin)
_d = PEtabParameter(d, value=5.0, lb=2.0, ub=8.0, scale=:lin)
_ω = PEtabParameter(ω, value=2.464, lb=2.0, ub=8.0, scale=:lin)
_x1_0 = PEtabParameter(x1_0, value=0.0, lb=-1.0, ub=3.0, scale=:lin)
_x2_0 = PEtabParameter(x2_0, value=0.0, lb=-1.0, ub=3.0, scale=:lin)
_sigma1 = PEtabParameter(sigma1, value=0.2, lb=1e-2, ub=2.0, scale=:lin)
_sigma2 = PEtabParameter(sigma2, value=0.8, lb=1e-2, ub=2.0, scale=:lin)
pest = [_a, _d, _ω, _x1_0, _x2_0, _sigma1, _sigma2]
_measurements1 = DataFrame(
        obs_id="obs_x1",
        time=_sol.t,
        measurement=obs1)
_measurements2 = DataFrame(
        obs_id="obs_x2",
        time=_sol.t,
        measurement=obs2)
measurements = vcat(_measurements1, _measurements2)

petab_model = PEtabModel(sys, observables, measurements, pest)
prob = PEtabODEProblem(petab_model,
                       ode_solver=ODESolver(AutoVern7(Rodas5()), abstol=1e-6, reltol=1e-6, maxiters=Int64(1e5)))
@unpack lower_bounds, upper_bounds = prob

target = PEtabLogDensity(prob)
nstartguesses = 10000
rng = StableRNGs.StableRNG(12)
# Uniformly sampled initial points ~ U(lb, ub)
_initial_points = [rand(rng, 7) .* (upper_bounds - lower_bounds) + lower_bounds for i in 1:nstartguesses]
initial_points = [target.inference_info.bijectors(x) for x in _initial_points]

pathfinder_res = multipathfinder(target, nstartguesses*2; init = initial_points)

With output;

ERROR: PosDefException: matrix is not positive definite; Cholesky factorization failed.
Stacktrace:
  [1] checkpositivedefinite
    @ ~/julia/usr/share/julia/stdlib/v1.10/LinearAlgebra/src/factorization.jl:67 [inlined]
  [2] cholesky!(A::LinearAlgebra.Symmetric{Float64, Matrix{Float64}}, ::LinearAlgebra.NoPivot; check::Bool)
    @ LinearAlgebra ~/julia/usr/share/julia/stdlib/v1.10/LinearAlgebra/src/cholesky.jl:269
  [3] cholesky! (repeats 2 times)
    @ ~/julia/usr/share/julia/stdlib/v1.10/LinearAlgebra/src/cholesky.jl:267 [inlined]
  [4] cholesky(A::LinearAlgebra.Symmetric{Float64, Matrix{Float64}}, ::LinearAlgebra.NoPivot; check::Bool)
    @ LinearAlgebra ~/julia/usr/share/julia/stdlib/v1.10/LinearAlgebra/src/cholesky.jl:401
  [5] cholesky (repeats 2 times)
    @ ~/julia/usr/share/julia/stdlib/v1.10/LinearAlgebra/src/cholesky.jl:401 [inlined]
  [6] pdfactorize(A::LinearAlgebra.Diagonal{Float64, Vector{Float64}}, B::Matrix{Float64}, D::Matrix{Float64})
    @ Pathfinder ~/.julia/packages/Pathfinder/WNKEJ/src/woodbury.jl:205
  [7] WoodburyPDMat
    @ ~/.julia/packages/Pathfinder/WNKEJ/src/woodbury.jl:262 [inlined]
  [8] lbfgs_inverse_hessian(H₀::LinearAlgebra.Diagonal{…}, S0::Matrix{…}, Y0::Matrix{…}, history_ind::Int64, history_length::Int64)
    @ Pathfinder ~/.julia/packages/Pathfinder/WNKEJ/src/inverse_hessian.jl:133
  [9] lbfgs_inverse_hessians(θs::Vector{…}, ∇logpθs::Vector{…}; Hinit::typeof(Pathfinder.gilbert_init), history_length::Int64, ϵ::Float64)
    @ Pathfinder ~/.julia/packages/Pathfinder/WNKEJ/src/inverse_hessian.jl:61
 [10] lbfgs_inverse_hessians
    @ ~/.julia/packages/Pathfinder/WNKEJ/src/inverse_hessian.jl:25 [inlined]
 [11] #fit_mvnormals#11
    @ ~/.julia/packages/Pathfinder/WNKEJ/src/mvnormal.jl:15 [inlined]
 [12] fit_mvnormals
    @ ~/.julia/packages/Pathfinder/WNKEJ/src/mvnormal.jl:14 [inlined]
 [13] _pathfinder(rng::Random._GLOBAL_RNG, prob::OptimizationProblem{…}, logp::Pathfinder.var"#logp#23"{}; history_length::Int64, optimizer::Optim.LBFGS{…}, ndraws_elbo::Int64, executor::Transducers.SequentialEx{…}, kwargs::@Kwargs{})
    @ Pathfinder ~/.julia/packages/Pathfinder/WNKEJ/src/singlepath.jl:293
 [14] _pathfinder
    @ ~/.julia/packages/Pathfinder/WNKEJ/src/singlepath.jl:277 [inlined]
 [15] _pathfinder_try_until_succeed(rng::Random._GLOBAL_RNG, prob::OptimizationProblem{…}, logp::Pathfinder.var"#logp#23"{}; ntries::Int64, init_scale::Int64, init_sampler::Pathfinder.UniformSampler{…}, allow_mutating_init::Bool, kwargs::@Kwargs{})
    @ Pathfinder ~/.julia/packages/Pathfinder/WNKEJ/src/singlepath.jl:263
 [16] _pathfinder_try_until_succeed
    @ ~/.julia/packages/Pathfinder/WNKEJ/src/singlepath.jl:251 [inlined]
 [17] #22
    @ ~/.julia/packages/Pathfinder/WNKEJ/src/singlepath.jl:180 [inlined]
 [18] progress(f::Pathfinder.var"#22#24"{}; name::String)
    @ ProgressLogging ~/.julia/packages/ProgressLogging/6KXlp/src/ProgressLogging.jl:262
 [19] progress
    @ ~/.julia/packages/ProgressLogging/6KXlp/src/ProgressLogging.jl:258 [inlined]
 [20] pathfinder(prob::OptimizationProblem{…}; rng::Random._GLOBAL_RNG, history_length::Int64, optimizer::Optim.LBFGS{…}, ndraws_elbo::Int64, ndraws::Int64, input::Function, kwargs::@Kwargs{})
    @ Pathfinder ~/.julia/packages/Pathfinder/WNKEJ/src/singlepath.jl:179
 [21] pathfinder
    @ ~/.julia/packages/Pathfinder/WNKEJ/src/singlepath.jl:165 [inlined]
 [22] #pathfinder#20
    @ ~/.julia/packages/Pathfinder/WNKEJ/src/singlepath.jl:163 [inlined]
 [23] pathfinder
    @ ~/.julia/packages/Pathfinder/WNKEJ/src/singlepath.jl:142 [inlined]
 [24] #29
    @ ~/.julia/packages/Pathfinder/WNKEJ/src/multipath.jl:166 [inlined]
 [25] next
    @ ~/.julia/packages/Transducers/IAWgA/src/library.jl:54 [inlined]
 [26] (::Transducers.var"#50#51"{})(u0::Tuple{…}, iresult::BangBang.SafeCollector{…})
    @ Transducers ~/.julia/packages/Transducers/IAWgA/src/library.jl:1302
 [27] wrapping
    @ ~/.julia/packages/Transducers/IAWgA/src/core.jl:734 [inlined]
 [28] next(rf::Transducers.Reduction{…}, result::Transducers.PrivateState{…}, input::Tuple{…})
    @ Transducers ~/.julia/packages/Transducers/IAWgA/src/library.jl:1300
 [29] macro expansion
    @ ~/.julia/packages/Transducers/IAWgA/src/core.jl:181 [inlined]
 [30] macro expansion
    @ ~/.julia/packages/Transducers/IAWgA/src/processes.jl:239 [inlined]
 [31] macro expansion
    @ ~/.julia/packages/Transducers/IAWgA/src/simd.jl:41 [inlined]
 [32] __foldl__
    @ ~/.julia/packages/Transducers/IAWgA/src/processes.jl:238 [inlined]
 [33] (::Transducers.var"#257#259"{})(id::Symbol)
    @ Transducers ~/.julia/packages/Transducers/IAWgA/src/progress.jl:95
 [34] __progress(f::Transducers.var"#257#259"{}; name::String)
    @ Transducers ~/.julia/packages/Transducers/IAWgA/src/progress.jl:74
 [35] __progress
    @ ~/.julia/packages/Transducers/IAWgA/src/progress.jl:70 [inlined]
 [36] __foldl__
    @ ~/.julia/packages/Transducers/IAWgA/src/progress.jl:81 [inlined]
 [37] #transduce#141
    @ ~/.julia/packages/Transducers/IAWgA/src/processes.jl:519 [inlined]
 [38] transduce(rf1::Transducers.Reduction{…}, init::BangBang.SafeCollector{…}, coll::Transducers.ProgressLoggingFoldable{…})
    @ Transducers ~/.julia/packages/Transducers/IAWgA/src/processes.jl:508
 [39] transduce(xform::Transducers.Composition{…}, f::Transducers.AdHocRF{…}, init::BangBang.SafeCollector{…}, coll::Transducers.ProgressLoggingFoldable{…}; kwargs::@Kwargs{})
    @ Transducers ~/.julia/packages/Transducers/IAWgA/src/processes.jl:502
 [40] transduce(xform::Transducers.Composition{…}, f::Transducers.AdHocRF{…}, init::BangBang.SafeCollector{…}, coll::Transducers.ProgressLoggingFoldable{…})
    @ Transducers ~/.julia/packages/Transducers/IAWgA/src/processes.jl:500
 [41] _collect(xf::Transducers.Map{…}, coll::Transducers.ProgressLoggingFoldable{…}, ::Transducers.SizeStable, ::Base.HasShape{…})
    @ Transducers ~/.julia/packages/Transducers/IAWgA/src/processes.jl:806
 [42] collect
    @ ~/.julia/packages/Transducers/IAWgA/src/processes.jl:802 [inlined]
 [43] collect(itr::Transducers.Eduction{…}, ex::Transducers.SequentialEx{…})
    @ Folds.Implementations ~/.julia/packages/Folds/qbSal/src/collect.jl:10
 [44] multipathfinder(optim_fun::OptimizationFunction{…}, ndraws::Int64; init::Vector{…}, input::PEtabLogDensity{…}, nruns::Int64, ndraws_elbo::Int64, ndraws_per_run::Int64, rng::Random._GLOBAL_RNG, history_length::Int64, optimizer::Optim.LBFGS{…}, executor::Transducers.SequentialEx{…}, executor_per_run::Transducers.SequentialEx{…}, importance::Bool, kwargs::@Kwargs{})
    @ Pathfinder ~/.julia/packages/Pathfinder/WNKEJ/src/multipath.jl:188
 [45] multipathfinder
    @ ~/.julia/packages/Pathfinder/WNKEJ/src/multipath.jl:132 [inlined]
 [46] #multipathfinder#27
    @ ~/.julia/packages/Pathfinder/WNKEJ/src/multipath.jl:130 [inlined]
 [47] top-level scope
    @ ~/Dropbox/PhD/Projects/Bayesian_benchmark/Code/Simulated_models/MVE.jl:63
Some type information was truncated. Use `show(err)` to see complete types.

I ran everything using Julia 1.10 on Ubuntu with the latest possible versions of each package.

@sethaxen
Copy link
Member

I have mixed feelings on this. We do have a resample and restart mechanism, so we could suppress errors. But the source of the error matters. We should detect and gracefully handle all errors coming from failed optimization. Clearly in this case we don't, so I'll try to identify which case we are missing.

Errors in model evaluation, however, are another thing. I lean towards we should not suppress these errors, as they do reflect problems that should be resolved, and an error message won't be as useful as a traceback (FWIW Turing won't suppress these either). Instead, the model implementer should detect errors that are acceptable and work around them, if possible.

I'm traveling until the end of the month so will look into this then. In the meantime, if this is a blocker for you, the issue is probably a non-finite number appearing in a position, gradient, or log-density. You may be able to detect this in a callback and return true to trigger termination.

@sebapersson
Copy link
Author

Thanks for the fast reply!

I agree with you on this. Error from failed optimization should be caught, while errors from the model should be handled appropriately by the modeling package.

@sebapersson
Copy link
Author

As an update to this issue, for another ODE model with a bad startguess I encountered an Optim.jl related error (see below). This error is rare, but can happen for ODE models when having a bad startguess.

ERROR: AssertionError: isfinite(phi_c) && isfinite(dphi_c)
Stacktrace:
  [1] (::LineSearches.HagerZhang{…})(ϕ::Function, ϕdϕ::LineSearches.var"#ϕdϕ#6"{}, c::Float64, phi_0::Float64, dphi_0::Float64)
    @ LineSearches ~/.julia/packages/LineSearches/G1LRk/src/hagerzhang.jl:299
  [2] HagerZhang
    @ ~/.julia/packages/LineSearches/G1LRk/src/hagerzhang.jl:101 [inlined]
  [3] perform_linesearch!(state::Optim.LBFGSState{…}, method::Optim.LBFGS{…}, d::Optim.ManifoldObjective{…})
    @ Optim ~/.julia/packages/Optim/ZhuZN/src/utilities/perform_linesearch.jl:58
  [4] update_state!(d::NLSolversBase.TwiceDifferentiable{…}, state::Optim.LBFGSState{…}, method::Optim.LBFGS{…})
    @ Optim ~/.julia/packages/Optim/ZhuZN/src/multivariate/solvers/first_order/l_bfgs.jl:204
  [5] optimize(d::NLSolversBase.TwiceDifferentiable{…}, initial_x::Vector{…}, method::Optim.LBFGS{…}, options::Optim.Options{…}, state::Optim.LBFGSState{…})
    @ Optim ~/.julia/packages/Optim/ZhuZN/src/multivariate/optimize/optimize.jl:54
  [6] optimize(d::NLSolversBase.TwiceDifferentiable{…}, initial_x::Vector{…}, method::Optim.LBFGS{…}, options::Optim.Options{…})
    @ Optim ~/.julia/packages/Optim/ZhuZN/src/multivariate/optimize/optimize.jl:36
  [7] __solve(cache::OptimizationBase.OptimizationCache{…})
    @ OptimizationOptimJL ~/.julia/packages/OptimizationOptimJL/fdBis/src/OptimizationOptimJL.jl:214
  [8] solve!
    @ ~/.julia/packages/SciMLBase/JUp1I/src/solve.jl:188 [inlined]
  [9] #solve#625
    @ ~/.julia/packages/SciMLBase/JUp1I/src/solve.jl:96 [inlined]
 [10] optimize_with_trace(prob::OptimizationProblem{…}, optimizer::Optim.LBFGS{…}; progress_name::String, progress_id::Base.UUID, maxiters::Int64, callback::Nothing, fail_on_nonfinite::Bool, kwargs::@Kwargs{})
    @ Pathfinder ~/.julia/packages/Pathfinder/p5Vuk/src/optimize.jl:46
 [11] optimize_with_trace
    @ ~/.julia/packages/Pathfinder/p5Vuk/src/optimize.jl:20 [inlined]
 [12] _pathfinder(rng::Random._GLOBAL_RNG, prob::OptimizationProblem{…}, logp::Pathfinder.var"#logp#23"{}; history_length::Int64, optimizer::Optim.LBFGS{…}, ndraws_elbo::Int64, executor::Transducers.SequentialEx{…}, kwargs::@Kwargs{})
    @ Pathfinder ~/.julia/packages/Pathfinder/p5Vuk/src/singlepath.jl:288
 [13] _pathfinder
    @ ~/.julia/packages/Pathfinder/p5Vuk/src/singlepath.jl:277 [inlined]
 [14] _pathfinder_try_until_succeed(rng::Random._GLOBAL_RNG, prob::OptimizationProblem{…}, logp::Pathfinder.var"#logp#23"{}; ntries::Int64, init_scale::Int64, init_sampler::Pathfinder.UniformSampler{…}, allow_mutating_init::Bool, kwargs::@Kwargs{})
    @ Pathfinder ~/.julia/packages/Pathfinder/p5Vuk/src/singlepath.jl:263
 [15] _pathfinder_try_until_succeed
    @ ~/.julia/packages/Pathfinder/p5Vuk/src/singlepath.jl:251 [inlined]
 [16] #22
    @ ~/.julia/packages/Pathfinder/p5Vuk/src/singlepath.jl:180 [inlined]
 [17] progress(f::Pathfinder.var"#22#24"{}; name::String)
    @ ProgressLogging ~/.julia/packages/ProgressLogging/6KXlp/src/ProgressLogging.jl:262
 [18] progress
    @ ~/.julia/packages/ProgressLogging/6KXlp/src/ProgressLogging.jl:258 [inlined]
 [19] pathfinder(prob::OptimizationProblem{…}; rng::Random._GLOBAL_RNG, history_length::Int64, optimizer::Optim.LBFGS{…}, ndraws_elbo::Int64, ndraws::Int64, input::Function, kwargs::@Kwargs{})
    @ Pathfinder ~/.julia/packages/Pathfinder/p5Vuk/src/singlepath.jl:179
 [20] pathfinder
    @ ~/.julia/packages/Pathfinder/p5Vuk/src/singlepath.jl:165 [inlined]
 [21] #pathfinder#20
    @ ~/.julia/packages/Pathfinder/p5Vuk/src/singlepath.jl:163 [inlined]
 [22] pathfinder
    @ ~/.julia/packages/Pathfinder/p5Vuk/src/singlepath.jl:142 [inlined]
 [23] #29
    @ ~/.julia/packages/Pathfinder/p5Vuk/src/multipath.jl:166 [inlined]
 [24] next
    @ ~/.julia/packages/Transducers/IAWgA/src/library.jl:54 [inlined]
 [25] (::Transducers.var"#50#51"{})(u0::Tuple{…}, iresult::BangBang.SafeCollector{…})
    @ Transducers ~/.julia/packages/Transducers/IAWgA/src/library.jl:1302
 [26] wrapping
    @ ~/.julia/packages/Transducers/IAWgA/src/core.jl:734 [inlined]
 [27] next(rf::Transducers.Reduction{…}, result::Transducers.PrivateState{…}, input::Tuple{…})
    @ Transducers ~/.julia/packages/Transducers/IAWgA/src/library.jl:1300
 [28] macro expansion
    @ ~/.julia/packages/Transducers/IAWgA/src/core.jl:181 [inlined]
 [29] macro expansion
    @ ~/.julia/packages/Transducers/IAWgA/src/processes.jl:239 [inlined]
 [30] macro expansion
    @ ~/.julia/packages/Transducers/IAWgA/src/simd.jl:41 [inlined]
 [31] __foldl__
    @ ~/.julia/packages/Transducers/IAWgA/src/processes.jl:238 [inlined]
 [32] (::Transducers.var"#257#259"{})(id::Symbol)
    @ Transducers ~/.julia/packages/Transducers/IAWgA/src/progress.jl:95
 [33] __progress(f::Transducers.var"#257#259"{}; name::String)
    @ Transducers ~/.julia/packages/Transducers/IAWgA/src/progress.jl:74
 [34] __progress
    @ ~/.julia/packages/Transducers/IAWgA/src/progress.jl:70 [inlined]
 [35] __foldl__
    @ ~/.julia/packages/Transducers/IAWgA/src/progress.jl:81 [inlined]
 [36] #transduce#141
    @ ~/.julia/packages/Transducers/IAWgA/src/processes.jl:519 [inlined]
 [37] transduce(rf1::Transducers.Reduction{…}, init::BangBang.SafeCollector{…}, coll::Transducers.ProgressLoggingFoldable{…})
    @ Transducers ~/.julia/packages/Transducers/IAWgA/src/processes.jl:508
 [38] transduce(xform::Transducers.Composition{…}, f::Transducers.AdHocRF{…}, init::BangBang.SafeCollector{…}, coll::Transducers.ProgressLoggingFoldable{…}; kwargs::@Kwargs{})
    @ Transducers ~/.julia/packages/Transducers/IAWgA/src/processes.jl:502
 [39] transduce(xform::Transducers.Composition{…}, f::Transducers.AdHocRF{…}, init::BangBang.SafeCollector{…}, coll::Transducers.ProgressLoggingFoldable{…})
    @ Transducers ~/.julia/packages/Transducers/IAWgA/src/processes.jl:500
 [40] _collect(xf::Transducers.Map{…}, coll::Transducers.ProgressLoggingFoldable{…}, ::Transducers.SizeStable, ::Base.HasShape{…})
    @ Transducers ~/.julia/packages/Transducers/IAWgA/src/processes.jl:806
 [41] collect
    @ ~/.julia/packages/Transducers/IAWgA/src/processes.jl:802 [inlined]
 [42] collect(itr::Transducers.Eduction{…}, ex::Transducers.SequentialEx{…})
    @ Folds.Implementations ~/.julia/packages/Folds/qbSal/src/collect.jl:10
 [43] multipathfinder(optim_fun::OptimizationFunction{…}, ndraws::Int64; init::Vector{…}, input::PEtabLogDensity{…}, nruns::Int64, ndraws_elbo::Int64, ndraws_per_run::Int64, rng::Random._GLOBAL_RNG, history_length::Int64, optimizer::Optim.LBFGS{…}, executor::Transducers.SequentialEx{…}, executor_per_run::Transducers.SequentialEx{…}, importance::Bool, kwargs::@Kwargs{})
    @ Pathfinder ~/.julia/packages/Pathfinder/p5Vuk/src/multipath.jl:188
 [44] multipathfinder
    @ ~/.julia/packages/Pathfinder/p5Vuk/src/multipath.jl:132 [inlined]
 [45] #multipathfinder#27
    @ ~/.julia/packages/Pathfinder/p5Vuk/src/multipath.jl:130 [inlined]
 [46] _get_starting_points(petab_problem::PEtabODEProblem{…}, dir_save::String; nstartguesses::Int64, van_der_pool::Bool)
    @ Main ~/Dropbox/PhD/Projects/Bayesian_benchmark/Benchmarks/Get_startguesses.jl:41
 [47] top-level scope
    @ ~/Dropbox/PhD/Projects/Bayesian_benchmark/Benchmarks/Test_Boehm.jl:11
Some type information was truncated. Use `show(err)` to see complete types.

@sethaxen
Copy link
Member

I just tried to test this with the latest versions of all packages on Julia v1.10, but it failed to load because ModelingToolkit failed to precompile. Do you happen to have a Manifest.toml you can share or a smaller MVE?

@sebapersson
Copy link
Author

Sorry for late reply (missed this in my notifications). Hopefully this attached manifest file makes it run
Manifest.zip.

If this does not work I can try to create a smaller MVE (it is hard though to have a smaller MVE as this error mainly seem to occur with a hard to evaluate likelihood). Alternatively, I can try via StableRNG hacking set up a MVE where failure is faster (once I updated PEtab to the new ModelingToolkit version which should resolve the MTK problem)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants