Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unknown expression / IR code typeof(var) = Core.SSAValue #174

Closed
penelopeysm opened this issue Oct 21, 2024 · 3 comments · Fixed by #175
Closed

Unknown expression / IR code typeof(var) = Core.SSAValue #174

penelopeysm opened this issue Oct 21, 2024 · 3 comments · Fixed by #175
Assignees

Comments

@penelopeysm
Copy link
Member

penelopeysm commented Oct 21, 2024

This code fails on both 1.10 and 1.11:

using Turing
@model function g()
    a ~ Normal(0, 1)
    b = 2
end
sample(g(), Gibbs(PG(10, :a), HMC(0.01, 4, :b)), 10)

Note that if we change b to ~ Normal(0, 1) as well, the code runs fine on both 1.10 and 1.11, so the bug is something to do with the assignment in the model.

In both cases the error is similar

┌ Error: Unknown IR code:
│   typeof(var) = Core.SSAValue
│   var = :(%85)
│   typeof(line) = Int64
│   line = 2
└ @ Libtask ~/.julia/packages/Libtask/0vMZ5/src/tapedfunction.jl:449
Sampling 100%|████████████████████████████████████████████████████████████████████| Time: 0:00:02
ERROR: Unknown IR code
Stacktrace:
  [1] translate!!(var::Core.SSAValue, line::Int64, bindings::Vector{Any}, isconst::Bool, ir::Core.CodeInfo)
    @ Libtask ~/.julia/packages/Libtask/0vMZ5/src/tapedfunction.jl:450
  [2] translate!(tape::Vector{Libtask.AbstractInstruction}, ir::Core.CodeInfo)
    @ Libtask ~/.julia/packages/Libtask/0vMZ5/src/tapedfunction.jl:320
  [3] Libtask.TapedFunction{…}(::typeof(g), ::DynamicPPL.Model{…}, ::Vararg{…}; cache::Bool, deepcopy_types::Type)
    @ Libtask ~/.julia/packages/Libtask/0vMZ5/src/tapedfunction.jl:73
  [4] Libtask.TapedFunction(::Function, ::DynamicPPL.Model{…}, ::Vararg{…}; cache::Bool, deepcopy_types::Type)
    @ Libtask ~/.julia/packages/Libtask/0vMZ5/src/tapedfunction.jl:80
  [5] TapedTask(::Function, ::DynamicPPL.Model{…}, ::Vararg{…}; deepcopy_types::Type)
    @ Libtask ~/.julia/packages/Libtask/0vMZ5/src/tapedtask.jl:76
  [6] TapedTask(::Turing.Essential.TracedModel{…}, ::AdvancedPS.TracedRNG{…}; kwargs::@Kwargs{…})
    @ Turing.Essential ~/ppl/lib/src/essential/container.jl:69
  [7] AdvancedPS.LibtaskModel(::Turing.Essential.TracedModel{…}, ::AdvancedPS.TracedRNG{…})
    @ AdvancedPSLibtaskExt ~/.julia/packages/AdvancedPS/qrjIF/ext/AdvancedPSLibtaskExt.jl:27
  [8] Trace
    @ ~/.julia/packages/AdvancedPS/qrjIF/ext/AdvancedPSLibtaskExt.jl:49 [inlined]
  [9] AdvancedPS.Trace(model::DynamicPPL.Model{…}, sampler::DynamicPPL.Sampler{…}, varinfo::DynamicPPL.TypedVarInfo{…}, rng::AdvancedPS.TracedRNG{…})
    @ Turing.Inference ~/ppl/lib/src/mcmc/particle_mcmc.jl:439
 [10] (::Turing.Inference.var"#66#67"{DynamicPPL.Model{…}, DynamicPPL.Sampler{…}, DynamicPPL.TypedVarInfo{…}})(::Int64)
    @ Turing.Inference ./none:0
 [11] iterate
    @ ./generator.jl:48 [inlined]
 [12] collect(itr::Base.Generator{UnitRange{…}, Turing.Inference.var"#66#67"{…}})
    @ Base ./array.jl:780
 [13] initialstep(rng::Random.TaskLocalRNG, model::DynamicPPL.Model{…}, spl::DynamicPPL.Sampler{…}, vi::DynamicPPL.TypedVarInfo{…}; kwargs::@Kwargs{…})
    @ Turing.Inference ~/ppl/lib/src/mcmc/particle_mcmc.jl:276
 [14] (::Turing.Inference.var"#75#77"{…})(local_spl::DynamicPPL.Sampler{…})
    @ Turing.Inference ~/ppl/lib/src/mcmc/gibbs.jl:208
 [15] map
    @ ./tuple.jl:356 [inlined]
 [16] initialstep(rng::Random.TaskLocalRNG, model::DynamicPPL.Model{…}, spl::DynamicPPL.Sampler{…}, vi::DynamicPPL.TypedVarInfo{…}; kwargs::@Kwargs{…})
    @ Turing.Inference ~/ppl/lib/src/mcmc/gibbs.jl:196
 [17] step(rng::Random.TaskLocalRNG, model::DynamicPPL.Model{…}, spl::DynamicPPL.Sampler{…}; initial_params::Nothing, kwargs::@Kwargs{})
    @ DynamicPPL ~/.julia/packages/DynamicPPL/ooLj8/src/sampler.jl:116
 [18] macro expansion
    @ ~/.julia/packages/DynamicPPL/ooLj8/src/sampler.jl:0 [inlined]
 [19] macro expansion
    @ ~/.julia/packages/ProgressLogging/6KXlp/src/ProgressLogging.jl:328 [inlined]
 [20] (::AbstractMCMC.var"#24#25"{…})()
    @ AbstractMCMC ~/.julia/packages/AbstractMCMC/jSkbw/src/logging.jl:12
 [21] with_logstate(f::AbstractMCMC.var"#24#25"{…}, logstate::Base.CoreLogging.LogState)
    @ Base.CoreLogging ./logging/logging.jl:522
 [22] with_logger(f::Function, logger::LoggingExtras.TeeLogger{Tuple{…}})
    @ Base.CoreLogging ./logging/logging.jl:632
 [23] with_progresslogger(f::Function, _module::Module, logger::Base.CoreLogging.ConsoleLogger)
    @ AbstractMCMC ~/.julia/packages/AbstractMCMC/jSkbw/src/logging.jl:36
 [24] macro expansion
    @ ~/.julia/packages/AbstractMCMC/jSkbw/src/logging.jl:11 [inlined]
 [25] mcmcsample(rng::Random.TaskLocalRNG, model::DynamicPPL.Model{…}, sampler::DynamicPPL.Sampler{…}, N::Int64; progress::Bool, progressname::String, callback::Nothing, num_warmup::Int64, discard_initial::Int64, thinning::Int64, chain_type::Type, initial_state::Nothing, kwargs::@Kwargs{})
    @ AbstractMCMC ~/.julia/packages/AbstractMCMC/jSkbw/src/sample.jl:142
 [26] sample(rng::Random.TaskLocalRNG, model::DynamicPPL.Model{…}, sampler::DynamicPPL.Sampler{…}, N::Int64; chain_type::Type, resume_from::Nothing, initial_state::Nothing, kwargs::@Kwargs{})
    @ DynamicPPL ~/.julia/packages/DynamicPPL/ooLj8/src/sampler.jl:93
 [27] sample
    @ ~/.julia/packages/DynamicPPL/ooLj8/src/sampler.jl:83 [inlined]
 [28] #sample#4
    @ ~/ppl/lib/src/mcmc/Inference.jl:305 [inlined]
 [29] sample
    @ ~/ppl/lib/src/mcmc/Inference.jl:296 [inlined]
 [30] #sample#3
    @ ~/ppl/lib/src/mcmc/Inference.jl:293 [inlined]
 [31] sample(model::DynamicPPL.Model{…}, alg::Gibbs{…}, N::Int64)
    @ Turing.Inference ~/ppl/lib/src/mcmc/Inference.jl:290
 [32] top-level scope
    @ REPL[4]:1
Some type information was truncated. Use `show(err)` to see complete types.
@penelopeysm penelopeysm changed the title Unknown expression typeof(var) = Core.SSAValue Unknown expression / IR code typeof(var) = Core.SSAValue Oct 22, 2024
@penelopeysm
Copy link
Member Author

This code fails on 1.11 only, but the error is slightly different. It's causing test failures upstream TuringLang/Turing.jl#2341.

using Turing
using Turing.RandomMeasures: ChineseRestaurantProcess, DirichletProcess
@model function imm(y, alpha, ::Type{M}=Vector{Float64}) where {M}
    N = length(y)
    rpm = DirichletProcess(alpha)
    z = zeros(Int, N)
    cluster_counts = zeros(Int, N)
    fill!(cluster_counts, 0)
    for i in 1:N
        z[i] ~ ChineseRestaurantProcess(rpm, cluster_counts)
        cluster_counts[z[i]] += 1
    end
    Kmax = findlast(!iszero, cluster_counts)
    m = M(undef, Kmax)
    for k in 1:Kmax
        m[k] ~ Normal(1.0, 1.0)
    end
end
model = imm(randn(100), 1.0)
sample(model, Gibbs(PG(10, :z), HMC(0.01, 4, :m)), 100)

but the error is slightly different:

┌ Error: Unknown Expression: 
│   typeof(var) = Core.SSAValue
│   var = :(%154)
│   typeof(line) = Expr
│   line = :($(Expr(:static_parameter, 1)))
└ @ Libtask ~/.julia/packages/Libtask/0vMZ5/src/tapedfunction.jl:443

TuringLang/Turing.jl#2341

@KDr2
Copy link
Member

KDr2 commented Oct 22, 2024

@penelopeysm Could you try this PR #175 on these cases?

@KDr2
Copy link
Member

KDr2 commented Oct 22, 2024

This code fails on both 1.10 and 1.11:

using Turing
@model function g()
    a ~ Normal(0, 1)
    b = 2
end
sample(g(), Gibbs(PG(10, :a), HMC(0.01, 4, :b)), 10)

Do you know what this model looks like after it is expanded? I want to create a new test case for it.

And also for this one:

@model function imm(y, alpha, ::Type{M}=Vector{Float64}) where {M}
    N = length(y)
    rpm = DirichletProcess(alpha)
    z = zeros(Int, N)
    cluster_counts = zeros(Int, N)
    fill!(cluster_counts, 0)
    for i in 1:N
        z[i] ~ ChineseRestaurantProcess(rpm, cluster_counts)
        cluster_counts[z[i]] += 1
    end
    Kmax = findlast(!iszero, cluster_counts)
    m = M(undef, Kmax)
    for k in 1:Kmax
        m[k] ~ Normal(1.0, 1.0)
    end
end

@penelopeysm @yebai
Thanks.

@KDr2 KDr2 self-assigned this Oct 22, 2024
@KDr2 KDr2 linked a pull request Oct 22, 2024 that will close this issue
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants