Skip to content

Rethinking SMC#2789

Draft
charlesknipp wants to merge 10 commits intomainfrom
ck/smc
Draft

Rethinking SMC#2789
charlesknipp wants to merge 10 commits intomainfrom
ck/smc

Conversation

@charlesknipp
Copy link
Copy Markdown
Member

Following the latest major release of DynamicPPL, I finally scrapped together my ideas for the long awaited restructuring of SMC within Turing.

Major Changes

  • TracedModel works quite differently under the hood. While Libtask is still a central component of this mechanic, the produce call from accumulation is delayed to ensure VarInfo is caught up in terms of log-likelihoods.
  • SMC no longer uses AbstractMCMC to interface; however ParticleGibbs still does.
  • We no longer use AdvancedPS as a dependency, opting instead to meld DynamicPPL directly with SMC algorithms. If one wants a more general interface outside of DynamicPPL, we direct the interested user to GeneralisedFilters.
  • An interface for particle rejuvenation has been added to the SMC stack, which would allow MCMC algorithms to explore the parameter space in case of particle degeneracy.
  • AbstractMCMCEnsemble now interacts with SMC samplers along the reweighting step, which is designed to operate in parallel.
  • The Libtask + DynamicPPL stack leaves a much smaller footprint than before.
  • TracedRNG has been removed in favor of a taped global RNG manipulation to facilitate allocation efficient replayability of referenced trajectories in Particle Gibbs.

TODO List

The original interface is for the most part replicated in its entirety. There are a handful of minor tweaks I need to make in order to better reinterface with some internal methods.

  • Finish AbstractMCMC wrapper for Particle Gibbs
  • Finish particle rejuvenation
    • Either a proper implementation of PartialLogDensity, or something to facilitate a partially observed model
    • Generalize rejuvenation to support external samplers
  • Link smcsample with MCMCChains/FlexiChains so that its consistent with the rest of the module
  • Write unit tests

Notes

This is all based on my self contained reimplementation here which serves more as a workspace for demonstration and experimentation. You can think of this PR as a subset of my personal repo; with that being said, I have a couple questions on proper integration:

  1. Should I move my DynamicPPL methods over there?
  2. What opinions do you have on the proposed interface changes?

Lastly, I would really appreciate help with the realization of particle rejuvenation. I have a demo over on TuringSMC which showcases a proof of concept.

@github-actions
Copy link
Copy Markdown
Contributor

Turing.jl documentation for PR #2789 is available at:
https://TuringLang.github.io/Turing.jl/previews/PR2789/

@codecov
Copy link
Copy Markdown

codecov bot commented Mar 13, 2026

Codecov Report

❌ Patch coverage is 0% with 176 lines in your changes missing coverage. Please review.
✅ Project coverage is 20.21%. Comparing base (33e1a22) to head (80bfac8).
⚠️ Report is 1 commits behind head on main.

Files with missing lines Patch % Lines
src/mcmc/smc.jl 0.00% 174 Missing ⚠️
src/mcmc/gibbs.jl 0.00% 2 Missing ⚠️

❗ There is a different number of reports uploaded between BASE (33e1a22) and HEAD (80bfac8). Click for more details.

HEAD has 18 uploads less than BASE
Flag BASE (33e1a22) HEAD (80bfac8)
24 6
Additional details and impacted files
@@             Coverage Diff             @@
##             main    #2789       +/-   ##
===========================================
- Coverage   86.34%   20.21%   -66.13%     
===========================================
  Files          22       23        +1     
  Lines        1435     1588      +153     
===========================================
- Hits         1239      321      -918     
- Misses        196     1267     +1071     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@charlesknipp
Copy link
Copy Markdown
Member Author

charlesknipp commented Mar 18, 2026

@penelopeysm I am running into the following issue when touching up my implementation. Suppose we run the following simple example

using Random
using Turing

@model function linear_regression(x, y)
    β ~ Normal(0, 1)
    σ ~ truncated(Cauchy(0, 3); lower=0)
    for t in eachindex(x)
        y[t] ~ Normal* x[t], σ)
    end
end

# condition the model
rng = MersenneTwister(1234)
x, y = rand(rng, 10), rand(rng, 10)
reg_model = linear_regression(x, y)

rng = MersenneTwister(1234)
particles = sample(rng, reg_model, SMC(0.5), 512, ensemble=MCMCSerial());

Where I get the following error:

ExceptionStack
LoadError: IR verification failed.
    Code location: .../julia/dev/Turing/src/mcmc/smc.jl:97
Stacktrace:
  [1] error(::String, ::String, ::String, ::Symbol, ::String, ::Int32)
    @ Base ./error.jl:54
  [2] (::Compiler.var"#raise_error#verify_ir##0"{Compiler.IRCode, Nothing})()
    @ Compiler .../julia/Compiler/src/ssair/verify.jl:125
  [3] check_op(ir::Compiler.IRCode, domtree::Compiler.GenericDomTree{false}, op::Any, use_bb::Int64, use_idx::Int64, printed_use_idx::Int64, print::Bool, isforeigncall::Bool, arg_idx::Int64, allow_frontend_forms::Bool, raise_error::Any)
    @ Compiler ./../usr/share/julia/Compiler/src/ssair/verify.jl:71
  [4] verify_ir(ir::Compiler.IRCode, print::Bool, allow_frontend_forms::Bool, 𝕃ₒ::Compiler.PartialsLattice{Compiler.ConstsLattice}, mi::Nothing)
    @ Compiler .../julia/Compiler/src/ssair/verify.jl:429
  [5] verify_ir
    @ .../julia/Compiler/src/ssair/verify.jl:110 [inlined]
  [6] optimise_ir!(ir::Compiler.IRCode; show_ir::Bool, do_inline::Bool)
    @ Libtask .../julia/packages/Libtask/NP9j5/src/utils.jl:46
  [7] optimise_ir!
    @ .../julia/packages/Libtask/NP9j5/src/utils.jl:40 [inlined]
  [8] build_callable(sig::Type{Tuple{typeof(Turing.Inference.init_context), Random123.Philox2x{UInt64, 10}, DynamicPPL.VarInfo{DynamicPPL.UnlinkAll, VarNamedTuple{(), Tuple{}}, DynamicPPL.AccumulatorTuple{3, @NamedTuple{LogPrior::DynamicPPL.LogPriorAccumulator{Float64}, LogJacobian::DynamicPPL.LogJacobianAccumulator{Float64}, LogLikelihood::Turing.Inference.ProduceLogLikelihoodAccumulator{Float64}}}}, AbstractPPL.VarName{:β, AbstractPPL.Iden}}})
    @ Libtask .../julia/packages/Libtask/NP9j5/src/copyable_task.jl:173
  [9] (::Libtask.DynamicCallable{Dict{Any, Any}})(::Function, ::Random123.Philox2x{UInt64, 10}, ::DynamicPPL.VarInfo{DynamicPPL.UnlinkAll, VarNamedTuple{(), Tuple{}}, DynamicPPL.AccumulatorTuple{3, @NamedTuple{LogPrior::DynamicPPL.LogPriorAccumulator{Float64}, LogJacobian::DynamicPPL.LogJacobianAccumulator{Float64}, LogLikelihood::Turing.Inference.ProduceLogLikelihoodAccumulator{Float64}}}}, ::AbstractPPL.VarName{:β, AbstractPPL.Iden})
    @ Libtask .../julia/packages/Libtask/NP9j5/src/copyable_task.jl:1407
 [10] tilde_assume!!
    @ .../julia/dev/Turing/src/mcmc/smc.jl:107 [inlined]
 [11] (::Tuple{Base.RefValue{AbstractRNG}, Base.RefValue{Any}, Base.RefValue{Any}, Base.RefValue{Any}, Base.RefValue{Any}, Base.RefValue{Any}, Base.RefValue{Any}, Base.RefValue{Any}, Base.RefValue{Tuple{Any, Any}}, Base.RefValue{Libtask.DynamicCallable{Dict{Any, Any}}}, Base.RefValue{Libtask.DynamicCallable{Dict{Any, Any}}}, Base.RefValue{Libtask.DynamicCallable{Dict{Any, Any}}}, Base.RefValue{Libtask.DynamicCallable{Dict{Any, Any}}}, Base.RefValue{Int32}})(_2::typeof(DynamicPPL.tilde_assume!!), _3::Turing.Inference.SMCContext, _4::Normal{Float64}, _5::AbstractPPL.VarName{:β, AbstractPPL.Iden}, _6::DynamicPPL.VarNamedTuples.NoTemplate, _7::DynamicPPL.VarInfo{DynamicPPL.UnlinkAll, VarNamedTuple{(), Tuple{}}, DynamicPPL.AccumulatorTuple{3, @NamedTuple{LogPrior::DynamicPPL.LogPriorAccumulator{Float64}, LogJacobian::DynamicPPL.LogJacobianAccumulator{Float64}, LogLikelihood::Turing.Inference.ProduceLogLikelihoodAccumulator{Float64}}}})
    @ Base.Experimental ./<missing>:0
 [12] (::MistyClosures.MistyClosure{Core.OpaqueClosure{Tuple{typeof(DynamicPPL.tilde_assume!!), Turing.Inference.SMCContext, Normal{Float64}, AbstractPPL.VarName{:β, AbstractPPL.Iden}, DynamicPPL.VarNamedTuples.NoTemplate, DynamicPPL.VarInfo{DynamicPPL.UnlinkAll, VarNamedTuple{(), Tuple{}}, DynamicPPL.AccumulatorTuple{3, @NamedTuple{LogPrior::DynamicPPL.LogPriorAccumulator{Float64}, LogJacobian::DynamicPPL.LogJacobianAccumulator{Float64}, LogLikelihood::Turing.Inference.ProduceLogLikelihoodAccumulator{Float64}}}}}, Union{Tuple{Any, Any}, Libtask.ProducedValue}}})(::Function, ::Turing.Inference.SMCContext, ::Normal{Float64}, ::AbstractPPL.VarName{:β, AbstractPPL.Iden}, ::DynamicPPL.VarNamedTuples.NoTemplate, ::DynamicPPL.VarInfo{DynamicPPL.UnlinkAll, VarNamedTuple{(), Tuple{}}, DynamicPPL.AccumulatorTuple{3, @NamedTuple{LogPrior::DynamicPPL.LogPriorAccumulator{Float64}, LogJacobian::DynamicPPL.LogJacobianAccumulator{Float64}, LogLikelihood::Turing.Inference.ProduceLogLikelihoodAccumulator{Float64}}}})
    @ MistyClosures .../julia/packages/MistyClosures/2vtLL/src/MistyClosures.jl:22
 [13] (::Libtask.LazyCallable{Tuple{typeof(DynamicPPL.tilde_assume!!), Turing.Inference.SMCContext, Normal{Float64}, AbstractPPL.VarName{:β, AbstractPPL.Iden}, DynamicPPL.VarNamedTuples.NoTemplate, DynamicPPL.VarInfo{DynamicPPL.UnlinkAll, VarNamedTuple{(), Tuple{}}, DynamicPPL.AccumulatorTuple{3, @NamedTuple{LogPrior::DynamicPPL.LogPriorAccumulator{Float64}, LogJacobian::DynamicPPL.LogJacobianAccumulator{Float64}, LogLikelihood::Turing.Inference.ProduceLogLikelihoodAccumulator{Float64}}}}}, Union{Tuple{Any, Any}, Libtask.ProducedValue}})(::Function, ::Turing.Inference.SMCContext, ::Normal{Float64}, ::AbstractPPL.VarName{:β, AbstractPPL.Iden}, ::DynamicPPL.VarNamedTuples.NoTemplate, ::DynamicPPL.VarInfo{DynamicPPL.UnlinkAll, VarNamedTuple{(), Tuple{}}, DynamicPPL.AccumulatorTuple{3, @NamedTuple{LogPrior::DynamicPPL.LogPriorAccumulator{Float64}, LogJacobian::DynamicPPL.LogJacobianAccumulator{Float64}, LogLikelihood::Turing.Inference.ProduceLogLikelihoodAccumulator{Float64}}}})
    @ Libtask .../julia/packages/Libtask/NP9j5/src/copyable_task.jl:1381
 [14] linear_regression
    @ .../julia/dev/Turing/testing.jl:4 [inlined]
 [15] (::Tuple{Base.RefValue{Tuple{Any, Any}}, Base.RefValue{Any}, Base.RefValue{Any}, Base.RefValue{Tuple{Any, Any}}, Base.RefValue{Any}, Base.RefValue{Any}, Base.RefValue{Tuple{Int64}}, Base.RefValue{Bool}, Base.RefValue{Int64}, Base.RefValue{Bool}, Base.RefValue{Bool}, Base.RefValue{Int64}, Base.RefValue{Int64}, Base.RefValue{Bool}, Base.RefValue{Int64}, Base.RefValue{Int64}, Base.RefValue{Any}, Base.RefValue{Vector{Float64}}, Base.RefValue{Bool}, Base.RefValue{Int64}, Base.RefValue{UInt64}, Base.RefValue{Tuple{Int64}}, Base.RefValue{Bool}, Base.RefValue{Int64}, Base.RefValue{UInt64}, Base.RefValue{Bool}, Base.RefValue{Tuple{Int64}}, Base.RefValue{MemoryRef{Float64}}, Base.RefValue{MemoryRef{Float64}}, Base.RefValue{Float64}, Base.RefValue{Any}, Base.RefValue{Normal}, Base.RefValue{Tuple{Int64}}, Base.RefValue{AbstractPPL.Index{Tuple{Int64}, @NamedTuple{}, AbstractPPL.Iden}}, Base.RefValue{AbstractPPL.VarName{:y, AbstractPPL.Index{Tuple{Int64}, @NamedTuple{}, AbstractPPL.Iden}}}, Base.RefValue{Bool}, Base.RefValue{Int64}, Base.RefValue{UInt64}, Base.RefValue{Tuple{Int64}}, Base.RefValue{Bool}, Base.RefValue{Int64}, Base.RefValue{UInt64}, Base.RefValue{Bool}, Base.RefValue{Tuple{Int64}}, Base.RefValue{Bool}, Base.RefValue{Int64}, Base.RefValue{UInt64}, Base.RefValue{Tuple{Int64}}, Base.RefValue{Bool}, Base.RefValue{Int64}, Base.RefValue{UInt64}, Base.RefValue{Bool}, Base.RefValue{Tuple{Int64}}, Base.RefValue{MemoryRef{Float64}}, Base.RefValue{MemoryRef{Float64}}, Base.RefValue{Float64}, Base.RefValue{Tuple{Float64, Union{DynamicPPL.OnlyAccsVarInfo, DynamicPPL.ThreadSafeVarInfo, DynamicPPL.VarInfo}}}, Base.RefValue{Float64}, Base.RefValue{Union{DynamicPPL.OnlyAccsVarInfo, DynamicPPL.ThreadSafeVarInfo, DynamicPPL.VarInfo}}, Base.RefValue{Bool}, Base.RefValue{Int64}, Base.RefValue{UInt64}, Base.RefValue{Tuple{Int64}}, Base.RefValue{Bool}, Base.RefValue{Int64}, Base.RefValue{UInt64}, Base.RefValue{Bool}, Base.RefValue{Tuple{Int64}}, Base.RefValue{MemoryRef{Float64}}, Base.RefValue{MemoryRef{Float64}}, Base.RefValue{Bool}, Base.RefValue{Int64}, Base.RefValue{Int64}, Base.RefValue{Int64}, Base.RefValue{Bool}, Base.RefValue{Bool}, Base.RefValue{Any}, Base.RefValue{Tuple{Nothing, Any}}, Base.RefValue{Libtask.LazyCallable{Tuple{typeof(DynamicPPL.tilde_assume!!), Turing.Inference.SMCContext, Normal{Float64}, AbstractPPL.VarName{:β, AbstractPPL.Iden}, DynamicPPL.VarNamedTuples.NoTemplate, DynamicPPL.VarInfo{DynamicPPL.UnlinkAll, VarNamedTuple{(), Tuple{}}, DynamicPPL.AccumulatorTuple{3, @NamedTuple{LogPrior::DynamicPPL.LogPriorAccumulator{Float64}, LogJacobian::DynamicPPL.LogJacobianAccumulator{Float64}, LogLikelihood::Turing.Inference.ProduceLogLikelihoodAccumulator{Float64}}}}}, Union{Tuple{Any, Any}, Libtask.ProducedValue}}}, Base.RefValue{Libtask.DynamicCallable{Dict{Any, Any}}}, Base.RefValue{Libtask.DynamicCallable{Dict{Any, Any}}}, Base.RefValue{Libtask.DynamicCallable{Dict{Any, Any}}}, Base.RefValue{Libtask.DynamicCallable{Dict{Any, Any}}}, Base.RefValue{Int32}})(_2::typeof(linear_regression), _3::DynamicPPL.Model{typeof(linear_regression), (:x, :y), (), (), Tuple{Vector{Float64}, Vector{Float64}}, Tuple{}, Turing.Inference.SMCContext, false}, _4::DynamicPPL.VarInfo{DynamicPPL.UnlinkAll, VarNamedTuple{(), Tuple{}}, DynamicPPL.AccumulatorTuple{3, @NamedTuple{LogPrior::DynamicPPL.LogPriorAccumulator{Float64}, LogJacobian::DynamicPPL.LogJacobianAccumulator{Float64}, LogLikelihood::Turing.Inference.ProduceLogLikelihoodAccumulator{Float64}}}}, _5::Vector{Float64}, _6::Vector{Float64})
    @ Base.Experimental ./<missing>:0
 [16] consume
    @ .../julia/packages/Libtask/NP9j5/src/copyable_task.jl:472 [inlined]
 [17] consume(trace::Turing.Inference.TracedModel{Libtask.TapedTask{Random123.Philox2x{UInt64, 10}, Tuple{typeof(linear_regression), DynamicPPL.Model{typeof(linear_regression), (:x, :y), (), (), Tuple{Vector{Float64}, Vector{Float64}}, Tuple{}, Turing.Inference.SMCContext, false}, DynamicPPL.VarInfo{DynamicPPL.UnlinkAll, VarNamedTuple{(), Tuple{}}, DynamicPPL.AccumulatorTuple{3, @NamedTuple{LogPrior::DynamicPPL.LogPriorAccumulator{Float64}, LogJacobian::DynamicPPL.LogJacobianAccumulator{Float64}, LogLikelihood::Turing.Inference.ProduceLogLikelihoodAccumulator{Float64}}}}, Vector{Float64}, Vector{Float64}}, MistyClosures.MistyClosure{Core.OpaqueClosure{Tuple{typeof(linear_regression), DynamicPPL.Model{typeof(linear_regression), (:x, :y), (), (), Tuple{Vector{Float64}, Vector{Float64}}, Tuple{}, Turing.Inference.SMCContext, false}, DynamicPPL.VarInfo{DynamicPPL.UnlinkAll, VarNamedTuple{(), Tuple{}}, DynamicPPL.AccumulatorTuple{3, @NamedTuple{LogPrior::DynamicPPL.LogPriorAccumulator{Float64}, LogJacobian::DynamicPPL.LogJacobianAccumulator{Float64}, LogLikelihood::Turing.Inference.ProduceLogLikelihoodAccumulator{Float64}}}}, Vector{Float64}, Vector{Float64}}, Union{Tuple{Nothing, Any}, Libtask.ProducedValue}}}}})
    @ Turing.Inference .../julia/dev/Turing/src/mcmc/smc.jl:19
 [18] advance!
    @ .../julia/dev/Turing/src/mcmc/smc.jl:201 [inlined]
 [19] (::Turing.Inference.var"#reweight!##0#reweight!##1")(particle::Turing.Inference.Particle{Turing.Inference.TracedModel{Libtask.TapedTask{Random123.Philox2x{UInt64, 10}, Tuple{typeof(linear_regression), DynamicPPL.Model{typeof(linear_regression), (:x, :y), (), (), Tuple{Vector{Float64}, Vector{Float64}}, Tuple{}, Turing.Inference.SMCContext, false}, DynamicPPL.VarInfo{DynamicPPL.UnlinkAll, VarNamedTuple{(), Tuple{}}, DynamicPPL.AccumulatorTuple{3, @NamedTuple{LogPrior::DynamicPPL.LogPriorAccumulator{Float64}, LogJacobian::DynamicPPL.LogJacobianAccumulator{Float64}, LogLikelihood::Turing.Inference.ProduceLogLikelihoodAccumulator{Float64}}}}, Vector{Float64}, Vector{Float64}}, MistyClosures.MistyClosure{Core.OpaqueClosure{Tuple{typeof(linear_regression), DynamicPPL.Model{typeof(linear_regression), (:x, :y), (), (), Tuple{Vector{Float64}, Vector{Float64}}, Tuple{}, Turing.Inference.SMCContext, false}, DynamicPPL.VarInfo{DynamicPPL.UnlinkAll, VarNamedTuple{(), Tuple{}}, DynamicPPL.AccumulatorTuple{3, @NamedTuple{LogPrior::DynamicPPL.LogPriorAccumulator{Float64}, LogJacobian::DynamicPPL.LogJacobianAccumulator{Float64}, LogLikelihood::Turing.Inference.ProduceLogLikelihoodAccumulator{Float64}}}}, Vector{Float64}, Vector{Float64}}, Union{Tuple{Nothing, Any}, Libtask.ProducedValue}}}}}, Float64})
    @ Turing.Inference .../julia/dev/Turing/src/mcmc/smc.jl:319
 [20] iterate
    @ ./generator.jl:48 [inlined]
 [21] _collect(c::Vector{Turing.Inference.Particle{Turing.Inference.TracedModel{Libtask.TapedTask{Random123.Philox2x{UInt64, 10}, Tuple{typeof(linear_regression), DynamicPPL.Model{typeof(linear_regression), (:x, :y), (), (), Tuple{Vector{Float64}, Vector{Float64}}, Tuple{}, Turing.Inference.SMCContext, false}, DynamicPPL.VarInfo{DynamicPPL.UnlinkAll, VarNamedTuple{(), Tuple{}}, DynamicPPL.AccumulatorTuple{3, @NamedTuple{LogPrior::DynamicPPL.LogPriorAccumulator{Float64}, LogJacobian::DynamicPPL.LogJacobianAccumulator{Float64}, LogLikelihood::Turing.Inference.ProduceLogLikelihoodAccumulator{Float64}}}}, Vector{Float64}, Vector{Float64}}, MistyClosures.MistyClosure{Core.OpaqueClosure{Tuple{typeof(linear_regression), DynamicPPL.Model{typeof(linear_regression), (:x, :y), (), (), Tuple{Vector{Float64}, Vector{Float64}}, Tuple{}, Turing.Inference.SMCContext, false}, DynamicPPL.VarInfo{DynamicPPL.UnlinkAll, VarNamedTuple{(), Tuple{}}, DynamicPPL.AccumulatorTuple{3, @NamedTuple{LogPrior::DynamicPPL.LogPriorAccumulator{Float64}, LogJacobian::DynamicPPL.LogJacobianAccumulator{Float64}, LogLikelihood::Turing.Inference.ProduceLogLikelihoodAccumulator{Float64}}}}, Vector{Float64}, Vector{Float64}}, Union{Tuple{Nothing, Any}, Libtask.ProducedValue}}}}}, Float64}}, itr::Base.Generator{Vector{Turing.Inference.Particle{Turing.Inference.TracedModel{Libtask.TapedTask{Random123.Philox2x{UInt64, 10}, Tuple{typeof(linear_regression), DynamicPPL.Model{typeof(linear_regression), (:x, :y), (), (), Tuple{Vector{Float64}, Vector{Float64}}, Tuple{}, Turing.Inference.SMCContext, false}, DynamicPPL.VarInfo{DynamicPPL.UnlinkAll, VarNamedTuple{(), Tuple{}}, DynamicPPL.AccumulatorTuple{3, @NamedTuple{LogPrior::DynamicPPL.LogPriorAccumulator{Float64}, LogJacobian::DynamicPPL.LogJacobianAccumulator{Float64}, LogLikelihood::Turing.Inference.ProduceLogLikelihoodAccumulator{Float64}}}}, Vector{Float64}, Vector{Float64}}, MistyClosures.MistyClosure{Core.OpaqueClosure{Tuple{typeof(linear_regression), DynamicPPL.Model{typeof(linear_regression), (:x, :y), (), (), Tuple{Vector{Float64}, Vector{Float64}}, Tuple{}, Turing.Inference.SMCContext, false}, DynamicPPL.VarInfo{DynamicPPL.UnlinkAll, VarNamedTuple{(), Tuple{}}, DynamicPPL.AccumulatorTuple{3, @NamedTuple{LogPrior::DynamicPPL.LogPriorAccumulator{Float64}, LogJacobian::DynamicPPL.LogJacobianAccumulator{Float64}, LogLikelihood::Turing.Inference.ProduceLogLikelihoodAccumulator{Float64}}}}, Vector{Float64}, Vector{Float64}}, Union{Tuple{Nothing, Any}, Libtask.ProducedValue}}}}}, Float64}}, Turing.Inference.var"#reweight!##0#reweight!##1"}, ::Base.EltypeUnknown, isz::Base.HasShape{1})
    @ Base ./array.jl:810
 [22] collect_similar
    @ ./array.jl:732 [inlined]
 [23] map
    @ ./abstractarray.jl:3372 [inlined]
 [24] reweight!(particles::Vector{Turing.Inference.Particle{Turing.Inference.TracedModel{Libtask.TapedTask{Random123.Philox2x{UInt64, 10}, Tuple{typeof(linear_regression), DynamicPPL.Model{typeof(linear_regression), (:x, :y), (), (), Tuple{Vector{Float64}, Vector{Float64}}, Tuple{}, Turing.Inference.SMCContext, false}, DynamicPPL.VarInfo{DynamicPPL.UnlinkAll, VarNamedTuple{(), Tuple{}}, DynamicPPL.AccumulatorTuple{3, @NamedTuple{LogPrior::DynamicPPL.LogPriorAccumulator{Float64}, LogJacobian::DynamicPPL.LogJacobianAccumulator{Float64}, LogLikelihood::Turing.Inference.ProduceLogLikelihoodAccumulator{Float64}}}}, Vector{Float64}, Vector{Float64}}, MistyClosures.MistyClosure{Core.OpaqueClosure{Tuple{typeof(linear_regression), DynamicPPL.Model{typeof(linear_regression), (:x, :y), (), (), Tuple{Vector{Float64}, Vector{Float64}}, Tuple{}, Turing.Inference.SMCContext, false}, DynamicPPL.VarInfo{DynamicPPL.UnlinkAll, VarNamedTuple{(), Tuple{}}, DynamicPPL.AccumulatorTuple{3, @NamedTuple{LogPrior::DynamicPPL.LogPriorAccumulator{Float64}, LogJacobian::DynamicPPL.LogJacobianAccumulator{Float64}, LogLikelihood::Turing.Inference.ProduceLogLikelihoodAccumulator{Float64}}}}, Vector{Float64}, Vector{Float64}}, Union{Tuple{Nothing, Any}, Libtask.ProducedValue}}}}}, Float64}}, ::MCMCSerial)
    @ Turing.Inference .../julia/dev/Turing/src/mcmc/smc.jl:318
 [25] smcsample(rng::MersenneTwister, model::DynamicPPL.Model{typeof(linear_regression), (:x, :y), (), (), Tuple{Vector{Float64}, Vector{Float64}}, Tuple{}, DynamicPPL.DefaultContext, false}, sampler::SMC{Turing.Inference.ESSResampler{Float64}, Nothing}, ensemble::MCMCSerial, N::Int64; ref::Nothing)
    @ Turing.Inference .../julia/dev/Turing/src/mcmc/smc.jl:367
 [26] smcsample
    @ .../julia/dev/Turing/src/mcmc/smc.jl:354 [inlined]
 [27] #sample#53
    @ .../julia/dev/Turing/src/mcmc/smc.jl:382 [inlined]
 [28] top-level scope
    @ .../julia/dev/Turing/testing.jl:18
in expression starting at .../julia/dev/Turing/testing.jl:18

Not entirely sure why this is considering when I run this outside of the Turing environment (same exact code from smc.jl) it works perfectly fine.

@penelopeysm
Copy link
Copy Markdown
Member

First, I found that src/mcmc/smc.jl needs using StatsFuns: softmax. But after adding that I can reproduce the error. On 1.11 I'm getting Unbound GlobalRef not allowed in value position, which looks to me to be the same thing as TuringLang/Libtask.jl#211. So maybe I could spend some time on that issue, and in the meantime ask you to use another model 😅?

I fixed this locally, but forgot to commit to the PR

Co-authored-by: Penelope Yong <penelopeysm@gmail.com>
@charlesknipp
Copy link
Copy Markdown
Member Author

use another model 😅?

lmao yeah, its about time to retire the linear regression. In lieu of keeping it simple, I also ran it with coinflip, normal, and test and got the same results

@penelopeysm
Copy link
Copy Markdown
Member

This PR TuringLang/Libtask.jl#219 should fix the Libtask errors; with that PR + once you add the DynamicPPL imports, it runs correctly for me.

If I had to take a guess, it was probably working for you before this because the variables weren't truly global (were they in a function, or some local scope?)

charlesknipp and others added 2 commits March 18, 2026 15:08
Co-authored-by: Penelope Yong <penelopeysm@gmail.com>
Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
@charlesknipp
Copy link
Copy Markdown
Member Author

This PR TuringLang/Libtask.jl#219 should fix the Libtask errors; with that PR + once you add the DynamicPPL imports, it runs correctly for me.

All it took was the imports, though I appreciate the second look at Libtask. I don't think there's any global manipulation here, unless you count the task local storage trickery used by Libtask.

charlesknipp and others added 4 commits March 18, 2026 16:56
I stg I need to enable auto-format upon saving

Co-authored-by: github-actions[bot] <41898282+github-actions[bot]@users.noreply.github.com>
@penelopeysm
Copy link
Copy Markdown
Member

I don't understand that at all, but if there's no error now, I guess I'll take the win.

@penelopeysm
Copy link
Copy Markdown
Member

(The point about the globals was that if you run the code snippet above in the REPL, then x and y are global variables.)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants