Skip to content

feat: Gibbs variable selection + single-chain default for MCMC#74

Merged
eprifti merged 2 commits intomainfrom
feat/mcmc-gibbs
Mar 25, 2026
Merged

feat: Gibbs variable selection + single-chain default for MCMC#74
eprifti merged 2 commits intomainfrom
feat/mcmc-gibbs

Conversation

@eprifti
Copy link
Copy Markdown
Contributor

@eprifti eprifti commented Mar 25, 2026

Summary

  • New MCMC mode: method=gibbs (default) — joint feature+coefficient sampling in a single MCMC run
  • Replaces the O(n_features) SBS loop with a single run using sparsity prior P(k)
  • Default to 1 chain (Vadim's recommendation: short chains don't converge)
  • New params: method (gibbs/sbs), p0 (prior inclusion prob), n_chains
  • Adaptive BTR threshold: P(active) > 0.5, fallback to top-k

Benchmark (Qin2014)

Mode Test AUC k Time
SBS (n_iter=200) 0.707 49 3.7s
Gibbs (n_iter=5000) 0.679 14 3.4s

Gibbs produces sparser models with more FBM diversity.

Test plan

  • 776 lib tests pass
  • Benchmark on Qin2014

Refs #70, #73

eprifti added 2 commits March 25, 2026 17:14
…ing)

New MCMC mode: method='gibbs' (default) vs method='sbs' (original)

Gibbs variable selection samples feature inclusion/exclusion within the
MCMC iterations, with a sparsity prior P(k) = Binomial(k; n, p0).
This replaces the O(n_features) SBS loop with a single MCMC run.

New params:
- mcmc.method: 'gibbs' (default) or 'sbs'
- mcmc.p0: prior inclusion probability (default 0.1)

Adaptive BTR threshold: selects features with P(active) > 0.5, falls
back to top-k by posterior if too few qualify.

Refs #70, #73
…aram

Parallel chains with short n_iter don't converge — each chain burns n_burn
iterations independently, leaving too few effective samples. Default to
single chain; users can set n_chains > 1 when n_iter is large enough.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant