Skip to content

[Docs] Add PRISM test-time scaling reference for LLaDA-2.0-mini in README #1

@viiika

Description

@viiika

Hi authors,

We recently released a preprint on efficient test-time scaling for discrete diffusion language models (dLLMs), titled:

PRISM: Efficient Test-Time Scaling via Hierarchical Search and Self-Verification for Discrete Diffusion Language Models

arXiv GitHub

PRISM is an test-time scaling framework that integrates:

  • Hierarchical Trajectory Search (HTS): dynamically prunes/reallocates compute in an early-to-mid denoising window
  • Local branching via partial remasking: explores diverse realizations while preserving high-confidence tokens
  • Self-Verified Feedback (SVF): uses the same dLLM as a lightweight Yes/No verifier on intermediate completions (no external RM required)

We evaluate PRISM on three dLLMs, including LLaDA-2.0-mini. Would you consider adding a short note in the README under “Decoding / Inference / Test-time scaling” pointing users to PRISM as a tested decoding recipe for LLaDA-2.0-mini?

Suggested README snippet (feel free to edit):

Test-time scaling (PRISM)

Thanks for considering!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions