Skip to content

Latest commit

 

History

History
52 lines (38 loc) · 1.22 KB

File metadata and controls

52 lines (38 loc) · 1.22 KB

Don’t Let It Fade: Preserving Edits in Diffusion Language Models via Token Timestep Allocation

This repository contains the official implementation for the paper:
"Don’t Let It Fade: Preserving Edits in Diffusion Language Models via Token Timestep Allocation"

🌐 Project Page 📄 Paper

📜 News

  • [2025.09.19] 🎉🎉 Our paper "Don’t Let It Fade: Preserving Edits in Diffusion Language Models via Token Timestep Allocation" is accepted at NeurIPS 2025!

Setup

git clone https://github.com/AIDASLab/TTA-Diffusion.git
conda env create -file sdlm.yaml

How to Train

# to be done

How to Run

# to be done

Citation

If you find our work useful, please cite:

@inproceedings{kim2025tta,
  title     = {Don’t Let It Fade: Preserving Edits in Diffusion Language Models via Token Timestep Allocation},
  author    = {Kim, Woojin and Do, Jaeyoung},
  booktitle = {Advances in Neural Information Processing Systems (NeurIPS)},
  year      = {2025}
}

📧 Contact

For questions or collaborations, feel free to reach out:
wjk9904@snu.ac.kr