Skip to content

Latest commit

 

History

History
45 lines (35 loc) · 1.67 KB

File metadata and controls

45 lines (35 loc) · 1.67 KB

LoopFormer: Elastic-Depth Looped Transformers for Latent Reasoning via Shortcut Modulation (ICLR 2026)

Authors:
Ahmadreza Jeddi, Marco Ciccone, Babak Taati

LoopFormer


This repository contains the official implementation of LoopFormer.

The codebase is a fork of NanoGPT, and we intentionally keep it as close as possible to the original implementation for clarity and reproducibility. Beyond the looped / elastic-depth components, the main architectural difference is using RMSNorm instead of LayerNorm.


Installation

pip install torch numpy transformers datasets tiktoken wandb tqdm

Citation

If you find this work useful, please give us a citation:

@misc{jeddi2026loopformerelasticdepthloopedtransformers,
      title={LoopFormer: Elastic-Depth Looped Transformers for Latent Reasoning via Shortcut Modulation}, 
      author={Ahmadreza Jeddi and Marco Ciccone and Babak Taati},
      year={2026},
      eprint={2602.11451},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2602.11451}, 
}