A theoretical reconstruction of the Claude Mythos architecture, built from first principles using the available research literature.
-
Updated
Apr 20, 2026 - Python
A theoretical reconstruction of the Claude Mythos architecture, built from first principles using the available research literature.
LoopFormer is an elastic-depth looped Transformer trained on variable-length trajectories, using time/step-size conditioning and a shortcut-consistency objective to enable budget-conditioned language modeling and latent reasoning without retraining.
Experimental implementation of "Looped Transformers are Better at Learning Learning Algorithms" showing superior performance with 12x fewer parameters. Includes complete environment setup, pre-trained weights, and extensive experiments comparing Looped TFs vs traditional Transformers for In-Context Learning.
Benchmark and evaluation ByteDance Ouro model based on Looped Language Models on several reasoning tasks.
Add a description, image, and links to the looped-transformers topic page so that developers can more easily learn about it.
To associate your repository with the looped-transformers topic, visit your repo's landing page and select "manage topics."