Tools for merging pretrained Large Language Models and create Mixture of Experts (MoE) from open-source models.
-
Updated
Sep 18, 2025 - Jupyter Notebook
Tools for merging pretrained Large Language Models and create Mixture of Experts (MoE) from open-source models.
An attempt at a glass pane through which to view the internals of power. 🪟
End-to-end workflow to build a Case-law Citation Helper by fine-tuning two specialized LoRA “experts” and then merging them into a single adapter using TIES and DARE-style strategies.
Mergekit Assistant is a cutting-edge toolkit designed for the seamless merging of pre-trained language models. It supports an array of models, offers various merging methods, and optimizes for low-resource environments with both CPU and GPU compatibility.
Copy of https://tieknots.how with a few tweaks
🛠️ Merge pre-trained language models efficiently with `mergekit`, using minimal resources and diverse algorithms for powerful, flexible solutions.
Add a description, image, and links to the ties topic page so that developers can more easily learn about it.
To associate your repository with the ties topic, visit your repo's landing page and select "manage topics."