RAG Time: A 5-week Learning Journey to Mastering RAG
-
Updated
Jun 17, 2025 - Jupyter Notebook
RAG Time: A 5-week Learning Journey to Mastering RAG
Official Code for Paper: Beyond Matryoshka: Revisiting Sparse Coding for Adaptive Representation
Code and pretrained models for the paper: "MatMamba: A Matryoshka State Space Model"
Enterprise RAG ecosystem managing 15,000+ semantic chunks. Features hybrid parsing (LlamaParse/PyMuPDF) and 256-dim MRL embeddings for 512MB RAM environments
Chem-MRL: SMILES-based Matryoshka Representation Learning Embedding Model
Gradio app showcasing Chem-MRL embeddings for SMILES-based similarity search.
Application of Matryoshka Representation Learning on Text Embeddings
An investigation into how Matryoshka Representation Learning (MRL) with Relational Distillation affects the geometric structure and social bias encoding of Word2Vec embeddings.
🚀 Build Ruby gems that utilize Rust for enhanced performance through two effective design patterns for seamless collaboration.
Add a description, image, and links to the matryoshka-representation-learning topic page so that developers can more easily learn about it.
To associate your repository with the matryoshka-representation-learning topic, visit your repo's landing page and select "manage topics."