Awesome Knowledge Distillation
-
Updated
Mar 22, 2026
Awesome Knowledge Distillation
Awesome Knowledge-Distillation. 分类整理的知识蒸馏paper(2014-2021)。
[ECCV2022] Factorizing Knowledge in Neural Networks
Multiple Generation Based Knowledge Distillation: A Roadmap
Official implementation of "Learn From One Specialized Sub-Teacher: One-to-One Mapping for Feature-Based Knowledge Distillation" accepted in EMNLP-Findings 2023
Production-ready pipeline to create ontology from documents. Submit jobs via REST API, processed asynchronously with Apache Pulsar. Features retry logic, dynamic topic discovery, health checks, and input validation.
Code Reproduction of the essay Distillation Decision Tree
Add a description, image, and links to the knowldge-distillation topic page so that developers can more easily learn about it.
To associate your repository with the knowldge-distillation topic, visit your repo's landing page and select "manage topics."