Skip to content

Pinned Loading

  1. vllm vllm Public

    A high-throughput and memory-efficient inference and serving engine for LLMs

    Python 64.9k 11.8k

  2. llm-compressor llm-compressor Public

    Transformers-compatible library for applying various compression algorithms to LLMs for optimized deployment with vLLM

    Python 2.4k 311

  3. recipes recipes Public

    Common recipes to run vLLM

    Jupyter Notebook 270 99

  4. speculators speculators Public

    A unified library for building, evaluating, and storing speculative decoding algorithms for LLM inference in vLLM

    Python 145 20

Repositories

Showing 10 of 28 repositories
  • llm-compressor Public

    Transformers-compatible library for applying various compression algorithms to LLMs for optimized deployment with vLLM

    vllm-project/llm-compressor’s past year of commit activity
    Python 2,357 Apache-2.0 311 76 (16 issues need help) 43 Updated Dec 9, 2025
  • tpu-inference Public

    TPU inference for vLLM, with unified JAX and PyTorch support.

    vllm-project/tpu-inference’s past year of commit activity
    Python 182 Apache-2.0 54 18 (1 issue needs help) 67 Updated Dec 9, 2025
  • vllm Public

    A high-throughput and memory-efficient inference and serving engine for LLMs

    vllm-project/vllm’s past year of commit activity
    Python 64,935 Apache-2.0 11,826 1,880 (34 issues need help) 1,274 Updated Dec 9, 2025
  • ci-infra Public

    This repo hosts code for vLLM CI & Performance Benchmark infrastructure.

    vllm-project/ci-infra’s past year of commit activity
    HCL 27 Apache-2.0 48 0 26 Updated Dec 9, 2025
  • vllm-gaudi Public

    Community maintained hardware plugin for vLLM on Intel Gaudi

    vllm-project/vllm-gaudi’s past year of commit activity
    Python 19 Apache-2.0 77 1 69 Updated Dec 9, 2025
  • speculators Public

    A unified library for building, evaluating, and storing speculative decoding algorithms for LLM inference in vLLM

    vllm-project/speculators’s past year of commit activity
    Python 145 Apache-2.0 20 9 (5 issues need help) 11 Updated Dec 9, 2025
  • vllm-ascend Public

    Community maintained hardware plugin for vLLM on Ascend

    vllm-project/vllm-ascend’s past year of commit activity
    Python 1,440 Apache-2.0 640 800 (7 issues need help) 280 Updated Dec 9, 2025
  • vllm-omni Public

    A framework for efficient model inference with omni-modality models

    vllm-project/vllm-omni’s past year of commit activity
    Python 759 Apache-2.0 94 50 (19 issues need help) 26 Updated Dec 9, 2025
  • guidellm Public

    Evaluate and Enhance Your LLM Deployments for Real-World Inference Needs

    vllm-project/guidellm’s past year of commit activity
    Python 736 Apache-2.0 105 42 (3 issues need help) 22 Updated Dec 9, 2025
  • semantic-router Public

    Intelligent Router for Mixture-of-Models

    vllm-project/semantic-router’s past year of commit activity
    Go 2,374 Apache-2.0 303 92 (15 issues need help) 31 Updated Dec 8, 2025