From 2f5408f884c1b167739a10cd3f10157fc8573863 Mon Sep 17 00:00:00 2001 From: "dependabot[bot]" <49699333+dependabot[bot]@users.noreply.github.com> Date: Fri, 27 Mar 2026 15:56:56 +0000 Subject: [PATCH] chore(deps): bump vllm from 0.14.1 to 0.18.0 in /scripts Bumps [vllm](https://github.com/vllm-project/vllm) from 0.14.1 to 0.18.0. - [Release notes](https://github.com/vllm-project/vllm/releases) - [Changelog](https://github.com/vllm-project/vllm/blob/main/RELEASE.md) - [Commits](https://github.com/vllm-project/vllm/compare/v0.14.1...v0.18.0) --- updated-dependencies: - dependency-name: vllm dependency-version: 0.18.0 dependency-type: direct:production ... Signed-off-by: dependabot[bot] --- scripts/requirements.txt | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/scripts/requirements.txt b/scripts/requirements.txt index 3a546d3..a35417c 100644 --- a/scripts/requirements.txt +++ b/scripts/requirements.txt @@ -1,5 +1,5 @@ # Torch will be installed as a dependency of vLLM, which will select the correct backend. -vllm==0.14.1 +vllm==0.18.0 flashinfer-python bitsandbytes>=0.45.3; sys_platform == 'linux' triton; sys_platform == 'linux'