Some of the failing nightly builds showed there's updated imports with vllm 0.11.1+ that will affect widening or upgrading the compatible vLLM range for this adapter: https://github.com/foundation-model-stack/vllm-detector-adapter/actions/runs/20047013858/job/57494654334
There are at least a couple vLLM PRs of note:
https://github.com/vllm-project/vllm/pull/27188 and https://github.com/vllm-project/vllm/pull/27567 the FlexibleArgumentParser path has changed and StoreBoolean was removed. The adapter for now uses those to parse environment variables. One potential workaround here is copying the StoreBoolean class in, or refactoring the entire LocalEnvVarArgumentParser parser.
https://github.com/vllm-project/vllm/pull/26427 removed model_config from the initialization of OpenAIServingModels and OpenAIServingChat etc. This will affect the API server and many of the tests in this adapter.
There may be enough breaking changes here to warrant an adapter minor update / non-backwards changes to support the latest vLLM changes.
Some of the failing nightly builds showed there's updated imports with vllm 0.11.1+ that will affect widening or upgrading the compatible vLLM range for this adapter: https://github.com/foundation-model-stack/vllm-detector-adapter/actions/runs/20047013858/job/57494654334
There are at least a couple vLLM PRs of note:
https://github.com/vllm-project/vllm/pull/27188andhttps://github.com/vllm-project/vllm/pull/27567theFlexibleArgumentParserpath has changed andStoreBooleanwas removed. The adapter for now uses those to parse environment variables. One potential workaround here is copying theStoreBooleanclass in, or refactoring the entireLocalEnvVarArgumentParserparser.https://github.com/vllm-project/vllm/pull/26427removedmodel_configfrom the initialization ofOpenAIServingModelsandOpenAIServingChatetc. This will affect the API server and many of the tests in this adapter.There may be enough breaking changes here to warrant an adapter minor update / non-backwards changes to support the latest vLLM changes.