Skip to content

chore(deps): bump vllm to 0.16.0#267

Closed
mplatzer wants to merge 3 commits intomainfrom
chore/bump-vllm-0-16-0
Closed

chore(deps): bump vllm to 0.16.0#267
mplatzer wants to merge 3 commits intomainfrom
chore/bump-vllm-0-16-0

Conversation

@mplatzer
Copy link
Contributor

@mplatzer mplatzer commented Mar 2, 2026

Summary

  • bump vllm in gpu optional dependency from 0.12.0 to 0.16.0
  • refresh uv.lock to resolve new vLLM transitive dependency set
  • update stale compatibility comment for transformers

Validation

  • uv lock
  • uv lock --check

Note

Medium Risk
Medium risk due to a major runtime dependency bump for GPU inference (vllm) plus a large lockfile refresh that changes key ML stack versions (Torch family, protobuf) and adds new transitive networking/runtime deps.

Overview
Bumps the optional gpu extra to vllm==0.16.0 and updates the transformers compatibility comment accordingly.

Refreshes uv.lock to match the new vLLM dependency graph, including new transitive packages (notably grpcio, grpcio-reflection, mcp, ijson, httpx-sse, pyjwt, sse-starlette) and several version/marker shifts such as torch/torchaudio/torchvision platform-specific pins, xgrammar updates for macOS/Linux, and a protobuf version change.

Written by Cursor Bugbot for commit ce29ca6. This will update automatically on new commits. Configure here.

@mplatzer mplatzer closed this Mar 2, 2026
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant