OSS Support Hub / Machine Learning

Python Machine Learning Apache-2.0 Latest: v0.20.2

vLLM

High-throughput and memory-efficient LLM inference and serving engine

Project Health at a Glance

Live data from GitHub and PyPI, updated daily.

โญ
80.0K+31
GitHub Stars
๐Ÿ“ฆ
v0.20.2
Latest Release ยท 5 days ago
๐Ÿ”„
8d
Avg. Release Cadence
๐Ÿ›
4.9K
Open Issues
๐Ÿ“…
Today
Last Commit
๐Ÿ”’
10
Active CVEs

Data last fetched: 2026-05-15

Known Vulnerabilities

10 active CVEs reported via OSV.dev

vLLM affected by RCE via auto_map dynamic module loading during model initialization

Published: 2026-01-21 Fixed in: 0.14.0

vLLM is vulnerable to Server-Side Request Forgery (SSRF) through `MediaConnector` class

Published: 2025-10-07 Fixed in: 0.11.0

vLLM: Unauthenticated OOM Denial of Service via Unbounded `n` Parameter in OpenAI API Server

Published: 2026-04-03 Fixed in: 0.19.0

Potential Timing Side-Channel Vulnerability in vLLMโ€™s Chunk-Based Prefix Caching

Published: 2025-05-28 Fixed in: 0.9.0

vLLM has RCE In Video Processing

Published: 2026-02-02 Fixed in: 0.14.1

vLLM Deserialization of Untrusted Data vulnerability

Published: 2025-03-20 No fix available

vLLM vulnerable to DoS via large Chat Completion or Tokenization requests with specially crafted `chat_template_kwargs`

Published: 2025-11-20 Fixed in: 0.11.1

vLLM: Resource-Exhaustion (DoS) through Malicious Jinja Template in OpenAI-Compatible Server

Published: 2025-10-07 Fixed in: 0.11.0

vLLM DOS: Remotely kill vllm over http with invalid JSON schema

Published: 2025-05-28 Fixed in: 0.9.0

vLLM has Hardcoded Trust Override in Model Files Enables RCE Despite Explicit User Opt-Out

Published: 2026-03-27 Fixed in: 0.18.0

Alternatives to vLLM

Other Machine Learning projects in the Python ecosystem worth evaluating.

Support Options for vLLM

Enterprise Support via DepKeep

Get SLA-backed support, security patches, and direct access to senior engineers for vLLM โ€” without relying on volunteer maintainers.

Talk to an Expert โ†’