CVE-2025-48944

Source
https://nvd.nist.gov/vuln/detail/CVE-2025-48944
Import Source
https://storage.googleapis.com/cve-osv-conversion/osv-output/CVE-2025-48944.json
JSON Data
https://api.osv.dev/v1/vulns/CVE-2025-48944
Aliases
Related
Published
2025-05-30T19:15:30Z
Modified
2025-07-02T05:39:37Z
Summary
[none]
Details

vLLM is an inference and serving engine for large language models (LLMs). In version 0.8.0 up to but excluding 0.9.0, the vLLM backend used with the /v1/chat/completions OpenAPI endpoint fails to validate unexpected or malformed input in the "pattern" and "type" fields when the tools functionality is invoked. These inputs are not validated before being compiled or parsed, causing a crash of the inference worker with a single request. The worker will remain down until it is restarted. Version 0.9.0 fixes the issue.

References

Affected packages

Git / github.com/vllm-project/vllm

Affected ranges

Type
GIT
Repo
https://github.com/vllm-project/vllm
Events