ICUICU
critical

vllm

v0.16.0

A high-throughput and memory-efficient inference and serving engine for LLMs

PyPIvLLM TeamFirst seen Feb 26, 2026

0

Total

0

Critical

0

High

0

Medium

Findings

No findings detected — this package appears clean.