Status: Active investigation
Last updated: March 27, 2026
Update (March 30): A new clean version of LiteLLM is now available (v1.83.0). This was released by our new CI/CD v2 pipeline which added isolated environments, stronger security gates, and safer release separation for LiteLLM.
Update (March 27): Review Townhall updates, including explanation of the incident, what we've done, and what comes next. Learn more
Update (March 27): Added Verified safe versions section with SHA-256 checksums for all audited PyPI and Docker releases.
Update (March 26): Added checkmarx[.]zone to Indicators of compromise
Update (March 25): Added community-contributed scripts for scanning GitHub Actions and GitLab CI pipelines for the compromised versions. See How to check if you are affected. s/o @Zach Fury for these scripts.
- The compromised PyPI packages were litellm==1.82.7 and litellm==1.82.8. Those packages were live on March 24, 2026 from 10:39 UTC for about 40 minutes before being quarantined by PyPI.
- We believe that the compromise originated from the Trivy dependency used in our CI/CD security scanning workflow.
- Customers running the official LiteLLM Proxy Docker image were not impacted. That deployment path pins dependencies in requirements.txt and does not rely on the compromised PyPI packages.
We have paused all new LiteLLM releases until we complete a broader supply-chain review and confirm the release path is safe. Updated: We have now released a new safe version of LiteLLM (v1.83.0) by our new CI/CD v2 pipeline which added isolated environments, stronger security gates, and safer release separation for LiteLLM. We have also verified the codebase is safe and no malicious code was pushed to main.
Overview​
LiteLLM AI Gateway is investigating a suspected supply chain attack involving unauthorized PyPI package publishes. Current evidence suggests a maintainer's PyPI account may have been compromised and used to distribute malicious code.
At this time, we believe this incident may be linked to the broader Trivy security compromise, in which stolen credentials were reportedly used to gain unauthorized access to the LiteLLM publishing pipeline.
This investigation is ongoing. Details below may change as we confirm additional findings.
Confirmed affected versions​
The following LiteLLM versions published to PyPI were impacted:
- v1.82.7: contained a malicious payload in the LiteLLM AI Gateway
proxy_server.py
- v1.82.8: contained
litellm_init.pth and a malicious payload in the LiteLLM AI Gateway proxy_server.py
If you installed or ran either of these versions, review the recommendations below immediately.
Note: These versions have already been removed from PyPI.
What happened​
Initial evidence suggests the attacker bypassed official CI/CD workflows and uploaded malicious packages directly to PyPI.
These compromised versions appear to have included a credential stealer designed to:
- Harvest secrets by scanning for:
- environment variables
- SSH keys
- cloud provider credentials (AWS, GCP, Azure)
- Kubernetes tokens
- database passwords
- Encrypt and exfiltrate data via a
POST request to models.litellm.cloud, which is not an official BerriAI / LiteLLM domain
Who is affected​
You may be affected if any of the following are true:
- You installed or upgraded LiteLLM via
pip on March 24, 2026, between 10:39 UTC and 16:00 UTC
- You ran
pip install litellm without pinning a version and received v1.82.7 or v1.82.8
- You built a Docker image during this window that included
pip install litellm without a pinned version
- A dependency in your project pulled in LiteLLM as a transitive, unpinned dependency
(for example through AI agent frameworks, MCP servers, or LLM orchestration tools)
You are not affected if any of the following are true:
LiteLLM AI Gateway/Proxy users: Customers running the official LiteLLM Proxy Docker image were not impacted. That deployment path pins dependencies in requirements.txt and does not rely on the compromised PyPI packages.
- You are using LiteLLM Cloud
- You are using the official LiteLLM AI Gateway Docker image:
ghcr.io/berriai/litellm
- You are on v1.82.6 or earlier and did not upgrade during the affected window
- You installed LiteLLM from source via the GitHub repository, which was not compromised
How to check if you are affected