| Version | Supported |
|---|---|
| 0.1.x | ✅ |
The llm-cache project takes security seriously. If you discover a security vulnerability, please report it privately rather than opening a public issue.
To report a vulnerability:
- Go to https://github.com/reaatech/llm-cache/security/advisories/new
- Fill in the advisory form with a detailed description
- Include steps to reproduce if possible
You can expect:
- Acknowledgment within 48 hours
- A resolution timeline within 5 business days
- Credit in the release notes (unless you request anonymity)
When using the EncryptionService, always provide a unique salt via the salt option or persist the auto-generated salt from encryption.salt. Using the same passphrase without a unique salt across installations weakens key derivation. See packages/core/src/utils/encryption.ts.
- Never commit
.envfiles to version control - Use short-lived API keys where possible
- Rotate
LLM_CACHE_API_KEYregularly - Store AWS credentials via IAM roles or environment variables, never in code
- Always use authentication (passwords/API keys) in production
- Bind services to localhost when possible, or use a private network
- Enable TLS for connections over untrusted networks
The logger automatically redacts sensitive fields (passwords, API keys, tokens, prompts) from log output. If you extend the logging system, ensure you maintain this redaction for any fields that may contain PII or credentials.
We run pnpm audit on every CI run. Dependabot is configured to open PRs for critical security updates. Please keep your dependencies up to date.
We follow a 90-day coordinated disclosure policy. Critical fixes are released as patch versions as soon as they are available.