Muninn is a memory-backed RAG assistant for portfolio sites. It answers from a bundled corpus, streams responses from Groq, and returns citation metadata so every answer can point back to its source.
Named after Odin's raven of memory, Muninn is designed to be small, cheap to run, and easy to deploy.
- Build-time MiniLM embeddings with
@xenova/transformers - Bundled JSON vector index, no hosted vector database
- Groq
llama-3.3-70b-versatilestreaming responses - Citation chips via the
x-muninn-citationsresponse header - Optional Upstash Redis rate limiting
- Next.js App Router API route at
/api/muninn
- Next.js 16
- React 19
- Vercel AI SDK
- Groq
- MiniLM local embeddings
- Upstash Ratelimit, optional
npm install
cp .env.example .env.local
npm run build:muninn
npm run devOpen http://localhost:3000.
Required:
GROQ_API_KEY=Optional:
UPSTASH_REDIS_REST_URL=
UPSTASH_REDIS_REST_TOKEN=If Upstash variables are missing, Muninn still runs locally and skips rate limiting.
Muninn's memory lives in data/corpus.json.
After editing the corpus, rebuild the index:
npm run build:muninnThis writes data/muninn-index.json, which is imported directly by the runtime route. The app does not need a database or a vector-store service.
POST /api/muninn
Request:
{
"messages": [
{ "role": "user", "content": "Explain Muninn's RAG architecture." }
]
}Response:
- Body: plain text stream
- Header:
x-muninn-citations, URL-encoded JSON citation metadata
npm run dev # start local dev server
npm run build:muninn # rebuild vector index
npm run build # build index, then build Next app
npm run start # start production server
npm run lint # run ESLint- Push this folder as its own repository.
- Add
GROQ_API_KEYto the deployment environment. - Optionally add Upstash Redis REST credentials.
- Deploy with the default Next.js build command:
npm run build.
MIT
