Skip to content

vj-bunbun/brainframe

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

2 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Brainframe

A provider-agnostic CLI agent that reads your code, writes files, runs commands, and searches your codebase. The brain is pluggable — use Claude, GPT, Gemini, DeepSeek, or run fully local with Ollama. Zero AI SDK dependencies.

Third in the trilogy:

  • Blackbox — curates what the AI knows
  • Blueprint — structures how the AI is briefed
  • Brainframe — the brain that acts

Quick Start

Prerequisites

  • Bun runtime (curl -fsSL https://bun.sh/install | bash)
  • An API key from any supported provider, or Ollama for free local inference

Install

git clone https://github.com/vj-bunbun/brainframe.git
cd brainframe
bun install

Option A: Cloud Provider (Anthropic, OpenAI, etc.)

Pick a provider, set the key, run:

export ANTHROPIC_API_KEY=sk-ant-...
bun run src/cli.ts
Brainframe — anthropic/claude-sonnet-4-20250514
Type /exit to quit, /clear to reset conversation.

you> what files are in this project?

Brainframe reads your codebase, calls tools, and streams the response — just like Claude Code, but you own every line.

Option B: Local with Ollama (Free, Private)

Nothing leaves your machine. No API key. No cost.

# Install Ollama (https://ollama.com) then pull a model
ollama pull llama3.1

# Run Brainframe against it
bun run src/cli.ts --provider ollama
Brainframe — ollama/llama3.1
Type /exit to quit, /clear to reset conversation.

you> find all TODO comments in this project
  [tool] Grep
           /TODO/ .
  [result] src/agent.ts:42: // TODO: add parallel tool execution
           src/cli.ts:18: // TODO: add history

Found 2 TODOs across the project...

Option C: Point at Any Project

Brainframe loads CLAUDE.md from the current directory as its system prompt. Just cd into your project:

cd ~/my-project
bun run ~/brainframe/src/cli.ts --provider ollama --model codellama

Or pass a custom system prompt:

bun run src/cli.ts --system ./my-custom-prompt.md --provider deepseek

Providers

Brainframe talks to any model through raw fetch() + SSE. No SDKs.

Provider Flag Default Model API Key Env Var
Anthropic --provider anthropic claude-sonnet-4-20250514 ANTHROPIC_API_KEY
OpenAI --provider openai gpt-4o OPENAI_API_KEY
Ollama --provider ollama llama3.1 none (local)
DeepSeek --provider deepseek deepseek-chat DEEPSEEK_API_KEY
Groq --provider groq llama-3.3-70b-versatile GROQ_API_KEY
Google --provider google gemini-2.0-flash GOOGLE_API_KEY
Custom --provider http://host:port/v1 specify with --model optional

Switch providers any time — same conversation format, same tools, different brain:

# Fast and cheap
bun run src/cli.ts -p groq -m llama-3.3-70b-versatile

# Most capable
bun run src/cli.ts -p anthropic -m claude-sonnet-4-20250514

# Fully private
bun run src/cli.ts -p ollama -m deepseek-coder-v2

Set Defaults (Skip the Flags)

Add to your shell profile (~/.bashrc, ~/.zshrc, etc.):

export BRAINFRAME_PROVIDER=ollama
export BRAINFRAME_MODEL=llama3.1

Then just:

cd ~/my-project
bun run ~/brainframe/src/cli.ts

Commands

Command Action
/clear Reset conversation
/exit Quit

What It Looks Like

Brainframe — anthropic/claude-sonnet-4-20250514
Type /exit to quit, /clear to reset conversation.

you> there's a bug in the login flow — users get redirected to a blank page after oauth

  [tool] Grep
           /oauth|redirect|login/ src/
  [result] src/auth/callback.ts:14: const redirectUrl = ...
           src/auth/callback.ts:28: res.redirect(redirectUrl)
           src/auth/login.ts:8: ...

  [tool] Read
           src/auth/callback.ts
  [result] 1  import { ...

Found it — line 28 in callback.ts redirects to `redirectUrl` but that
variable is null when the OAuth state cookie expires. The fix:

  [tool] Write
           src/auth/callback.ts
  [result] Updated: src/auth/callback.ts

Added a fallback: if `redirectUrl` is null, redirect to `/dashboard`
instead of passing null to `res.redirect()`.

you> nice, does it handle the case where the session also expired?

  [tool] Read
           src/auth/session.ts
  [result] 1  import { ...

Yes — session.ts already checks for expired sessions on line 42 and
redirects to /login. The cookie expiry was the only gap.

The agent reads, writes, searches, and runs commands — looping as many times as needed until the task is done.

Tools

Brainframe ships with 5 tools:

Tool What it does
Bash Run shell commands
Read Read files with line numbers
Write Create or update files
Grep Search file contents (ripgrep)
Glob Find files by pattern

Adding a Tool

Create a file in src/tools/ and register it in src/tools/index.ts:

// src/tools/my-tool.ts
import { z } from 'zod';
import { defineTool } from './types.js';

export const myTool = defineTool({
  name: 'MyTool',
  description: 'What the tool does.',
  inputSchema: z.object({
    input: z.string().describe('What this input is'),
  }),
  async call(input) {
    // do the thing
    return { content: 'result' };
  },
});
// src/tools/index.ts
import { myTool } from './my-tool.js';
export const tools: Tool[] = [bashTool, readTool, writeTool, grepTool, globTool, myTool];

With Blackbox + Blueprint

Brainframe is designed to consume output from its companion tools:

# 1. Blackbox assembles relevant project knowledge
cd ~/blackbox/scripts
bun run context.ts --vault ~/my-vault --task "fixing auth" --output ~/prompts/_context.md

# 2. Blueprint builds the system prompt
cd ~/blueprint/scripts
bun run build.ts --dir ~/prompts --execute --output ~/project/CLAUDE.md

# 3. Brainframe runs with full context
cd ~/project
bun run ~/brainframe/src/cli.ts --provider ollama --model deepseek-coder-v2

Brainframe automatically loads CLAUDE.md from the current directory as its system prompt. No configuration needed — just run it where your code lives.

Architecture

src/
├── cli.ts              Entry point, readline loop
├── agent.ts            Core agent loop (stream → tools → loop)
├── system.ts           System prompt file loader
├── tools/
│   ├── types.ts        Tool interface + helpers
│   ├── index.ts        Tool registry
│   ├── bash.ts         Shell execution
│   ├── read.ts         File reading
│   ├── write.ts        File writing
│   ├── grep.ts         ripgrep wrapper
│   └── glob.ts         File pattern matching
└── providers/
    ├── types.ts        Provider-agnostic message types
    ├── sse.ts          Shared SSE stream parser
    ├── anthropic.ts    Anthropic API (raw fetch)
    ├── openai.ts       OpenAI-compatible (OpenAI, Ollama, DeepSeek, Groq)
    ├── google.ts       Google Gemini API
    └── index.ts        Provider registry + resolver

~1,200 lines. 3 dependencies. chalk, zod, zod-to-json-schema.

Adding a Provider

Implement the Provider interface from src/providers/types.ts:

interface Provider {
  name: string;
  stream(params: StreamParams): AsyncGenerator<StreamEvent>;
  validate(): string | null;
}

Your provider receives our message types, streams our event types. The agent loop doesn't care where the intelligence comes from.

License

MIT

About

Run your AI locally. Provider-agnostic CLI agent — any model, any endpoint, zero SDK dependencies.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors