Research-inspired compatible runtime for Claude Code style terminal workflows.
This repository is an unofficial compatible implementation inspired by public research, including sourcemap-based analysis of @anthropic-ai/claude-code.
- It is not a mirror of any Anthropic internal repository.
- It does not claim behavioral parity with Anthropic's official product.
- It exists to study and build reproducible agent CLI architecture in the open.
本仓库是一个受公开研究材料启发的兼容实现,其中包括对 @anthropic-ai/claude-code sourcemap 的研究。
- 它不是 Anthropic 内部仓库镜像。
- 它不声称与官方产品行为完全一致。
- 它的目标是把 Agent CLI 的系统工程公开化、可验证化、可扩展化。
claude-code-runtime: the runnable compatibility-focused runtimeclaude-code-playbook: the docs and architecture companionoperator-kernel: the next-generation agent kernel extracted from the learnings
- CLI entrypoint and command registry
- Agent loop and tool execution
- Session context and config loading
- Provider adapters for Anthropic and OpenAI-compatible APIs
- Plugin hook surface for extension
pnpm install
pnpm test
pnpm dev run "summarize my workspace"By default, claude-code-runtime uses a local deterministic planning path so the CLI remains runnable without external credentials.
Use --live to call a real provider API:
pnpm dev run "inspect this repo" --provider anthropic --live
pnpm dev run "draft release notes" --provider openai-compatible --live# Anthropic
export ANTHROPIC_AUTH_TOKEN=...
# optional
export ANTHROPIC_BASE_URL=https://api.anthropic.com
export ANTHROPIC_MODEL=claude-sonnet-4-20250514
# OpenAI-compatible
export OPENAI_API_KEY=...
# optional
export OPENAI_BASE_URL=https://api.openai.com/v1
export OPENAI_MODEL=gpt-5-mini
export OPENAI_WIRE_API=chat-completionsIf --live is used without matching credentials, the runtime returns a clear failure message instead of crashing.
claude-code-runtime resolves provider credentials in this order:
- process environment
- local
.env ~/.codex/auth.jsonforOPENAI_API_KEY
This makes it easy to reuse the key already stored by Codex without copying secrets into your shell profile.
For OpenAI-compatible providers that implement the Responses API instead of Chat Completions, set:
export OPENAI_WIRE_API=responsesIf the provider uses a root host like https://nimabo.cn, the runtime will normalize it to the matching OpenAI-style API path automatically.
If your environment exposes https_proxy, http_proxy, or all_proxy, live provider calls will route through that proxy automatically.
See ROADMAP.md.