Experimental. Use at your own risk.
Use your Codex subscription as the backend for Claude Code — one subscription, both sides.
Minimal Anthropic Messages API gateway that talks to your local codex app-server.
Option A — global (recommended)
git clone https://github.com/EgonexAI/codex2cc.git
cd codex2cc
npm install -g .Then you can run codex-gateway from any directory.
Option B — use without installing
From your project: npx codex2cc start (and npx codex-gateway help when needed).
1. Start the gateway from your project directory (so Codex uses it as workdir):
codex-gateway startIf not installed globally: npx codex2cc start
2. Set env and use Claude (other terminal or IDE):
macOS / Linux:
export ANTHROPIC_BASE_URL=http://127.0.0.1:8080
export ANTHROPIC_API_KEY=dummy
claude --setting-sources localWindows (recommended): edit %USERPROFILE%\.claude\settings.json
{
"env": {
"ANTHROPIC_BASE_URL": "http://127.0.0.1:8080",
"ANTHROPIC_API_KEY": "dummy"
}
}Then run claude.
If your user settings (~/.claude/settings.json, Windows: %USERPROFILE%\.claude\settings.json) overrides the URL, run Claude with --setting-sources local.
| Option | Default | Description |
|---|---|---|
GATEWAY_PORT |
8080 |
HTTP port |
CODEX_PATH |
codex |
Codex CLI path (on Windows, auto-resolves .exe / .cmd / .bat) |
CODEX_WORKDIR |
cwd | Working directory (where you run the gateway) |
CODEX_SANDBOX |
workspace-write |
read-only, workspace-write, danger-full-access, seatbelt |
DEFAULT_CODEX_MODEL |
gpt-5.2 |
Fallback Codex model |
More: AUTO_APPROVE, FORCE_STREAM_FALSE, AUTO_RESTART, timeouts, etc. Run codex-gateway help (or npx codex-gateway help) for CLI flags.
- POST /v1/messages — non-streaming.
- POST /v1/messages/count_tokens, GET /v1/models — minimal compatible responses.
Model mapping: opus → gpt-5.3-codex; sonnet / haiku → gpt-5.2; others use DEFAULT_CODEX_MODEL.
curl http://127.0.0.1:8080/healthz