Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 9 additions & 2 deletions CLAUDE.md
Original file line number Diff line number Diff line change
Expand Up @@ -418,11 +418,18 @@ Run coding agents via Google Cloud Vertex AI:

**Note:** Use `@` instead of `-` in model names for Vertex AI.

### Alternative API Providers (GLM, Ollama, Kimi, Custom)
### Alternative API Providers (GLM, Ollama, Kimi, GitHub Copilot, Custom)

Alternative providers are configured via the **Settings UI** (gear icon > API Provider section). Select a provider, set the base URL, auth token, and model — no `.env` changes needed.

**Available providers:** Claude (default), GLM (Zhipu AI), Ollama (local models), Kimi (Moonshot), Custom
**Available providers:** Claude (default), GLM (Zhipu AI), Ollama (local models), Kimi (Moonshot), GitHub Copilot (Enterprise), Custom

**GitHub Copilot (Enterprise) notes:**
- Uses the [copilot-api](https://github.com/ericc-ch/copilot-api) proxy to expose your GitHub Copilot Enterprise subscription as an Anthropic-compatible API
- Setup: `npx copilot-api@latest start` (authenticates via GitHub OAuth in browser)
- Available models: Claude Opus 4.6, Claude Opus 4.6 Fast, GPT 5.2 Codex, GPT 5.3 Codex
- No API key needed in AutoCoder — the proxy handles authentication
- Check proxy status: `curl http://localhost:4141/v1/models`

**Ollama notes:**
- Requires Ollama v0.14.0+ with Anthropic API compatibility
Expand Down
5 changes: 4 additions & 1 deletion client.py
Original file line number Diff line number Diff line change
Expand Up @@ -345,19 +345,22 @@ def create_client(
from registry import get_effective_sdk_env
sdk_env = get_effective_sdk_env()

# Detect alternative API mode (Ollama, GLM, or Vertex AI)
# Detect alternative API mode (Ollama, GLM, GitHub Copilot, or Vertex AI)
base_url = sdk_env.get("ANTHROPIC_BASE_URL", "")
is_vertex = sdk_env.get("CLAUDE_CODE_USE_VERTEX") == "1"
is_alternative_api = bool(base_url) or is_vertex
is_ollama = "localhost:11434" in base_url or "127.0.0.1:11434" in base_url
is_azure = "services.ai.azure.com" in base_url
is_github_copilot = "localhost:4141" in base_url or "127.0.0.1:4141" in base_url
model = convert_model_for_vertex(model)
if sdk_env:
print(f" - API overrides: {', '.join(sdk_env.keys())}")
if is_vertex:
project_id = sdk_env.get("ANTHROPIC_VERTEX_PROJECT_ID", "unknown")
region = sdk_env.get("CLOUD_ML_REGION", "unknown")
print(f" - Vertex AI Mode: Using GCP project '{project_id}' with model '{model}' in region '{region}'")
elif is_github_copilot:
print(" - GitHub Copilot Mode: Using copilot-api proxy (auth via GitHub OAuth)")
elif is_ollama:
print(" - Ollama Mode: Using local models")
elif is_azure:
Expand Down
13 changes: 13 additions & 0 deletions registry.py
Original file line number Diff line number Diff line change
Expand Up @@ -699,6 +699,19 @@ def get_all_settings() -> dict[str, str]:
],
"default_model": "qwen3-coder",
},
"github-copilot": {
"name": "GitHub Copilot (Enterprise)",
"base_url": "http://localhost:4141",
"requires_auth": False, # copilot-api handles auth via GitHub OAuth
"models": [
{"id": "claude-opus-4.6", "name": "Claude Opus 4.6"},
{"id": "claude-opus-4.6-fast", "name": "Claude Opus 4.6 Fast"},
{"id": "gpt-5.2-codex", "name": "GPT 5.2 Codex"},
{"id": "gpt-5.3-codex", "name": "GPT 5.3 Codex"},
],
"default_model": "claude-opus-4.6",
"setup_instructions": "npx copilot-api@latest start",
},
"custom": {
"name": "Custom Provider",
"base_url": "",
Expand Down
1 change: 1 addition & 0 deletions ui/src/components/SettingsModal.tsx
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@ const PROVIDER_INFO_TEXT: Record<string, string> = {
kimi: 'Get an API key at kimi.com',
glm: 'Get an API key at open.bigmodel.cn',
ollama: 'Run models locally. Install from ollama.com',
'github-copilot': 'Requires copilot-api proxy. Run: npx copilot-api@latest start',
custom: 'Connect to any OpenAI-compatible API endpoint.',
}

Expand Down