diff --git a/README.md b/README.md index 51574b4..2153e30 100644 --- a/README.md +++ b/README.md @@ -170,6 +170,8 @@ Free-Way exposes **both** OpenAI and Anthropic compatible endpoints, so most cod - [OpenRouter Provider Setup](./docs/providers/openrouter.md) - [OpenCode setup guide](./docs/agents/opencode.md) - [Aider Integration Guide](./docs/agents/aider.md) +- [Continue.dev setup guide](./docs/agents/continue.md) +- [Cline / Roo Code setup guide](./docs/agents/cline-roo-code.md) ### Claude Code diff --git a/docs/agents/cline-roo-code.md b/docs/agents/cline-roo-code.md new file mode 100644 index 0000000..2defb82 --- /dev/null +++ b/docs/agents/cline-roo-code.md @@ -0,0 +1,34 @@ +# Cline / Roo Code Setup Guide + +Follow these steps to connect Cline or Roo Code to Free-Way through the OpenAI-compatible endpoint. + +## 1. Choose an OpenAI-compatible provider mode + +In Cline or Roo Code, choose the OpenAI-compatible provider option and set: + +```text +Base URL: http://localhost:8787/v1 +API key: your FREEWAY_API_KEY +Model: llama-3.3-70b +``` + +Use a model ID that appears in your local Free-Way model catalog. If Free-Way gateway auth is disabled, the API key can be any non-empty placeholder value. + +## 2. Keep the path shape exact + +Use `http://localhost:8787/v1` for OpenAI-compatible clients. + +Do not use `http://localhost:8787` in OpenAI-compatible provider mode, because the client will usually append `/chat/completions` or `/models` under `/v1`. + +## 3. Verify the connection + +Send a small prompt from Cline or Roo Code, then check Free-Way's **Usage** tab to confirm the request reached your local gateway. + +## Troubleshooting + +| Problem | Check | +| --- | --- | +| 404 or route not found | The base URL includes `/v1` | +| Model not found | The model ID exists in Free-Way's model catalog | +| Unauthorized | `FREEWAY_API_KEY` matches the gateway key configured in Free-Way | +| No request in Usage | The selected provider mode is OpenAI-compatible and points to localhost | diff --git a/docs/agents/continue.md b/docs/agents/continue.md new file mode 100644 index 0000000..d2dadb3 --- /dev/null +++ b/docs/agents/continue.md @@ -0,0 +1,46 @@ +# Continue.dev Setup Guide + +Follow these steps to connect Continue.dev to Free-Way through the OpenAI-compatible endpoint. + +## 1. Confirm the Free-Way endpoint + +OpenAI-compatible clients should use: + +```text +http://localhost:8787/v1 +``` + +Anthropic-compatible clients should use `http://localhost:8787` without `/v1`. + +## 2. Add a model to Continue + +Add this entry to your Continue `config.json`: + +```json +{ + "models": [ + { + "title": "Free-Way", + "provider": "openai", + "model": "llama-3.3-70b", + "apiBase": "http://localhost:8787/v1", + "apiKey": "your FREEWAY_API_KEY" + } + ] +} +``` + +Use a model ID that appears in your local Free-Way model catalog. If gateway auth is disabled, the API key can be any non-empty placeholder value. + +## 3. Verify the connection + +Run a simple chat request from Continue, then check Free-Way's **Usage** tab to confirm the request reached your local gateway. + +## Troubleshooting + +| Problem | Check | +| --- | --- | +| Connection refused | Free-Way is running on `localhost:8787` | +| Model not found | The configured model exists in the Free-Way model catalog | +| Unauthorized | `FREEWAY_API_KEY` matches the gateway key configured in Free-Way | +| Requests do not appear in Usage | Continue is using this model entry and `apiBase` includes `/v1` |