Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -170,6 +170,8 @@ Free-Way exposes **both** OpenAI and Anthropic compatible endpoints, so most cod
- [OpenRouter Provider Setup](./docs/providers/openrouter.md)
- [OpenCode setup guide](./docs/agents/opencode.md)
- [Aider Integration Guide](./docs/agents/aider.md)
- [Continue.dev setup guide](./docs/agents/continue.md)
- [Cline / Roo Code setup guide](./docs/agents/cline-roo-code.md)

### Claude Code

Expand Down
34 changes: 34 additions & 0 deletions docs/agents/cline-roo-code.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,34 @@
# Cline / Roo Code Setup Guide

Follow these steps to connect Cline or Roo Code to Free-Way through the OpenAI-compatible endpoint.

## 1. Choose an OpenAI-compatible provider mode

In Cline or Roo Code, choose the OpenAI-compatible provider option and set:

```text
Base URL: http://localhost:8787/v1
API key: your FREEWAY_API_KEY
Model: llama-3.3-70b
```

Use a model ID that appears in your local Free-Way model catalog. If Free-Way gateway auth is disabled, the API key can be any non-empty placeholder value.

## 2. Keep the path shape exact

Use `http://localhost:8787/v1` for OpenAI-compatible clients.

Do not use `http://localhost:8787` in OpenAI-compatible provider mode, because the client will usually append `/chat/completions` or `/models` under `/v1`.

## 3. Verify the connection

Send a small prompt from Cline or Roo Code, then check Free-Way's **Usage** tab to confirm the request reached your local gateway.

## Troubleshooting

| Problem | Check |
| --- | --- |
| 404 or route not found | The base URL includes `/v1` |
| Model not found | The model ID exists in Free-Way's model catalog |
| Unauthorized | `FREEWAY_API_KEY` matches the gateway key configured in Free-Way |
| No request in Usage | The selected provider mode is OpenAI-compatible and points to localhost |
46 changes: 46 additions & 0 deletions docs/agents/continue.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,46 @@
# Continue.dev Setup Guide

Follow these steps to connect Continue.dev to Free-Way through the OpenAI-compatible endpoint.

## 1. Confirm the Free-Way endpoint

OpenAI-compatible clients should use:

```text
http://localhost:8787/v1
```

Anthropic-compatible clients should use `http://localhost:8787` without `/v1`.

## 2. Add a model to Continue

Add this entry to your Continue `config.json`:

```json
{
"models": [
{
"title": "Free-Way",
"provider": "openai",
"model": "llama-3.3-70b",
"apiBase": "http://localhost:8787/v1",
"apiKey": "your FREEWAY_API_KEY"
}
]
}
```

Use a model ID that appears in your local Free-Way model catalog. If gateway auth is disabled, the API key can be any non-empty placeholder value.

## 3. Verify the connection

Run a simple chat request from Continue, then check Free-Way's **Usage** tab to confirm the request reached your local gateway.

## Troubleshooting

| Problem | Check |
| --- | --- |
| Connection refused | Free-Way is running on `localhost:8787` |
| Model not found | The configured model exists in the Free-Way model catalog |
| Unauthorized | `FREEWAY_API_KEY` matches the gateway key configured in Free-Way |
| Requests do not appear in Usage | Continue is using this model entry and `apiBase` includes `/v1` |