Integrate multiple LLM API providers into VS Code's GitHub Copilot Chat using the Language Model API.
-
Updated
Jan 19, 2026 - TypeScript
Integrate multiple LLM API providers into VS Code's GitHub Copilot Chat using the Language Model API.
Local LLM proxy, DevOps friendly
Simplified API for OpenAI requests, providing basic access and response handling... Created at https://coslynx.com
AI-powered backend for user queries, providing quick and accurate responses... Created at https://coslynx.com
Add a description, image, and links to the language-model-api topic page so that developers can more easily learn about it.
To associate your repository with the language-model-api topic, visit your repo's landing page and select "manage topics."