Skip to content

Latest commit

 

History

History
63 lines (38 loc) · 1.8 KB

File metadata and controls

63 lines (38 loc) · 1.8 KB

TenetLLMLocal

TenetLLMLocal provides local inference server adapters for the Tenet platform.

Badges/Status

CI Python 3.11+ License: AGPL-3.0-only

Platform Context + EGRF Traceability

Overview

This package includes adapters for LM Studio, Ollama, and generic OpenAI-compatible local servers.

Installation

pip install tenet-llm-local

Quick Start / Usage

Configure TenetCore backend provider entries to use lmstudio, ollama, or local adapter names.

Configuration

Endpoints are configured by host runtime configuration and environment variables. SSRF safeguards restrict targets to loopback/private network destinations.

API/CLI Surface

  • Python entry-point group: tenet.llm_adapters
  • Standalone CLI: not applicable

Architecture

  • Product code: src/tenet_llm_local/
  • Adapter implementations: _lmstudio.py, _ollama.py, _generic.py

Testing

pytest tests/

Development & Contributing

Changelog

See CHANGELOG.md.

License

AGPL-3.0-only. See LICENSE.