Skip to content

Beaker12/TenetLLMLocal

Repository files navigation

TenetLLMLocal

Local inference server adapters for the Tenet platform.

This package provides:

  • LMStudioAdapter for LM Studio native v1 model lifecycle APIs plus OAI-compat generation
  • OllamaAdapter for Ollama native APIs
  • GenericLocalAdapter for generic local OAI-compatible servers

All adapters enforce an SSRF guard that only permits loopback and private-network targets.

Platform Context

Installation

pip install tenet-llm-local

Entry Points

[project.entry-points."tenet.llm_adapters"]
lmstudio = "tenet_llm_local._lmstudio:LMStudioAdapter"
ollama = "tenet_llm_local._ollama:OllamaAdapter"
local = "tenet_llm_local._generic:GenericLocalAdapter"

About

TenetLLMLocal — local inference adapters for LM Studio, Ollama, and generic OpenAI-compatible endpoints

Topics

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages