Skip to content

Request: Ollama as an LLM backend #94

@rdamron

Description

@rdamron

I was wondering if adding Ollama as a backend server would be an option. It's a polished wrapper for llama cpp I believe. https://ollama.com

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions