Skip to content

Thinking models in ollama exit early #2481

@sam-baumann

Description

@sam-baumann

Have you searched existing issues? 🔎

  • I have searched and found no existing issues

Desribe the bug

Even when passing in 'stop': None to bertopic.OpenAI, "stop": "\n" gets passed into the openai model call, which causes thinking models to exit early in Ollama (in particular, gemma4:e2b)

Reproduction

from bertopic import BERTopic
import openai
from bertopic.representation import OpenAI

client = openai.OpenAI(
    base_url="http://localhost:11435/v1",
    api_key="ollama"
)

representation_model = OpenAI(client, model="gemma4:e2b", generator_kwargs={"stop": None})

BERTopic Version

0.17.4

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions