Have you searched existing issues? 🔎
Desribe the bug
Even when passing in 'stop': None to bertopic.OpenAI, "stop": "\n" gets passed into the openai model call, which causes thinking models to exit early in Ollama (in particular, gemma4:e2b)
Reproduction
from bertopic import BERTopic
import openai
from bertopic.representation import OpenAI
client = openai.OpenAI(
base_url="http://localhost:11435/v1",
api_key="ollama"
)
representation_model = OpenAI(client, model="gemma4:e2b", generator_kwargs={"stop": None})
BERTopic Version
0.17.4
Have you searched existing issues? 🔎
Desribe the bug
Even when passing in 'stop': None to bertopic.OpenAI, "stop": "\n" gets passed into the openai model call, which causes thinking models to exit early in Ollama (in particular, gemma4:e2b)
Reproduction
BERTopic Version
0.17.4