Skip to content

fix: Use local Ollama as default in GitHub Issue Agent demo (manual)#252

Merged
pdettori merged 2 commits intokagenti:mainfrom
rh-dnagornuks:fix/demo-agent-ollama-local-env
Apr 8, 2026
Merged

fix: Use local Ollama as default in GitHub Issue Agent demo (manual)#252
pdettori merged 2 commits intokagenti:mainfrom
rh-dnagornuks:fix/demo-agent-ollama-local-env

Conversation

@rh-dnagornuks
Copy link
Copy Markdown
Contributor

@rh-dnagornuks rh-dnagornuks commented Mar 19, 2026

Summary

  • Add/modify OLLAMA_API_BASE and LLM_API_BASE in AuthBridge/demos/github-issue/k8s/git-issue-agent-deployment.yaml to http://host.docker.internal:11434
  • Add comments with the URLs for cluster and local instance options in AuthBridge/demos/github-issue/k8s/git-issue-agent-deployment.yaml
  • Add a note in AuthBridge/demos/github-issue/demo-manual.md to inform of the default instance type and provide method to switch to use cluster Ollama

Context

Currently, the manual demo for the GitHub Issue Agent defaults to a cluster-based Ollama instance, which differs from the environment variables used in the UI demo: https://github.com/kagenti/agent-examples/blob/main/a2a/git_issue_agent/.env.ollama, and to what is stated within the current version of the deployment YAML: https://github.com/kagenti/kagenti-extensions/blob/main/AuthBridge/demos/github-issue/k8s/git-issue-agent-deployment.yaml

The user is also unaware if or how they need to change the environment variables to use the correct instance of Ollama. This change sets the default instance in the manual demo to the local Ollama instance and adds instructions for switching to a cluster-based Ollama instance if desired.

Tests

  • The GitHub Issue Agent successfully connects to the local Ollama instance with the modified YAML file
  • The GitHub Issue Agent successfully connects after overriding the environment variables with: kubectl set env

Update GitHub issue agent YAML to use local Ollama by default to match behaviour in 'agent-examples' repo.
Preserve cluster Ollama URL as a commented alternative.

Signed-off-by: Daniels Nagornuks <dnagornu@redhat.com>
Add a note to the 'Verify Ollama is running' step explaining how to configure environment variables to connect to an in-cluster Ollama instance.

Signed-off-by: Daniels Nagornuks <dnagornu@redhat.com>
Copy link
Copy Markdown
Contributor

@rubambiza rubambiza left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Clean, well-scoped fix. The OLLAMA_API_BASE addition is needed for litellm (crewai >=1.10), and aligning the default to local Ollama matches agent-examples. Comments showing both URL options are a nice touch. LGTM.

Copy link
Copy Markdown
Contributor

@mrsabath mrsabath left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Review Summary

Clean, well-scoped fix that aligns the manual demo defaults with the agent-examples repo. The OLLAMA_API_BASE addition is correct for litellm compatibility (crewai >=1.10). Comments in the YAML and the documentation note make the configuration options clear for users.

Areas reviewed: YAML, Docs, Security, Commit conventions
Commits: 2 commits, all signed-off ✓
CI status: all passing ✓

No issues found. LGTM.

- name: TASK_MODEL_ID
value: "ollama/ibm/granite4:latest"
# Ollama API base URL. Required by litellm (used by crewai >=1.10).
# For Docker Desktop / Kind: http://host.docker.internal:11434
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍 Nice touch adding both URL options as comments — makes it easy for users to switch between local and in-cluster Ollama.

Copy link
Copy Markdown
Contributor

@pdettori pdettori left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@pdettori pdettori merged commit 6eb98f5 into kagenti:main Apr 8, 2026
18 checks passed
@rh-dnagornuks rh-dnagornuks deleted the fix/demo-agent-ollama-local-env branch April 9, 2026 11:21
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants