Skip to content

feat: add llm-cli-gateway as optional multi-LLM orchestration add-in#866

Open
mr-k-man wants to merge 1 commit intogarrytan:mainfrom
mr-k-man:feat/llm-gateway-add-in-clean
Open

feat: add llm-cli-gateway as optional multi-LLM orchestration add-in#866
mr-k-man wants to merge 1 commit intogarrytan:mainfrom
mr-k-man:feat/llm-gateway-add-in-clean

Conversation

@mr-k-man
Copy link
Copy Markdown
Contributor

@mr-k-man mr-k-man commented Apr 6, 2026

Summary

  • Adds llm-cli-gateway as an optional gstack add-in via the contrib/add-tool/ pattern
  • When installed, 6 skills gain a "Multi-LLM Orchestration" section with contextual MCP tool recommendations
  • Adds Gemini as a third review voice, async parallel orchestration, and session continuity — complementary to existing CODEX_SECOND_OPINION / ADVERSARIAL_STEP patterns

What's included

Component Files
Contrib structure contrib/add-tool/README.md, contrib/add-tool/llm-gateway/* (README, tools.json, detection.sh, install.sh, uninstall.sh)
Resolver scripts/resolvers/llm-gateway.ts — reads tools.json, emits conditional markdown per skill with host-aware CLI filtering
Preamble detection 12 lines of bash in preamble.ts — detects llm-cli-gateway binary + per-CLI availability (claude, codex, gemini)
Template integration {{LLM_GATEWAY_CONTEXT}} added to review, investigate, plan-eng-review, plan-ceo-review, ship, retro
Tests 109 new tests (resolver, schema validation, host suppression, integration)

Design decisions

  • MCP server name: llm-cli-gw (avoids collision with simonw's llm tool)
  • Complementary: Does NOT replace existing Codex exec patterns — adds Gemini + async + sessions alongside them
  • Host filtering: Generalized ctx.host === t.requires_cli prevents self-invocation on any host
  • Async only for parallel work: review/ship use _async variants; investigate/plan/retro use sync
  • Graceful degradation: Resolver returns empty string if gateway not installed or tools.json missing

Test plan

  • 109 new tests pass (resolver unit, tools.json schema, host suppression, generated SKILL.md content, preamble detection)
  • 335 existing gen-skill-docs tests pass
  • 71 existing host-config tests pass (golden files updated)
  • bun run gen:skill-docs --host all succeeds for all 8 hosts
  • Manual: install llm-cli-gateway, run /review on a real diff, verify Multi-LLM section appears and tools work

Add llm-cli-gateway integration via the contrib/add-tool/ pattern. When
installed, 6 gstack skills gain a "Multi-LLM Orchestration" section with
contextual MCP tool recommendations for Gemini + Codex parallel reviews,
async job orchestration, and session continuity.

Components:
- contrib/add-tool/llm-gateway/: routing contract, detection, install/uninstall
- scripts/resolvers/llm-gateway.ts: resolver with host-aware CLI filtering
- Preamble detection for llm-cli-gateway binary + per-CLI availability
- {{LLM_GATEWAY_CONTEXT}} in review, investigate, plan-eng-review,
  plan-ceo-review, ship, retro templates
- 109 new tests (resolver, schema, integration, preamble)

Complementary to existing CODEX_SECOND_OPINION — does not replace it.
MCP server name: llm-cli-gw (avoids collision with simonw's llm tool).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant