Symptom
/codex:adversarial-review (and presumably /codex:review) fails immediately with:
[codex] Codex error: {"type":"error","status":400,"error":{"type":"invalid_request_error",
"message":"The 'gpt-5.5' model requires a newer version of Codex. Please upgrade to the latest app or CLI and try again."}}
[codex] Turn failed.
Result file shows: Codex did not return valid structured JSON. — Parse error: <same 400 above>.
Environment
- Codex CLI:
0.120.0 → upgraded to 0.130.0 (latest as of 2026-05-09). Same failure on both.
- Plugin:
codex@openai-codex v1.0.4 (latest, reinstalled cleanly to confirm).
- Auth: ChatGPT login active (verified via
node scripts/codex-companion.mjs setup --json → ready: true, auth.loggedIn: true).
~/.codex/config.toml: model = "gpt-5.5" (default after CLI upgrade).
- Direct
codex review --base <ref> works correctly with same model and same config — the issue is plugin-specific.
Reproduction
- Codex CLI 0.130.0, ChatGPT login,
model = \"gpt-5.5\" in config.toml
- Run
/codex:adversarial-review --base <ref> --background (or --wait)
- Plugin task-thread returns HTTP 400 from Codex API
Root Cause Hypothesis
scripts/codex-companion.mjs adversarial-review (and review) only accept `[--wait|--background|--base|--scope] [focus]` — no `--model` flag (only `task` subcommand accepts `--model`). The companion script falls back to the config.toml default model, which on a freshly-upgraded CLI is `gpt-5.5`. The plugin's task-thread protocol layer apparently doesn't advertise sufficient version to the API for gpt-5.5 dispatch — direct `codex review --base ` (CLI-native, not task-thread) succeeds with the same model and config.
Workaround
Edit `~/.codex/config.toml`: `model = "gpt-5.4"` (older but supported by plugin's task-thread protocol). Revert after invocation.
Suggested Fix
Either:
- Expose `--model ` flag on `review` and `adversarial-review` subcommands (mirror what `task` already does), so users can pin a known-good model per-invocation, or
- Update plugin's task-thread protocol layer to advertise the current Codex version to the API, matching what direct `codex review` does.
Plugin 1.0.4 was published 2026-04-18; gpt-5.5 launched 2026-04-23 — plugin is 5 days behind on protocol compatibility.
Impact
Plugin currently unusable on default-config CLI 0.130.0. New users hitting this immediately after install will get the cryptic "upgrade your CLI" message even though CLI is already at latest.
Symptom
/codex:adversarial-review(and presumably/codex:review) fails immediately with:Result file shows:
Codex did not return valid structured JSON. — Parse error: <same 400 above>.Environment
0.120.0→ upgraded to0.130.0(latest as of 2026-05-09). Same failure on both.codex@openai-codexv1.0.4 (latest, reinstalled cleanly to confirm).node scripts/codex-companion.mjs setup --json→ready: true,auth.loggedIn: true).~/.codex/config.toml:model = "gpt-5.5"(default after CLI upgrade).codex review --base <ref>works correctly with same model and same config — the issue is plugin-specific.Reproduction
model = \"gpt-5.5\"in config.toml/codex:adversarial-review --base <ref> --background(or--wait)Root Cause Hypothesis
scripts/codex-companion.mjs adversarial-review(andreview) only accept `[--wait|--background|--base|--scope] [focus]` — no `--model` flag (only `task` subcommand accepts `--model`). The companion script falls back to the config.toml default model, which on a freshly-upgraded CLI is `gpt-5.5`. The plugin's task-thread protocol layer apparently doesn't advertise sufficient version to the API for gpt-5.5 dispatch — direct `codex review --base ` (CLI-native, not task-thread) succeeds with the same model and config.Workaround
Edit `~/.codex/config.toml`: `model = "gpt-5.4"` (older but supported by plugin's task-thread protocol). Revert after invocation.
Suggested Fix
Either:
Plugin 1.0.4 was published 2026-04-18; gpt-5.5 launched 2026-04-23 — plugin is 5 days behind on protocol compatibility.
Impact
Plugin currently unusable on default-config CLI 0.130.0. New users hitting this immediately after install will get the cryptic "upgrade your CLI" message even though CLI is already at latest.