Codex-style tools for Pi.
Note
Use the npm package for normal installs. Avoid pi install git:... unless you know you want the development checkout; see Development checkout.
GPT/Codex models are strongest when the tool surface looks like the Codex CLI they were trained around: shell commands, resumable terminal sessions, and patch-based edits. This extension brings that workflow to Pi while keeping Pi's runtime, sessions, project context, skills, and UI.
The point is to give the model tools it already knows how to use well: shell-first inspection, resumable command sessions, and large one-shot patch edits instead of piecemeal read/edit/write steps.
pi install npm:@howaboua/pi-codex-conversionThe Git checkout is mostly for development and mirrors the maintainer workflow. If you run it directly, you may need to build the bundled apply_patch binary for your platform.
Run the current checkout without installing globally:
pi --no-extensions --no-skills -e /path/to/pi-codex-conversionWhen the adapter is active, the LLM sees these tools:
exec_command— shell execution with Codex-stylecmdparameters and resumable sessionswrite_stdin— continue or poll a running exec sessionapply_patch— patch toolweb_search— native OpenAI Codex Responses web search, enabled only on theopenai-codexproviderimage_generation— native OpenAI Codex Responses image generation, enabled only on image-capableopenai-codexmodelsview_image— image-only wrapper around Pi's native image reading, enabled only for image-capable models
Notably:
- there is no dedicated
read,edit, orwritetool in adapter mode - local text-file inspection should happen through
exec_command - file creation and edits should default to
apply_patch - Pi may still expose additional runtime tools such as
parallel; the prompt is written to tolerate that instead of assuming a fixed four-tool universe
- Adapter mode activates automatically for OpenAI
gpt*andcodex*models, then restores the previous tool set when you switch away. - Pi's composed prompt is preserved; the extension only adds a small Codex-style tool-use nudge.
- Shell activity is rendered with Codex-like labels such as
Ran,Explored,Read, and background-terminal status. apply_patchrenders as Codex-styleAdded/Edited/Deletedblocks, including inline partial-failure state.- Native web search appears as a compact expandable summary after a turn, with queries and sources in the expanded view.
- Generated images are saved under
.pi/openai-codex-images/at the workspace/repo root, with the latest image mirrored tolatest.png.
rg -n foo src->Explored / Search foo in srcrg --files src | head -n 50->Explored / List srccat README.md->Explored / Read README.mdnpm test->Ran npm testwrite_stdin({ session_id, chars: "" })->Waited for background terminalwrite_stdin({ session_id, chars: "y\n" })->Interacted with background terminal
Raw command output is still available by expanding the tool result.
exec_commandandwrite_stdinuse a PTY-backed session manager for interactive commands and long-running processes.apply_patchaccepts absolute paths as-is and resolves relative paths against the current working directory.- Shell
apply_patchis also available insideexec_command, but the dedicatedapply_patchtool is preferred unless you are chaining edits with other shell steps. - Native
web_searchandimage_generationare forwarded to OpenAI Codex Responses tools rather than executed as local function tools.
MIT
