Skip to content

feat: make LLM call timeout configurable via MEMORIX_LLM_TIMEOUT_MS#59

Draft
Copilot wants to merge 2 commits intomainfrom
copilot/handle-pr
Draft

feat: make LLM call timeout configurable via MEMORIX_LLM_TIMEOUT_MS#59
Copilot wants to merge 2 commits intomainfrom
copilot/handle-pr

Conversation

Copy link
Copy Markdown

Copilot AI commented Apr 6, 2026

The hardcoded 12s LLM timeout is too short for providers behind proxy chains (e.g., CLIProxyAPI → Chutes/Fireworks), causing every call to fall back to heuristic and effectively disabling LLM-enhanced mode.

Changes

  • src/llm/provider.ts: Replace LLM_CALL_TIMEOUT_MS = 12_000 with a value read from MEMORIX_LLM_TIMEOUT_MS env var, defaulting to 30s. Invalid/non-positive values fall back to the default.
export MEMORIX_LLM_TIMEOUT_MS=60000  # 60s for slow proxy chains

Copilot AI changed the title [WIP] Handle pull request processing feat: make LLM call timeout configurable via MEMORIX_LLM_TIMEOUT_MS Apr 6, 2026
Copilot AI requested a review from AVIDS2 April 6, 2026 04:09
Copy link
Copy Markdown
Owner

@AVIDS2 AVIDS2 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This looks like the stronger of the two timeout PRs. The env-var approach is narrow, safe, and importantly it validates invalid/non-positive values instead of letting configuration mistakes break the LLM path. This does not solve the full timeout family from #57 yet, but as a focused first step this change is acceptable.

Copy link
Copy Markdown
Owner

AVIDS2 commented Apr 9, 2026

We carried forward #58, which now covers this timeout direction more completely (validation, tests, and issue linkage). So from the maintainer side, I’m treating this PR as superseded rather than the primary path forward. If you want, we can close this one to keep the timeout workstream tidy.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants