Skip to content

v0.6.0 release#235

Merged
sethconvex merged 21 commits intomainfrom
release/0.6.0
Mar 19, 2026
Merged

v0.6.0 release#235
sethconvex merged 21 commits intomainfrom
release/0.6.0

Conversation

@sethconvex
Copy link
Contributor

@sethconvex sethconvex commented Mar 6, 2026

Summary

  • Auto-deny unresolved tool approvals when a new generation starts (merged from auto-deny-unresolved-approvals branch)
  • Fix unhandled rejection in DeltaStreamer when transaction is torn down after stream consumption
  • Remove orphaned streaming demo example files (backend was already deleted)
  • Fix pre-existing type errors in streaming integration tests
  • Bump version to 0.6.0 and update changelog

Test plan

  • npm run typecheck passes
  • npm test — 264 tests pass, 0 type errors, 0 unhandled rejections
  • npm run lint — 0 errors (2 pre-existing warnings)

🤖 Generated with Claude Code

Summary by CodeRabbit

  • New Features

    • Auto-deny unresolved tool approvals when a new generation starts
    • Tool approval workflow with server/client flows and UI approval states
  • Bug Fixes

    • Fix unhandled rejection during stream teardown
    • Avoid finishing streams after they've been aborted; persist final step on stream errors
  • Tests

    • Added comprehensive streaming/delta integration tests and approval flow tests
  • Documentation

    • Added detailed tool approval guide with examples and UI patterns
  • Chores

    • Release version updated to 0.6.0; changelog and .gitignore updated

sethconvex and others added 5 commits February 21, 2026 00:20
When a user sends a new message instead of resolving pending tool approvals,
inject synthetic denial responses rather than silently dropping the orphaned
tool calls. This preserves context in the message history and prevents the
AI SDK from receiving incomplete tool-call/result pairs.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
…Id usage

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- Auto-deny unresolved tool approvals when a new generation starts
- Fix unhandled rejection in DeltaStreamer on transaction teardown
- Remove orphaned streaming demo example files
- Fix pre-existing type errors in streaming integration tests
- Bump version to 0.6.0 and update changelog

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@coderabbitai
Copy link

coderabbitai bot commented Mar 6, 2026

📝 Walkthrough

Walkthrough

Bumps package to 0.6.0 and documents a Tool Approval flow; implements auto-deny of unresolved approvals, merges approval responses when possible, updates client-side filtering to retain pending tool-calls, tightens DeltaStreamer abort/finish behavior, adds streaming integration tests, and adds docs for tool approval.

Changes

Cohort / File(s) Summary
Release & Metadata
CHANGELOG.md, package.json, .gitignore
Version bumped to 0.6.0, changelog updated (auto-deny unresolved approvals; DeltaStreamer teardown fix), and .mcp.json added to .gitignore.
Docs: Tool Approval
docs/tool-approval.mdx
New documentation describing needsApproval, server & client flows, approval mutations/actions, UI states, multi-approval handling, and auto-deny semantics.
Mapping & Serialization
src/mapping.ts, src/mapping.test.ts
Replaced prior merge behavior with exported autoDenyUnresolvedApprovals(messages) that groups unresolved approvals by assistant step, emits warnings, injects synthetic denial responses; refactored serialization API to serializeResponseMessages and adjusted internal serialization helpers/tests.
Client: context & search
src/client/search.ts, src/client/search.test.ts
Keep tool-call messages that have pending approval requests during context assembly; call autoDenyUnresolvedApprovals as post-processing; tests updated to reflect retained tool-call and auto-deny behavior.
Client: approval handling & merging
src/client/index.ts
Added private findApprovalContext(...) to locate prompt message and optional existing response; when found, merge new tool-approval-response into existing message instead of always creating a new response.
UIMessages: approval & denied handling
src/UIMessages.ts
Pre-extract approval-request/response and execution-denied parts, apply approval metadata to toolCall parts, and set part states (approval-requested, approval-responded, output-denied) during UI message assembly.
Streaming: runtime changes
src/client/streaming.ts, src/client/streamText.ts
Tighter abort handling: await in-flight stream creation before aborting, skip finish/send when aborted, swallow send errors and trigger abort cleanup; ensure pending final step persisted on stream error.
Streaming: tests
src/client/streaming.integration.test.ts
Adds comprehensive integration tests for DeltaStreamer lifecycle, delta persistence/reconstruction, compression, filtering, abort/finish semantics, thread isolation, and helpers (deriveUIMessagesFromDeltas, compressUIMessageChunks, etc.).
Start & Callsite updates
src/client/start.ts
Replaced serializeNewMessagesInStep with serializeResponseMessages(...) and updated callsites to pass model and new response messages.
Approval tests & examples
src/client/approval.test.ts, example/*
Added multi-tool approval test flows and test helpers (testMultiToolApproveFlow, submitApprovalForMultiToolAgent); minor formatting/comment tweaks in examples.

Sequence Diagram(s)

sequenceDiagram
  autonumber
  participant UI as Client UI
  participant Server as Server/API
  participant Agent as Generator/Agent
  participant DB as DB/Message Store

  UI->>Server: sendMessage(user message)
  Server->>DB: save user message
  Server->>Agent: schedule generation
  Agent->>DB: append assistant message (tool-call + approval-request)
  Agent-->>Agent: pause generation awaiting approvals
  UI->>Server: submitApproval(approvalId, approve/deny)
  Server->>DB: save approval response (merge if existingResponseMessage)
  Server->>Agent: continueAfterApprovals
  Agent->>DB: append tool-result / continue assistant output
  Note over Server,DB: If a new generation starts before approvals resolve,\nServer runs autoDenyUnresolvedApprovals -> inject synthetic denial messages
Loading

Estimated code review effort

🎯 4 (Complex) | ⏱️ ~45 minutes

Possibly related PRs

  • AI SDK v6 Support #216 — Prior PR modifying tool-approval handling and touching mapping, UIMessages, and client search/approval areas; strongly related.

Poem

A rabbit hops in code and cheer, 🐰
Denials spring when approvals disappear,
Streams tidy up and merges hum with care,
Docs and tests parade — the repo breathes fresh air,
Version 0.6.0 — a joyful little share!

🚥 Pre-merge checks | ✅ 2 | ❌ 1

❌ Failed checks (1 warning)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 42.86% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
✅ Passed checks (2 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title 'v0.6.0 release' directly aligns with the primary purpose of the PR: bumping the version to 0.6.0 and releasing it, as evidenced by version updates in package.json and CHANGELOG.md.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
📝 Generate docstrings
  • Create stacked PR
  • Commit on current branch
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Commit unit tests in branch release/0.6.0
📝 Coding Plan
  • Generate coding plan for human review comments

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 5

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
src/client/streaming.ts (1)

337-343: ⚠️ Potential issue | 🔴 Critical

Stream state will be marked as successfully completed even when delta writes fail.

The change silently returns from sendDelta() on error instead of propagating the exception. This causes a critical issue:

  1. When an error occurs, onAsyncAbort(call.fail) notifies Convex of the failure
  2. abortController.abort() prevents further writes
  3. However, consumeStream() continues its loop (unaware of the error)
  4. The loop completes normally and calls finish(), marking the stream as successfully completed in the database
  5. This is incorrect—the stream should be marked as failed/aborted via fail(), not finished

The original exception-based flow was correct: the exception propagated to the streamText.ts catch block (lines 168-171), which explicitly called streamer?.fail() to mark the stream with the proper failed state before rethrowing.

With this change, streams that fail during delta transmission will have incorrect state in the database, appearing as successfully completed when they actually failed.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/client/streaming.ts` around lines 337 - 343, sendDelta currently swallows
errors by calling onAsyncAbort and abortController.abort() then returning, which
lets consumeStream() finish and call finish() instead of marking the stream
failed; change sendDelta (the catch block in sendDelta) to rethrow the caught
error after calling this.config.onAsyncAbort(...) and
this.abortController.abort() so the exception propagates to the streamText.ts
catch path (which calls streamer?.fail()), preserving the original error (throw
e) rather than returning silently.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In @.mcp.json:
- Around line 1-14: The committed .mcp.json contains machine- and user-specific
paths (the "command" entry referencing /Applications/Hive.app/... and the
MCP_SOCKET_PATH with /Users/magicseth/...), so remove this file from the repo
and replace it with a portable approach: add ".mcp.json" to .gitignore, create a
".mcp.json.example" with placeholder values for "command" and
"env.MCP_SOCKET_PATH" (e.g., "<PATH_TO_HIVE>" and "<MCP_SOCKET_PATH>") and
update any code that reads .mcp.json to fall back to environment variables
(e.g., process.env.MCP_COMMAND and process.env.MCP_SOCKET_PATH) when present so
local machine paths are not stored in source control.

In `@docs/http-streaming-requirements.md`:
- Around line 35-52: The fenced ASCII diagram block lacks a language specifier;
update the opening fence to include a plain text language (for example use
```text or ```plaintext) so the diagram containing create(), addDelta(),
finish(), abort()/timeout, and cleanup is lint-compliant and rendered as plain
text.
- Around line 120-123: The fenced code block containing the index query snippet
using .withIndex("streamId_start_end", (q) => q.eq("streamId",
cursor.streamId).gte("start", cursor.cursor)) should include a language
specifier (e.g., add ```text or ```js) or a short inline comment indicating the
format; update the block surrounding the symbols .withIndex,
"streamId_start_end", q.eq, gte, cursor.streamId, and cursor.cursor to start
with a language tag so syntax/formatting and reader intent are clear.

In `@src/client/streaming.integration.test.ts`:
- Line 1: Remove the unused import symbol `vi` from the test file's import
statement (the line importing beforeEach, describe, expect, test, vi) so the
file only imports the used symbols; update the import to exclude `vi` (e.g.,
import { beforeEach, describe, expect, test } from "vitest") to eliminate the
static analysis warning.
- Around line 7-12: Remove the unused symbol DEFAULT_STREAMING_OPTIONS from the
import list in src/client/streaming.integration.test.ts; update the import
statement that currently imports compressUIMessageChunks, DeltaStreamer,
mergeTransforms, DEFAULT_STREAMING_OPTIONS from "./streaming.js" to only import
compressUIMessageChunks, DeltaStreamer, and mergeTransforms so the test file no
longer contains an unused import.

---

Outside diff comments:
In `@src/client/streaming.ts`:
- Around line 337-343: sendDelta currently swallows errors by calling
onAsyncAbort and abortController.abort() then returning, which lets
consumeStream() finish and call finish() instead of marking the stream failed;
change sendDelta (the catch block in sendDelta) to rethrow the caught error
after calling this.config.onAsyncAbort(...) and this.abortController.abort() so
the exception propagates to the streamText.ts catch path (which calls
streamer?.fail()), preserving the original error (throw e) rather than returning
silently.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: dbc1b56b-6e79-4c27-aeaa-289daa6aed40

📥 Commits

Reviewing files that changed from the base of the PR and between 6fbf088 and 0b000ca.

📒 Files selected for processing (12)
  • .mcp.json
  • CHANGELOG.md
  • docs/http-streaming-requirements.md
  • docs/tool-approval.mdx
  • example/ui/main.tsx
  • package.json
  • src/client/search.test.ts
  • src/client/search.ts
  • src/client/streaming.integration.test.ts
  • src/client/streaming.ts
  • src/mapping.test.ts
  • src/mapping.ts

Comment on lines +35 to +52
```
create() addDelta() (with heartbeat)
│ │
▼ ▼
┌──────────┐ ┌──────────┐
│ streaming │─────▶│ streaming │──── heartbeat every ~2.5 min
└──────────┘ └──────────┘
│ │
│ finish() │ abort() / timeout (10 min)
▼ ▼
┌──────────┐ ┌─────────┐
│ finished │ │ aborted │
└──────────┘ └─────────┘
│ cleanup (5 min delay)
[deleted]
```
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Add language specifier to fenced code block.

This ASCII diagram code block is missing a language specifier. Consider using text or plaintext to satisfy linting and improve clarity.

-```
+```text
   create()          addDelta() (with heartbeat)
🧰 Tools
🪛 markdownlint-cli2 (0.21.0)

[warning] 35-35: Fenced code blocks should have a language specified

(MD040, fenced-code-language)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docs/http-streaming-requirements.md` around lines 35 - 52, The fenced ASCII
diagram block lacks a language specifier; update the opening fence to include a
plain text language (for example use ```text or ```plaintext) so the diagram
containing create(), addDelta(), finish(), abort()/timeout, and cleanup is
lint-compliant and rendered as plain text.

Comment on lines +120 to +123
```
.withIndex("streamId_start_end", (q) =>
q.eq("streamId", cursor.streamId).gte("start", cursor.cursor))
```
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Add language specifier to fenced code block.

This code block is missing a language specifier. Since it shows an index query pattern, consider using text or a comment indicating the format.

-```
+```text
 .withIndex("streamId_start_end", (q) =>
🧰 Tools
🪛 markdownlint-cli2 (0.21.0)

[warning] 120-120: Fenced code blocks should have a language specified

(MD040, fenced-code-language)

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@docs/http-streaming-requirements.md` around lines 120 - 123, The fenced code
block containing the index query snippet using .withIndex("streamId_start_end",
(q) => q.eq("streamId", cursor.streamId).gte("start", cursor.cursor)) should
include a language specifier (e.g., add ```text or ```js) or a short inline
comment indicating the format; update the block surrounding the symbols
.withIndex, "streamId_start_end", q.eq, gte, cursor.streamId, and cursor.cursor
to start with a language tag so syntax/formatting and reader intent are clear.

Copy link
Member

@ianmacartney ianmacartney left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

seems like there's some leftover changes in here.
overall seems good though

…nings

- Add .mcp.json to .gitignore and remove from tracking (machine-specific)
- Remove docs/http-streaming-requirements.md (would auto-publish to docs site)
- Remove unused imports (vi, DEFAULT_STREAMING_OPTIONS) in streaming tests

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@pkg-pr-new
Copy link

pkg-pr-new bot commented Mar 6, 2026

Open in StackBlitz

npm i https://pkg.pr.new/get-convex/agent/@convex-dev/agent@235

commit: 42927ba

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Nitpick comments (1)
src/client/streaming.integration.test.ts (1)

46-47: Keep the new integration suite fully typed.

This file widens Convex IDs to string and falls back to any in several assertions. That weakens the strict typing these tests should be preserving; prefer the concrete Id<...> returned by createThread and narrow parts/transforms with typed predicates instead of any.

As per coding guidelines, src/**/*.ts: Use Id<'tableName'> for document IDs and maintain strict TypeScript type safety.

Also applies to: 232-233, 369-371, 442-443, 758-759, 963-964, 155-157, 625-635, 746-748, 946-949, 1023-1025, 1203-1203

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/client/streaming.integration.test.ts` around lines 46 - 47, The tests
currently widen Convex IDs to string and use any for assertions; change threadId
from string to the concrete Id<'threads'> (or appropriate table name) and assign
it from the typed return of createThread so it preserves the Convex Id type;
replace loose any uses in assertions for parts/transforms with narrow type
predicates or type guards (e.g., isTextPart/isTransformType) and assert against
those narrowed types instead of casting to any; update occurrences around
variables t, threadId, createThread, and the parts/transforms checks so all
expectations use the proper Id<'...'> and narrowed types rather than string/any.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/client/streaming.integration.test.ts`:
- Around line 21-29: The shared test configuration defaultTestOptions currently
swallows every async abort via onAsyncAbort, hiding real streaming write
failures; change onAsyncAbort in defaultTestOptions to fail fast (e.g., throw
the received _reason or call a test failure helper) so unexpected aborts
surface, and then update only the specific tests that expect teardown/abort
behavior to override onAsyncAbort with the current no-op handler; locate
defaultTestOptions and the onAsyncAbort property to implement the change and
adjust the few tests that intentionally exercise aborts to provide their own
override.
- Around line 171-178: The tests call streamText(...) and await
streamer.consumeStream(result.toUIMessageStream()) but fail to drain the AI SDK
result itself; for each occurrence (e.g., the variable result from streamText
and its toUIMessageStream()), also call and await result.consumeStream() after
awaiting streamer.consumeStream(...) so the streamText internal async work is
fully consumed; locate uses of streamText(...) and ensure both
streamer.consumeStream(result.toUIMessageStream()) and await
result.consumeStream() are present (matching the pattern used in the first test
case) to prevent leaked async tasks/unhandled rejections.

---

Nitpick comments:
In `@src/client/streaming.integration.test.ts`:
- Around line 46-47: The tests currently widen Convex IDs to string and use any
for assertions; change threadId from string to the concrete Id<'threads'> (or
appropriate table name) and assign it from the typed return of createThread so
it preserves the Convex Id type; replace loose any uses in assertions for
parts/transforms with narrow type predicates or type guards (e.g.,
isTextPart/isTransformType) and assert against those narrowed types instead of
casting to any; update occurrences around variables t, threadId, createThread,
and the parts/transforms checks so all expectations use the proper Id<'...'> and
narrowed types rather than string/any.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 174ddac9-ead6-4473-b4af-9cca3939b322

📥 Commits

Reviewing files that changed from the base of the PR and between 0b000ca and 7ba8b13.

📒 Files selected for processing (2)
  • .gitignore
  • src/client/streaming.integration.test.ts

Comment on lines +21 to +29
const defaultTestOptions = {
throttleMs: 0,
abortSignal: undefined,
compress: null,
onAsyncAbort: async (_reason: string) => {
// In integration tests, async aborts can happen when the stream
// finishes before a pending delta write completes. This is expected.
},
};
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Don't suppress every async abort in the shared test config.

This default handler makes real streaming write failures indistinguishable from the one teardown race you're documenting. I'd make the shared default fail fast and only override it in the handful of cases where an abort is actually expected.

🔧 Minimal fix
 const defaultTestOptions = {
   throttleMs: 0,
   abortSignal: undefined,
   compress: null,
-  onAsyncAbort: async (_reason: string) => {
-    // In integration tests, async aborts can happen when the stream
-    // finishes before a pending delta write completes. This is expected.
-  },
+  onAsyncAbort: async (reason: string) => {
+    throw new Error(`Unexpected async abort in test: ${reason}`);
+  },
 };

Then override onAsyncAbort only in tests that intentionally exercise teardown/abort behavior.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/client/streaming.integration.test.ts` around lines 21 - 29, The shared
test configuration defaultTestOptions currently swallows every async abort via
onAsyncAbort, hiding real streaming write failures; change onAsyncAbort in
defaultTestOptions to fail fast (e.g., throw the received _reason or call a test
failure helper) so unexpected aborts surface, and then update only the specific
tests that expect teardown/abort behavior to override onAsyncAbort with the
current no-op handler; locate defaultTestOptions and the onAsyncAbort property
to implement the change and adjust the few tests that intentionally exercise
aborts to provide their own override.

Comment on lines +171 to +178
const result = streamText({
model: mockModel({
content: [{ type: "text", text: "Done" }],
}),
prompt: "Test",
});

await streamer.consumeStream(result.toUIMessageStream());
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

🧩 Analysis chain

🏁 Script executed:

# First, locate the test file and check if it exists
find . -name "streaming.integration.test.ts" -type f

Repository: get-convex/agent

Length of output: 102


🏁 Script executed:

# Get an overview of the file size and structure
wc -l src/client/streaming.integration.test.ts
git ls-files src/client/ | head -20

Repository: get-convex/agent

Length of output: 648


🏁 Script executed:

# Read the first test case mentioned (lines 141-143) and surrounding context
sed -n '135,150p' src/client/streaming.integration.test.ts

Repository: get-convex/agent

Length of output: 603


🏁 Script executed:

# Read the snippet from lines 171-178 (the main review target)
sed -n '165,185p' src/client/streaming.integration.test.ts

Repository: get-convex/agent

Length of output: 641


🏁 Script executed:

# Check a few of the flagged line ranges to see the pattern
sed -n '245,260p' src/client/streaming.integration.test.ts
sed -n '280,295p' src/client/streaming.integration.test.ts

Repository: get-convex/agent

Length of output: 1092


🏁 Script executed:

# Check all the flagged line ranges to confirm the incomplete pattern
echo "=== Lines 251-255 ===" && sed -n '245,260p' src/client/streaming.integration.test.ts
echo "=== Lines 287-291 ===" && sed -n '280,300p' src/client/streaming.integration.test.ts
echo "=== Lines 461-468 ===" && sed -n '455,475p' src/client/streaming.integration.test.ts
echo "=== Lines 507-511 ===" && sed -n '500,520p' src/client/streaming.integration.test.ts

Repository: get-convex/agent

Length of output: 2695


🏁 Script executed:

# Check remaining flagged ranges
echo "=== Lines 519-523 ===" && sed -n '513,530p' src/client/streaming.integration.test.ts
echo "=== Lines 557-563 ===" && sed -n '550,570p' src/client/streaming.integration.test.ts
echo "=== Lines 606-615 ===" && sed -n '600,620p' src/client/streaming.integration.test.ts

Repository: get-convex/agent

Length of output: 1953


🏁 Script executed:

# Check the last batch of flagged ranges
echo "=== Lines 983-993 ===" && sed -n '975,1000p' src/client/streaming.integration.test.ts
echo "=== Lines 1077-1083 ===" && sed -n '1070,1090p' src/client/streaming.integration.test.ts
echo "=== Lines 1122-1126 ===" && sed -n '1115,1135p' src/client/streaming.integration.test.ts

Repository: get-convex/agent

Length of output: 2242


🏁 Script executed:

# Count how many streamText() calls exist and how many have result.consumeStream() after toUIMessageStream
rg 'streamText\s*\(' src/client/streaming.integration.test.ts | wc -l
rg 'await result\.consumeStream\(\)' src/client/streaming.integration.test.ts | wc -l

Repository: get-convex/agent

Length of output: 64


🏁 Script executed:

# Look at streamText implementation to understand what result.consumeStream() does
cat -n src/client/streamText.ts | head -80

Repository: get-convex/agent

Length of output: 3269


🏁 Script executed:

# Search for any comments or documentation about result consumption
rg 'consumeStream' src/client/streamText.ts -B 3 -A 3

Repository: get-convex/agent

Length of output: 499


🏁 Script executed:

# Check if there are any issues or patterns related to teardown in the test file
rg 'teardown|unhandled|rejection|abort' src/client/streaming.integration.test.ts | head -20

Repository: get-convex/agent

Length of output: 1000


🏁 Script executed:

# Look at the context around the first test to understand the explicit comment
sed -n '135,155p' src/client/streaming.integration.test.ts

Repository: get-convex/agent

Length of output: 788


🏁 Script executed:

# Check what streamText is being imported in the integration test
head -50 src/client/streaming.integration.test.ts | grep -A 5 -B 5 import

Repository: get-convex/agent

Length of output: 848


🏁 Script executed:

# Check the full imports section
sed -n '1,30p' src/client/streaming.integration.test.ts

Repository: get-convex/agent

Length of output: 1045


🏁 Script executed:

# Let me check if this is using the Agent's streamText wrapper or the direct AI SDK one
rg 'from.*streamText' src/client/streaming.integration.test.ts

Repository: get-convex/agent

Length of output: 42


🏁 Script executed:

# Find where streamText is defined in the test
sed -n '80,130p' src/client/streaming.integration.test.ts

Repository: get-convex/agent

Length of output: 1559


Drain each streamText() result to prevent async work leaking past t.run().

The first test case (lines 141-147) explicitly does this with a comment "Ensure the AI SDK result is also fully consumed", calling both await streamer.consumeStream(result.toUIMessageStream()) and await result.consumeStream(). However, all 12 later instances only await the streamer and skip the result drain, leaving AI SDK internals unfinished when the test scope exits. This can surface as unhandled rejections during teardown.

🔧 Representative fix
 const result = streamText({
   model: mockModel({
     content: [{ type: "text", text: "Done" }],
   }),
   prompt: "Test",
 });

 await streamer.consumeStream(result.toUIMessageStream());
+await result.consumeStream();

Also applies to: 251–255, 287–291, 461–468, 507–511, 519–523, 557–563, 606–615, 983–993, 1077–1083, 1122–1126

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/client/streaming.integration.test.ts` around lines 171 - 178, The tests
call streamText(...) and await
streamer.consumeStream(result.toUIMessageStream()) but fail to drain the AI SDK
result itself; for each occurrence (e.g., the variable result from streamText
and its toUIMessageStream()), also call and await result.consumeStream() after
awaiting streamer.consumeStream(...) so the streamText internal async work is
fully consumed; locate uses of streamText(...) and ensure both
streamer.consumeStream(result.toUIMessageStream()) and await
result.consumeStream() are present (matching the pattern used in the first test
case) to prevent leaked async tasks/unhandled rejections.

Copy link
Contributor Author

seems like there's some leftover changes in here.
overall seems good though

​done

@sethconvex sethconvex requested a review from ianmacartney March 6, 2026 22:17
Copy link
Member

@ianmacartney ianmacartney left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I noticed mergeApprovalResponseMessages isn't applied at the same place where we drop hanging tool messages - so it isn't reflected in fetchRecentAndSearchMessages / fetchContextMessages. I can understand auto-rejecting only during the fetchContextWithPrompt

I'm also curious why we merge at read time instead of writing merged messages when approving / denying?

Comment on lines +96 to +130
// 3. Submit an approval decision
export const submitApproval = mutation({
args: {
threadId: v.string(),
approvalId: v.string(),
approved: v.boolean(),
reason: v.optional(v.string()),
},
returns: v.object({ messageId: v.string() }),
handler: async (ctx, { threadId, approvalId, approved, reason }) => {
const { messageId } = approved
? await approvalAgent.approveToolCall(ctx, {
threadId, approvalId, reason,
})
: await approvalAgent.denyToolCall(ctx, {
threadId, approvalId, reason,
});
return { messageId };
},
});

// 4. Continue generation after all approvals resolved.
// Pass the last approval message ID as promptMessageId so the agent
// resumes generation from where the approval was issued.
export const continueAfterApprovals = internalAction({
args: { threadId: v.string(), lastApprovalMessageId: v.string() },
handler: async (ctx, { threadId, lastApprovalMessageId }) => {
const result = await approvalAgent.streamText(
ctx,
{ threadId },
{ promptMessageId: lastApprovalMessageId },
);
await result.consumeStream();
},
});
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it's worth calling out that these can be in the same function - you can approve tool calls from a mutation or an action

...inputPrompt,
...existingResponses,
]),
);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

With this, if the user's contextHandler doesn't call autoDenyUnresolvedApprovals and mergeApprovalResponseMessages then they'll end up with invalid results. I think we should do this after the context handler as post-processing, or add them to search/recent/input/prompt/existing, though there the messages might span two lists.. I'd rather we merge at write time.

- respondToToolCallApproval now appends to an existing approval-response
  message for the same step, so the DB always has a single merged message
- Remove mergeApprovalResponseMessages from the read path in search.ts
- Move autoDenyUnresolvedApprovals to post-process after contextHandler
  so custom handlers don't need to worry about it
- Fix consumeStream to skip finish() when stream was aborted
- Note in tool-approval docs that approve/deny works from mutations or actions

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Copy link
Contributor Author

I noticed mergeApprovalResponseMessages isn't applied at the same place where we drop hanging tool messages - so it isn't reflected in fetchRecentAndSearchMessages / fetchContextMessages. I can understand auto-rejecting only during the fetchContextWithPrompt

I'm also curious why we merge at read time instead of writing merged messages when approving / denying?

​Ok, can you take one more look!

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (2)
src/client/search.ts (1)

651-674: ⚠️ Potential issue | 🟠 Major

Don't auto-deny approvals pulled in from search results.

processedMessages already includes search hits here. When searchOtherThreads surfaces an unresolved approval from another thread, autoDenyUnresolvedApprovals will synthesize auto-denied: new generation started for that unrelated conversation. That mutates retrieved history and can mislead the model. Please scope this to current-thread messages, or carry thread provenance through post-processing.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/client/search.ts` around lines 651 - 674, The current code runs
autoDenyUnresolvedApprovals(processedMessages) over messages that include search
hits from other threads, which causes synthesized auto-denials to be injected
into unrelated conversation history; restrict the mutation to only messages from
the current thread by either (A) filtering processedMessages to only messages
with the current threadId before calling autoDenyUnresolvedApprovals (use the
threadId variable and the message.threadId/provenance field) and then merge
back, or (B) change autoDenyUnresolvedApprovals to accept a threadId (or
provenance) and have it only modify messages whose threadId matches the provided
threadId; update any callers (here around processedMessages, allMessages,
search, recent, inputMessages, inputPrompt, existingResponses and
args.contextHandler) accordingly so search-sourced approvals are not mutated.
src/client/index.ts (1)

1127-1155: ⚠️ Potential issue | 🟠 Major

Scope merge candidates to the matched approval step.

existingResponseMessage is set to the first tool-approval-response seen anywhere later in the thread. If an older approval is handled after a newer step has already written its own approval response, this path will append into that unrelated message and return the wrong messageId. Please either reject stale approvals once newer history exists, or only reuse a response message that belongs to the same prompt/step as the matched request.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/client/index.ts` around lines 1127 - 1155, The scan currently sets
existingResponseMessage to the first tool-approval-response anywhere in the
thread, which can cause merging into a response for a different prompt/step;
change the logic so you locate the tool-approval-request (matching
typedPart.approvalId === args.approvalId) first (capture its message._id as
promptMessageId) and only accept/reuse a tool-approval-response if that response
explicitly references the same prompt/step (e.g. has a parent/replyTo/promptId
field equal to promptMessageId or an explicit stepId that matches the request),
otherwise treat a pre-existing response that is not tied to this request as a
newer unrelated entry and reject the stale approval; update the loop around
listMessages/listMessages(...) and the existingResponseMessage assignment to
enforce this scoped check (references: existingResponseMessage, listMessages,
typedPart, tool-approval-response, tool-approval-request, approvalId,
promptMessageId).
🧹 Nitpick comments (1)
example/convex/approval.test.ts (1)

285-295: Assert the merged messageId explicitly.

This only feeds lastMessageId into the continuation; it never proves that the first and second approvals reused the same stored message or that only one tool-response message was written. A regression back to separate writes would still pass here.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@example/convex/approval.test.ts` around lines 285 - 295, Collect and assert
the merged messageId explicitly instead of only using lastMessageId: when
iterating approvalParts and calling ctx.runMutation with
submitApprovalForTestMultiToolAgent, capture the first returned messageId (e.g.,
firstMessageId) and assert that every subsequent messageId equals firstMessageId
(or assert firstMessageId === lastMessageId after the loop); optionally also
assert the SDK/DB shows only one tool-response message for the step to guard
against separate writes. Ensure you use the existing variables approvalParts,
lastMessageId, and the ctx.runMutation(submitApprovalForTestMultiToolAgent) call
sites to locate where to add the assertions.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Outside diff comments:
In `@src/client/index.ts`:
- Around line 1127-1155: The scan currently sets existingResponseMessage to the
first tool-approval-response anywhere in the thread, which can cause merging
into a response for a different prompt/step; change the logic so you locate the
tool-approval-request (matching typedPart.approvalId === args.approvalId) first
(capture its message._id as promptMessageId) and only accept/reuse a
tool-approval-response if that response explicitly references the same
prompt/step (e.g. has a parent/replyTo/promptId field equal to promptMessageId
or an explicit stepId that matches the request), otherwise treat a pre-existing
response that is not tied to this request as a newer unrelated entry and reject
the stale approval; update the loop around listMessages/listMessages(...) and
the existingResponseMessage assignment to enforce this scoped check (references:
existingResponseMessage, listMessages, typedPart, tool-approval-response,
tool-approval-request, approvalId, promptMessageId).

In `@src/client/search.ts`:
- Around line 651-674: The current code runs
autoDenyUnresolvedApprovals(processedMessages) over messages that include search
hits from other threads, which causes synthesized auto-denials to be injected
into unrelated conversation history; restrict the mutation to only messages from
the current thread by either (A) filtering processedMessages to only messages
with the current threadId before calling autoDenyUnresolvedApprovals (use the
threadId variable and the message.threadId/provenance field) and then merge
back, or (B) change autoDenyUnresolvedApprovals to accept a threadId (or
provenance) and have it only modify messages whose threadId matches the provided
threadId; update any callers (here around processedMessages, allMessages,
search, recent, inputMessages, inputPrompt, existingResponses and
args.contextHandler) accordingly so search-sourced approvals are not mutated.

---

Nitpick comments:
In `@example/convex/approval.test.ts`:
- Around line 285-295: Collect and assert the merged messageId explicitly
instead of only using lastMessageId: when iterating approvalParts and calling
ctx.runMutation with submitApprovalForTestMultiToolAgent, capture the first
returned messageId (e.g., firstMessageId) and assert that every subsequent
messageId equals firstMessageId (or assert firstMessageId === lastMessageId
after the loop); optionally also assert the SDK/DB shows only one tool-response
message for the step to guard against separate writes. Ensure you use the
existing variables approvalParts, lastMessageId, and the
ctx.runMutation(submitApprovalForTestMultiToolAgent) call sites to locate where
to add the assertions.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: b5be4c89-991e-432e-a970-19052b86791c

📥 Commits

Reviewing files that changed from the base of the PR and between 7ba8b13 and 5ed702f.

📒 Files selected for processing (5)
  • docs/tool-approval.mdx
  • example/convex/approval.test.ts
  • src/client/index.ts
  • src/client/search.ts
  • src/client/streaming.ts
🚧 Files skipped from review as they are similar to previous changes (2)
  • src/client/streaming.ts
  • docs/tool-approval.mdx

- finish() now checks abortController.signal.aborted after awaiting
  ongoingWrite and sendDelta, preventing streams.finish from running
  on aborted streams
- findApprovalContext resets existingResponseMessage when crossing a
  different step's assistant message, preventing approval responses
  from merging into a different step's tool message

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
src/client/streaming.ts (1)

338-344: ⚠️ Potential issue | 🟠 Major

Stream may be left in orphaned "streaming" state when delta write fails.

When addDelta throws, the error is caught and swallowed after calling onAsyncAbort (which is call.fail) and setting the local abort signal. However, streams.abort is never called to mark the stream as aborted in the database.

The call.fail callback only finalizes the pending message; it does not update the stream state. Since the error doesn't propagate, the catch block in streamText.ts (lines 150-153) that would call streamer?.fail() and properly abort the stream never executes. This leaves the stream in "streaming" status in the database indefinitely, and callers receive no indication that streaming failed.

Consider calling streams.abort here before returning, or ensure onAsyncAbort is responsible for marking the stream as aborted in the database.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/client/streaming.ts` around lines 338 - 344, When addDelta throws in
src/client/streaming.ts the catch currently calls this.config.onAsyncAbort and
aborts the local abortController but never marks the stream as aborted in the
DB, leaving it stuck in "streaming"; update the catch block to also call the
stream-abort path (e.g., invoke streams.abort with the current stream
id/context) before returning, or ensure this.config.onAsyncAbort delegates to
streams.abort — specifically modify the error handler that references addDelta,
this.config.onAsyncAbort, and this.abortController.abort() to also call the
stream abort method (streams.abort) so the stream state is cleared like
streamer?.fail() does in streamText.ts.
🧹 Nitpick comments (1)
src/client/index.ts (1)

1157-1163: Clarify: tracking logic captures first encountered response regardless of approvalId.

The current logic tracks the first (newest) tool-approval-response message seen during iteration, regardless of which approvalId it contains. This is intentional for same-step merging, but the comment could clarify that responses for any approval in the same step are valid merge targets.

Consider enhancing the comment:

-          // Track the most recent tool-approval-response message for merging
+          // Track the most recent tool-approval-response message for merging.
+          // Any response in the same step is a valid merge target, regardless
+          // of which approvalId it contains.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/client/index.ts` around lines 1157 - 1163, The comment above the block
that sets existingResponseMessage when typedPart.type ===
"tool-approval-response" should be clarified: update the comment to state that
this intentionally captures the first encountered tool-approval-response message
for the current step regardless of its approvalId (i.e., responses for any
approval in the same step are valid merge targets), referencing the
typedPart.type check and the existingResponseMessage assignment so readers
understand the intended merging behavior.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Outside diff comments:
In `@src/client/streaming.ts`:
- Around line 338-344: When addDelta throws in src/client/streaming.ts the catch
currently calls this.config.onAsyncAbort and aborts the local abortController
but never marks the stream as aborted in the DB, leaving it stuck in
"streaming"; update the catch block to also call the stream-abort path (e.g.,
invoke streams.abort with the current stream id/context) before returning, or
ensure this.config.onAsyncAbort delegates to streams.abort — specifically modify
the error handler that references addDelta, this.config.onAsyncAbort, and
this.abortController.abort() to also call the stream abort method
(streams.abort) so the stream state is cleared like streamer?.fail() does in
streamText.ts.

---

Nitpick comments:
In `@src/client/index.ts`:
- Around line 1157-1163: The comment above the block that sets
existingResponseMessage when typedPart.type === "tool-approval-response" should
be clarified: update the comment to state that this intentionally captures the
first encountered tool-approval-response message for the current step regardless
of its approvalId (i.e., responses for any approval in the same step are valid
merge targets), referencing the typedPart.type check and the
existingResponseMessage assignment so readers understand the intended merging
behavior.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: cdb914be-e64d-4d27-8c18-307c81aa85fd

📥 Commits

Reviewing files that changed from the base of the PR and between 5ed702f and 8994fe3.

📒 Files selected for processing (2)
  • src/client/index.ts
  • src/client/streaming.ts

Copy link
Member

@ianmacartney ianmacartney left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The full table scan + inconsistent write scenario need fixing

- Auto-deny unresolved tool approvals when a new generation starts
- Fix unhandled rejection in DeltaStreamer on transaction teardown

**Note on versioning:** This release jumps from 0.3.2 to 0.6.0, skipping versions 0.4.0 and 0.5.0. This is intentional and aligns the `@convex-dev/agent` package version with the AI SDK v6 major version for clearer compatibility signaling. Going forward, the minor version of this package will track the AI SDK major version to make it easier for developers to identify which version of the AI SDK is supported.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

note: this does constrain us a bit in not making any breaking changes between major version changes of the AI SDK.. though we can always break from this in the future.

let cursor: string | null = null;
let existingResponseMessage: MessageDoc | undefined;
do {
const page = await this.listMessages(ctx, {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is walking every message in the chat history -if we can pass in the order / promptMessageId / something, then we can highly constrain it, and we could likely enforce that you can only respond to tool calls in the latest N message order.

Comment on lines +1073 to +1099
const { promptMessageId, existingResponseMessage } =
await this.findApprovalContext(ctx, {
threadId: args.threadId,
approvalId: args.approvalId,
});

const newPart = {
type: "tool-approval-response" as const,
approvalId: args.approvalId,
});
approved: args.approved,
reason: args.reason,
};

// Merge into an existing approval-response message for this step
// so the AI SDK sees a single tool message per step.
if (existingResponseMessage) {
const existingContent = existingResponseMessage.message?.content;
const mergedContent = Array.isArray(existingContent)
? [...(existingContent as any[]), newPart]
: [newPart];
await this.updateMessage(ctx, {
messageId: existingResponseMessage._id,
patch: {
message: { role: "tool", content: mergedContent },
status: "success",
},
});
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

note: if this is called from an action, then this could race and replace the content with last-write-wins, losing the previous approval... can we do this in saveMessage so anyone who calls saveMessage with a tool approval will have it auto-squashed into the appropriate message? It should generally be one of the last messages (maybe has to be the the last message b/c of the Gemini?).

DangerouslyShip and others added 2 commits March 10, 2026 16:14
…lict

Write-time merge in findApprovalContext handles all cases, making the
read-time mergeApprovalResponseMessages unnecessary. Also resolves
merge conflict in toModelMessageContent tool-approval-response case.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@danyi1212
Copy link

Hey @sethconvex, thank you for the great work!!

When do you expect this version to be released? Looking forward for it 🤞

sethconvex and others added 7 commits March 17, 2026 17:53
- Remove stash conflict markers from UIMessages.ts (kept approval/execution-denied UI processing code)
- Remove misplaced filterApprovalPartsForProvider call on MessageDoc[] in search.ts (only needed on ModelMessage[] in fetchContextWithPrompt)
- Update approval test expectations to match write-time merge behavior (one tool message instead of two)
- Remove duplicate JSDoc comment

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Prevents scanning entire thread history when looking for approval
requests. Approvals should always be near the end of the thread.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
serializeResponseMessages takes explicit response messages (used by
start.ts streaming loop). serializeNewMessagesInStep keeps the
heuristic path for callers that don't track message counts. Shared
logic extracted to serializeStepMessages.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
findApprovalContext was incorrectly resetting existingResponseMessage
when the same assistant message had multiple approval requests (e.g.,
two tool calls needing approval in one step). Now only resets when the
target approval is in a genuinely different step.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
The filter was stripping tool-approval-request and tool-approval-response
parts before the AI SDK could process them via collectToolApprovals.
The SDK handles filtering internally before sending to providers.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Verifies that approving two tool calls from the same assistant message
correctly merges approval responses into one tool message via
findApprovalContext's step-boundary logic.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
- compressTextStreamParts: file parts were double-pushed (sanitized copy
  + original with binary data). Fixed with else-if chain.
- DeltaStreamer abort handler: abort the controller immediately and await
  in-flight stream creation before cleanup, preventing orphaned streams.
- streamText: save deferred final step in catch block so generated
  messages aren't lost when consumeStream throws.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@sethconvex
Copy link
Contributor Author

@codex review

@chatgpt-codex-connector
Copy link

To use Codex here, create a Codex account and connect to github.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

Caution

Some comments are outside the diff and can’t be posted inline due to platform limitations.

⚠️ Outside diff range comments (1)
src/client/streaming.ts (1)

246-262: ⚠️ Potential issue | 🟠 Major

Guard the abort event listener from unhandled Promise rejections.

Line 246 uses an async event listener; if Line 253 or Lines 257-260 reject, that rejection can go unhandled. Wrap the async work in a caught task (or try/catch) so this path cannot reintroduce unhandled rejections.

Proposed hardening
-      config.abortSignal.addEventListener("abort", async () => {
+      config.abortSignal.addEventListener("abort", () => {
+        void (async () => {
         if (this.abortController.signal.aborted) {
           return;
         }
         this.abortController.abort();
         // Wait for in-flight stream creation before trying to abort it
         if (this.#creatingStreamIdPromise) {
           await this.#creatingStreamIdPromise;
         }
         if (this.streamId) {
           await this.#ongoingWrite;
           await this.ctx.runMutation(this.component.streams.abort, {
             streamId: this.streamId,
             reason: "abortSignal",
           });
         }
-      });
+        })().catch(async (e) => {
+          await this.config.onAsyncAbort(
+            e instanceof Error ? e.message : "abort listener failure",
+          );
+        });
+      });
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/client/streaming.ts` around lines 246 - 262, The abortSignal listener
added via config.abortSignal.addEventListener currently uses an async callback
that can produce unhandled rejections from awaiting
this.#creatingStreamIdPromise, this.#ongoingWrite, or
this.ctx.runMutation(this.component.streams.abort); wrap the entire async body
in a try/catch (or dispatch the work to a caught task) so any thrown error is
caught and handled (e.g., log via processLogger or swallow if intentional) and
ensure the listener still respects this.abortController.signal.aborted and
performs the abort flow for this.abortController, this.streamId, and related
awaits inside the guarded block.
🧹 Nitpick comments (1)
src/client/streaming.ts (1)

446-453: Avoid the undefined as unknown as Uint8Array double-cast here.

Line 451 bypasses the type system and weakens safety for downstream consumers of file.uint8Array. Prefer introducing a compressed/persisted file-part type where bytes are optional/omitted, then encode that type directly instead of force-casting.

As per coding guidelines src/**/*.ts: maintain strict TypeScript type safety.

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/client/streaming.ts` around lines 446 - 453, The code is force-casting
file.uint8Array to Uint8Array with "undefined as unknown as Uint8Array" inside
the part.type === "file" branch (the compressed.push block); replace this unsafe
double-cast by introducing a dedicated compressed/persisted file-part shape
where the file field omits or makes uint8Array optional (e.g., PersistedFilePart
or CompressedFilePart) and construct that object explicitly in the
compressed.push call (copy other file metadata but do not include the bytes),
then adjust any encoder/consumer to accept the new type instead of relying on a
coerced Uint8Array.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/client/approval.test.ts`:
- Around line 314-326: The type error comes from the flatMap/filter pipeline
producing a broad union that isn't narrowed by the final assertion on
approvalParts; replace the inline filter with a proper type predicate function
so TypeScript can narrow each item to the shape with type ===
"tool-approval-request" and then map to { approvalId, toolCallId } — target the
pipeline that starts from result1.savedMessages and the filter call, implement a
predicate like function isApprovalPart(p): p is { type: "tool-approval-request";
approvalId: string; toolCallId: string } and use it in .filter(isApprovalPart)
(and then .map(...) if needed) to eliminate the type mismatch for approvalParts.

In `@src/client/streamText.ts`:
- Around line 172-174: When persisting the deferred step (pendingFinalStep) in
the recovery block, wrap the await call.save({ step: pendingFinalStep }, false)
in its own try/catch so any error from call.save does not overwrite the original
caught error variable (e); if save fails, record or log that save error (e.g.,
attach as a suppressed/inner error or processLogger.error) but rethrow the
original stream error (e) so the root failure is preserved. Ensure you reference
pendingFinalStep, call.save and the original error variable (e) in the fix.

---

Outside diff comments:
In `@src/client/streaming.ts`:
- Around line 246-262: The abortSignal listener added via
config.abortSignal.addEventListener currently uses an async callback that can
produce unhandled rejections from awaiting this.#creatingStreamIdPromise,
this.#ongoingWrite, or this.ctx.runMutation(this.component.streams.abort); wrap
the entire async body in a try/catch (or dispatch the work to a caught task) so
any thrown error is caught and handled (e.g., log via processLogger or swallow
if intentional) and ensure the listener still respects
this.abortController.signal.aborted and performs the abort flow for
this.abortController, this.streamId, and related awaits inside the guarded
block.

---

Nitpick comments:
In `@src/client/streaming.ts`:
- Around line 446-453: The code is force-casting file.uint8Array to Uint8Array
with "undefined as unknown as Uint8Array" inside the part.type === "file" branch
(the compressed.push block); replace this unsafe double-cast by introducing a
dedicated compressed/persisted file-part shape where the file field omits or
makes uint8Array optional (e.g., PersistedFilePart or CompressedFilePart) and
construct that object explicitly in the compressed.push call (copy other file
metadata but do not include the bytes), then adjust any encoder/consumer to
accept the new type instead of relying on a coerced Uint8Array.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: fb9e969e-f421-45b4-b497-03679df48a1f

📥 Commits

Reviewing files that changed from the base of the PR and between 8994fe3 and f72df70.

📒 Files selected for processing (9)
  • docs/tool-approval.mdx
  • src/UIMessages.ts
  • src/client/approval.test.ts
  • src/client/index.ts
  • src/client/start.ts
  • src/client/streamText.ts
  • src/client/streaming.ts
  • src/mapping.test.ts
  • src/mapping.ts
🚧 Files skipped from review as they are similar to previous changes (1)
  • docs/tool-approval.mdx

- Wrap deferred step save in catch block with its own try/catch so
  a save failure doesn't mask the original stream error.
- Wrap abort signal listener body in try/catch to prevent unhandled
  promise rejections from crashing the process.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

♻️ Duplicate comments (1)
src/client/approval.test.ts (1)

314-320: ⚠️ Potential issue | 🟡 Minor

Fix the TypeScript type error causing pipeline failure.

The flatMap produces a broad union type that TypeScript cannot narrow through the filter + final type assertion. Use a type predicate to properly narrow the type.

🔧 Proposed fix
     // Extract both approval IDs
     const approvalParts = result1.savedMessages
       ?.flatMap((m) =>
-        Array.isArray(m.message?.content) ? m.message.content : [],
+        Array.isArray(m.message?.content)
+          ? (m.message.content as unknown[])
+          : [],
       )
       .filter(
-        (p) => (p as { type: string }).type === "tool-approval-request",
-      ) as Array<{ approvalId: string; toolCallId: string }>;
+        (
+          p,
+        ): p is {
+          type: "tool-approval-request";
+          approvalId: string;
+          toolCallId: string;
+        } => (p as { type?: string }).type === "tool-approval-request",
+      );
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/client/approval.test.ts` around lines 314 - 320, The current pipeline
uses flatMap + filter with an unsafe type assertion for approvalParts causing a
TypeScript error; replace the final type assertion by adding a type predicate
function (e.g., isToolApprovalRequest) that checks p?.type ===
"tool-approval-request" and returns p is { approvalId: string; toolCallId:
string }, then pass that predicate into the filter used after flatMap (on
result1.savedMessages.flatMap(...)). Update approvalParts to be declared from
that filtered result (no cast) so TypeScript can correctly narrow the union.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Duplicate comments:
In `@src/client/approval.test.ts`:
- Around line 314-320: The current pipeline uses flatMap + filter with an unsafe
type assertion for approvalParts causing a TypeScript error; replace the final
type assertion by adding a type predicate function (e.g., isToolApprovalRequest)
that checks p?.type === "tool-approval-request" and returns p is { approvalId:
string; toolCallId: string }, then pass that predicate into the filter used
after flatMap (on result1.savedMessages.flatMap(...)). Update approvalParts to
be declared from that filtered result (no cast) so TypeScript can correctly
narrow the union.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 0fa66f39-c370-49a5-a45a-e264327bcb96

📥 Commits

Reviewing files that changed from the base of the PR and between 8994fe3 and c006492.

📒 Files selected for processing (9)
  • docs/tool-approval.mdx
  • src/UIMessages.ts
  • src/client/approval.test.ts
  • src/client/index.ts
  • src/client/start.ts
  • src/client/streamText.ts
  • src/client/streaming.ts
  • src/mapping.test.ts
  • src/mapping.ts
🚧 Files skipped from review as they are similar to previous changes (3)
  • src/client/streaming.ts
  • src/client/start.ts
  • src/client/streamText.ts

Copy link
Contributor Author

@sethconvex sethconvex left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ok, I think I've addressed the issues here. I've reviewed all the code by eyeball, I've tested locally, and had multiple review passes from ai agents. I think we're ready to release.

@sethconvex sethconvex requested a review from ianmacartney March 18, 2026 18:24
Use a type predicate in the filter instead of a cast to properly
narrow the approval parts type.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@src/client/approval.test.ts`:
- Around line 337-344: Rename the intentionally unused destructured variable
msgId1 to have a leading underscore to satisfy the ESLint rule; update the first
await call that uses ctx.runMutation and
anyApi["approval.test"].submitApprovalForMultiToolAgent to destructure as
messageId: _msgId1 (leaving the second destructuring to messageId: msgId2
unchanged) so the variable is clearly marked as intentionally unused.

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 3134c1b1-dd8a-4d99-b912-99766227bef2

📥 Commits

Reviewing files that changed from the base of the PR and between c006492 and 57b9785.

📒 Files selected for processing (1)
  • src/client/approval.test.ts

Comment on lines +337 to +344
const { messageId: msgId1 } = await ctx.runMutation(
anyApi["approval.test"].submitApprovalForMultiToolAgent,
{ threadId: thread.threadId, approvalId: approvalParts[0].approvalId },
);
const { messageId: msgId2 } = await ctx.runMutation(
anyApi["approval.test"].submitApprovalForMultiToolAgent,
{ threadId: thread.threadId, approvalId: approvalParts[1].approvalId },
);
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

Prefix unused variable with underscore.

msgId1 is assigned but never used. Per the ESLint rule, unused variables should match /^_/u. Since both approvals are merged at write time and only the last promptMessageId is needed to continue generation, this is intentionally unused.

🔧 Proposed fix
     // Approve both tool calls
-    const { messageId: msgId1 } = await ctx.runMutation(
+    const { messageId: _msgId1 } = await ctx.runMutation(
       anyApi["approval.test"].submitApprovalForMultiToolAgent,
       { threadId: thread.threadId, approvalId: approvalParts[0].approvalId },
     );
🧰 Tools
🪛 GitHub Check: Test and lint

[warning] 337-337:
'msgId1' is assigned a value but never used. Allowed unused vars must match /^_/u

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@src/client/approval.test.ts` around lines 337 - 344, Rename the intentionally
unused destructured variable msgId1 to have a leading underscore to satisfy the
ESLint rule; update the first await call that uses ctx.runMutation and
anyApi["approval.test"].submitApprovalForMultiToolAgent to destructure as
messageId: _msgId1 (leaving the second destructuring to messageId: msgId2
unchanged) so the variable is clearly marked as intentionally unused.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
let existingResponseMessage: MessageDoc | undefined;
// Limit the search to the most recent messages. Approvals should always
// be near the end of the thread — scanning further is wasteful.
const MAX_MESSAGES_TO_SCAN = 200;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: let's make this 100 and just fetch a single page instead of the do/while

});
}
} catch (e) {
console.error("Error during stream abort cleanup:", e);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is it ok to swallow the error here?

// Skip finish if it will be handled externally (atomically with message save)
if (!this.#finishedExternally) {
// or if the stream was aborted (e.g., due to a failed delta write)
if (!this.#finishedExternally && !this.abortController.signal.aborted) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

to confirm - calling streams.abort means we don't need to call this.finish for aborted streams?

Comment on lines +387 to +389
if (this.abortController.signal.aborted) {
return;
}
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

nit: this isn't necessary, since sendDelta does this as the first line

@sethconvex sethconvex merged commit 853a2b7 into main Mar 19, 2026
2 checks passed
Copy link
Contributor Author

Merge activity

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants