Add protobuf-ts interceptor support and implementation plan for #101#603
Add protobuf-ts interceptor support and implementation plan for #101#603soc221b wants to merge 6 commits intocodex-protobuf-tsfrom
Conversation
There was a problem hiding this comment.
Pull request overview
This PR adds first-class protobuf-ts (gRPC-Web transport) support to the gRPC Devtools browser extension, addressing issue #101. It introduces a new interceptor that captures unary and server-streaming RPC calls and forwards them to the existing devtools message pipeline, along with documentation and usage examples.
Changes:
- New
protobuf-tsinterceptor implementation incontent-scripts/src/main/protobuf-ts.tswithinterceptUnaryandinterceptServerStreamingmethods, plus metadata normalization. - Exposed
protobufTsInterceptoron the injectedwindow.__gRPC_devtools__global and added README documentation and example snippets. - Added an implementation plan document at
docs/issue-101-protobuf-ts-plan.md.
Reviewed changes
Copilot reviewed 7 out of 7 changed files in this pull request and generated 3 comments.
Show a summary per file
| File | Description |
|---|---|
content-scripts/src/main/protobuf-ts.ts |
New interceptor implementation with unary and server-streaming support |
content-scripts/src/main/index.ts |
Imports and exposes the new interceptor on the injected global |
README.md |
Documents protobuf-ts usage with code snippets |
example/protobuf-ts/client/src/grpc-devtools.ts |
Example interceptor wiring snippet |
example/protobuf-ts/client/src/transport.ts |
Example transport configuration snippet |
example/protobuf-ts/README.md |
Example directory documentation |
docs/issue-101-protobuf-ts-plan.md |
Implementation and rollout plan for issue #101 |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| call.then( | ||
| (finishedUnaryCall) => { | ||
| postMessageToContentScript({ | ||
| id, | ||
| responseMetadata: toMetadataRecord(finishedUnaryCall.headers), | ||
| responseMessage: finishedUnaryCall.response, | ||
| }); | ||
|
|
||
| postMessageToContentScript({ | ||
| id, | ||
| responseMessage: "EOF", | ||
| }); | ||
|
|
||
| return finishedUnaryCall; | ||
| }, | ||
| (error: RpcError) => { | ||
| postMessageToContentScript({ | ||
| id, | ||
| responseMessage: toSerializableError(error), | ||
| errorMetadata: toMetadataRecord(error.meta), | ||
| }); | ||
| throw error; | ||
| }, | ||
| ); |
There was a problem hiding this comment.
The .then(onFulfilled, onRejected) call on line 108 creates a new promise, but its result is never stored or chained. When the onRejected handler on line 129 throws, the resulting rejected promise is unhandled, which will trigger an UnhandledPromiseRejection warning in the browser console.
Compare this to the gRPC-Web unary interceptor in grpc.ts (line 26), which returns the chained promise so the rejection is properly propagated.
Either remove the throw error in the rejection handler (just let the original call promise propagate the error to the caller), or add a .catch(() => {}) to suppress the dangling rejection from the forked promise chain.
| call.responses.onError((error) => { | ||
| postMessageToContentScript({ | ||
| id, | ||
| responseMessage: toSerializableError(error), | ||
| errorMetadata: toMetadataRecord(error.meta), | ||
| }); | ||
|
|
||
| throw error; |
There was a problem hiding this comment.
throw error inside the onError callback will cause an uncaught exception. Unlike the unary case where the throw is inside a .then() rejection handler (which creates a rejected promise), here the throw is inside a plain callback invoked by the stream's internal event machinery. There's no try-catch or promise context to catch it, so it becomes an unhandled runtime error.
Compare to the gRPC-Web stream interceptor in grpc.ts (lines 70-82), which only posts the error message to the devtools pipeline but does not re-throw. The throw error on this line should be removed.
| value, | ||
| ], | ||
| ) => { | ||
| acc[key] = Array.isArray(value) ? String(value[0] ?? "") : String(value); |
There was a problem hiding this comment.
When metadata values are arrays with multiple elements, only the first element is kept and the rest are silently dropped. This is a lossy conversion that could hide important metadata from the devtools display. The standard HTTP convention for multi-valued headers is to join them with ", " (comma-space). Consider using value.join(", ") instead of String(value[0] ?? "") so all metadata values are preserved in the devtools output.
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 10 out of 11 changed files in this pull request and generated 3 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| postMessageToContentScript({ | ||
| id, | ||
| responseMessage: toSerializableError(error), | ||
| errorMetadata: toMetadataRecord(error.meta), |
There was a problem hiding this comment.
The onError callback in protobuf-ts's RpcOutputStream is typed as (error: Error) => void, but here error is passed to toSerializableError (which expects RpcError) and error.meta is accessed (which doesn't exist on Error). At runtime, the actual errors in gRPC calls are indeed RpcError instances so this works in practice, but the type mismatch means TypeScript won't catch misuses. The error parameter should be cast to RpcError (e.g., as RpcError) to make the type intent explicit and ensure error.code and error.meta accesses are type-safe.
| postMessageToContentScript({ | |
| id, | |
| responseMessage: toSerializableError(error), | |
| errorMetadata: toMetadataRecord(error.meta), | |
| const rpcError = error as RpcError; | |
| postMessageToContentScript({ | |
| id, | |
| responseMessage: toSerializableError(rpcError), | |
| errorMetadata: toMetadataRecord(rpcError.meta), |
| const fs = require("fs"); | ||
| const path = require("path"); | ||
|
|
||
| test("protobuf-ts interceptor is exposed in injected globals", () => { | ||
| const indexPath = path.resolve(__dirname, "..", "content-scripts", "src", "main", "index.ts"); | ||
| const content = fs.readFileSync(indexPath, "utf-8"); | ||
|
|
||
| expect(content).toContain("protobufTsInterceptor"); | ||
| expect(content).toContain("value: protobufTsInterceptor"); | ||
| }); | ||
|
|
||
| test("protobuf-ts interceptor handles unary and server streaming flows", () => { | ||
| const filePath = path.resolve( | ||
| __dirname, | ||
| "..", | ||
| "content-scripts", | ||
| "src", | ||
| "main", | ||
| "protobuf-ts.ts", | ||
| ); | ||
| const content = fs.readFileSync(filePath, "utf-8"); | ||
|
|
||
| expect(content).toContain("interceptUnary"); | ||
| expect(content).toContain("interceptServerStreaming"); | ||
| expect(content).toContain('responseMessage: "EOF"'); | ||
| expect(content).toContain("toMetadataRecord(call.requestHeaders)"); | ||
| expect(content).toContain("responseMetadata"); | ||
| }); |
There was a problem hiding this comment.
These tests only assert that certain strings exist in the source file text, rather than actually testing the interceptor's behavior (e.g., that postMessageToContentScript is called with the expected arguments for unary/streaming flows, error handling, and metadata normalization). This provides no real test coverage — any refactoring that preserves behavior but changes variable names or code structure would break these tests, while bugs in the actual logic would go undetected.
The existing __tests__/grpc-web.ts test was a special case: it inspected the minified internals of a third-party library to ensure compatibility. The new interceptor is first-party code that could be tested with mocks and assertions on postMessageToContentScript calls.
| responseMessage: toSerializableError(error), | ||
| errorMetadata: toMetadataRecord(error.meta), | ||
| }); | ||
| throw error; |
There was a problem hiding this comment.
The call.then(onFulfilled, onRejected) on line 54 creates a new promise whose return value is discarded (the original call is returned on line 79 instead). When the call fails, the onRejected handler at line 69 re-throws the error, which produces an unhandled promise rejection from this discarded promise — even though the caller may correctly handle the error on the original call.
This differs from the grpc.ts unary interceptor, which returns the promise chain directly (return invoker(request).then(...).catch(...) at grpc.ts:26), so the caller handles the final rejection.
To fix this, either remove throw error at line 75 (since the original call already propagates the rejection to the caller), or attach a no-op .catch() to the new promise returned by call.then(...).
| throw error; |
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 10 out of 11 changed files in this pull request and generated 3 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| return Object.entries(metadata).reduce<Record<string, string>>( | ||
| ( | ||
| acc, | ||
| [ | ||
| key, | ||
| value, | ||
| ], | ||
| ) => { | ||
| acc[key] = Array.isArray(value) ? String(value[0] ?? "") : String(value); | ||
| return acc; | ||
| }, | ||
| {}, | ||
| ); |
There was a problem hiding this comment.
toMetadataRecord() drops all but the first value when a metadata entry is a string[] (it uses value[0]). This can silently lose header/trailer values (e.g., repeated metadata keys). Consider joining all values into a single string (e.g., comma-joined) so normalization to Record<string, string> preserves information instead of truncating it.
| }, | ||
| "devDependencies": { | ||
| "@connectrpc/connect": "2.1.0", | ||
| "@protobuf-ts/runtime-rpc": "^2.11.1", |
There was a problem hiding this comment.
Version ranges are pinned to exact versions elsewhere in this file, but @protobuf-ts/runtime-rpc is added with a caret range (^2.11.1). This can lead to non-reproducible installs and unexpected lockfile churn; consider pinning it to an exact version (e.g., 2.11.1) to match the existing dependency style.
| "@protobuf-ts/runtime-rpc": "^2.11.1", | |
| "@protobuf-ts/runtime-rpc": "2.11.1", |
| import { interceptors } from "./grpc-devtools"; | ||
|
|
||
| const transport = new GrpcWebFetchTransport({ | ||
| baseUrl: "http://localhost:8080", |
There was a problem hiding this comment.
The protobuf-ts snippet uses baseUrl: "http://localhost:8080", but the new protobuf-ts example (and existing Connect-ES docs) use http://localhost:3003 for the backend. This mismatch can cause copy/paste setups to fail; consider aligning the README snippet with the example/baseUrl used elsewhere.
| baseUrl: "http://localhost:8080", | |
| baseUrl: "http://localhost:3003", |
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 10 out of 11 changed files in this pull request and generated 3 comments.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
You can also share your feedback on Copilot code review. Take the survey.
|
|
||
| void Promise.allSettled([ | ||
| call.headers, | ||
| call.trailers, | ||
| ]).then( | ||
| ([ | ||
| headersResult, | ||
| trailersResult, | ||
| ]) => { | ||
| responseMetadata = mergeMetadata( | ||
| headersResult.status === "fulfilled" ? headersResult.value : undefined, | ||
| trailersResult.status === "fulfilled" ? trailersResult.value : undefined, | ||
| ); | ||
| }, | ||
| ); | ||
|
|
There was a problem hiding this comment.
responseMetadata is populated via Promise.allSettled([call.headers, call.trailers]), which will only resolve after call.trailers resolves (typically at end-of-stream). As a result, streamed onMessage events will almost always be posted with responseMetadata still undefined, so Devtools won't show response headers for the stream messages. Consider setting metadata from call.headers as soon as it resolves, and then merging in trailers on completion (or posting a separate metadata-only message when headers arrive).
| void Promise.allSettled([ | |
| call.headers, | |
| call.trailers, | |
| ]).then( | |
| ([ | |
| headersResult, | |
| trailersResult, | |
| ]) => { | |
| responseMetadata = mergeMetadata( | |
| headersResult.status === "fulfilled" ? headersResult.value : undefined, | |
| trailersResult.status === "fulfilled" ? trailersResult.value : undefined, | |
| ); | |
| }, | |
| ); | |
| let responseHeaders: RpcMetadata | undefined; | |
| // Populate metadata as soon as headers are available, then merge in trailers later. | |
| void call.headers | |
| .then((headers) => { | |
| responseHeaders = headers; | |
| responseMetadata = mergeMetadata(headers, undefined); | |
| }) | |
| .catch(() => { | |
| // Ignore header errors here; they will surface via onError. | |
| }); | |
| void call.trailers | |
| .then((trailers) => { | |
| responseMetadata = mergeMetadata(responseHeaders, trailers); | |
| }) | |
| .catch(() => { | |
| // Ignore trailer errors here; they will surface via onError. | |
| }); |
| methodName: method.name, | ||
| serviceName: method.service.typeName, | ||
| requestMessage: call.request, | ||
| requestMetadata: toMetadataRecord(call.requestHeaders), | ||
| }); |
There was a problem hiding this comment.
The interceptor forwards call.request / finishedUnaryCall.response directly. For protobuf-ts-style message objects, this often includes internal fields like $typeName / $unknown (similar to what connect-es already strips before posting). Consider normalizing/sanitizing request/response message objects before sending them to postMessageToContentScript to keep the Devtools payload format consistent and avoid leaking internal bookkeeping fields.
| call.responses.onMessage((message) => { | ||
| postMessageToContentScript({ | ||
| id, | ||
| responseMetadata, | ||
| responseMessage: message, | ||
| }); |
There was a problem hiding this comment.
Streamed responses are posted with responseMessage: message without any normalization. If protobuf-ts messages include internal fields (e.g. $typeName, $unknown) like other protobuf runtimes, these will show up in Devtools and diverge from the sanitized shapes used by the Connect-ES interceptor. Consider applying the same message sanitization/normalization here before posting.
There was a problem hiding this comment.
Pull request overview
Copilot reviewed 10 out of 11 changed files in this pull request and generated 1 comment.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
You can also share your feedback on Copilot code review. Take the survey.
| }, | ||
| "devDependencies": { | ||
| "@connectrpc/connect": "2.1.0", | ||
| "@protobuf-ts/runtime-rpc": "^2.11.1", |
Motivation
protobuf-ts(gRPC-Web transport) so applications using@protobuf-ts/grpcweb-transportcan plug into Devtools without custom user code.protobuf-tsmetadata and message shapes with existing Devtools payload format so messages, metadata, EOF markers and errors are displayed consistently.Description
content-scripts/src/main/protobuf-ts.tsthat implementsinterceptUnaryandinterceptServerStreamingto forward requests, responses, stream messages, EOF and errors to the existing Devtools message pipeline, and normalizes metadata (Record<string,string|string[]>→Record<string,string>).window.__gRPC_devtools__.protobufTsInterceptorby updatingcontent-scripts/src/main/index.tsso users can reference it the same way as existing integrations.grpc-devtools.tssnippet and aGrpcWebFetchTransportwiring example inREADME.mdunder a new "protobuf-ts" section.docs/issue-101-protobuf-ts-plan.mddescribing goals, delivery steps and validation.Testing
npm run build, which completed successfully (webpack compiled production bundles; there were performance size warnings but no build failures). ✅npm run test --workspaces --if-present, and all test suites passed (4suites,60tests). ✅Codex Task