-
Notifications
You must be signed in to change notification settings - Fork 2
L2ps implementation #514
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: l2ps_simplified
Are you sure you want to change the base?
L2ps implementation #514
Conversation
- Import UDIdentityAssignPayload and UDIdentityManager - Add 'ud_identity_assign' case routing to UDIdentityManager.verifyPayload() - Add 'ud_identity_remove' to identity removal cases - Follows signature-based verification pattern like XM
- Create UDIdentityManager class following XM signature-based pattern - Implement resolveUDDomain() with UNS/CNS registry fallback - Implement verifyPayload() with signature and ownership verification - Add helper methods for getting UD identities - Use ethers.js for free Ethereum contract reads - Full error handling and logging
- Add SavedUdIdentity interface to IdentityTypes.ts - Update StoredIdentities type to include ud: SavedUdIdentity[] - Add ud: [] to default identities initialization in handleGCR.ts - Update UDIdentityManager with proper type imports - Database will auto-update via JSONB (no migration needed)
- Update SavedUdIdentity interface with network field - Add Polygon L2 UNS registry: 0xa9a6A3626993D487d2Dbda3173cf58cA1a9D9e9f - Rewrite resolveUDDomain() with 3-tier fallback strategy: 1. Polygon L2 UNS (primary - most new domains, cheaper gas) 2. Ethereum L1 UNS (legacy domains) 3. Ethereum L1 CNS (oldest legacy domains) - Add network verification in verifyPayload() - Add logging for fallback chain attempts - Per UD docs: check Polygon first for better UX
- Added Base L2 (0xF6c1b83977DE3dEffC476f5048A0a84d3375d498) registry support - Added Sonic (0xDe1DAdcF11a7447C3D093e97FdbD513f488cE3b4) registry support - Updated resolution strategy: Polygon → Base → Sonic → Ethereum UNS → Ethereum CNS - Extended network type to include "base" and "sonic" in UD identity types - Maintained backward compatibility with existing Polygon and Ethereum domains 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
Add signatureDetector utility for detecting signature types from address formats, supporting multi-address UD identity verification. Features: - detectSignatureType(): Auto-detect EVM vs Solana from address format - validateAddressType(): Validate address matches expected type - isSignableAddress(): Check if address is in recognized format Pattern matching: - EVM: 0x-prefixed 40 hex characters (secp256k1) - Solana: Base58-encoded 32-44 characters (ed25519) Includes comprehensive test coverage in local_tests/. Part of UD multi-chain identity refactor supporting multi-address signing (users can sign with any address in domain records). 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
Extend EVM domain resolution to fetch ALL domain records, not just owner address. This enables multi-address verification where users can sign with any address in their domain records. Changes: - Updated resolveUDDomain() return type to EVMDomainResolution - Added resolver ABI with get() method for record fetching - Defined UD_RECORD_KEYS array (8 common crypto address records) - Created fetchDomainRecords() helper for batch record retrieval - Created extractSignableAddresses() helper with auto-detection - Applied record fetching to all 5 networks: * Polygon L2 UNS * Base L2 UNS * Sonic UNS * Ethereum L1 UNS * Ethereum L1 CNS Technical details: - Resolver may differ from registry (fetched via resolverOf()) - Records include: crypto.ETH/SOL/BTC/MATIC.address, token.EVM/SOL.* - Signature type auto-detected via signatureDetector utility - Null/empty records handled gracefully Next phase: Integrate Solana resolution with EVM fallback 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
…tion Phase 3: Solana Resolution Integration - Add Solana fallback after EVM cascade in resolveUDDomain() - Migrate to UnifiedDomainResolution format (EVM + Solana) - Add evmToUnified() and solanaToUnified() conversion helpers - Support .demos domains via SolanaDomainResolver - Return authorizedAddresses array from all domain records Phase 4: Multi-Signature Verification - Rewrite verifyPayload() for multi-address authorization - Add verifySignature() helper supporting EVM + Solana - Install tweetnacl & bs58 for Solana signature verification - Users can now sign with ANY address in domain records - Support mixed signature types (EVM + Solana in same domain) Technical: - EVM verification: ethers.verifyMessage() - Solana verification: nacl.sign.detached.verify() with base58 - Enhanced error messages listing authorized addresses - Detailed logging of signing address and signature type 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
Breaking changes to SavedUdIdentity interface:
- resolvedAddress → signingAddress (address that signed challenge)
- Added signatureType field ("evm" | "solana")
- Added "solana" to network options
Updated GCRIdentityRoutines to handle new signature types and validate
multi-address verification. Users can now sign with ANY address from
their domain records, not just the owner address.
This phase completes the node-side changes for multi-chain UD support.
Phase 6 (SDK updates) is in progress in ../sdks repository.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
Security Fixes: 1. Challenge validation: Replace unsafe substring matching with regex-based exact match validation to prevent public key spoofing attacks 2. Input validation: Add comprehensive validation for Solana domain resolution methods (label, tld, recordKeys) with early-return pattern 3. Type safety: Fix unsafe indexing in validateUint8Array with proper type narrowing and runtime validation 4. BigInt conversion: Validate recordVersion is non-negative integer before conversion to prevent TypeError Code Quality Improvements: 1. Major refactoring: Extract duplicated EVM network resolution logic into reusable tryEvmNetwork() helper method, reducing code by 73 lines 2. Security documentation: Add comprehensive rationale for network/registry mismatch handling with clear threat model explanation 3. Error messages: Improve clarity and consistency across all validation errors 4. Performance: Early validation prevents unnecessary async operations Changes: - udIdentityManager.ts: 331 lines changed (129+, 202-) * Refactored 5 nested try/catch blocks into clean loop * Added tryEvmNetwork() helper method * Fixed challenge validation vulnerability * Added security documentation for mismatch handling * Updated all resolvedAddress → signingAddress references - udSolanaResolverHelper.ts: 66 lines changed (53+, 13-) * Added input validation for resolveRecord() method * Added comprehensive validation for resolve() method * Added recordVersion validation before BigInt conversion * Handles empty recordKeys array correctly - validateUint8Array.ts: 16 lines changed (11+, 5-) * Fixed unsafe indexing into unknown type * Added proper type narrowing with validation * Ensures all values are numbers before conversion All changes are ESLint compliant and TypeScript safe. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
Add comprehensive points system for Unstoppable Domains integration: - Award 3 points for .demos TLD domains - Award 1 point for other UD domains - Implement awardUdDomainPoints() with duplicate detection and referral support - Implement deductUdDomainPoints() with domain-specific point tracking - Extend GCR entity breakdown with udDomains field - Add telegram field to socialAccounts breakdown - Update UserPoints interface to match GCR structure Resolves missing UD points methods in IncentiveManager hooks. 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <noreply@anthropic.com>
SECURITY: Prevent point farming by normalizing all UD domain names to lowercase before comparison, lookup, and storage. This ensures: - Same domain with different cases (MyDomain.demos vs mydomain.demos) cannot be linked multiple times to farm points - Domain lookups always work regardless of input case - Behavior matches DNS standard (domains are case-insensitive) Changes: - awardUdDomainPoints(): Normalize domain at start, use normalizedDomain for TLD check, duplicate detection, and storage - deductUdDomainPoints(): Normalize domain at start, use normalizedDomain for TLD check, lookup, and deduction Fixes vulnerability identified in code review where case-variant domains could bypass duplicate detection and allow unlimited point farming.
SECURITY VULNERABILITY FIX: Prevent authentication bypass via case manipulation Issue: Previous code applied toLowerCase() to ALL addresses including Solana addresses, which are case-sensitive (base58 encoding). This allowed attackers to bypass authorization by changing the case of a valid Solana address. Example Attack: - Domain authorized for: 8VqZ8cqQ8h9FqF7cXNx5bXKqNz9V8F7h9FqF7cXNx5b - Attacker uses: 8vqz8cqq8h9fqf7cxnx5bxkqnz9v8f7h9fqf7cxnx5b - Previous code: ACCEPTED (case-insensitive match) - Fixed code: REJECTED (case-sensitive for Solana) Fix: - Conditional address comparison based on signature type - Solana addresses: Exact case-sensitive match (base58 standard) - EVM addresses: Case-insensitive match (EVM standard) Security Impact: HIGH - Closes critical authentication bypass vulnerability Affected: All Solana UD domain verifications in production Reviewer Concern kynesyslabs#4
Issue: Function called regex.test() on address parameter without validating it is a string, causing TypeError when called with null/undefined/non-string. Fix: - Add early type guard checking typeof address === 'string' - Return null for non-string or empty inputs - Trim whitespace before regex matching for consistency Security Impact: Prevents runtime crashes from malformed inputs Error Handling: Gracefully returns null instead of throwing TypeError Reviewer Concern kynesyslabs#2
Issue: evmToUnified() hardcoded registryType as 'UNS', mislabeling CNS domains and losing registry type information from the resolver. Fix: - Update evmToUnified() signature to accept registryType parameter - Pass registryType from tryEvmNetwork() to evmToUnified() - Correctly label CNS domains as CNS instead of UNS Data Integrity: Ensures accurate domain metadata Client Impact: Enables correct filtering and display based on registry type Reviewer Concern kynesyslabs#5
Issue: New required fields (udDomains, telegram) break loading historical
JSONB records that don't have these fields, causing TypeORM/PostgreSQL
errors when loading existing user data.
Fix:
- Make udDomains optional: udDomains?: { [domain: string]: number }
- Make telegram optional: telegram?: number
- Ensures backward compatibility with pre-existing records
- All usage sites already handle undefined with safe access patterns
Database Impact: Prevents data loading failures in production
Migration: No migration needed - field addition is non-breaking
Reviewer Concerns kynesyslabs#9 & kynesyslabs#10
Issue: Only checked field presence, not valid values. Invalid enum values or timestamps could cause downstream errors with unclear messages. Fix: - Validate signatureType is 'evm' or 'solana' - Validate network is one of: polygon, base, sonic, ethereum, solana - Validate registryType is 'UNS' or 'CNS' - Validate timestamp is valid number or parseable date (>0) - Provide clear, actionable error messages for each validation failure Data Integrity: Prevents invalid data from entering system Developer Experience: Clear error messages aid debugging Error Prevention: Catches issues before database write Reviewer Concern #1
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 4
♻️ Duplicate comments (1)
src/libs/l2ps/L2PSConsensus.ts (1)
377-425: UsinggetProofs("", "applied", ...)still filters by empty UID, breaking rollback and stats
rollbackProofsForBlockandgetBlockStatsboth call:const appliedProofs = await L2PSProofManager.getProofs("", "applied", ...)Passing
""does not mean “all networks”; it filters forl2ps_uid === "". Unless you actually have an L2PS network with an empty UID, both methods will see zero proofs:
- No proofs rolled back when a block fails.
- Block stats always report zero applied proofs/edits.
Please switch to a helper that explicitly means “all applied proofs for a given block”, e.g.:
const proofsToRollback = await L2PSProofManager.getAppliedProofsForBlock(blockNumber, 1000)and likewise in
getBlockStats:const blockProofs = await L2PSProofManager.getAppliedProofsForBlock(blockNumber, 10000)(or whatever helper you’ve implemented on
L2PSProofManagerto cover this case).This is a functional correctness issue in consensus rollback and observability.
Also applies to: 439-451
🧹 Nitpick comments (3)
src/libs/network/routines/transactions/handleL2PS.ts (1)
25-29: Consider optional chaining for nested structure checksThe nested guards on
l2psTx.content.data[1].l2ps_uidanddecryptedTx.content.fromare correct but a bit noisy. You could simplify with optional chaining for readability, e.g.:if (!l2psTx.content?.data?.[1]?.l2ps_uid) { return "Invalid L2PS transaction structure: missing l2ps_uid in data payload" } if (!decryptedTx?.content?.from) { return { decryptedTx: null, error: "Invalid decrypted transaction structure" } }Pure readability / style; behaviour stays the same.
Also applies to: 61-62
src/libs/l2ps/L2PSBatchAggregator.ts (1)
395-408: Nonce handling is fine, but comments overstate persistence guarantees
getNextBatchNoncederives nonces fromDate.now() * 1000and a value cached insharedState.l2psBatchNonce. This gives a monotonic sequence within a process and across restarts as long as the system clock is monotonic, but it does not persist across process restarts (no DB/FS write).To avoid confusion for future maintainers, I’d tighten the comments:
- Emphasize that persistence is per-process (via shared state), not durable.
- Note that uniqueness across restarts is time-based, not strictly persisted.
Purely documentation; logic itself looks reasonable given the current constraints.
Also applies to: 413-425
src/libs/l2ps/L2PSTransactionExecutor.ts (1)
229-297: Native handler lacks structure checks ondataand ignoressimulate(optional tightening)
handleNativeTransactionassumestx.content.datais a["native", INativePayload]tuple and doesn’t use thesimulateflag:const nativePayloadData = tx.content.data as ["native", INativePayload] const nativePayload = nativePayloadData[1]If a malformed transaction reaches here, this can throw before validation. Also,
simulatedoes not affect any behaviour (we always hit the DB and build edits).Not strictly wrong, but you might want to harden this:
if (!Array.isArray(tx.content.data) || tx.content.data[0] !== "native") { return { success: false, message: "Invalid native payload structure" } } const [, nativePayload] = tx.content.data as ["native", INativePayload]and optionally gate side effects in simulate mode if that’s part of your design contract.
Pure robustness / clarity; current path is logically correct for well-formed inputs.
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
Disabled knowledge base sources:
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (4)
src/libs/l2ps/L2PSBatchAggregator.ts(1 hunks)src/libs/l2ps/L2PSConsensus.ts(1 hunks)src/libs/l2ps/L2PSTransactionExecutor.ts(1 hunks)src/libs/network/routines/transactions/handleL2PS.ts(2 hunks)
🧰 Additional context used
🧬 Code graph analysis (2)
src/libs/network/routines/transactions/handleL2PS.ts (5)
src/libs/l2ps/parallelNetworks.ts (1)
ParallelNetworks(82-439)src/libs/blockchain/transaction.ts (1)
Transaction(44-534)src/utilities/logger.ts (1)
error(125-132)src/libs/network/server_rpc.ts (1)
emptyResponse(46-51)src/libs/l2ps/L2PSTransactionExecutor.ts (1)
L2PSTransactionExecutor(49-487)
src/libs/l2ps/L2PSBatchAggregator.ts (6)
src/utilities/logger.ts (1)
error(125-132)src/utilities/sharedState.ts (2)
getSharedState(276-278)SharedState(17-271)src/libs/crypto/hashing.ts (1)
Hashing(15-26)src/libs/utils/calibrateTime.ts (1)
getNetworkTimestamp(17-23)src/libs/blockchain/transaction.ts (1)
hash(106-114)src/libs/blockchain/mempool_v2.ts (1)
Mempool(18-227)
🪛 GitHub Check: SonarCloud Code Analysis
src/libs/network/routines/transactions/handleL2PS.ts
[warning] 26-26: Prefer using an optional chain expression instead, as it's more concise and easier to read.
[warning] 61-61: Prefer using an optional chain expression instead, as it's more concise and easier to read.
src/libs/l2ps/L2PSBatchAggregator.ts
[warning] 436-436: 'error' will use Object's default stringification format ('[object Object]') when stringified.
🔇 Additional comments (2)
src/libs/network/routines/transactions/handleL2PS.ts (1)
129-146: Mempool status updates are now keyed by encrypted hash (looks correct)The execution and error paths consistently call
L2PSMempool.updateStatus(l2psTx.hash, ...), which matches the mempool’s use of the encrypted transaction hash as its primary key. This fixes the earlier mismatch withoriginalHashand should make status transitions (processed→failed/executed) reliable.src/libs/l2ps/L2PSBatchAggregator.ts (1)
336-383: Clarify and standardizegetSharedStateusageThe code pattern
const sharedState = getSharedStatefollowed by property access likesharedState.keypair?.privateKeysuggests a potential issue. IfgetSharedStateis a function (which the naming convention indicates), this will fail at runtime.Confirm whether
getSharedStateis a function and, if so, add invocation parentheses:
createBatchPayload(line 338)getLastNonceFromStoragesaveNonceToStoragesubmitBatchToMempoolAlternatively, if it's already a singleton instance, rename it to
sharedStateto clarify the intent and prevent future confusion.
| getStatus(): { | ||
| isRunning: boolean; | ||
| isAggregating: boolean; | ||
| intervalMs: number; | ||
| joinedL2PSCount: number; | ||
| } { | ||
| return { | ||
| isRunning: this.isRunning, | ||
| isAggregating: this.isAggregating, | ||
| intervalMs: this.AGGREGATION_INTERVAL, | ||
| joinedL2PSCount: SharedState.getInstance().l2psJoinedUids?.length || 0, | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
SharedState is not imported – getStatus will throw at runtime
getStatus() calls SharedState.getInstance() but this symbol is never imported in this file (only getSharedState is). This will fail at compile time or as a ReferenceError at runtime.
Prefer reusing the existing shared state accessor:
- getStatus(): {
- isRunning: boolean;
- isAggregating: boolean;
- intervalMs: number;
- joinedL2PSCount: number;
- } {
- return {
- isRunning: this.isRunning,
- isAggregating: this.isAggregating,
- intervalMs: this.AGGREGATION_INTERVAL,
- joinedL2PSCount: SharedState.getInstance().l2psJoinedUids?.length || 0,
- }
- }
+ getStatus(): {
+ isRunning: boolean;
+ isAggregating: boolean;
+ intervalMs: number;
+ joinedL2PSCount: number;
+ } {
+ const sharedState = getSharedState
+ return {
+ isRunning: this.isRunning,
+ isAggregating: this.isAggregating,
+ intervalMs: this.AGGREGATION_INTERVAL,
+ joinedL2PSCount: sharedState.l2psJoinedUids?.length || 0,
+ }
+ }(or, if getSharedState is a function, call it accordingly).
🤖 Prompt for AI Agents
In src/libs/l2ps/L2PSBatchAggregator.ts around lines 605 to 617, getStatus()
calls SharedState.getInstance() but SharedState is not imported here; this will
fail at compile/runtime. Fix by replacing that call with the existing
shared-state accessor used elsewhere in the file (e.g. call getSharedState() or
the appropriate function already imported) or alternatively add an import for
SharedState from its module; update the return value to use the correctly
referenced shared state (e.g. getSharedState().l2psJoinedUids?.length) so the
symbol is defined and the code compiles.
| private static async rollbackEdits( | ||
| proof: L2PSProof, | ||
| editResults: GCRResult[], | ||
| mockTx: any | ||
| ): Promise<void> { | ||
| for (let i = editResults.length - 2; i >= 0; i--) { | ||
| if (editResults[i].success) { | ||
| const rollbackEdit = { ...proof.gcr_edits[i], isRollback: true } | ||
| await HandleGCR.apply(rollbackEdit, mockTx, true, false) | ||
| } | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Apply GCR edits from a proof | ||
| */ | ||
| private static async applyGCREdits( | ||
| proof: L2PSProof, | ||
| simulate: boolean, | ||
| proofResult: ProofResult | ||
| ): Promise<boolean> { | ||
| const editResults: GCRResult[] = [] | ||
|
|
||
| for (const edit of proof.gcr_edits) { | ||
| const editAccount = 'account' in edit ? edit.account as string : proof.affected_accounts[0] || '' | ||
| const mockTx = this.createMockTx(proof, editAccount) | ||
|
|
||
| const editResult = await HandleGCR.apply(edit, mockTx as any, false, simulate) | ||
| editResults.push(editResult) | ||
|
|
||
| if (!editResult.success) { | ||
| proofResult.message = `GCR edit failed: ${editResult.message}` | ||
| if (!simulate) { | ||
| await this.rollbackEdits(proof, editResults, mockTx) | ||
| await L2PSProofManager.markProofRejected(proof.id, proofResult.message) | ||
| } | ||
| return false | ||
| } | ||
|
|
||
| proofResult.editsApplied++ | ||
| } | ||
|
|
||
| return true | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Rollback uses a single mockTx for all edits, potentially mismatching account context
In applyGCREdits / rollbackEdits, you:
- Build
mockTxper edit during forward application. - On failure, call
rollbackEdits(proof, editResults, mockTx), passing only the lastmockTx. - In
rollbackEdits, you reuse that samemockTxfor every rollback edit.
If HandleGCR.apply relies on mockTx.content.from / to for authorization or accounting, earlier edits may be rolled back with the wrong account context (that of the last edit).
Safer pattern:
private static async rollbackEdits(
proof: L2PSProof,
editResults: GCRResult[],
- mockTx: any
+ _lastMockTx: any
): Promise<void> {
for (let i = editResults.length - 2; i >= 0; i--) {
if (editResults[i].success) {
- const rollbackEdit = { ...proof.gcr_edits[i], isRollback: true }
- await HandleGCR.apply(rollbackEdit, mockTx, true, false)
+ const originalEdit = proof.gcr_edits[i]
+ const editAccount = "account" in originalEdit
+ ? (originalEdit.account as string)
+ : proof.affected_accounts[0] || ""
+ const rollbackEdit = { ...originalEdit, isRollback: true }
+ const mockTx = this.createMockTx(proof, editAccount)
+ await HandleGCR.apply(rollbackEdit, mockTx as any, true, false)
}
}
}This keeps rollback semantics aligned with the per-edit account context.
📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| private static async rollbackEdits( | |
| proof: L2PSProof, | |
| editResults: GCRResult[], | |
| mockTx: any | |
| ): Promise<void> { | |
| for (let i = editResults.length - 2; i >= 0; i--) { | |
| if (editResults[i].success) { | |
| const rollbackEdit = { ...proof.gcr_edits[i], isRollback: true } | |
| await HandleGCR.apply(rollbackEdit, mockTx, true, false) | |
| } | |
| } | |
| } | |
| /** | |
| * Apply GCR edits from a proof | |
| */ | |
| private static async applyGCREdits( | |
| proof: L2PSProof, | |
| simulate: boolean, | |
| proofResult: ProofResult | |
| ): Promise<boolean> { | |
| const editResults: GCRResult[] = [] | |
| for (const edit of proof.gcr_edits) { | |
| const editAccount = 'account' in edit ? edit.account as string : proof.affected_accounts[0] || '' | |
| const mockTx = this.createMockTx(proof, editAccount) | |
| const editResult = await HandleGCR.apply(edit, mockTx as any, false, simulate) | |
| editResults.push(editResult) | |
| if (!editResult.success) { | |
| proofResult.message = `GCR edit failed: ${editResult.message}` | |
| if (!simulate) { | |
| await this.rollbackEdits(proof, editResults, mockTx) | |
| await L2PSProofManager.markProofRejected(proof.id, proofResult.message) | |
| } | |
| return false | |
| } | |
| proofResult.editsApplied++ | |
| } | |
| return true | |
| } | |
| private static async rollbackEdits( | |
| proof: L2PSProof, | |
| editResults: GCRResult[], | |
| _lastMockTx: any | |
| ): Promise<void> { | |
| for (let i = editResults.length - 2; i >= 0; i--) { | |
| if (editResults[i].success) { | |
| const originalEdit = proof.gcr_edits[i] | |
| const editAccount = "account" in originalEdit | |
| ? (originalEdit.account as string) | |
| : proof.affected_accounts[0] || "" | |
| const rollbackEdit = { ...originalEdit, isRollback: true } | |
| const mockTx = this.createMockTx(proof, editAccount) | |
| await HandleGCR.apply(rollbackEdit, mockTx as any, true, false) | |
| } | |
| } | |
| } | |
| /** | |
| * Apply GCR edits from a proof | |
| */ | |
| private static async applyGCREdits( | |
| proof: L2PSProof, | |
| simulate: boolean, | |
| proofResult: ProofResult | |
| ): Promise<boolean> { | |
| const editResults: GCRResult[] = [] | |
| for (const edit of proof.gcr_edits) { | |
| const editAccount = 'account' in edit ? edit.account as string : proof.affected_accounts[0] || '' | |
| const mockTx = this.createMockTx(proof, editAccount) | |
| const editResult = await HandleGCR.apply(edit, mockTx as any, false, simulate) | |
| editResults.push(editResult) | |
| if (!editResult.success) { | |
| proofResult.message = `GCR edit failed: ${editResult.message}` | |
| if (!simulate) { | |
| await this.rollbackEdits(proof, editResults, mockTx) | |
| await L2PSProofManager.markProofRejected(proof.id, proofResult.message) | |
| } | |
| return false | |
| } | |
| proofResult.editsApplied++ | |
| } | |
| return true | |
| } |
| static async execute( | ||
| l2psUid: string, | ||
| tx: Transaction, | ||
| l1BatchHash: string, | ||
| simulate: boolean = false | ||
| ): Promise<L2PSExecutionResult> { | ||
| try { | ||
| log.info(`[L2PS Executor] Processing tx ${tx.hash} from L2PS ${l2psUid} (type: ${tx.content.type})`) | ||
|
|
||
| // Generate GCR edits based on transaction type | ||
| const editsResult = await this.generateGCREdits(tx, simulate) | ||
| if (!editsResult.success) { | ||
| return editsResult | ||
| } | ||
|
|
||
| const gcrEdits = editsResult.gcr_edits || [] | ||
| const affectedAccounts = editsResult.affected_accounts || [] | ||
|
|
||
| // Create proof with GCR edits (if not simulating) | ||
| if (!simulate && gcrEdits.length > 0) { | ||
| return this.createProofAndRecord(l2psUid, tx, l1BatchHash, gcrEdits, affectedAccounts) | ||
| } | ||
|
|
||
| return { | ||
| success: true, | ||
| message: simulate | ||
| ? `Validated: ${gcrEdits.length} GCR edits would be generated` | ||
| : `Proof created with ${gcrEdits.length} GCR edits (will apply at consensus)`, | ||
| gcr_edits: gcrEdits, | ||
| affected_accounts: [...new Set(affectedAccounts)] | ||
| } | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
execute reports “Proof created” even when no proof is created (zero-edit case)
When gcrEdits.length === 0, the if (!simulate && gcrEdits.length > 0) branch is skipped, so no proof is created or recorded. However the fallback success path still says:
message: simulate
? `Validated: ${gcrEdits.length} GCR edits would be generated`
: `Proof created with ${gcrEdits.length} GCR edits (will apply at consensus)`For zero edits this reports “Proof created with 0 GCR edits” even though no proof exists, which can confuse callers and operators.
Consider distinguishing the “no edits / no proof” case:
- return {
- success: true,
- message: simulate
- ? `Validated: ${gcrEdits.length} GCR edits would be generated`
- : `Proof created with ${gcrEdits.length} GCR edits (will apply at consensus)`,
- gcr_edits: gcrEdits,
- affected_accounts: [...new Set(affectedAccounts)]
- }
+ const base = {
+ success: true,
+ gcr_edits: gcrEdits,
+ affected_accounts: [...new Set(affectedAccounts)],
+ }
+
+ if (simulate) {
+ return {
+ ...base,
+ message: `Validated: ${gcrEdits.length} GCR edits would be generated`,
+ }
+ }
+
+ return {
+ ...base,
+ message: gcrEdits.length === 0
+ ? "No GCR edits generated; transaction recorded without proof"
+ : `Proof created with ${gcrEdits.length} GCR edits (will apply at consensus)`,
+ }This keeps semantics and observability aligned.
🤖 Prompt for AI Agents
In src/libs/l2ps/L2PSTransactionExecutor.ts around lines 113 to 144, the success
fallback message incorrectly claims a proof was created when gcrEdits.length ===
0; change the logic so when not simulating and gcrEdits.length === 0 you return
a success result that explicitly states no GCR edits were generated and that no
proof was created/recorded, and preserve the existing proof-created message only
when a proof was actually created; keep gcr_edits as an empty array and ensure
affected_accounts is deduplicated in the returned result.
| const verificationResult = await Transaction.confirmTx(decryptedTx, decryptedTx.content.from) | ||
| if (!verificationResult) { | ||
| response.result = 400 | ||
| response.response = false | ||
| response.extra = "Transaction signature verification failed" | ||
| return response | ||
| return { decryptedTx: null, error: "Transaction signature verification failed" } | ||
| } | ||
|
|
||
| // REVIEW: PR Fix #11 - Validate encrypted payload structure before type assertion | ||
| // Reuse payloadData extracted earlier (line 38) | ||
| return { decryptedTx: decryptedTx as unknown as Transaction, error: null } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Signature verification result is not checked, potentially accepting invalid transactions
Transaction.confirmTx returns a structured result (with a success flag), but the code only checks that the result is truthy. This can mark a transaction as valid even when signature verification fails.
Update the logic to inspect the success flag (and optionally include the message):
- const verificationResult = await Transaction.confirmTx(decryptedTx, decryptedTx.content.from)
- if (!verificationResult) {
- return { decryptedTx: null, error: "Transaction signature verification failed" }
- }
+ const verificationResult = await Transaction.confirmTx(decryptedTx, decryptedTx.content.from)
+ if (!verificationResult || !verificationResult.success) {
+ const msg = verificationResult?.message || "Transaction signature verification failed"
+ return { decryptedTx: null, error: msg }
+ }This ensures only properly verified transactions proceed.
🤖 Prompt for AI Agents
In src/libs/network/routines/transactions/handleL2PS.ts around lines 65 to 71,
the code treats the result of Transaction.confirmTx as a boolean but that
function returns a structured object with a success flag and optional message;
update the logic to check verificationResult.success (and, if false, return an
error incorporating verificationResult.message) and only return the decryptedTx
when success is true so that failed signature verifications are rejected.
…nd enhance error handling in handleL2PS
- Added L2PSBatchProver class to generate and verify PLONK proofs for L2PS transaction batches. - Introduced batch size selection logic (5 or 10 transactions) with zero-amount transfer padding. - Created circuits for 5 and 10 transaction batches using Poseidon hash function. - Developed setup script to generate ZK keys for batch circuits. - Updated README with usage instructions and performance metrics. - Enhanced L2PSProof entity to support multiple proof types including PLONK and STARK.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 5
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
package.json (1)
51-102: Verify @solana/web3.js version explicitly due to recent supply-chain incident.The new dependencies appropriately support the ZK-SNARK infrastructure (
snarkjs0.7.5,circomlib2.0.5,circomlibjs0.1.7), Solana/Anchor integrations, and security hardening (helmet8.1.0). However, @solana/web3.js experienced a supply-chain attack in December 2024 (CVE-2024-54134) affecting versions 1.95.6–1.95.7. The pinned version 1.98.4 is post-patch and should be safe, but confirm this version is intentional and monitor for any subsequent releases. Additionally, ensure snarkjs remains at ≥0.7.0 to maintain the fix for CVE-2023-33252, and keep circomlib updated as maintenance activity is limited.
♻️ Duplicate comments (6)
src/libs/network/routines/transactions/handleL2PS.ts (1)
65-68: Signature verification still not checking thesuccessflag.This issue was flagged in a previous review.
Transaction.confirmTxreturns a structured result with asuccessproperty, but the code only checks if the result is truthy. This could allow invalid signatures to pass.const verificationResult = await Transaction.confirmTx(decryptedTx, decryptedTx.content.from) - if (!verificationResult) { - return { decryptedTx: null, error: "Transaction signature verification failed" } + if (!verificationResult || !verificationResult.success) { + const msg = verificationResult?.message || "Transaction signature verification failed" + return { decryptedTx: null, error: msg } }src/libs/l2ps/L2PSBatchAggregator.ts (2)
609-612: Verifyreference_block: 0handling.A past review flagged that
reference_block: 0may not be recalculated by the mempool, potentially bypassing ordering/anchoring semantics. The comment indicates this was addressed, but the code still showsreference_block: 0.Please verify that either:
- The mempool now sets a proper reference block, or
- A meaningful reference block should be computed here
562-563: Unchecked type cast forpublicKeymay cause runtime error.The cast
sharedState.keypair.publicKey as Uint8Arrayassumes the key is always aUint8Array, but per past review comments, this could fail if the key is a different type.Add a runtime check:
- const nodeIdentityHex = uint8ArrayToHex(sharedState.keypair.publicKey as Uint8Array) + const publicKey = sharedState.keypair.publicKey + if (!(publicKey instanceof Uint8Array)) { + log.error("[L2PS Batch Aggregator] publicKey is not a Uint8Array") + return false + } + const nodeIdentityHex = uint8ArrayToHex(publicKey)Alternatively, if
sharedState.publicKeyHexexists (as suggested in past comments), use that directly.src/libs/l2ps/L2PSTransactionExecutor.ts (1)
136-143: Misleading success message when no proof is created.When
gcrEdits.length === 0and not simulating, the method reports "Proof created with 0 GCR edits" even though no proof was created (theifcondition at line 132 was false).Apply this diff to distinguish the zero-edit case:
+ const uniqueAccounts = [...new Set(affectedAccounts)] + + if (simulate) { + return { + success: true, + message: `Validated: ${gcrEdits.length} GCR edits would be generated`, + gcr_edits: gcrEdits, + affected_accounts: uniqueAccounts + } + } + return { success: true, - message: simulate - ? `Validated: ${gcrEdits.length} GCR edits would be generated` - : `Proof created with ${gcrEdits.length} GCR edits (will apply at consensus)`, + message: gcrEdits.length === 0 + ? "No GCR edits generated; transaction recorded without proof" + : `Transaction recorded with ${gcrEdits.length} GCR edits (proof will be created during batching)`, gcr_edits: gcrEdits, - affected_accounts: [...new Set(affectedAccounts)] + affected_accounts: uniqueAccounts }This clarifies that when no edits are generated, no proof is created.
src/model/entities/L2PSProofs.ts (1)
144-149: Fix JSONB default value syntax.The string literal
default: "[]"does not produce a correct JSONB default at the database level. TypeORM requires raw SQL expressions for JSONB defaults.Apply this diff:
/** * Individual transaction hashes from L2PS mempool * Used to update mempool status to 'confirmed' after proof application */ - @Column("jsonb", { default: "[]" }) + @Column("jsonb", { default: () => "'[]'" }) transaction_hashes: string[]Alternatively, you can use an explicit cast:
default: () => "'[]'::jsonb"Based on past review comments flagging TypeORM JSONB default value requirements.
src/libs/l2ps/L2PSProofManager.ts (1)
227-233: Nonce edits should also require theaccountfield.The comment states "Balance and nonce edits require account field", but the validation only enforces this for
balanceedits. Nonce edits without anaccountfield would pass validation, creating inconsistency.Apply this diff:
// Validate each GCR edit has required fields for (const edit of proof.gcr_edits) { - if (!edit.type || (edit.type === 'balance' && !('account' in edit))) { + if (!edit.type) { log.warning(`[L2PS ProofManager] Proof ${proof.id} has invalid GCR edit`) return false } + + // Balance and nonce edits require account field + if ((edit.type === 'balance' || edit.type === 'nonce') && !('account' in edit)) { + log.warning(`[L2PS ProofManager] Proof ${proof.id} has invalid GCR edit: ${edit.type} missing account`) + return false + } }
🧹 Nitpick comments (14)
src/libs/l2ps/zk/README.md (1)
65-86: Add language specifiers to fenced code blocks and fix batch_20 inconsistency.Two minor issues:
- Lines 65 and 80 have fenced code blocks without language specifiers (use
textorplaintext)- Line 84 references
batch_20/in the generated keys structure, but the setup script (line 90) notes that batch_20 was removed due to stability issues-``` +```text zk/ ├── L2PSBatchProver.ts # Main prover class (auto-selects batch size) ...**Generated (gitignored):** -``` +```text ├── keys/ # ~1GB proving keys │ ├── batch_5/ │ ├── batch_10/ -│ └── batch_20/ └── ptau/ # ~500MB powers of tau</blockquote></details> <details> <summary>src/libs/l2ps/zk/scripts/setup_all_batches.sh (3)</summary><blockquote> `33-36`: **Fix quoting and portability issues in file size check.** Two issues: 1. The `stat` output should be quoted to prevent word splitting (SC2046) 2. `stat -c%s` is GNU/Linux-specific; macOS uses `stat -f%z` ```diff - if [ ! -f "$PTAU_DIR/$file" ] || [ $(stat -c%s "$PTAU_DIR/$file") -lt 1000000 ]; then + if [ ! -f "$PTAU_DIR/$file" ]; then + need_download=true + elif [[ "$OSTYPE" == "darwin"* ]]; then + [ "$(stat -f%z "$PTAU_DIR/$file")" -lt 1000000 ] && need_download=true + else + [ "$(stat -c%s "$PTAU_DIR/$file")" -lt 1000000 ] && need_download=true + fi + + if [ "${need_download:-false}" = true ]; then echo -e "${YELLOW}Downloading pot${size}...${NC}"Alternatively, use a portable approach with
wc -c:if [ ! -f "$PTAU_DIR/$file" ] || [ "$(wc -c < "$PTAU_DIR/$file")" -lt 1000000 ]; then
36-36: Add error handling for curl download.
curlmay fail silently without-f(fail on HTTP errors). Consider adding validation:- curl -L -o "$PTAU_DIR/$file" "$url" + if ! curl -fL -o "$PTAU_DIR/$file" "$url"; then + echo -e "\033[0;31mFailed to download $file\033[0m" + exit 1 + fi
1-14: Consider adding dependency validation.The script assumes
circom,npx, andsnarkjsare installed. Adding validation upfront would provide clearer error messages.set -e +# Check dependencies +for cmd in circom npx; do + if ! command -v "$cmd" &> /dev/null; then + echo -e "\033[0;31mError: $cmd is required but not installed.\033[0m" + exit 1 + fi +done + SCRIPT_DIR="$(cd "$(dirname "$0")" && pwd)"src/libs/l2ps/zk/circuits/l2ps_batch_10.circom (1)
20-32:pre_hashoutput is computed but never used.The
BalanceTransfertemplate computespre_hash(lines 20, 29-32) butL2PSBatchonly usespost_hashfor state chaining. Either integratepre_hashinto the state verification or remove it to reduce constraint count.src/libs/l2ps/zk/circuits/l2ps_batch_5.circom (2)
26-27: Unused constraint signal - dead code in circuit.The
checksignal is computed but never constrained against anything. In Circom, signals that are computed but not used in constraints don't contribute to the proof's security guarantees. This appears to be leftover debug code.Either remove it or add a meaningful constraint:
- signal check; - check <== sender_after * sender_after;
20-21:pre_hashoutput is declared but never used.The
pre_hashoutput is computed using Poseidon but is never referenced by theL2PSBatchtemplate (onlypost_hashis used at line 71). This adds unnecessary constraints to the circuit.If
pre_hashis needed for future verification, keep it. Otherwise, consider removing to reduce circuit size:- signal output pre_hash; signal output post_hash; sender_after === sender_before - amount; receiver_after === receiver_before + amount; - - signal check; - check <== sender_after * sender_after; - - component preHasher = Poseidon(2); - preHasher.inputs[0] <== sender_before; - preHasher.inputs[1] <== receiver_before; - pre_hash <== preHasher.out;Also applies to: 29-32
src/libs/l2ps/zk/L2PSBatchProver.ts (3)
9-23: Global prototype patching at import time is risky.Patching
EventTarget.prototype.dispatchEventat module load time affects all code in the process, not just this module. This could cause unexpected behavior in other libraries that rely on standarddispatchEventbehavior.Consider isolating this patch:
- Move it inside
initialize()where it's only applied when the prover is actually used- Or wrap snarkjs calls in a try-catch instead of globally patching
-// Bun compatibility: patch web-worker before importing snarkjs -const isBun = typeof (globalThis as any).Bun !== 'undefined'; -if (isBun) { - // Suppress web-worker errors in Bun by patching dispatchEvent - const originalDispatchEvent = EventTarget.prototype.dispatchEvent; - EventTarget.prototype.dispatchEvent = function(event: any) { - ... - }; -} +// Bun compatibility handled in initialize() to avoid global side effects
139-158: Clarify thatloadedKeyscaches paths, not loaded key data.The method is named
loadKeysand the cacheloadedKeys, but it actually caches file paths that snarkjs will load on eachfullProvecall. This is correct behavior for snarkjs, but the naming suggests pre-loading into memory.Consider renaming for clarity:
- private loadedKeys: Map<BatchSize, { zkey: any; wasm: string }> = new Map(); + private resolvedKeyPaths: Map<BatchSize, { zkeyPath: string; wasmPath: string }> = new Map();
290-294: Synchronous file read blocks the event loop.
fs.readFileSyncon line 294 blocks the event loop during proof verification. For a service processing multiple batches, this could impact responsiveness.- const vkey = JSON.parse(fs.readFileSync(vkeyPath, 'utf-8')); + const vkey = JSON.parse(await fs.promises.readFile(vkeyPath, 'utf-8'));Alternatively, cache the verification key in
loadedKeysafter first load.src/libs/l2ps/L2PSBatchAggregator.ts (1)
93-94: Unused readonly fieldbatchNonceCounter.This field is declared but never used - nonce management now uses
getNextBatchNonce()with shared state storage. Remove the dead code:- /** Persistent nonce counter for batch transactions */ - private readonly batchNonceCounter: number = 0src/libs/l2ps/zk/circomlibjs.d.ts (3)
32-42: Replaceanywith a more specific type for field operations.Line 33 uses
anyfor theFfield, which defeats TypeScript's type safety. The field operations object likely shares a similar structure with the PoseidonFinterface (lines 12-15) or could be typed as a commonFieldOperationsinterface.Consider defining a shared interface:
interface FieldOperations { toObject(element: Uint8Array): bigint toString(element: Uint8Array): string // Add other common field operation methods if applicable }Then apply this diff:
export function buildBabyjub(): Promise<{ - F: any + F: FieldOperations Generator: [bigint, bigint]
47-52: Replaceanywith a more specific type for field operations.Line 48 uses
anyfor theFfield. This should use the same typed interface as suggested for other field operations to maintain consistency and type safety across the circomlibjs declarations.Apply this diff:
export function buildEddsa(): Promise<{ - F: any + F: FieldOperations prv2pub(privateKey: Uint8Array): [bigint, bigint]
57-61: Replaceanywith a more specific type for field operations.Line 58 uses
anyfor theFfield. For consistency and type safety, this should use the same typed interface as the other field operations.Apply this diff:
export function buildMimcSponge(): Promise<{ - F: any + F: FieldOperations hash(left: bigint, right: bigint, key: bigint): bigint
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
Disabled knowledge base sources:
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (15)
package.json(3 hunks)scripts/send-l2-batch.ts(1 hunks)src/libs/l2ps/L2PSBatchAggregator.ts(1 hunks)src/libs/l2ps/L2PSProofManager.ts(1 hunks)src/libs/l2ps/L2PSTransactionExecutor.ts(1 hunks)src/libs/l2ps/zk/BunPlonkWrapper.ts(1 hunks)src/libs/l2ps/zk/L2PSBatchProver.ts(1 hunks)src/libs/l2ps/zk/README.md(1 hunks)src/libs/l2ps/zk/circomlibjs.d.ts(1 hunks)src/libs/l2ps/zk/circuits/l2ps_batch_10.circom(1 hunks)src/libs/l2ps/zk/circuits/l2ps_batch_5.circom(1 hunks)src/libs/l2ps/zk/scripts/setup_all_batches.sh(1 hunks)src/libs/l2ps/zk/snarkjs.d.ts(1 hunks)src/libs/network/routines/transactions/handleL2PS.ts(2 hunks)src/model/entities/L2PSProofs.ts(1 hunks)
🧰 Additional context used
🧬 Code graph analysis (5)
src/libs/l2ps/zk/BunPlonkWrapper.ts (1)
src/utilities/logger.ts (1)
error(125-132)
src/libs/l2ps/L2PSProofManager.ts (3)
src/libs/crypto/hashing.ts (1)
Hashing(15-26)src/model/entities/L2PSProofs.ts (1)
L2PSProofStatus(27-31)src/utilities/logger.ts (1)
error(125-132)
src/libs/l2ps/L2PSTransactionExecutor.ts (3)
src/libs/blockchain/transaction.ts (1)
Transaction(44-534)src/libs/l2ps/L2PSProofManager.ts (1)
L2PSProofManager(71-357)src/libs/l2ps/zk/L2PSBatchProver.ts (2)
L2PSTransaction(43-49)hash(163-166)
src/libs/l2ps/zk/L2PSBatchProver.ts (1)
src/libs/l2ps/zk/BunPlonkWrapper.ts (1)
plonkVerifyBun(86-187)
src/model/entities/L2PSProofs.ts (1)
src/model/entities/L2PSTransactions.ts (1)
Entity(27-143)
🪛 markdownlint-cli2 (0.18.1)
src/libs/l2ps/zk/README.md
65-65: Fenced code blocks should have a language specified
(MD040, fenced-code-language)
80-80: Fenced code blocks should have a language specified
(MD040, fenced-code-language)
🪛 Shellcheck (0.11.0)
src/libs/l2ps/zk/scripts/setup_all_batches.sh
[warning] 33-33: Quote this to prevent word splitting.
(SC2046)
🔇 Additional comments (30)
src/libs/l2ps/zk/circuits/l2ps_batch_10.circom (1)
26-27: Unusedchecksignal - clarify intent or remove.The
checksignal computessender_after * sender_afterbut is never used in any constraint. If the intent is to enforce non-negative balances, note that squaring in a finite field doesn't prevent underflow—negative values (large field elements) still produce valid squares.If this is intentional for future use or as a constraint-generation placeholder, consider adding a comment. Otherwise, remove it.
- signal check; - check <== sender_after * sender_after; + // Note: Non-negativity is enforced off-chain before proof generationsrc/libs/l2ps/zk/snarkjs.d.ts (1)
1-78: LGTM - Minimal but functional type declarations for snarkjs.The type declarations provide adequate coverage for PLONK/Groth16 proof generation and verification workflows. Using
anyfor proof objects is pragmatic given snarkjs's dynamic nature.src/libs/network/routines/transactions/handleL2PS.ts (3)
12-20: LGTM - Clean error response helper.The
createErrorResponsehelper centralizes error formatting, improving code consistency.
129-145: Execution flow and status updates look correct.The execution delegates to
L2PSTransactionExecutor, properly handles errors with logging, and updates mempool status appropriately (failedon error,executedon success).
70-70: Double cast is necessary for type conversion between SDK and local Transaction classes, not a type safety bypass.The cast converts from the SDK's
Transactiontype (returned byl2psInstance.decryptTx()) to the localTransactionclass required by the function's return type. The localTransactionclass implements the SDK'sITransactioninterface, and the object passes validation viaTransaction.confirmTx()before the cast, ensuring structural correctness at runtime.src/libs/l2ps/zk/circuits/l2ps_batch_5.circom (1)
23-24: Consider adding balance underflow protection.The constraint
sender_after === sender_before - amountoperates in a finite field where underflow wraps around rather than failing. A malicious prover could create a valid proof withsender_before < amountresulting in a wrapped value.If balance validity must be enforced on-chain, add a range check or ensure the off-chain prover validates
sender_before >= amountbefore proof generation.src/libs/l2ps/zk/L2PSBatchProver.ts (1)
215-278: LGTM - proof generation flow is well-structured.The
generateProofmethod properly:
- Ensures initialization before use
- Validates input (empty batch check)
- Selects appropriate batch size
- Pads transactions correctly
- Uses
singleThreadoption for Bun compatibilitysrc/libs/l2ps/zk/BunPlonkWrapper.ts (2)
86-187: LGTM - Bun-compatible PLONK verification is well-implemented.The implementation:
- Correctly uses
singleThread: trueto avoid Bun worker crashes- Properly terminates the curve in
finallyblock to prevent memory leaks- Returns
falseon errors rather than throwing, which is appropriate for verification- Follows the snarkjs reference implementation faithfully
29-78: LGTM - Keccak256Transcript correctly implements Fiat-Shamir.The transcript class properly handles polynomial commitments and scalars for challenge derivation, matching the snarkjs reference implementation.
src/libs/l2ps/L2PSBatchAggregator.ts (4)
663-669: LGTM - Cycle time calculation is correct.The
updateCycleTimemethod is only called aftersuccessfulCyclesis incremented (line 248), so division by zero cannot occur. The running average formula correctly handles the first cycle case.
444-456: ZK proof uses placeholder data - proofs won't verify actual balances.The TODO at line 447 notes this, but it's worth emphasizing: the current implementation derives balance values from transaction hashes rather than actual amounts. This means:
- ZK proofs will be mathematically valid but semantically meaningless
- Balance underflows/overflows won't be caught
- The proof only demonstrates circuit execution, not transaction validity
Until actual transaction amounts are extracted, consider either:
- Disabling ZK proofs entirely (
zkEnabled = false)- Clearly logging that proofs are in "placeholder mode"
397-410: LGTM - HMAC key handling now fails fast as recommended.The code now throws an error when
keypair.privateKeyis unavailable instead of using an insecurebatchHashfallback. This addresses the security concern from the previous review.
514-526: Nonce persistence is process-lifetime only - consider database storage for production.The current implementation persists nonces in
sharedState, which survives within the same process but resets on restart. The timestamp-based fallback (line 500-503) provides reasonable collision resistance, but for production deployments with strict ordering requirements, consider persisting to the database.The current approach is acceptable for development/testing.
src/libs/l2ps/L2PSProofManager.ts (5)
34-44: LGTM! Deterministic stringify implementation is solid.The alphabetical key sorting ensures consistent hashing across JSONB round-trips, and arrays are correctly left in their original order.
111-171: LGTM! Proof creation uses deterministic hashing.The previous concern about non-deterministic
Date.now()timestamps has been resolved. The implementation now usesdeterministicStringifyconsistently for stable hash generation.
194-211: LGTM! Reserved parameter for future block-specific filtering.The
blockNumberparameter is intentionally unused with a clearFUTUREcomment. This reserves the API signature for block-specific batching without requiring breaking changes later.
258-292: LGTM! Status update methods are correctly implemented.Both
markProofAppliedandmarkProofRejectedproperly update proof status with appropriate metadata (block number, error message, timestamps).
332-356: LGTM! Statistics query uses proper QueryBuilder cloning.The
clone()pattern correctly creates independent queries for each status count, andPromise.allefficiently executes them in parallel.src/model/entities/L2PSProofs.ts (3)
38-44: LGTM! Index definitions are correct and comprehensive.All index names follow a consistent naming convention, and the compound index on
[l2ps_uid, status]will optimize common query patterns.
85-97: LGTM! Proof structure is well-defined and flexible.The JSONB column supports multiple proof systems (hash, plonk, snark, stark) with comprehensive metadata fields for production use.
99-167: LGTM! Column definitions are appropriate and well-documented.All columns have correct types, nullability, and defaults. The use of
simple-arrayforaffected_accountsis suitable for a string array.src/libs/l2ps/L2PSTransactionExecutor.ts (7)
49-77: LGTM! Repository initialization uses proper promise guard pattern.The lazy initialization with
initPromiseguard prevents race conditions when multiple calls occur before initialization completes.
154-186: LGTM! GCR edit generation logic is clear and handles all cases.The method properly delegates to
handleNativeTransactionfor native transactions and validates providedgcr_editsfor other types.
226-296: LGTM! Native transaction handling includes proper validation.The amount validation checks type, finiteness, and ensures positive values. Variables are properly scoped within the if/else blocks.
298-333: LGTM! GCR edit validation is properly scoped.All switch cases are wrapped in braces, and the unused
repovariable issue from past reviews has been resolved.
335-370: LGTM! Transaction recording handles BigInt conversions correctly.The method properly converts
amount,nonce, andtimestamptoBigInttypes as required by the entity schema.
372-396: LGTM! Status update now detects when no rows match.The method checks
result.affected === 0and logs a warning, preventing silent failures when updating non-existent transaction hashes.
398-484: LGTM! Query methods provide comprehensive L2PS data access.The account transaction query correctly uses an array of where conditions to match both sender and receiver addresses. Statistics properly combine transaction and proof data.
src/libs/l2ps/zk/circomlibjs.d.ts (2)
1-5: LGTM!The header comment clearly describes the file's purpose.
6-27: LGTM!The Poseidon interface and builder declarations are well-typed with appropriate JSDoc comments.
scripts/send-l2-batch.ts
Outdated
| // Get initial nonce and track locally to avoid conflicts | ||
| let currentNonce = (await demos.getAddressNonce(nonceAccount)) + 1 | ||
| console.log(` Starting nonce: ${currentNonce}`) | ||
|
|
||
| for (let i = 0; i < options.count; i++) { | ||
| console.log(`\n🔄 Transaction ${i + 1}/${options.count} (nonce: ${currentNonce})`) | ||
|
|
||
| const payload: TxPayload = { | ||
| l2ps_uid: options.uid, | ||
| } | ||
| if (options.data) { | ||
| payload.message = `${options.data} [${i + 1}/${options.count}]` | ||
| } | ||
|
|
||
| console.log(" 🧱 Building inner transaction (L2 payload)...") | ||
| const innerTx = await buildInnerTransaction( | ||
| demos, | ||
| toAddress, | ||
| amount, | ||
| payload, | ||
| ) | ||
|
|
||
| console.log(" 🔐 Encrypting with L2PS key material...") | ||
| const encryptedTx = await l2ps.encryptTx(innerTx) | ||
| const [, encryptedPayload] = encryptedTx.content.data | ||
|
|
||
| console.log(" 🧱 Building outer L2PS transaction...") | ||
| const subnetTx = await buildL2PSTransaction( | ||
| demos, | ||
| encryptedPayload as L2PSEncryptedPayload, | ||
| toAddress, | ||
| currentNonce, | ||
| ) | ||
|
|
||
| console.log(" ✅ Confirming transaction with node...") | ||
| const validityResponse = await demos.confirm(subnetTx) | ||
| const validityData = validityResponse.response | ||
|
|
||
| if (!validityData?.data?.valid) { | ||
| throw new Error( | ||
| `Transaction invalid: ${validityData?.data?.message ?? "Unknown error"}`, | ||
| ) | ||
| } | ||
|
|
||
| console.log(" 📤 Broadcasting encrypted L2PS transaction to L1...") | ||
| const broadcastResponse = await demos.broadcast(validityResponse) | ||
|
|
||
| const txResult = { | ||
| index: i + 1, | ||
| hash: subnetTx.hash, | ||
| innerHash: innerTx.hash, | ||
| nonce: currentNonce, | ||
| payload: payload, | ||
| response: broadcastResponse, | ||
| } | ||
|
|
||
| results.push(txResult) | ||
|
|
||
| console.log(` ✅ Outer hash: ${subnetTx.hash}`) | ||
| console.log(` ✅ Inner hash: ${innerTx.hash}`) | ||
|
|
||
| // Small delay between transactions to avoid nonce conflicts | ||
| if (i < options.count - 1) { | ||
| await new Promise(resolve => setTimeout(resolve, 500)) | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nonce is never incremented between transactions - all but first will be rejected.
currentNonce is initialized once before the loop (line 306) and passed to each transaction (line 336), but it's never incremented inside the loop. All transactions after the first will have the same nonce and be rejected by the node.
// Small delay between transactions to avoid nonce conflicts
if (i < options.count - 1) {
await new Promise(resolve => setTimeout(resolve, 500))
}
+
+ // Increment nonce for next transaction
+ currentNonce++
}📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| // Get initial nonce and track locally to avoid conflicts | |
| let currentNonce = (await demos.getAddressNonce(nonceAccount)) + 1 | |
| console.log(` Starting nonce: ${currentNonce}`) | |
| for (let i = 0; i < options.count; i++) { | |
| console.log(`\n🔄 Transaction ${i + 1}/${options.count} (nonce: ${currentNonce})`) | |
| const payload: TxPayload = { | |
| l2ps_uid: options.uid, | |
| } | |
| if (options.data) { | |
| payload.message = `${options.data} [${i + 1}/${options.count}]` | |
| } | |
| console.log(" 🧱 Building inner transaction (L2 payload)...") | |
| const innerTx = await buildInnerTransaction( | |
| demos, | |
| toAddress, | |
| amount, | |
| payload, | |
| ) | |
| console.log(" 🔐 Encrypting with L2PS key material...") | |
| const encryptedTx = await l2ps.encryptTx(innerTx) | |
| const [, encryptedPayload] = encryptedTx.content.data | |
| console.log(" 🧱 Building outer L2PS transaction...") | |
| const subnetTx = await buildL2PSTransaction( | |
| demos, | |
| encryptedPayload as L2PSEncryptedPayload, | |
| toAddress, | |
| currentNonce, | |
| ) | |
| console.log(" ✅ Confirming transaction with node...") | |
| const validityResponse = await demos.confirm(subnetTx) | |
| const validityData = validityResponse.response | |
| if (!validityData?.data?.valid) { | |
| throw new Error( | |
| `Transaction invalid: ${validityData?.data?.message ?? "Unknown error"}`, | |
| ) | |
| } | |
| console.log(" 📤 Broadcasting encrypted L2PS transaction to L1...") | |
| const broadcastResponse = await demos.broadcast(validityResponse) | |
| const txResult = { | |
| index: i + 1, | |
| hash: subnetTx.hash, | |
| innerHash: innerTx.hash, | |
| nonce: currentNonce, | |
| payload: payload, | |
| response: broadcastResponse, | |
| } | |
| results.push(txResult) | |
| console.log(` ✅ Outer hash: ${subnetTx.hash}`) | |
| console.log(` ✅ Inner hash: ${innerTx.hash}`) | |
| // Small delay between transactions to avoid nonce conflicts | |
| if (i < options.count - 1) { | |
| await new Promise(resolve => setTimeout(resolve, 500)) | |
| } | |
| } | |
| // Get initial nonce and track locally to avoid conflicts | |
| let currentNonce = (await demos.getAddressNonce(nonceAccount)) + 1 | |
| console.log(` Starting nonce: ${currentNonce}`) | |
| for (let i = 0; i < options.count; i++) { | |
| console.log(`\n🔄 Transaction ${i + 1}/${options.count} (nonce: ${currentNonce})`) | |
| const payload: TxPayload = { | |
| l2ps_uid: options.uid, | |
| } | |
| if (options.data) { | |
| payload.message = `${options.data} [${i + 1}/${options.count}]` | |
| } | |
| console.log(" 🧱 Building inner transaction (L2 payload)...") | |
| const innerTx = await buildInnerTransaction( | |
| demos, | |
| toAddress, | |
| amount, | |
| payload, | |
| ) | |
| console.log(" 🔐 Encrypting with L2PS key material...") | |
| const encryptedTx = await l2ps.encryptTx(innerTx) | |
| const [, encryptedPayload] = encryptedTx.content.data | |
| console.log(" 🧱 Building outer L2PS transaction...") | |
| const subnetTx = await buildL2PSTransaction( | |
| demos, | |
| encryptedPayload as L2PSEncryptedPayload, | |
| toAddress, | |
| currentNonce, | |
| ) | |
| console.log(" ✅ Confirming transaction with node...") | |
| const validityResponse = await demos.confirm(subnetTx) | |
| const validityData = validityResponse.response | |
| if (!validityData?.data?.valid) { | |
| throw new Error( | |
| `Transaction invalid: ${validityData?.data?.message ?? "Unknown error"}`, | |
| ) | |
| } | |
| console.log(" 📤 Broadcasting encrypted L2PS transaction to L1...") | |
| const broadcastResponse = await demos.broadcast(validityResponse) | |
| const txResult = { | |
| index: i + 1, | |
| hash: subnetTx.hash, | |
| innerHash: innerTx.hash, | |
| nonce: currentNonce, | |
| payload: payload, | |
| response: broadcastResponse, | |
| } | |
| results.push(txResult) | |
| console.log(` ✅ Outer hash: ${subnetTx.hash}`) | |
| console.log(` ✅ Inner hash: ${innerTx.hash}`) | |
| // Small delay between transactions to avoid nonce conflicts | |
| if (i < options.count - 1) { | |
| await new Promise(resolve => setTimeout(resolve, 500)) | |
| } | |
| // Increment nonce for next transaction | |
| currentNonce++ | |
| } |
🤖 Prompt for AI Agents
In scripts/send-l2-batch.ts around lines 305 to 370: the nonce (currentNonce) is
only set once before the loop and never incremented, causing all transactions
after the first to reuse the same nonce and be rejected; fix by incrementing
currentNonce after a successful transaction (e.g., after push to results or
after confirming/broadcasting the transaction) so each iteration uses a unique
nonce (currentNonce++), or alternatively compute nonce per-iteration from the
initial value plus i; ensure the increment happens only after the transaction is
accepted to avoid gaps.
| declare module "circomlibjs" { | ||
| /** | ||
| * Poseidon hasher instance | ||
| */ | ||
| interface Poseidon { | ||
| (inputs: bigint[]): Uint8Array | ||
| F: { | ||
| toObject(element: Uint8Array): bigint | ||
| toString(element: Uint8Array): string | ||
| } | ||
| } | ||
|
|
||
| /** | ||
| * Build Poseidon hasher | ||
| * @returns Poseidon instance with field operations | ||
| */ | ||
| export function buildPoseidon(): Promise<Poseidon> | ||
|
|
||
| /** | ||
| * Build Poseidon reference (slower but simpler) | ||
| */ | ||
| export function buildPoseidonReference(): Promise<Poseidon> | ||
|
|
||
| /** | ||
| * Build baby jubjub curve operations | ||
| */ | ||
| export function buildBabyjub(): Promise<{ | ||
| F: any | ||
| Generator: [bigint, bigint] | ||
| Base8: [bigint, bigint] | ||
| order: bigint | ||
| subOrder: bigint | ||
| mulPointEscalar(point: [bigint, bigint], scalar: bigint): [bigint, bigint] | ||
| addPoint(p1: [bigint, bigint], p2: [bigint, bigint]): [bigint, bigint] | ||
| inSubgroup(point: [bigint, bigint]): boolean | ||
| inCurve(point: [bigint, bigint]): boolean | ||
| }> | ||
|
|
||
| /** | ||
| * Build EdDSA operations | ||
| */ | ||
| export function buildEddsa(): Promise<{ | ||
| F: any | ||
| prv2pub(privateKey: Uint8Array): [bigint, bigint] | ||
| sign(privateKey: Uint8Array, message: bigint): { R8: [bigint, bigint], S: bigint } | ||
| verify(message: bigint, signature: { R8: [bigint, bigint], S: bigint }, publicKey: [bigint, bigint]): boolean | ||
| }> | ||
|
|
||
| /** | ||
| * Build MiMC sponge hasher | ||
| */ | ||
| export function buildMimcSponge(): Promise<{ | ||
| F: any | ||
| hash(left: bigint, right: bigint, key: bigint): bigint | ||
| multiHash(arr: bigint[], key?: bigint, numOutputs?: number): bigint[] | ||
| }> | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Check circomlibjs version and usage patterns
echo "=== Checking circomlibjs version in package.json ==="
if [ -f package.json ]; then
jq '.dependencies.circomlibjs // .devDependencies.circomlibjs // "not found"' package.json
else
echo "package.json not found"
fi
echo -e "\n=== Finding circomlibjs imports and usage ==="
rg -n "circomlibjs" --type=ts --type=js -A 3 2>/dev/null | head -50
echo -e "\n=== Checking for test files using circomlibjs ==="
fd -e test.ts -e spec.ts -e test.js -e spec.js 2>/dev/null | while read f; do
if rg -l "circomlibjs" "$f" >/dev/null 2>&1; then
echo "Found: $f"
fi
done
echo -e "\n=== Checking if circomlibjs is listed in package.json at all ==="
grep -A 20 '"dependencies"' package.json 2>/dev/null | grep -i circom || echo "Not found in dependencies"
grep -A 20 '"devDependencies"' package.json 2>/dev/null | grep -i circom || echo "Not found in devDependencies"Repository: kynesyslabs/node
Length of output: 1088
🌐 Web query:
circomlibjs npm package API buildPoseidon buildBabyjub buildEddsa buildMimcSponge official documentation
💡 Result:
Short answer — official docs and source:
-
circomlibjs (exports including buildPoseidon, buildBabyjub, buildEddsa, buildMimcSponge) — source + README on the iden3 GitHub repo. See the package repo for the exact API and source code: https://github.com/iden3/circomlibjs. [1]
-
npm package page (package metadata / install): https://www.npmjs.com/package/circomlibjs. [2]
-
Circom project docs (context about circomlib / circomlibjs usage): https://docs.circom.io/ (see circomlib/circomlibjs sections). [3]
If you want, I can open the repo and copy the exact exported function signatures (from src/index.js and relevant files) and give small usage examples with line citations. Which would you prefer?
🏁 Script executed:
#!/bin/bash
# Try to find circomlibjs in node_modules to inspect actual API
echo "=== Checking for circomlibjs in node_modules ==="
if [ -d "node_modules/circomlibjs" ]; then
echo "Found circomlibjs package"
echo -e "\n=== Package.json of circomlibjs ==="
cat node_modules/circomlibjs/package.json | jq '.main, .exports, .version' 2>/dev/null || echo "Could not read package.json"
echo -e "\n=== Main entry point (index.js) ==="
if [ -f "node_modules/circomlibjs/src/index.js" ]; then
head -100 node_modules/circomlibjs/src/index.js
elif [ -f "node_modules/circomlibjs/index.js" ]; then
head -100 node_modules/circomlibjs/index.js
else
echo "Could not find index.js"
ls -la node_modules/circomlibjs/ | head -20
fi
else
echo "circomlibjs not found in node_modules"
echo -e "\n=== Checking dependencies in package.json ==="
jq '.dependencies | keys[]' package.json 2>/dev/null | grep -i circom || echo "No circom-related dependencies"
fiRepository: kynesyslabs/node
Length of output: 219
🌐 Web query:
iden3 circomlibjs github source code index.js buildPoseidon buildBabyjub buildEddsa export
💡 Result:
On the iden3/circomlibjs package the library exposes async builders for the primitives — e.g. buildPoseidon, buildBabyjub and buildEddsa — from its entry module (index). You can import/use them like:
- const { buildEddsa, buildPoseidon, buildBabyjub } = require('circomlibjs'); then await buildEddsa(), await buildPoseidon(n), etc. [1][2]
Sources:
- iden3/circomlibjs repository (package entry and file tree showing the library). [1]
- Example usage showing buildEddsa (EdDSA / Poseidon) from circomlibjs. [2]
References:
[1] https://github.com/iden3/circomlibjs
[2] https://gist.github.com/neuodev/df04d1ff944d890d6d6a549a43d436e1
🏁 Script executed:
#!/bin/bash
# Fetch circomlibjs source from GitHub to verify API
echo "=== Fetching circomlibjs index.js from GitHub ==="
curl -s https://raw.githubusercontent.com/iden3/circomlibjs/main/src/index.js | head -150
echo -e "\n=== Checking for type definitions in circomlibjs ==="
curl -s https://raw.githubusercontent.com/iden3/circomlibjs/main/package.json | jq '.main, .exports, .types, .version' 2>/dev/null
echo -e "\n=== Checking available exported functions in circomlibjs ==="
curl -s https://raw.githubusercontent.com/iden3/circomlibjs/main/src/index.js | grep -E "^(async function|function|export|module.exports)" | head -20Repository: kynesyslabs/node
Length of output: 332
🌐 Web query:
site:github.com iden3/circomlibjs buildPoseidonReference buildMimcSponge export functions
💡 Result:
Yes — circomlibjs exposes Poseidon and MiMC builder functions. See the repository source and built bundle which include the Poseidon builders (e.g. buildPoseidon / poseidon utilities) and MiMC builders (buildMimc* / MimcSponge). [1][2]
Sources:
- iden3/circomlibjs repository (src listing). [1]
- Built bundle / mirror showing exported MiMC and Poseidon builders (contains buildMimc7 / MiMC and Poseidon constants). [2]
References:
[1] https://github.com/iden3/circomlibjs
[2] mirror of circomlibjs (built main.cjs) showing buildMimc and Poseidon code.
🏁 Script executed:
#!/bin/bash
# Fetch the actual built bundle to see exported functions
echo "=== Fetching built main.js from circomlibjs npm CDN ==="
curl -s "https://unpkg.com/circomlibjs@0.1.8/main.js" | head -200
echo -e "\n=== Checking for specific exported functions ==="
curl -s "https://unpkg.com/circomlibjs@0.1.8/main.js" | grep -E "(buildPoseidon|buildBabyjub|buildEddsa|buildMimic)" | head -20
echo -e "\n=== Looking for function definitions ==="
curl -s "https://unpkg.com/circomlibjs@0.1.8/main.js" | grep -oE "(buildPoseidon[a-zA-Z]*|buildBabyjub|buildEddsa|buildMimic[a-zA-Z0-9]*)" | sort -uRepository: kynesyslabs/node
Length of output: 251
🌐 Web query:
circomlibjs 0.1.7 0.1.8 buildPoseidonReference buildMimcSponge actual exported functions list
💡 Result:
Short answer:
-
circomlibjs v0.1.7 is the published latest release; v0.1.8 is not published on npm. [1][2]
-
The package root exports (the top-level functions you can import from 'circomlibjs') include the builder functions such as:
- buildBabyJub
- buildEddsa
- buildPedersenHash
- buildPoseidon
- buildPoseidonReference
- poseidon_gencontract
- buildMimc7
- buildMimcSponge
- …and other circom helper/build functions (these are the same top-level exports shown in the repo / package). [2][3]
-
What each builder returns (the “actual exported functions” on the objects returned by the builders):
- buildMimc* (e.g. buildMimc7 / buildMimcSponge) produce instances exposing at least:
- hash(...)
- multiHash(...)
- (and the internal field F / constants accessors used by the implementation). [3]
- buildPoseidonReference / buildPoseidon produce the Poseidon primitives used by the package (poseidon contract generator + JS hash functions / constants). The repo shows poseidon-related exports (poseidon constants, generator) alongside buildPoseidonReference. [3]
- buildMimc* (e.g. buildMimc7 / buildMimcSponge) produce instances exposing at least:
Sources:
- npm package page for circomlibjs (v0.1.7 published; no v0.1.8). [1]
- circomlibjs repository (top-level exports and implementation of Mimc/Poseidon builders). [2]
If you want, I can fetch the exact top-level export list and the full method names returned by buildPoseidonReference and buildMimcSponge by opening the packed build file (main.cjs) for v0.1.7 and paste the exact identifiers. Which do you prefer?
🏁 Script executed:
#!/bin/bash
# Fetch circomlibjs v0.1.7 from npm CDN
echo "=== Fetching circomlibjs v0.1.7 main.js ==="
curl -s "https://unpkg.com/circomlibjs@0.1.7/main.js" 2>&1 | head -300 | tail -200
echo -e "\n=== Searching for exported function names ==="
curl -s "https://unpkg.com/circomlibjs@0.1.7/main.js" 2>&1 | grep -oE "build[A-Za-z0-9]+" | sort -u
echo -e "\n=== Checking for case sensitivity in exports ==="
curl -s "https://unpkg.com/circomlibjs@0.1.7/main.js" 2>&1 | grep -E "(buildBaby|buildBabyJ)" | head -10Repository: kynesyslabs/node
Length of output: 1456
🌐 Web query:
circomlibjs v0.1.7 npm "buildBabyjub" OR "buildBabyJub" exact export name case sensitive
💡 Result:
The package exports buildBabyJub (capital "J") — not buildBabyjub. [1][2]
Sources:
[1] circomlibjs build/main.cjs (shows async function buildBabyJub).
[2] circomlibjs@0.1.7 package files listing (build/ exists for v0.1.7).
🏁 Script executed:
#!/bin/bash
# Fetch implementation files to verify exact return types and signatures
echo "=== Checking Poseidon implementation ==="
curl -s "https://raw.githubusercontent.com/iden3/circomlibjs/main/src/poseidon_reference.js" 2>&1 | head -80
echo -e "\n=== Checking baby jubjub implementation ==="
curl -s "https://raw.githubusercontent.com/iden3/circomlibjs/main/src/babyjub.js" 2>&1 | head -80
echo -e "\n=== Checking EdDSA implementation ==="
curl -s "https://raw.githubusercontent.com/iden3/circomlibjs/main/src/eddsa.js" 2>&1 | head -80
echo -e "\n=== Checking MiMC Sponge implementation ==="
curl -s "https://raw.githubusercontent.com/iden3/circomlibjs/main/src/mimcsponge.js" 2>&1 | head -80Repository: kynesyslabs/node
Length of output: 10386
🌐 Web query:
site:github.com/iden3/circomlibjs "F.toObject" OR "F.toString" Poseidon interface structure
💡 Result:
Quick answer: I couldn't find any occurrences of F.toObject or F.toString inside the iden3/circomlibjs repository. circomlibjs's Poseidon code returns/uses field elements via the underlying field/curve utilities (ffjavascript / snarkjs-style curve objects) which expose toObject/toString-like conversions (e.g. G1.toObject, Fr/curve field utilities) — those conversion methods live in the curve/ffjavascript code, not directly on a Poseidon interface in circomlibjs. See the circomlibjs repo and examples in snarkjs showing toObject usage for curve/field objects. [1][2]
Sources
- iden3 / circomlibjs (repo root). [1]
- snarkjs build code showing curve/Fr/G1.toObject usage (example of where .toObject is used for field/curve objects). [2]
🏁 Script executed:
#!/bin/bash
# Verify MimcSponge multiHash and EdDSA verify signatures
echo "=== Checking MimcSponge multiHash return type ==="
curl -s "https://raw.githubusercontent.com/iden3/circomlibjs/main/src/mimcsponge.js" 2>&1 | grep -A 30 "multiHash"
echo -e "\n=== Checking EdDSA verify method ==="
curl -s "https://raw.githubusercontent.com/iden3/circomlibjs/main/src/eddsa.js" 2>&1 | grep -A 15 "verify"
echo -e "\n=== Checking Poseidon return type structure ==="
curl -s "https://raw.githubusercontent.com/iden3/circomlibjs/main/src/poseidon_reference.js" 2>&1 | tail -20
echo -e "\n=== Checking Poseidon.F capabilities from ffjavascript ==="
curl -s "https://raw.githubusercontent.com/iden3/circomlibjs/main/src/poseidon_wasm.js" 2>&1 | head -50Repository: kynesyslabs/node
Length of output: 5708
🌐 Web query:
ffjavascript Fr field object toObject toString methods API documentation
💡 Result:
See the ffjavascript source for the definitive API. The Finite Field "Fr" / F1Field code and its object/string helpers are in the iden3/ffjavascript repository on GitHub (src) and the published npm package.
- Repository (source/implementation): iden3/ffjavascript — check the src folder for the F1Field / Fr implementation and methods. [1]
- npm package (package info / versions): ffjavascript on npm. [2]
If you want, I can:
- open the exact source file and quote the toObject/toString implementations, or
- paste short usage examples showing how to call Fr.toObject() and Fr.toString().
Which would you prefer?
Sources:
[1] https://github.com/iden3/ffjavascript
[2] https://www.npmjs.com/package/ffjavascript
Fix type declaration mismatches with circomlibjs v0.1.7 API.
-
buildMimcSponge.hash() return type is incorrect: The declaration specifies
hash(left: bigint, right: bigint, key: bigint): bigint, but the actual implementation returns{xL, xR}object, not a bigint. ThemultiHash()method correctly returnsbigint[]orbigint. -
Poseidon return type doesn't match both implementations: The declaration specifies
(inputs: bigint[]): Uint8Array, butposeidon_reference.jsreturns field elements (not Uint8Array), whileposeidon_wasm.jsreturns Uint8Array. The interface should account for both return types or specify which implementation is being used. -
EdDSA interface is oversimplified: The declaration declares a generic
verify()method, but the actual library provides four specific variants:verifyPedersen(),verifyMiMC(),verifyPoseidon(), andverifyMiMCSponge(). Update the interface to reflect the actual available methods or provide a union type for the signature parameter. -
Poseidon.F property documentation is misleading: While
toObject()andtoString()methods do exist (from ffjavascript's Fr object), the declaration should clarify these come from the underlying field implementation, not from circomlibjs itself.
| /* | ||
| * L2PS Batch Circuit - 10 transactions | ||
| * ~35K constraints → pot16 (64MB) | ||
| * | ||
| * For batches with 6-10 transactions. | ||
| * Unused slots filled with zero-amount transfers. | ||
| */ |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Incorrect constraint count in comment.
The comment states "~35K constraints → pot16" but per the README, batch_10 has ~74K constraints and requires pot17. The comment appears to be copied from batch_5.
/*
* L2PS Batch Circuit - 10 transactions
- * ~35K constraints → pot16 (64MB)
+ * ~74K constraints → pot17 (128MB)
*
* For batches with 6-10 transactions.
* Unused slots filled with zero-amount transfers.
*/📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| /* | |
| * L2PS Batch Circuit - 10 transactions | |
| * ~35K constraints → pot16 (64MB) | |
| * | |
| * For batches with 6-10 transactions. | |
| * Unused slots filled with zero-amount transfers. | |
| */ | |
| /* | |
| * L2PS Batch Circuit - 10 transactions | |
| * ~74K constraints → pot17 (128MB) | |
| * | |
| * For batches with 6-10 transactions. | |
| * Unused slots filled with zero-amount transfers. | |
| */ |
🤖 Prompt for AI Agents
In src/libs/l2ps/zk/circuits/l2ps_batch_10.circom around lines 5 to 11, the
header comment incorrectly lists "~35K constraints → pot16" (copied from
batch_5); update the comment to reflect the correct constraint count and
power-of-two transcript size for batch_10 (≈74K constraints → pot17) and ensure
remaining descriptive text still indicates this circuit is for batches with 6–10
transactions and that unused slots are zero-amount transfers.
…lass; update L2PSBatchAggregator for improved ZK proof validation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 5
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
src/libs/blockchain/chain.ts (1)
191-193: Inconsistent null handling:getTransactionFromHashmay throw on null input.Unlike the established pattern in
getTxByHash(lines 76-92), this method callsTransaction.fromRawTransactiondirectly without checking if the repository returned null. IffindOneByreturns null,fromRawTransactionwill throw an error due to the defensive check at line 499 intransaction.ts.Update the method to align with the
getTxByHashpattern:static async getTransactionFromHash(hash: string): Promise<Transaction | null> { + const rawTx = await this.transactions.findOneBy({ hash: ILike(hash) }) + + if (!rawTx) { + return null + } + - return Transaction.fromRawTransaction( - await this.transactions.findOneBy({ hash: ILike(hash) }), - ) + return Transaction.fromRawTransaction(rawTx) }This requires:
- Updating the return type from
Promise<Transaction>toPromise<Transaction | null>- Adding null check before calling
fromRawTransaction- Updating the call site in
src/libs/blockchain/routines/subOperations.ts(line 38) to handle null returns
♻️ Duplicate comments (2)
src/libs/l2ps/L2PSBatchAggregator.ts (2)
736-748: Critical:getSharedStatemust be called as a function.On line 746,
getSharedStateis not invoked. This will fail at runtime when attempting to access the.l2psJoinedUidsproperty on the function itself rather than the state object.This issue was flagged in past review comments but remains unresolved.
Based on learnings, past review comment from coderabbitai flagged this exact issue. Apply this diff:
getStatus(): { isRunning: boolean; isAggregating: boolean; intervalMs: number; joinedL2PSCount: number; } { + const sharedState = getSharedState() return { isRunning: this.isRunning, isAggregating: this.isAggregating, intervalMs: this.AGGREGATION_INTERVAL, - joinedL2PSCount: getSharedState.l2psJoinedUids?.length || 0, + joinedL2PSCount: sharedState.l2psJoinedUids?.length || 0, } }
638-638: Set meaningfulreference_blockvalue.The comment states "Will be set by mempool" but according to past reviews, the mempool implementation passes
reference_blockthrough to persistence without recalculating it. Storing0may bypass anchoring/ordering semantics.Consider setting a meaningful reference block:
- reference_block: 0, // Will be set by mempool + reference_block: sharedState.lastKnownBlock || 0, // Anchor to current block heightIf
lastKnownBlockis not available in SharedState, query the latest block height from the blockchain state before constructing this transaction, or document why0is intentionally correct for batch transactions.Based on learnings, past review comment from coderabbitai indicates this needs to be set to a proper value.
🧹 Nitpick comments (3)
src/libs/l2ps/L2PSBatchAggregator.ts (3)
93-94: Remove unusedbatchNonceCounterproperty.This property is declared but never used in the implementation. The actual nonce management correctly uses
getLastNonceFromStorage()andsaveNonceToStorage()methods. This dead code should be removed to avoid confusion.Apply this diff:
- /** Persistent nonce counter for batch transactions */ - private readonly batchNonceCounter: number = 0
399-410: Add runtime type guard forprivateKey.The code assumes
sharedState.keypair.privateKeyis aUint8Arraywithout verification. If the actual type is a string or another format, theBuffer.from()cast will fail or produce incorrect results.Apply this diff to add a type check:
if (!sharedState.keypair?.privateKey) { throw new Error("[L2PS Batch Aggregator] Node keypair not available for HMAC generation") } - const hmacKey = Buffer.from(sharedState.keypair.privateKey as Uint8Array) + const privateKey = sharedState.keypair.privateKey + if (!(privateKey instanceof Uint8Array)) { + throw new Error("[L2PS Batch Aggregator] Node private key must be Uint8Array format") + } + + const hmacKey = Buffer.from(privateKey) .toString("hex") .slice(0, 64)
586-592: Add runtime type guard forpublicKey.The code assumes
sharedState.keypair.publicKeyis aUint8Arraywithout verification. IfsharedState.publicKeyHexis already available and normalized, prefer reusing it. Otherwise, add a type check.Based on learnings, past reviews flagged this assumption. Apply one of these approaches:
Option 1 (preferred if
publicKeyHexis available):- // Get node's public key as hex string for 'from' field - const nodeIdentityHex = uint8ArrayToHex(sharedState.keypair.publicKey as Uint8Array) + // Get node's public key as hex string for 'from' field + const nodeIdentityHex = sharedState.publicKeyHex + if (!nodeIdentityHex) { + log.error("[L2PS Batch Aggregator] Node public key hex not available") + return false + }Option 2 (if publicKeyHex is not available):
- // Get node's public key as hex string for 'from' field - const nodeIdentityHex = uint8ArrayToHex(sharedState.keypair.publicKey as Uint8Array) + // Get node's public key as hex string for 'from' field + const publicKey = sharedState.keypair.publicKey + if (!(publicKey instanceof Uint8Array)) { + log.error("[L2PS Batch Aggregator] Node public key must be Uint8Array format") + return false + } + const nodeIdentityHex = uint8ArrayToHex(publicKey)
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
Disabled knowledge base sources:
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (3)
src/libs/blockchain/chain.ts(1 hunks)src/libs/blockchain/transaction.ts(1 hunks)src/libs/l2ps/L2PSBatchAggregator.ts(1 hunks)
🧰 Additional context used
🧬 Code graph analysis (1)
src/libs/blockchain/chain.ts (1)
src/libs/blockchain/transaction.ts (2)
hash(106-114)Transaction(44-538)
🔇 Additional comments (1)
src/libs/blockchain/transaction.ts (1)
499-501: Good defensive check for null/undefined input.The early validation prevents potential null pointer exceptions during transaction reconstruction. This aligns well with the updated null-handling pattern in
chain.ts.
| static async getTxByHash(hash: string): Promise<Transaction | null> { | ||
| try { | ||
| return Transaction.fromRawTransaction( | ||
| await this.transactions.findOneBy({ | ||
| hash: ILike(hash), | ||
| }), | ||
| ) | ||
| const rawTx = await this.transactions.findOneBy({ | ||
| hash: ILike(hash), | ||
| }) | ||
|
|
||
| if (!rawTx) { | ||
| return null | ||
| } | ||
|
|
||
| return Transaction.fromRawTransaction(rawTx) | ||
| } catch (error) { | ||
| console.log("[ChainDB] [ ERROR ]: " + JSON.stringify(error)) | ||
| console.error(error) | ||
| throw error // It does not crash the node, as it is caught by the endpoint handler | ||
| return null | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Search for getTxByHash usage to ensure they handle null returns
rg -nP --type=ts -C3 '\.getTxByHash\s*\(' -g '!src/libs/blockchain/chain.ts'Repository: kynesyslabs/node
Length of output: 1036
🏁 Script executed:
sed -n '191,193p' src/libs/blockchain/chain.tsRepository: kynesyslabs/node
Length of output: 186
🏁 Script executed:
sed -n '185,200p' src/libs/blockchain/chain.tsRepository: kynesyslabs/node
Length of output: 569
Improved null safety, but note the breaking change, error suppression, and critical inconsistency.
The explicit return type Promise<Transaction | null> and the null guard before calling fromRawTransaction improve safety and align with the defensive check added in transaction.ts. However, be aware:
-
Breaking change: Callers expecting
Promise<Transaction>must now handle null returns. Review of call sites shows:scripts/send-l2-batch.ts:267does not explicitly handle null returns (logs status directly)src/libs/network/manageNodeCall.ts:141handles it implicitly via try-catch but inconsistently
-
Error suppression: Returning
nullon catch (line 90) may hide underlying issues (e.g., database connection failures, malformed data). Consider whether callers need to distinguish "transaction not found" from "error during retrieval." -
Critical inconsistency:
getTransactionFromHash(lines 191-193) still callsfromRawTransactiondirectly onfindOneByresult without null checking, unlikegetTxByHash. This will throw if the transaction is not found, creating an inconsistent API that will cause runtime errors.
🤖 Prompt for AI Agents
In src/libs/blockchain/chain.ts around lines 76 to 92: the method now returns
Promise<Transaction | null> but swallows errors by returning null in the catch,
and another method (getTransactionFromHash) still calls fromRawTransaction
without a null check causing inconsistent behavior; change getTxByHash to either
rethrow after logging (or return a discriminated error) instead of silently
returning null on exceptions so callers can distinguish "not found" from
"failure", or keep returning null only for the not-found path while logging and
rethrowing unexpected errors; then make getTransactionFromHash follow the same
contract (perform the null guard and return/throw consistently), and update
callers (e.g., scripts/send-l2-batch.ts line ~267 and
src/libs/network/manageNodeCall.ts line ~141) to handle the chosen contract
(explicit null checks or try/catch) so behavior is consistent and errors are not
suppressed.
| l2psUid: string, | ||
| transactions: L2PSMempoolTx[], | ||
| ): Promise<L2PSBatchPayload> { | ||
| const sharedState = getSharedState |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Critical: getSharedState must be called as a function.
getSharedState is imported as a function but not invoked here. This will fail at runtime when the code attempts to access sharedState.keypair on line 399.
Apply this diff:
- const sharedState = getSharedState
+ const sharedState = getSharedState()📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const sharedState = getSharedState | |
| const sharedState = getSharedState() |
🤖 Prompt for AI Agents
In src/libs/l2ps/L2PSBatchAggregator.ts around line 376, the code assigns
getSharedState without invoking it; change the assignment to call the function
(i.e., use getSharedState()) so sharedState is the returned object and
subsequent access like sharedState.keypair will work; if desired, also add a
defensive null/undefined check after calling to throw or handle an error if the
function returns no state.
| private async getLastNonceFromStorage(): Promise<number> { | ||
| try { | ||
| const sharedState = getSharedState | ||
| // Use shared state to persist nonce across the session | ||
| // This survives within the same process lifetime | ||
| if (sharedState.l2psBatchNonce) { | ||
| return sharedState.l2psBatchNonce | ||
| } | ||
| return 0 | ||
| } catch { | ||
| return 0 | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Critical: getSharedState must be called as a function.
Same issue as line 376 - getSharedState is not invoked. This will fail at runtime when accessing sharedState.l2psBatchNonce on line 533.
Apply this diff:
private async getLastNonceFromStorage(): Promise<number> {
try {
- const sharedState = getSharedState
+ const sharedState = getSharedState()
// Use shared state to persist nonce across the session🤖 Prompt for AI Agents
In src/libs/l2ps/L2PSBatchAggregator.ts around lines 528 to 540, getSharedState
is referenced but not invoked; change the code to call getSharedState() and use
the returned object when checking/sharedState.l2psBatchNonce so the property
access occurs on the function result (wrap the call in a const sharedState =
getSharedState();), preserving the existing try/catch and returning 0 on error.
| private async saveNonceToStorage(nonce: number): Promise<void> { | ||
| try { | ||
| const sharedState = getSharedState | ||
| // Store in shared state for persistence | ||
| sharedState.l2psBatchNonce = nonce | ||
| } catch (error) { | ||
| const errorMessage = error instanceof Error ? error.message : 'Unknown error' | ||
| log.warning(`[L2PS Batch Aggregator] Failed to persist nonce: ${errorMessage}`) | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Critical: getSharedState must be called as a function.
Same issue - getSharedState is not invoked on line 547.
Apply this diff:
private async saveNonceToStorage(nonce: number): Promise<void> {
try {
- const sharedState = getSharedState
+ const sharedState = getSharedState()
// Store in shared state for persistence🤖 Prompt for AI Agents
In src/libs/l2ps/L2PSBatchAggregator.ts around lines 545 to 554, the code
assigns getSharedState instead of calling it; change the assignment to call the
function (const sharedState = getSharedState()) so you obtain the shared state
object, then store nonce as before (sharedState.l2psBatchNonce = nonce); keep
the existing try/catch and logging behavior, and optionally add a null/undefined
guard if getSharedState() can return no value.
| */ | ||
| private async submitBatchToMempool(batchPayload: L2PSBatchPayload): Promise<boolean> { | ||
| try { | ||
| const sharedState = getSharedState |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Critical: getSharedState must be called as a function.
Same issue - getSharedState is not invoked. This will fail at runtime when accessing sharedState.keypair on line 586.
Apply this diff:
- const sharedState = getSharedState
+ const sharedState = getSharedState()📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| const sharedState = getSharedState | |
| const sharedState = getSharedState() |
🤖 Prompt for AI Agents
In src/libs/l2ps/L2PSBatchAggregator.ts around line 568, the code assigns
getSharedState instead of calling it, so sharedState is a function reference and
accessing sharedState.keypair (line 586) will fail; change the assignment to
invoke the function (call getSharedState()) so sharedState holds the returned
object, preserving any async/side-effect expectations of the getter.
…nd convert state root and volume to BigInt
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
♻️ Duplicate comments (4)
src/libs/l2ps/L2PSBatchAggregator.ts (4)
372-424: Critical:getSharedStateisn’t invoked;sharedState.keypairwill crash.
This is the same “must callgetSharedState()” issue as prior reviews.Also: current HMAC key usage passes a hex string to
createHmac, which uses UTF-8 bytes of that string (likely unintended). Prefer aBufferkey.- const sharedState = getSharedState + const sharedState = getSharedState() + if (!sharedState) { + throw new Error("[L2PS Batch Aggregator] Shared state unavailable") + } ... - const hmacKey = Buffer.from(sharedState.keypair.privateKey as Uint8Array) - .toString("hex") - .slice(0, 64) + const hmacKey = Buffer.from(sharedState.keypair.privateKey as Uint8Array) const hmacData = `${l2psUid}:${encryptedBatch}:${batchHash}:${transactionHashes.join(",")}` const authenticationTag = crypto .createHmac("sha256", hmacKey) .update(hmacData) .digest("hex")
528-554: Critical: nonce persistence usesgetSharedStatewithout calling it.
This is the same “must callgetSharedState()” issue as prior reviews.private async getLastNonceFromStorage(): Promise<number> { try { - const sharedState = getSharedState + const sharedState = getSharedState() // Use shared state to persist nonce across the session // This survives within the same process lifetime if (sharedState.l2psBatchNonce) { return sharedState.l2psBatchNonce } return 0 } catch { return 0 } } ... private async saveNonceToStorage(nonce: number): Promise<void> { try { - const sharedState = getSharedState + const sharedState = getSharedState() // Store in shared state for persistence sharedState.l2psBatchNonce = nonce } catch (error) {
566-669: Critical: submission path usesgetSharedStatewithout calling it.
This is the same “must callgetSharedState()” issue as prior reviews.Also, two previously-raised robustness concerns still apply here:
reference_block: 0may get persisted as-is if mempool doesn’t override it.publicKey as Uint8Arrayis a brittle cast; prefer a normalized hex field (e.g.,sharedState.publicKeyHex) or runtime conversion.private async submitBatchToMempool(batchPayload: L2PSBatchPayload): Promise<boolean> { try { - const sharedState = getSharedState + const sharedState = getSharedState() ... - const nodeIdentityHex = uint8ArrayToHex(sharedState.keypair.publicKey as Uint8Array) + const nodeIdentityHex = + (sharedState as any).publicKeyHex ?? + uint8ArrayToHex(sharedState.keypair.publicKey as Uint8Array) ... const batchTransaction = { ... - reference_block: 0, // Will be set by mempool + // TODO: set a real reference_block (or ensure Mempool.addTransaction overwrites it) + reference_block: 0,
744-756: Critical:getStatus()usesgetSharedStateas a value, notgetSharedState().
This is the same “must callgetSharedState()” issue as prior reviews.getStatus(): { ... } { + const sharedState = getSharedState() return { isRunning: this.isRunning, isAggregating: this.isAggregating, intervalMs: this.AGGREGATION_INTERVAL, - joinedL2PSCount: getSharedState.l2psJoinedUids?.length || 0, + joinedL2PSCount: sharedState.l2psJoinedUids?.length || 0, } }
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
Disabled knowledge base sources:
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (1)
src/libs/l2ps/L2PSBatchAggregator.ts(1 hunks)
| /** Cleanup interval - remove batched transactions older than this (1 hour) */ | ||
| private readonly CLEANUP_AGE_MS = 5 * 60 * 1000 // 5 minutes - confirmed txs can be cleaned up quickly |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix doc mismatch: cleanup is “5 minutes”, not “1 hour”.
Line 87 says “(1 hour)” but CLEANUP_AGE_MS is 5 * 60 * 1000.
- /** Cleanup interval - remove batched transactions older than this (1 hour) */
+ /** Cleanup interval - remove batched transactions older than this (5 minutes) */
private readonly CLEANUP_AGE_MS = 5 * 60 * 1000 // 5 minutes - confirmed txs can be cleaned up quickly🤖 Prompt for AI Agents
In src/libs/l2ps/L2PSBatchAggregator.ts around lines 87-88, the inline comment
states "(1 hour)" but the CLEANUP_AGE_MS constant is set to 5 * 60 * 1000 (5
minutes); update the comment to reflect "5 minutes" (or the intended duration if
you meant 1 hour then change the constant to 60 * 60 * 1000) so the
documentation and value match.
| const zkTransactions = transactions.map((tx) => { | ||
| const amount = BigInt((tx.encrypted_tx as any)?.content?.amount || 0) | ||
|
|
||
| // Neutral before/after while preserving the invariant: | ||
| // senderAfter = senderBefore - amount, receiverAfter = receiverBefore + amount. | ||
| const senderBefore = amount | ||
| const senderAfter = senderBefore - amount | ||
| const receiverBefore = 0n | ||
| const receiverAfter = receiverBefore + amount | ||
|
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Guard BigInt(...) conversions to avoid throwing and killing proof generation.
If (tx.encrypted_tx as any)?.content?.amount is non-integer-ish (e.g., "10.5", "1e3", "0x..."), BigInt() throws.
const zkTransactions = transactions.map((tx) => {
- const amount = BigInt((tx.encrypted_tx as any)?.content?.amount || 0)
+ let amount = 0n
+ try {
+ const raw = (tx.encrypted_tx as any)?.content?.amount
+ amount = raw === undefined || raw === null ? 0n : BigInt(raw)
+ } catch {
+ amount = 0n
+ }Also applies to: 467-468
…nce code readability and maintainability
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
Fix all issues with AI Agents 🤖
In @src/libs/l2ps/zk/scripts/setup_all_batches.sh:
- Around line 28-41: In download_ptau(), the file-size check uses GNU-only stat
(-c%s) which breaks on macOS/BSD; replace that with a POSIX-safe command such as
using wc -c (e.g., [[ $(wc -c < "$PTAU_DIR/$file") -lt 1000000 ]]) or detect
platform and use stat -f%z on macOS, keeping the existing ! -f guard and error
handling; update the conditional in download_ptau to use the portable size check
so the download logic works across Linux and macOS.
♻️ Duplicate comments (10)
src/libs/l2ps/L2PSProofManager.ts (2)
228-234: Incomplete validation: nonce edits also requireaccountfield.The comment states "Validate each GCR edit has required fields," but Line 230 only enforces the
accountfield forbalancetype. Nonce operations also require an account to be specified. Without this check, malformed nonce edits could pass validation and cause downstream errors during application.🔎 Recommended fix: Enforce account field for both balance and nonce
// Validate each GCR edit has required fields for (const edit of proof.gcr_edits) { - if (!edit.type || (edit.type === 'balance' && !('account' in edit))) { - log.warning(`[L2PS ProofManager] Proof ${proof.id} has invalid GCR edit`) + if (!edit.type) { + log.warning(`[L2PS ProofManager] Proof ${proof.id} has GCR edit missing type`) return false } + + // Balance and nonce edits require an account field + if ((edit.type === "balance" || edit.type === "nonce") && !("account" in edit)) { + log.warning(`[L2PS ProofManager] Proof ${proof.id} has ${edit.type} edit without account`) + return false + } }
195-212: Remove or actively warn about the unusedblockNumberparameter.The method accepts
blockNumberbut ignores it entirely, creating a misleading API. Callers may assume block-specific filtering is working when it returns all pending proofs instead.🔎 Recommended fix: Remove the parameter until implemented
- /** - * Get pending proofs for a specific block - * - * @param blockNumber - Target block number - * @returns Array of proofs targeting this block - */ - static async getProofsForBlock(blockNumber: number): Promise<L2PSProof[]> { + /** + * Get all pending proofs (block-specific filtering not yet implemented) + * + * @returns Array of all pending proofs in creation order + */ + static async getProofsForBlock(): Promise<L2PSProof[]> { const repo = await this.getRepo() - // FUTURE: Filter proofs by target_block_number when block-specific batching is implemented - // For now, returns all pending proofs in creation order (blockNumber reserved for future use) + // TODO: Add target_block_number field to L2PSProof entity and filter here return repo.find({ where: { status: "pending" as L2PSProofStatus }, order: { created_at: "ASC" } }) }Alternatively, if you must keep the signature for compatibility, throw or warn:
static async getProofsForBlock(blockNumber: number): Promise<L2PSProof[]> { const repo = await this.getRepo() + + if (blockNumber !== undefined) { + log.warning(`[L2PS ProofManager] getProofsForBlock: blockNumber parameter not yet implemented, returning all pending proofs`) + } // FUTURE: Filter proofs by target_block_number when block-specific batching is implemented // For now, returns all pending proofs in creation order (blockNumber reserved for future use) return repo.find({ where: { status: "pending" as L2PSProofStatus }, order: { created_at: "ASC" } }) }src/libs/l2ps/L2PSBatchAggregator.ts (8)
377-377: 🔴 CRITICAL:getSharedStatemust be called as a function (STILL UNFIXED).Despite being marked as addressed in previous reviews, line 377 still assigns
getSharedStatewithout invoking it:const sharedState = getSharedState // ❌ Missing ()This will fail at runtime when code attempts to access
sharedState.keypairon line 400.🔎 Required fix
- const sharedState = getSharedState + const sharedState = getSharedState()
529-541: 🔴 CRITICAL:getSharedStatemust be called as a function (STILL UNFIXED).Line 531 has the same unfixed issue:
const sharedState = getSharedState // ❌ Missing ()This will fail when accessing
sharedState.l2psBatchNonceon line 534.🔎 Required fix
private async getLastNonceFromStorage(): Promise<number> { try { - const sharedState = getSharedState + const sharedState = getSharedState()
546-555: 🔴 CRITICAL:getSharedStatemust be called as a function (STILL UNFIXED).Line 548 has the same unfixed issue:
const sharedState = getSharedState // ❌ Missing ()🔎 Required fix
private async saveNonceToStorage(nonce: number): Promise<void> { try { - const sharedState = getSharedState + const sharedState = getSharedState()
567-598: 🔴 CRITICAL:getSharedStatemust be called as a function (STILL UNFIXED).Line 569 has the same unfixed issue:
const sharedState = getSharedState // ❌ Missing ()This will fail when accessing
sharedState.keypairon line 595.🔎 Required fix
- const sharedState = getSharedState + const sharedState = getSharedState()
746-758: 🔴 CRITICAL:getSharedStatemust be called as a function (STILL UNFIXED).Line 756 accesses
getSharedStateas an object instead of calling it:joinedL2PSCount: getSharedState.l2psJoinedUids?.length || 0, // ❌ Missing ()🔎 Required fix
return { isRunning: this.isRunning, isAggregating: this.isAggregating, intervalMs: this.AGGREGATION_INTERVAL, - joinedL2PSCount: getSharedState.l2psJoinedUids?.length || 0, + joinedL2PSCount: getSharedState().l2psJoinedUids?.length || 0, }
87-88: Fix doc mismatch: cleanup is "5 minutes", not "1 hour" (STILL UNFIXED).Line 87 comment states "(1 hour)" but
CLEANUP_AGE_MSis5 * 60 * 1000(5 minutes).🔎 Required fix
- /** Cleanup interval - remove batched transactions older than this (1 hour) */ + /** Cleanup interval - remove batched transactions older than this (5 minutes) */ private readonly CLEANUP_AGE_MS = 5 * 60 * 1000 // 5 minutes - confirmed txs can be cleaned up quickly
448-465: GuardBigInt()conversions to prevent aggregation failures.Line 449 uses
BigInt()without error handling. Ifamountcontains non-integer strings (e.g.,"10.5","1e3","0x..."),BigInt()will throw and kill the entire batch aggregation cycle.🔎 Proposed fix
const zkTransactions = transactions.map((tx) => { - const amount = BigInt((tx.encrypted_tx as any)?.content?.amount || 0) + let amount = 0n + try { + const raw = (tx.encrypted_tx as any)?.content?.amount + amount = raw === undefined || raw === null ? 0n : BigInt(raw) + } catch (error) { + log.warning(`[L2PS Batch Aggregator] Invalid amount in tx ${tx.hash}: ${error}`) + amount = 0n + }
639-650: Reviewreference_block: 0— mempool may not recalculate it.Line 647 sets
reference_block: 0with a comment that mempool will set it. However, based on the relevant code snippets provided,Mempool.addTransactiondoes not appear to recalculatereference_block—it's passed through to persistence.If
reference_blockis important for transaction ordering, expiry, or anchoring semantics, storing L2PS batch transactions withreference_block = 0may bypass these mechanisms.Recommended fix: Set
reference_blockto a meaningful value (e.g., latest known block height or consensus block) instead of 0:+ // Get current block reference for ordering/anchoring + const referenceBlock = sharedState().lastKnownBlock || 0 + if (referenceBlock === 0) { + log.warning("[L2PS Batch Aggregator] No reference block available, batch ordering may be affected") + } + const batchTransaction = { hash, content: transactionContent, signature: signature ? { type: sharedState.signingAlgorithm, data: uint8ArrayToHex(signature.signature), domain: this.SIGNATURE_DOMAIN, } : null, - reference_block: 0, // Will be set by mempool + reference_block: referenceBlock, status: "pending", extra: null, }
🧹 Nitpick comments (1)
src/libs/l2ps/zk/scripts/setup_all_batches.sh (1)
48-98: Consider adding dependency checks for required tools.The script assumes
circom,npx, andsnarkjsare available but doesn't verify this. If these tools are missing, the script will fail with unclear error messages.🔎 Proposed dependency check
Add this check after line 21 (before creating directories):
echo -e "${YELLOW}Max batch size: 10 transactions${NC}" + +# Check required dependencies +echo "Checking dependencies..." +for cmd in circom npx; do + if ! command -v $cmd &> /dev/null; then + echo -e "${RED}Error: $cmd is not installed${NC}" + echo "Please install required dependencies:" + echo " - circom: https://docs.circom.io/getting-started/installation/" + echo " - node/npx: https://nodejs.org/" + exit 1 + fi +done + +if ! npx snarkjs --help &> /dev/null; then + echo -e "${RED}Error: snarkjs is not available via npx${NC}" + echo "Install with: npm install -g snarkjs" + exit 1 +fi + +echo "All dependencies found ✓" # Create directories
📜 Review details
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Disabled knowledge base sources:
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (5)
src/libs/l2ps/L2PSBatchAggregator.tssrc/libs/l2ps/L2PSProofManager.tssrc/libs/l2ps/zk/BunPlonkWrapper.tssrc/libs/l2ps/zk/L2PSBatchProver.tssrc/libs/l2ps/zk/scripts/setup_all_batches.sh
🧰 Additional context used
🧬 Code graph analysis (4)
src/libs/l2ps/zk/L2PSBatchProver.ts (1)
src/libs/l2ps/zk/BunPlonkWrapper.ts (1)
plonkVerifyBun(103-188)
src/libs/l2ps/L2PSBatchAggregator.ts (7)
src/libs/l2ps/zk/L2PSBatchProver.ts (2)
L2PSBatchProver(65-336)hash(164-167)src/utilities/logger.ts (1)
error(125-132)src/libs/blockchain/l2ps_mempool.ts (1)
L2PSMempool(48-659)src/utilities/sharedState.ts (1)
getSharedState(276-278)src/libs/crypto/hashing.ts (1)
Hashing(15-26)src/libs/utils/calibrateTime.ts (1)
getNetworkTimestamp(17-23)src/libs/blockchain/mempool_v2.ts (1)
Mempool(18-227)
src/libs/l2ps/zk/BunPlonkWrapper.ts (1)
src/utilities/logger.ts (1)
error(125-132)
src/libs/l2ps/L2PSProofManager.ts (3)
src/libs/crypto/hashing.ts (1)
Hashing(15-26)src/model/entities/L2PSProofs.ts (1)
L2PSProofStatus(27-31)src/utilities/logger.ts (1)
error(125-132)
🔇 Additional comments (21)
src/libs/l2ps/L2PSProofManager.ts (11)
1-28: LGTM — Clear documentation and appropriate imports.The module documentation provides a comprehensive overview of the L2PS proof flow and supported proof systems. All imports are relevant and properly structured.
30-44: LGTM — Solid deterministic stringify implementation.The key-sorting replacer ensures stable hashing across PostgreSQL JSONB round-trips and prevents non-deterministic hash generation. Arrays are correctly excluded from sorting.
46-64: LGTM — Well-defined result interfaces.Both interfaces are properly structured with appropriate optional fields and clear semantics for proof operations.
71-98: LGTM — Robust lazy initialization pattern.The initPromise guard prevents race conditions during concurrent initialization, and the getRepo wrapper ensures the repository is always ready before use.
100-172: LGTM — Deterministic hashing properly implemented.The previous non-determinism issue with
Date.now()has been resolved. The method now usesdeterministicStringifyfor bothtransactionsHashandproofData, ensuring consistent hashes for identical inputs. Error handling is comprehensive.
174-193: LGTM — Clean pending proofs retrieval.The optional filtering by
l2psUidis properly implemented, and ordering bycreated_atensures proofs are processed in creation order.
259-275: LGTM — Clean status transition to applied.The method properly updates all relevant fields (
status,applied_block_number,processed_at) and logs the action.
277-293: LGTM — Clean status transition to rejected.The method properly captures the rejection reason in
error_messageand logs it for debugging.
295-304: LGTM — Straightforward batch hash lookup.Simple and correct implementation for retrieving proofs by L1 batch transaction hash.
306-331: LGTM — Flexible proof querying with sensible defaults.The method supports optional status filtering and includes a reasonable default limit. Ordering by
created_at DESCis appropriate for general proof retrieval.
333-357: LGTM — Efficient statistics aggregation.The QueryBuilder pattern with
clone()correctly builds multiple status-specific queries from a common base. UsingPromise.allfor parallel execution is efficient.src/libs/l2ps/zk/scripts/setup_all_batches.sh (1)
1-26: LGTM: Script initialization and directory setup.The script header, error handling (
set -e), and directory structure setup are well-organized. The use ofmkdir -pensures idempotent execution.src/libs/l2ps/zk/BunPlonkWrapper.ts (2)
103-188: LGTM: Well-implemented Bun-compatible PLONK verifier.The implementation addresses Bun worker thread issues by using
singleThread: true(line 117) and includes:
- Comprehensive error handling with try/catch/finally
- Proper resource cleanup via
curve.terminate()in the finally block- Optional logging support
- Clear documentation of the Bun compatibility workaround
190-448: LGTM: Helper functions are well-structured.The conversion and calculation helpers properly handle:
- Proof and verification key object transformations (lines 190-230)
- Structural validation (lines 232-245)
- PLONK-specific calculations following the protocol spec
The code is a faithful port of the snarkjs implementation adapted for Bun compatibility.
src/libs/l2ps/zk/L2PSBatchProver.ts (3)
75-211: LGTM: Initialization and helper methods are robust.The implementation includes:
- Graceful degradation when ZK keys are unavailable (lines 81-86)
- Key caching to avoid repeated file I/O (lines 140-143)
- Proper batch size selection with clear error messages (lines 113-134)
- Sound padding strategy using zero-amount transfers (lines 172-187)
216-336: LGTM: Proof generation and verification properly handle Bun compatibility.Key features:
- Uses
singleThread: truein proverOptions (line 265) to avoid Bun worker crashes- Environment-aware verification: uses
BunPlonkWrapperin Bun, native snarkjs in Node (lines 300-308)- Graceful fallback for
exportSolidityCallDataif unavailable (lines 324-334)- Comprehensive error handling throughout
9-23: Global prototype patching is safe in this codebase but consider architectural improvements.The code patches
EventTarget.prototype.dispatchEventglobally to work around Bun's web-worker issues. Analysis shows no other code in the codebase usesEventTargetordispatchEvent, so there are no conflicts with other code. However, from an architectural perspective:
- Module-level side effects: Patching at import time affects global state
- No restoration mechanism: Cannot revert the patch if needed
- Implicit runtime behavior: Intent of the patch isn't obvious to future maintainers
While this approach works in practice, consider these refinements:
- Lazy patching: Only patch when snarkjs is first invoked, not at module load
- Scoped patching: Patch only during snarkjs calls, then restore original
- Feature detection: Verify patching is still needed in current/future Bun versions
Consider adding a comment explaining why patching is necessary and when it can be removed.
src/libs/l2ps/L2PSBatchAggregator.ts (4)
138-180: LGTM: Service initialization with graceful ZK prover degradation.The
start()andinitializeZkProver()methods properly handle:
- Reentrancy protection with clear error messages
- Graceful degradation when ZK keys are unavailable (lines 172-179)
- Clear user guidance to run setup script
- Statistics reset using helper method
225-259: LGTM: Robust reentrancy protection.The
safeAggregateAndSubmit()wrapper properly prevents overlapping cycles that could cause:
- Database conflicts
- Duplicate batch submissions
- Race conditions
Good use of statistics tracking for skipped, successful, and failed cycles.
437-500: LGTM: ZK proof generation with local verification.The
generateZkProofForBatch()method includes excellent safety measures:
- Proper conversion of transactions to ZK-friendly format
- Local verification of generated proofs before submission (lines 479-482)
- Graceful degradation on failure with clear logging
- Good use of modular arithmetic for state root (line 468)
The safety verification at lines 479-482 is particularly valuable for catching corrupted keys/wasm early.
673-731: LGTM: Cleanup and statistics tracking.The cleanup mechanism and statistics logging are well-implemented:
- Proper age-based cleanup to prevent unbounded growth
- Comprehensive statistics covering all aspects of the service
- Clear success rate calculation with percentage formatting
- Detailed logging for observability
| download_ptau() { | ||
| local size=$1 | ||
| local file="powersOfTau28_hez_final_${size}.ptau" | ||
| local url="https://storage.googleapis.com/zkevm/ptau/$file" | ||
|
|
||
| if [[ ! -f "$PTAU_DIR/$file" ]] || [[ $(stat -c%s "$PTAU_DIR/$file") -lt 1000000 ]]; then | ||
| echo -e "${YELLOW}Downloading pot${size}...${NC}" | ||
| rm -f "$PTAU_DIR/$file" | ||
| curl -L -o "$PTAU_DIR/$file" "$url" | ||
| else | ||
| echo "pot${size} already exists" | ||
| fi | ||
| return 0 | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fix portability: stat -c%s fails on macOS.
Line 33 uses GNU stat syntax that won't work on macOS or BSD systems. The -c flag is Linux-specific.
🔎 Proposed cross-platform fix
- if [[ ! -f "$PTAU_DIR/$file" ]] || [[ $(stat -c%s "$PTAU_DIR/$file") -lt 1000000 ]]; then
+ # Cross-platform file size check (works on Linux and macOS)
+ local filesize=0
+ if [[ -f "$PTAU_DIR/$file" ]]; then
+ if stat --version &>/dev/null; then
+ # GNU stat (Linux)
+ filesize=$(stat -c%s "$PTAU_DIR/$file")
+ else
+ # BSD stat (macOS)
+ filesize=$(stat -f%z "$PTAU_DIR/$file")
+ fi
+ fi
+
+ if [[ ! -f "$PTAU_DIR/$file" ]] || [[ $filesize -lt 1000000 ]]; thenAlternatively, use wc -c which is POSIX-compliant:
- if [[ ! -f "$PTAU_DIR/$file" ]] || [[ $(stat -c%s "$PTAU_DIR/$file") -lt 1000000 ]]; then
+ if [[ ! -f "$PTAU_DIR/$file" ]] || [[ $(wc -c < "$PTAU_DIR/$file") -lt 1000000 ]]; then🤖 Prompt for AI Agents
In @src/libs/l2ps/zk/scripts/setup_all_batches.sh around lines 28-41, In
download_ptau(), the file-size check uses GNU-only stat (-c%s) which breaks on
macOS/BSD; replace that with a POSIX-safe command such as using wc -c (e.g., [[
$(wc -c < "$PTAU_DIR/$file") -lt 1000000 ]]) or detect platform and use stat
-f%z on macOS, keeping the existing ! -f guard and error handling; update the
conditional in download_ptau to use the portable size check so the download
logic works across Linux and macOS.
…or improved clarity and maintainability
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 0
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
src/libs/l2ps/parallelNetworks.ts (1)
269-276: Critical:getSharedStatemust be called as a function.Line 272 assigns
getSharedStatewithout invoking it. This will fail at runtime when accessingsharedState.signingAlgorithm.Suggested fix
async encryptTransaction( uid: string, tx: Transaction, senderIdentity?: any, ): Promise<Transaction> { const l2ps = await this.loadL2PS(uid) const encryptedTx = await l2ps.encryptTx(tx, senderIdentity) // Sign encrypted transaction with node's private key - const sharedState = getSharedState + const sharedState = getSharedState() const signature = await ucrypto.sign(
♻️ Duplicate comments (12)
src/libs/l2ps/L2PSBatchAggregator.ts (8)
649-649: Reviewreference_block: 0— mempool may not override this value.Setting
reference_block: 0with the comment "Will be set by mempool" may not be accurate. Based on the PR context, mempool'saddTransactionpasses the value through to persistence without recalculating it. L2PS batch transactions may be stored withreference_block = 0, potentially bypassing anchoring/ordering semantics.Consider computing the actual reference block here or verifying that mempool logic does set it.
87-88: Fix doc mismatch: cleanup is "5 minutes", not "1 hour".The comment says "(1 hour)" but
CLEANUP_AGE_MSis set to 5 minutes.Suggested fix
- /** Cleanup interval - remove batched transactions older than this (1 hour) */ + /** Cleanup interval - remove batched transactions older than this (5 minutes) */ private readonly CLEANUP_AGE_MS = 5 * 60 * 1000 // 5 minutes - confirmed txs can be cleaned up quickly
379-379: Critical:getSharedStatemust be called as a function.
getSharedStateis a function but is assigned without invocation. This will fail at runtime when accessingsharedState.keypairon line 402.Suggested fix
- const sharedState = getSharedState + const sharedState = getSharedState()
450-451: GuardBigInt(...)conversion to avoid throwing.If
(tx.encrypted_tx as any)?.content?.amountis not an integer-compatible value (e.g.,"10.5",undefined),BigInt()will throw and terminate proof generation for the entire batch.Suggested fix
const zkTransactions = transactions.map((tx) => { - const amount = BigInt((tx.encrypted_tx as any)?.content?.amount || 0) + let amount = 0n + try { + const raw = (tx.encrypted_tx as any)?.content?.amount + amount = raw == null ? 0n : BigInt(Math.floor(Number(raw))) + } catch { + amount = 0n + }
531-543: Critical:getSharedStatemust be called as a function.Same issue as line 379 -
getSharedStateis not invoked. This will fail at runtime when accessingsharedState.l2psBatchNonce.Suggested fix
private async getLastNonceFromStorage(): Promise<number> { try { - const sharedState = getSharedState + const sharedState = getSharedState()
548-557: Critical:getSharedStatemust be called as a function.Same issue -
getSharedStateis not invoked on line 550.Suggested fix
private async saveNonceToStorage(nonce: number): Promise<void> { try { - const sharedState = getSharedState + const sharedState = getSharedState()
569-573: Critical:getSharedStatemust be called as a function.Same issue on line 571 - this will fail at runtime when accessing
sharedState.keypair.Suggested fix
- const sharedState = getSharedState + const sharedState = getSharedState()
749-761: Critical:getSharedStatemust be called as a function.Line 759 accesses
getSharedState.l2psJoinedUidsbutgetSharedStateis a function that must be invoked.Suggested fix
getStatus(): { isRunning: boolean; isAggregating: boolean; intervalMs: number; joinedL2PSCount: number; } { return { isRunning: this.isRunning, isAggregating: this.isAggregating, intervalMs: this.AGGREGATION_INTERVAL, - joinedL2PSCount: getSharedState.l2psJoinedUids?.length || 0, + joinedL2PSCount: getSharedState().l2psJoinedUids?.length || 0, } }src/libs/l2ps/L2PSConsensus.ts (3)
379-391:getProofs("")returns zero proofs — rollback will not work.
getProofs("", "applied", ...)filters byl2ps_uid === "", which will return no proofs unless a network has an empty UID. This means no proofs are actually rolled back on block failure.Consider implementing a dedicated helper in
L2PSProofManagerlikegetAppliedProofsForBlock(blockNumber)that queries by applied_block_number without filtering by UID.
446-447: Same issue:getProofs("")will return zero proofs.
getBlockStatshas the same problem - it will always report zero proofs for any block because the UID filter matches nothing.
194-205: Rollback uses singlemockTxfor all edits — may mismatch account context.The
rollbackEditsmethod reuses the samemockTx(from the last edit) for all rollback operations. IfHandleGCR.applyrelies onmockTx.content.from/tofor authorization or accounting, earlier edits may be rolled back with incorrect account context.Consider creating a fresh
mockTxper edit usingcreateMockTxwith the appropriate account for each rollback.src/libs/l2ps/L2PSTransactionExecutor.ts (1)
136-143: Message claims "Proof created" even when no proof was created.When
gcrEdits.length === 0, the condition at line 132 is false, so no proof is created. However, the fallback return still says "Proof created with 0 GCR edits" which is misleading.Suggested fix
return { success: true, message: simulate ? `Validated: ${gcrEdits.length} GCR edits would be generated` - : `Proof created with ${gcrEdits.length} GCR edits (will apply at consensus)`, + : gcrEdits.length === 0 + ? "No GCR edits generated; transaction recorded without proof" + : `Proof created with ${gcrEdits.length} GCR edits (will apply at consensus)`, gcr_edits: gcrEdits, affected_accounts: [...new Set(affectedAccounts)] }
🧹 Nitpick comments (3)
src/libs/l2ps/L2PSConcurrentSync.ts (1)
63-63: Minor: Logger debug call has inconsistent format.The logger call uses comma syntax (
log.debug("...", message)) which may not interpolate the message correctly depending on the logger implementation. Consider using template literals or string concatenation for consistency with other log calls in the file.Suggested fix
- log.debug(`[L2PS Sync] Failed to query peer ${peer.muid} for ${l2psUid}:`, message) + log.debug(`[L2PS Sync] Failed to query peer ${peer.muid} for ${l2psUid}: ${message}`)Also applies to lines 188, 221, 278.
src/libs/l2ps/zk/BunPlonkWrapper.ts (1)
186-189: Consider using the project logger instead ofconsole.error.For consistency with the rest of the codebase, consider using the project's logger utility instead of
console.error.Suggested fix
+import log from "@/utilities/logger" + // In plonkVerifyBun: } catch (error) { const message = error instanceof Error ? error.message : String(error) - console.error("PLONK Verify error:", message) + log.error(`[PLONK Verify] Error: ${message}`) return falsesrc/libs/l2ps/L2PSConsensus.ts (1)
419-427: Dynamic import pattern works but consider static import.The dynamic imports for datasource and entity could be replaced with static imports at the top of the file for consistency and potentially better tree-shaking.
Suggested approach
Add static imports at the top:
import Datasource from "@/model/datasource" import { L2PSProof } from "@/model/entities/L2PSProofs"Then use them directly in
rollbackProofsForBlock.
📜 Review details
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Disabled knowledge base sources:
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (7)
src/libs/l2ps/L2PSBatchAggregator.tssrc/libs/l2ps/L2PSConcurrentSync.tssrc/libs/l2ps/L2PSConsensus.tssrc/libs/l2ps/L2PSHashService.tssrc/libs/l2ps/L2PSTransactionExecutor.tssrc/libs/l2ps/parallelNetworks.tssrc/libs/l2ps/zk/BunPlonkWrapper.ts
🚧 Files skipped from review as they are similar to previous changes (1)
- src/libs/l2ps/L2PSHashService.ts
🧰 Additional context used
🧬 Code graph analysis (4)
src/libs/l2ps/L2PSConcurrentSync.ts (1)
src/utilities/logger.ts (1)
error(125-132)
src/libs/l2ps/zk/BunPlonkWrapper.ts (1)
src/utilities/logger.ts (1)
error(125-132)
src/libs/l2ps/parallelNetworks.ts (2)
src/utilities/logger.ts (1)
error(125-132)src/utilities/sharedState.ts (1)
getSharedState(276-278)
src/libs/l2ps/L2PSBatchAggregator.ts (6)
src/libs/l2ps/zk/L2PSBatchProver.ts (2)
L2PSBatchProver(65-336)hash(164-167)src/libs/blockchain/l2ps_mempool.ts (2)
L2PSMempool(48-659)L2PS_STATUS(16-29)src/utilities/sharedState.ts (1)
getSharedState(276-278)src/libs/crypto/hashing.ts (1)
Hashing(15-26)src/libs/utils/calibrateTime.ts (1)
getNetworkTimestamp(17-23)src/libs/blockchain/transaction.ts (1)
hash(106-114)
🔇 Additional comments (25)
src/libs/l2ps/L2PSBatchAggregator.ts (4)
16-37: LGTM - well-structured interface definition.The
L2PSBatchPayloadinterface is well-documented with clear field descriptions and optional ZK proof support.
57-98: LGTM - solid singleton pattern with appropriate configuration.The class structure follows a clear singleton pattern with proper state management, reentrancy protection, and configurable intervals.
166-180: Good graceful degradation for ZK prover initialization.The ZK prover initialization properly catches errors and allows the service to continue without ZK proofs, with helpful logging indicating how to enable them.
190-217: Well-implemented graceful shutdown.The stop method properly clears the interval, waits for ongoing operations with a configurable timeout, and logs statistics before shutdown.
src/libs/l2ps/parallelNetworks.ts (6)
55-76: Good input validation for hex-encoded key files.The
hexFileToByteshelper properly validates emptiness, even length, and hex character format before conversion. This prevents cryptographic initialization with malformed keys.
108-133: Good concurrency control with promise locking.The
loadingPromisesmap prevents concurrent loads for the same UID, avoiding race conditions during parallel initialization.
141-161: Good path traversal protection.The code validates that the resolved config path stays within the expected base directory, preventing directory traversal attacks via malicious UIDs.
294-317: LGTM - proper signature verification before decryption.The decryption flow correctly verifies the signature before decrypting, with appropriate warning logging when signatures are missing.
338-356: LGTM - defensive validation of transaction data format.The code properly validates array structure before destructuring, preventing runtime errors from malformed transactions.
252-252:getSharedStateis correctly used as a getter in both locations.
getSharedStateis exported fromsharedState.tsas a getter property (get getSharedState() { return SharedState.getInstance() }). Both line 252 and line 272 inparallelNetworks.tsuse it correctly:
- Line 252:
getSharedState.l2psJoinedUids = l2psJoinedUids— accesses the getter (which returns aSharedStateinstance) and sets a property on it.- Line 272:
const sharedState = getSharedState— accesses the getter and assigns the returnedSharedStateinstance to a variable.Both patterns are valid getter usage. There is no inconsistency or error.
src/libs/l2ps/L2PSConcurrentSync.ts (3)
60-64: LGTM - consistent error handling pattern.Error message extraction using
error instanceof Error ? error.message : String(error)is a good defensive pattern that handles both Error objects and other throwables.
119-122: LGTM - appropriate use oflog.warningfor invalid responses.Using warning level for invalid peer responses is appropriate as it's not a critical failure but worth noting.
168-190: Good batch duplicate detection optimization.Pre-fetching existing hashes in a single query before iterating reduces database round-trips compared to checking each transaction individually.
src/libs/l2ps/zk/BunPlonkWrapper.ts (4)
97-100: Key fix for Bun compatibility - good documentation.The critical
singleThread: trueoption is well-documented and addresses the Bun worker thread crash issue. This is the core purpose of this module.
190-195: Good resource cleanup with curve termination.The
finallyblock ensures the curve is terminated even on errors, preventing memory leaks from the elliptic curve library.
26-78: LGTM - faithful port of Keccak256Transcript.The Fiat-Shamir transcript implementation correctly accumulates polynomial commitments and scalars for challenge derivation, matching the snarkjs implementation.
255-322: LGTM - correct multi-round challenge computation.The challenge calculation follows the PLONK protocol specification with proper round sequencing (beta/gamma, alpha, xi, v, u).
src/libs/l2ps/L2PSConsensus.ts (3)
39-54: LGTM - well-structured result interface.The
L2PSConsensusResultinterface provides comprehensive information about the consensus operation including counts, affected accounts, and per-proof details.
112-171: LGTM - solid proof application flow.The
applyPendingProofsmethod properly handles simulation mode, tracks statistics, deduplicates affected accounts, and only processes applied proofs when not simulating.
298-371: LGTM - well-designed L1 batch transaction creation.The batch transaction creation properly aggregates proof metadata, generates a deterministic hash, and creates a visible L1 record without exposing actual transaction content.
src/libs/l2ps/L2PSTransactionExecutor.ts (5)
30-41: LGTM - comprehensive execution result interface.The
L2PSExecutionResultinterface provides all necessary fields for tracking execution outcomes including GCR edits, affected accounts, and identifiers.
57-77: Good async initialization with promise-based deduplication.The
initPromisepattern prevents redundant initialization attempts when multiple callers invokeinit()concurrently.
230-297: Good implementation with proper type validation.The native transaction handler properly validates amount type and positivity before processing, and generates appropriate GCR edits for balance transfers.
391-396: Good: update result check now implemented.The code now checks
result.affected === 0and logs a warning, addressing the previous concern about silent success on non-existent transactions.
309-323: Switch cases now properly wrapped in blocks.The switch cases use block scoping (
case "balance": { ... }) which prevents variable hoisting issues flagged in previous reviews.
…y and maintainability
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
Fix all issues with AI Agents 🤖
In @src/libs/l2ps/L2PSConsensus.ts:
- Around line 420-428: The dynamic imports of "@/model/datasource" and
"@/model/entities/L2PSProofs" are inside the for (const proof of
proofsToRollback) loop (used to build repo, ds, and proofRepo), causing repeated
module loading; move these imports and the repo/ds/proofRepo initialization
outside the loop (e.g., import datasource and L2PSProof once before iterating
and create proofRepo once) and then reuse proofRepo.update(proof.id, {...})
inside the loop to avoid redundant dynamic imports.
♻️ Duplicate comments (6)
scripts/send-l2-batch.ts (3)
74-140: Add bounds checking for all value-consuming flags.The pattern
argv[i + 1]followed byi++is used for many flags without verifying thati + 1 < argv.length. If a flag is the last argument (e.g.,--nodewithout a value),argv[i + 1]returnsundefined, leading to silent failures or crashes downstream.🔎 Proposed fix: guard each value-consuming flag
case "--node": + if (i + 1 >= argv.length) { + throw new Error("--node requires a value") + } options.nodeUrl = argv[i + 1] i++ break case "--uid": + if (i + 1 >= argv.length) { + throw new Error("--uid requires a value") + } options.uid = argv[i + 1] i++ breakApply similar guards to all flags:
--config,--key,--iv,--mnemonic,--mnemonic-file,--from,--to,--value,--data,--count.
121-127: Add NaN validation for--countargument.
Number.parseIntreturnsNaNfor non-numeric input, and sinceNaN < 1evaluates tofalse, invalid input like--count abcsilently setscounttoNaNinstead of throwing an error.🔎 Proposed fix
case "--count": options.count = Number.parseInt(argv[i + 1], 10) i++ - if (options.count < 1) { + if (!Number.isFinite(options.count) || options.count < 1) { throw new Error("--count must be at least 1") } break
318-382: CRITICAL: Nonce is never incremented—all transactions after the first will be rejected.
currentNonceis initialized once at line 318 and passed to each transaction at line 348, but it is never incremented inside the loop. Every transaction after the first will have the same nonce and will be rejected by the node.🔎 Proposed fix
// Small delay between transactions to avoid nonce conflicts if (i < options.count - 1) { await new Promise(resolve => setTimeout(resolve, 500)) } + + // Increment nonce for next transaction + currentNonce++ }src/libs/l2ps/L2PSConsensus.ts (3)
194-205: Rollback uses a single mockTx for all edits, potentially mismatching account context.This issue was previously flagged and remains unresolved. When an edit fails (line 227),
rollbackEditsis called with themockTxcreated for that failed edit. InsiderollbackEdits, this samemockTxis reused to rollback all previous successful edits (line 202), even though those edits may have been applied with different account contexts.If
HandleGCR.applyrelies onmockTx.content.fromor.tofor authorization or accounting, this causes incorrect rollback semantics.🔎 Recommended fix from previous review
Recreate the appropriate
mockTxfor each edit being rolled back:private static async rollbackEdits( proof: L2PSProof, editResults: GCRResult[], - mockTx: any + _lastMockTx: any ): Promise<void> { for (let i = editResults.length - 2; i >= 0; i--) { if (editResults[i].success) { - const rollbackEdit = { ...proof.gcr_edits[i], isRollback: true } - await HandleGCR.apply(rollbackEdit, mockTx, true, false) + const originalEdit = proof.gcr_edits[i] + const editAccount = "account" in originalEdit + ? (originalEdit.account as string) + : proof.affected_accounts[0] || "" + const rollbackEdit = { ...originalEdit, isRollback: true } + const mockTx = this.createMockTx(proof, editAccount) + await HandleGCR.apply(rollbackEdit, mockTx as any, true, false) } } }Also applies to: 227-227
380-392: Rollback and stats never see applied proofs because getProofs is called with an empty UID.This critical issue was previously flagged and remains unresolved. Line 383-387 calls
L2PSProofManager.getProofs("", "applied", ...)assuming""means "all L2PS networks", butgetProofsactually filters byl2ps_uid === "". Unless you have a network with an empty UID, this returns zero proofs, so:
- No proofs are rolled back on block failure.
- Consensus rollback fails silently.
🔎 Recommended fix from previous review
Introduce a dedicated
getAppliedProofsForBlockhelper inL2PSProofManager:static async rollbackProofsForBlock(blockNumber: number): Promise<void> { try { - // Get proofs that were applied in this block - const appliedProofs = await L2PSProofManager.getProofs( - "", // all L2PS networks - "applied", - 1000 - ) - - // Filter by block number and rollback in reverse order - const proofsToRollback = appliedProofs - .filter(p => p.applied_block_number === blockNumber) - .reverse() + // Get proofs that were applied in this block (already ordered for rollback) + const proofsToRollback = await L2PSProofManager.getAppliedProofsForBlock( + blockNumber, + 1000 + )
448-449: Stats retrieval also affected by empty UID issue.This method has the same critical issue as
rollbackProofsForBlock: callinggetProofs("", "applied", ...)filters forl2ps_uid === ""and returns no proofs. As a result, block stats always report zero applied proofs, total edits, and affected accounts.🔎 Recommended fix from previous review
Use the same dedicated helper as suggested for rollback:
static async getBlockStats(blockNumber: number): Promise<{ proofsApplied: number totalEdits: number affectedAccounts: number }> { - const appliedProofs = await L2PSProofManager.getProofs("", "applied", 10000) - const blockProofs = appliedProofs.filter(p => p.applied_block_number === blockNumber) + const blockProofs = await L2PSProofManager.getAppliedProofsForBlock(blockNumber, 10000) return { proofsApplied: blockProofs.length, totalEdits: blockProofs.reduce((sum, p) => sum + p.gcr_edits.length, 0), affectedAccounts: new Set(blockProofs.flatMap(p => p.affected_accounts)).size } }
🧹 Nitpick comments (2)
src/libs/l2ps/L2PSConcurrentSync.ts (1)
186-189: Consider extracting the error message pattern and preserving stack traces.The error message extraction pattern
error instanceof Error ? error.message : String(error)is repeated 5 times across this file (lines 62, 187, 220, 227, 277). Additionally, extracting only the message loses the stack trace, which can hinder debugging in production environments.Recommended improvements
1. Extract to a utility function:
Add to a shared utilities file:
export function extractErrorMessage(error: unknown): string { return error instanceof Error ? error.message : String(error) } export function formatErrorForLogging(error: unknown): string { if (error instanceof Error) { return `${error.message}\n${error.stack || ''}` } return String(error) }2. Then simplify this file's error handling while preserving stack traces:
-const message = error instanceof Error ? error.message : String(error) -log.error(`[L2PS Sync] Failed to batch check duplicates: ${message}`) +log.error(`[L2PS Sync] Failed to batch check duplicates: ${formatErrorForLogging(error)}`)This approach reduces duplication and provides richer debugging context when errors occur in production.
scripts/send-l2-batch.ts (1)
240-276: Consider documenting or validating type assertions for SDK interop.The functions use type assertions (
as unknown as Transaction["content"]["data"]) at lines 255 and 271 to construct transaction payloads in the format expected by the DemoSDK. While these appear intentional for SDK integration, they bypass TypeScript's type checking.If the SDK's expectations change or the payload structure is incorrect, runtime errors could occur.
Consider:
- Adding inline comments explaining why these assertions are safe
- Adding runtime validation (e.g., schema checks) before the assertions if the SDK contract is unclear
- Or confirming with SDK documentation that the tuple shapes
["native", {...}]and["l2psEncryptedTx", payload]match the expected API contract
📜 Review details
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Disabled knowledge base sources:
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (4)
scripts/send-l2-batch.tssrc/libs/l2ps/L2PSConcurrentSync.tssrc/libs/l2ps/L2PSConsensus.tssrc/libs/l2ps/L2PSHashService.ts
🚧 Files skipped from review as they are similar to previous changes (1)
- src/libs/l2ps/L2PSHashService.ts
🧰 Additional context used
🧬 Code graph analysis (1)
src/libs/l2ps/L2PSConcurrentSync.ts (1)
src/utilities/logger.ts (1)
error(125-132)
🔇 Additional comments (4)
src/libs/l2ps/L2PSConcurrentSync.ts (3)
60-63: LGTM: Consistent error message extraction.The error handling properly extracts string messages for logging and gracefully handles peer failures without breaking the parallel discovery process. The debug log level is appropriate for transient peer communication issues.
219-221: LGTM: Appropriate error propagation strategy.The error handling consistently extracts messages and applies the correct propagation strategy for each context:
- Line 221: Logs and continues processing remaining transactions (appropriate for bulk operations)
- Line 229: Logs and rethrows to signal sync failure (appropriate for critical operations)
- Line 278: Logs and continues with remaining peers (appropriate for fire-and-forget broadcasts)
The error handling standardization improves code consistency across all L2PS sync operations.
Also applies to: 226-229, 275-278
120-120: LGTM: Improved logging clarity.The warning logs now use consistent formatting with template literals, making it easier to identify which peer and L2PS UID failed validation during sync operations.
Also applies to: 152-152
scripts/send-l2-batch.ts (1)
284-415: Main execution flow is well-structured.The script's main flow demonstrates good practices:
- Comprehensive error handling with stack traces and non-zero exit codes
- Clear console output with emoji indicators for readability
- Optional status polling via
--waitflag- Proper cleanup and summary reporting
The initialization, transaction loop, and error handling are logically organized.
| const repo = await (await import("@/model/datasource")).default.getInstance() | ||
| const ds = repo.getDataSource() | ||
| const proofRepo = ds.getRepository((await import("@/model/entities/L2PSProofs")).L2PSProof) | ||
|
|
||
| await proofRepo.update(proof.id, { | ||
| status: "pending", | ||
| applied_block_number: null, | ||
| processed_at: null | ||
| }) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🛠️ Refactor suggestion | 🟠 Major
Move dynamic imports outside the loop.
Lines 420-422 perform dynamic imports of @/model/datasource and @/model/entities/L2PSProofs inside the for (const proof of proofsToRollback) loop. This causes the same modules to be imported repeatedly for each proof, which is inefficient.
🔎 Proposed refactor
static async rollbackProofsForBlock(blockNumber: number): Promise<void> {
try {
// Get proofs that were applied in this block
const appliedProofs = await L2PSProofManager.getProofs(
"", // all L2PS networks
"applied",
1000
)
// Filter by block number and rollback in reverse order
const proofsToRollback = appliedProofs
.filter(p => p.applied_block_number === blockNumber)
.reverse()
log.info(`[L2PS Consensus] Rolling back ${proofsToRollback.length} proofs for block ${blockNumber}`)
+ // Import repository dependencies once before the loop
+ const repo = await (await import("@/model/datasource")).default.getInstance()
+ const ds = repo.getDataSource()
+ const proofRepo = ds.getRepository((await import("@/model/entities/L2PSProofs")).L2PSProof)
+
for (const proof of proofsToRollback) {
// Rollback each edit in reverse order
for (let i = proof.gcr_edits.length - 1; i >= 0; i--) {
const edit = proof.gcr_edits[i]
const rollbackEdit = { ...edit, isRollback: true }
const editAccount = 'account' in edit ? edit.account as string : proof.affected_accounts[0] || ''
const mockTx = {
hash: proof.transactions_hash,
content: {
type: "l2ps",
from: editAccount,
to: editAccount,
timestamp: Date.now()
}
}
await HandleGCR.apply(rollbackEdit, mockTx as any, true, false)
}
// Reset proof status to pending
- const repo = await (await import("@/model/datasource")).default.getInstance()
- const ds = repo.getDataSource()
- const proofRepo = ds.getRepository((await import("@/model/entities/L2PSProofs")).L2PSProof)
-
await proofRepo.update(proof.id, {
status: "pending",
applied_block_number: null,
processed_at: null
})
}🤖 Prompt for AI Agents
In @src/libs/l2ps/L2PSConsensus.ts around lines 420-428, The dynamic imports of
"@/model/datasource" and "@/model/entities/L2PSProofs" are inside the for (const
proof of proofsToRollback) loop (used to build repo, ds, and proofRepo), causing
repeated module loading; move these imports and the repo/ds/proofRepo
initialization outside the loop (e.g., import datasource and L2PSProof once
before iterating and create proofRepo once) and then reuse
proofRepo.update(proof.id, {...}) inside the loop to avoid redundant dynamic
imports.
…mproved clarity and consistency
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
Fix all issues with AI Agents 🤖
In @src/libs/l2ps/L2PSProofManager.ts:
- Around line 314-331: getProofs currently always sets where.l2ps_uid = l2psUid,
so passing an empty string (as in L2PSConsensus.rollbackProofsForBlock) filters
to only empty UIDs instead of returning all proofs; change getProofs to only add
the l2ps_uid predicate when l2psUid is non-empty/defined (e.g., if (l2psUid)
where.l2ps_uid = l2psUid) or make l2psUid optional/nullable and check for
undefined, or alternatively add a separate method like getProofsForAllNetworks
and have rollbackProofsForBlock call that—update the function signature/uses of
L2PSProofManager.getProofs and L2PSConsensus.rollbackProofsForBlock accordingly.
In @src/libs/l2ps/parallelNetworks.ts:
- Line 252: You're setting l2psJoinedUids on the function object instead of the
shared state instance; replace the assignment getSharedState.l2psJoinedUids =
l2psJoinedUids with an invocation on the shared state instance (call
getSharedState() and set its l2psJoinedUids property), e.g. use
getSharedState().l2psJoinedUids = l2psJoinedUids and ensure any typing/null
checks around getSharedState() are preserved.
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
src/libs/l2ps/parallelNetworks.ts (1)
272-283: Critical:getSharedStatemust be called as a function.
getSharedStateis assigned without invoking it. This will fail at runtime when attempting to accesssharedState.signingAlgorithmon Line 274, assharedStatewill be a function reference rather than the shared state object.🔎 Proposed fix
- const sharedState = getSharedState + const sharedState = getSharedState() const signature = await ucrypto.sign( sharedState.signingAlgorithm, new TextEncoder().encode(JSON.stringify(encryptedTx.content)), )
♻️ Duplicate comments (5)
src/libs/l2ps/L2PSConsensus.ts (3)
194-205: Rollback uses a singlemockTxfor all edits, potentially mismatching account context.The
rollbackEditsmethod receivesmockTxfrom the caller but uses it for all rollback operations. IfHandleGCR.applyrelies onmockTx.content.from/tofor authorization, earlier edits may be rolled back with the wrong account context.Consider recreating the mockTx per edit similar to
applyGCREdits:private static async rollbackEdits( proof: L2PSProof, editResults: GCRResult[], - mockTx: any + _lastMockTx: any ): Promise<void> { for (let i = editResults.length - 2; i >= 0; i--) { if (editResults[i].success) { + const edit = proof.gcr_edits[i] + const editAccount = 'account' in edit ? edit.account as string : proof.affected_accounts[0] || '' + const mockTx = this.createMockTx(proof, editAccount) const rollbackEdit = { ...proof.gcr_edits[i], isRollback: true } - await HandleGCR.apply(rollbackEdit, mockTx, true, false) + await HandleGCR.apply(rollbackEdit, mockTx as any, true, false) } } }
380-430: Rollback and stats use empty UID which returns zero proofs.
rollbackProofsForBlockandgetBlockStatscallL2PSProofManager.getProofs("", "applied", ...). SincegetProofsfilters byl2ps_uid === "", this returns zero proofs unless you have a network with an empty UID.Additionally, dynamic imports at lines 420-422 are inside the
forloop, causing repeated module loading per proof.Both issues were flagged in prior reviews. Consider:
- Adding a
getAppliedProofsForBlock(blockNumber)helper inL2PSProofManager- Moving imports outside the loop
443-456:getBlockStatshas the same empty UID issue.Same problem as
rollbackProofsForBlock- callinggetProofs("")will return zero proofs.src/libs/l2ps/L2PSProofManager.ts (1)
228-234: Nonce edits missing account validation.The comment says "Balance and nonce edits require account field", but only
balanceis checked. A malformednonceedit withoutaccountwould pass validation.🔎 Proposed fix
for (const edit of proof.gcr_edits) { - if (!edit.type || (edit.type === 'balance' && !('account' in edit))) { + if (!edit.type) { + log.warning(`[L2PS ProofManager] Proof ${proof.id} has invalid GCR edit`) + return false + } + if ((edit.type === 'balance' || edit.type === 'nonce') && !('account' in edit)) { log.warning(`[L2PS ProofManager] Proof ${proof.id} has invalid GCR edit`) return false } }src/libs/l2ps/L2PSTransactionExecutor.ts (1)
136-143: Message claims "Proof created" even when no proof exists (zero-edit case).When
gcrEdits.length === 0andsimulate = false, line 132's condition is false, so no proof is created. However, line 140 still reports "Proof created with 0 GCR edits", which is misleading.🔎 Proposed fix
return { success: true, message: simulate ? `Validated: ${gcrEdits.length} GCR edits would be generated` - : `Proof created with ${gcrEdits.length} GCR edits (will apply at consensus)`, + : gcrEdits.length === 0 + ? "No GCR edits generated; transaction recorded without proof" + : `Proof created with ${gcrEdits.length} GCR edits (will apply at consensus)`, gcr_edits: gcrEdits, affected_accounts: [...new Set(affectedAccounts)] }
🧹 Nitpick comments (7)
src/libs/l2ps/zk/BunPlonkWrapper.ts (3)
85-87: Consider including array index in debug output.The log message shows
"v: "for all 5 challenge values without indicating which index is being logged, making debugging harder.🔎 Suggested fix
for (let i = 1; i < 6; i++) { - logger.debug("v: " + Fr.toString(challenges.v[i], 16)) + logger.debug(`v[${i}]: ` + Fr.toString(challenges.v[i], 16)) }
108-111: Enhance error message with expected vs actual counts.When public signal count validation fails, including the expected and actual values aids debugging without leaking sensitive information.
🔎 Suggested fix
if (publicSignals.length !== vk_verifier.nPublic) { - if (logger) logger.error("Invalid number of public inputs") + if (logger) logger.error(`Invalid number of public inputs: expected ${vk_verifier.nPublic}, got ${publicSignals.length}`) return false }
186-189: Inconsistent error logging pattern.The function accepts an optional
loggerparameter and uses it throughout, but the catch block usesconsole.errordirectly instead. This breaks logging consistency and may bypass structured logging pipelines.🔎 Suggested fix
} catch (error) { const message = error instanceof Error ? error.message : ((error as any)?.message || String(error)) - console.error("PLONK Verify error:", message) + if (logger) { + logger.error(`PLONK Verify error: ${message}`) + } else { + console.error("PLONK Verify error:", message) + } return falsescripts/send-l2-batch.ts (1)
128-130: Addbreakafterprocess.exit(0)to satisfy linter.While the code after
process.exit(0)is unreachable, the static analyzer flags this as a fallthrough case. Adding an explicitbreakstatement improves code clarity and silences the linter warning.🔎 Proposed fix
case "--help": printUsage() process.exit(0) + break default:src/libs/l2ps/L2PSProofManager.ts (1)
201-212: UnusedblockNumberparameter acknowledged but not filtered.The parameter is accepted but explicitly not used (comment says "reserved for future use"). This is acceptable as a placeholder, but consider documenting this behavior in the JSDoc or using
_blockNumbernaming convention to signal intent.🔎 Suggested improvement
- static async getProofsForBlock(blockNumber: number): Promise<L2PSProof[]> { + static async getProofsForBlock(_blockNumber: number): Promise<L2PSProof[]> {src/libs/l2ps/L2PSTransactionExecutor.ts (2)
339-371: Consider handling potential BigInt conversion errors.Lines 360-362 convert values to
BigIntwithout validation. Iftx.content.amount,tx.content.nonce, ortx.content.timestampcontain non-numeric strings, this will throw.🔎 Defensive conversion
const l2psTx = txRepo.create({ l2ps_uid: l2psUid, hash: tx.hash, encrypted_hash: encryptedHash || null, l1_batch_hash: l1BatchHash, batch_index: batchIndex, type: tx.content.type, from_address: tx.content.from as string, to_address: tx.content.to as string, - amount: BigInt(tx.content.amount || 0), - nonce: BigInt(tx.content.nonce || 0), - timestamp: BigInt(tx.content.timestamp || Date.now()), + amount: this.safeBigInt(tx.content.amount, 0n), + nonce: this.safeBigInt(tx.content.nonce, 0n), + timestamp: this.safeBigInt(tx.content.timestamp, BigInt(Date.now())), status: "pending", content: tx.content as Record<string, any>, execution_message: null })Add a helper:
private static safeBigInt(value: unknown, fallback: bigint): bigint { try { if (value === undefined || value === null) return fallback return BigInt(value as string | number | bigint) } catch { return fallback } }
468-485: Missinginit()call ingetNetworkStats.Unlike other methods that call
await this.init()before accessing the datasource,getNetworkStatsdirectly accessesDatasource.getInstance(). While this likely works becauseL2PSProofManager.getStatsinitializes its own repository, consistency would be better.🔎 Suggested improvement
static async getNetworkStats(l2psUid: string): Promise<{ totalTransactions: number pendingProofs: number appliedProofs: number }> { + await this.init() const dsInstance = await Datasource.getInstance()
📜 Review details
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Disabled knowledge base sources:
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (9)
scripts/send-l2-batch.tssrc/libs/l2ps/L2PSBatchAggregator.tssrc/libs/l2ps/L2PSConcurrentSync.tssrc/libs/l2ps/L2PSConsensus.tssrc/libs/l2ps/L2PSHashService.tssrc/libs/l2ps/L2PSProofManager.tssrc/libs/l2ps/L2PSTransactionExecutor.tssrc/libs/l2ps/parallelNetworks.tssrc/libs/l2ps/zk/BunPlonkWrapper.ts
🧰 Additional context used
🧬 Code graph analysis (5)
src/libs/l2ps/L2PSProofManager.ts (3)
src/libs/crypto/hashing.ts (1)
Hashing(15-26)src/model/entities/L2PSProofs.ts (1)
L2PSProofStatus(27-31)src/utilities/logger.ts (1)
error(125-132)
scripts/send-l2-batch.ts (2)
src/utilities/logger.ts (1)
error(125-132)src/libs/blockchain/transaction.ts (1)
Transaction(44-538)
src/libs/l2ps/zk/BunPlonkWrapper.ts (1)
src/utilities/logger.ts (1)
error(125-132)
src/libs/l2ps/L2PSHashService.ts (1)
src/utilities/logger.ts (1)
error(125-132)
src/libs/l2ps/parallelNetworks.ts (2)
src/utilities/logger.ts (1)
error(125-132)src/utilities/sharedState.ts (1)
getSharedState(276-278)
🪛 Biome (2.1.2)
scripts/send-l2-batch.ts
[error] 128-130: This case is falling through to the next case.
Add a break or return statement to the end of this case to prevent fallthrough.
(lint/suspicious/noFallthroughSwitchClause)
🔇 Additional comments (21)
src/libs/l2ps/zk/BunPlonkWrapper.ts (4)
190-195: Good practice: Curve termination in finally block.Properly terminating the curve in the finally block prevents memory leaks from the cryptographic library's internal state, especially important for long-running services.
240-253: Good security practice: Curve point validation.Validating that all G1 points in the proof are valid curve points is essential to prevent invalid curve attacks. This check ensures the proof elements are well-formed before proceeding with verification.
255-322: LGTM: Fiat-Shamir challenge computation.The challenge generation correctly implements the PLONK protocol's Fiat-Shamir transform with proper transcript accumulation across rounds. The 1-indexed v array matches the mathematical notation.
442-456: LGTM: Final pairing verification.The pairing check correctly implements the PLONK verification equation, combining the opening proofs (Wxi, Wxiw) with the accumulated verification elements (E, F) for the final bilinear pairing equality check.
src/libs/l2ps/L2PSHashService.ts (1)
176-177: LGTM! Improved error handling consistency.The consistent error message extraction pattern and the change from
log.warntolog.warningimprove code quality and align with error handling patterns across the L2PS subsystem.Also applies to: 210-211, 264-265, 318-320, 328-330
src/libs/l2ps/L2PSConcurrentSync.ts (1)
7-14: LGTM! Excellent refactoring improves maintainability.The introduction of
getErrorMessageand the extraction of four focused helper functions (getPeerMempoolInfo,getLocalMempoolInfo,fetchPeerTransactions,processSyncTransactions) significantly improves code organization and readability. The consistent error handling pattern aligns well with the broader L2PS enhancements.Also applies to: 93-190, 219-241
src/libs/l2ps/L2PSConsensus.ts (5)
1-24: LGTM - Well-documented module header.Clear documentation of the L2PS consensus flow and module purpose.
25-55: LGTM - Interface definitions are well-structured.The
ProofResultandL2PSConsensusResultinterfaces provide comprehensive typing for consensus outcomes.
66-80: LGTM - Transaction hash collection logic.Proper fallback handling when
transaction_hashesis missing.
112-171: LGTM - Main consensus application flow.The
applyPendingProofsmethod properly:
- Handles empty proof case
- Tracks success/failure counts
- Deduplicates affected accounts
- Respects simulation mode
298-372: LGTM - L1 batch transaction creation.Good use of deterministic hashing with sorted keys for reproducibility. The batch payload structure appropriately hides transaction content while providing metadata.
src/libs/l2ps/L2PSProofManager.ts (4)
30-44: LGTM - Deterministic stringify utility.Proper recursive key sorting ensures consistent hashing after PostgreSQL JSONB round-trips.
71-98: LGTM - Lazy initialization with race protection.The init/initPromise pattern correctly prevents duplicate initialization.
111-172: LGTM - Proof creation with deterministic hashing.The
createProofmethod correctly usesdeterministicStringifyfor hash generation, addressing the previous non-determinism concern.
336-357: LGTM - Stats query implementation.Proper use of query builder with cloning for parallel count queries.
src/libs/l2ps/L2PSTransactionExecutor.ts (6)
1-26: LGTM - Well-documented unified state architecture.Clear explanation of the L2PS privacy model where state changes are applied to L1 via ZK proofs at consensus time.
49-77: LGTM - Repository initialization with race protection.Same safe init pattern as L2PSProofManager.
230-297: LGTM - Native transaction handling with proper validation.Good improvements:
- Type check for amount (
typeof amount !== 'number' || !Number.isFinite(amount))- Balance validation against L1 state
- Proper GCR edit generation for sender/receiver
302-334: LGTM - GCR edit validation.Previous unused
repoissue is now fixed - usesinit()instead. Switch cases properly use block scopes.
376-397: LGTM - Transaction status update with affected row check.Previous issue addressed - now properly checks
result.affected === 0and logs a warning.
402-439: LGTM - Query methods for transactions.Proper pagination support and compound where clause for account transactions.
| static async getProofs( | ||
| l2psUid: string, | ||
| status?: L2PSProofStatus, | ||
| limit: number = 100 | ||
| ): Promise<L2PSProof[]> { | ||
| const repo = await this.getRepo() | ||
|
|
||
| const where: any = { l2ps_uid: l2psUid } | ||
| if (status) { | ||
| where.status = status | ||
| } | ||
|
|
||
| return repo.find({ | ||
| where, | ||
| order: { created_at: "DESC" }, | ||
| take: limit | ||
| }) | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
getProofs always filters by l2ps_uid, causing empty string to match only empty UIDs.
When called with l2psUid = "" (as done in L2PSConsensus.rollbackProofsForBlock), line 321 sets where.l2ps_uid = "", which only returns proofs with an empty UID rather than "all proofs".
Consider making the filter conditional or adding a dedicated method for cross-network queries:
static async getProofs(
- l2psUid: string,
+ l2psUid?: string,
status?: L2PSProofStatus,
limit: number = 100
): Promise<L2PSProof[]> {
const repo = await this.getRepo()
- const where: any = { l2ps_uid: l2psUid }
+ const where: any = {}
+ if (l2psUid) {
+ where.l2ps_uid = l2psUid
+ }
if (status) {
where.status = status
}🤖 Prompt for AI Agents
In @src/libs/l2ps/L2PSProofManager.ts around lines 314-331, getProofs currently
always sets where.l2ps_uid = l2psUid, so passing an empty string (as in
L2PSConsensus.rollbackProofsForBlock) filters to only empty UIDs instead of
returning all proofs; change getProofs to only add the l2ps_uid predicate when
l2psUid is non-empty/defined (e.g., if (l2psUid) where.l2ps_uid = l2psUid) or
make l2psUid optional/nullable and check for undefined, or alternatively add a
separate method like getProofsForAllNetworks and have rollbackProofsForBlock
call that—update the function signature/uses of L2PSProofManager.getProofs and
L2PSConsensus.rollbackProofsForBlock accordingly.
| log.error(`[L2PS] Failed to load L2PS ${uid}: ${message}`) | ||
| } | ||
| } | ||
| getSharedState.l2psJoinedUids = l2psJoinedUids |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Critical: getSharedState must be called as a function.
getSharedState is imported as a function but not invoked here. This will set the property on the function object itself rather than on the shared state instance, causing state to not persist correctly.
🔎 Proposed fix
- getSharedState.l2psJoinedUids = l2psJoinedUids
+ getSharedState().l2psJoinedUids = l2psJoinedUids📝 Committable suggestion
‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.
| getSharedState.l2psJoinedUids = l2psJoinedUids | |
| getSharedState().l2psJoinedUids = l2psJoinedUids |
🤖 Prompt for AI Agents
In @src/libs/l2ps/parallelNetworks.ts around line 252, You're setting
l2psJoinedUids on the function object instead of the shared state instance;
replace the assignment getSharedState.l2psJoinedUids = l2psJoinedUids with an
invocation on the shared state instance (call getSharedState() and set its
l2psJoinedUids property), e.g. use getSharedState().l2psJoinedUids =
l2psJoinedUids and ensure any typing/null checks around getSharedState() are
preserved.
…or improved clarity and consistency
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
Fix all issues with AI Agents 🤖
In @src/libs/l2ps/L2PSConcurrentSync.ts:
- Around line 108-114: The getLocalMempoolInfo function currently types
lastTimestamp as any; change its return type to Promise<{ count: number;
lastTimestamp: number | string }> and ensure the signature for
getLocalMempoolInfo(l2psUid: string) reflects lastTimestamp: number | string so
it accurately represents returning 0 (number) or localTxs[localTxs.length -
1].timestamp (string) from L2PSMempool.getByUID.
♻️ Duplicate comments (10)
src/libs/l2ps/L2PSBatchAggregator.ts (7)
450-451: GuardBigInt()conversions to prevent throwing.If
(tx.encrypted_tx as any)?.content?.amountis a non-integer string (e.g.,"10.5","abc","0x..."),BigInt()will throw a runtime error and kill the proof generation.🔎 Proposed fix
const zkTransactions = transactions.map((tx) => { - const amount = BigInt((tx.encrypted_tx as any)?.content?.amount || 0) + let amount = 0n + try { + const raw = (tx.encrypted_tx as any)?.content?.amount + amount = raw === undefined || raw === null ? 0n : BigInt(raw) + } catch { + amount = 0n + }
87-88: Fix documentation mismatch: cleanup age is 5 minutes, not 1 hour.The inline comment states "(1 hour)" but
CLEANUP_AGE_MSis set to5 * 60 * 1000(5 minutes).🔎 Proposed fix
- /** Cleanup interval - remove batched transactions older than this (1 hour) */ + /** Cleanup interval - remove batched transactions older than this (5 minutes) */ private readonly CLEANUP_AGE_MS = 5 * 60 * 1000 // 5 minutes - confirmed txs can be cleaned up quickly
379-379: Critical:getSharedStatemust be invoked as a function.
getSharedStateis a function but is not being called here. Line 402 attempts to accesssharedState.keypair, which will fail at runtime becausesharedStatewill be a function reference instead of an object.🔎 Proposed fix
- const sharedState = getSharedState + const sharedState = getSharedState()
533-533: Critical:getSharedStatemust be invoked as a function.Same issue as line 379 -
getSharedStateis not invoked. Line 536 attempts to accesssharedState.l2psBatchNonce, which will fail at runtime.🔎 Proposed fix
- const sharedState = getSharedState + const sharedState = getSharedState()
550-550: Critical:getSharedStatemust be invoked as a function.Same issue -
getSharedStateis not invoked. Line 552 attempts to assignsharedState.l2psBatchNonce, which will fail at runtime.🔎 Proposed fix
- const sharedState = getSharedState + const sharedState = getSharedState()
571-571: Critical:getSharedStatemust be invoked as a function.Same issue -
getSharedStateis not invoked. Line 597 attempts to accesssharedState.keypair, which will fail at runtime.🔎 Proposed fix
- const sharedState = getSharedState + const sharedState = getSharedState()
759-759: Critical:getSharedStatemust be invoked as a function.Attempting to access
.l2psJoinedUidsproperty directly on the function reference instead of calling it first. This will returnundefinedinstead of the actual count.🔎 Proposed fix
- joinedL2PSCount: getSharedState.l2psJoinedUids?.length || 0, + joinedL2PSCount: getSharedState().l2psJoinedUids?.length || 0,scripts/send-l2-batch.ts (2)
120-127: Validate that parsed count is a valid number.
Number.parseIntreturnsNaNfor non-numeric input (e.g.,--count abc). The current checkoptions.count < 1doesn't catchNaNbecauseNaN < 1evaluates tofalse.🔎 Proposed fix
case "--count": if (value) { options.count = Number.parseInt(value, 10) - if (options.count < 1) { + if (Number.isNaN(options.count) || options.count < 1) { throw new Error("--count must be at least 1") } } break
319-383: Critical: Nonce is never incremented - all transactions after the first will be rejected.
currentNonceis initialized once at line 319 but never incremented inside the loop. Every transaction will use the same nonce value, causing all but the first to be rejected by the node with a duplicate nonce error.🔎 Proposed fix
console.log(` ✅ Outer hash: ${subnetTx.hash}`) console.log(` ✅ Inner hash: ${innerTx.hash}`) // Small delay between transactions to avoid nonce conflicts if (i < options.count - 1) { await new Promise(resolve => setTimeout(resolve, 500)) } + + // Increment nonce for next transaction + currentNonce++ }src/libs/l2ps/L2PSConcurrentSync.ts (1)
116-132: Same error handling pattern concern asgetPeerMempoolInfo.Similar to the previous helper,
fetchPeerTransactionsdoesn't wrappeer.call()in a try-catch block. While currently safe due to the outer try-catch insyncL2PSWithPeer, consider making this helper more defensive for consistency and reusability.
🧹 Nitpick comments (1)
src/libs/l2ps/L2PSConcurrentSync.ts (1)
134-188: Consider more robust duplicate detection.Line 177: The duplicate detection relies on string matching (
result.error?.includes("already")), which is fragile and can break if the error message format changes inL2PSMempool.addTransaction.A more maintainable approach would be to:
- Have
addTransactionreturn structured error codes (e.g.,{ success: false, errorCode: 'DUPLICATE_TRANSACTION' })- Check for specific error types rather than substring matching
This would improve maintainability and make the code less brittle to changes in error messages.
📜 Review details
Configuration used: Organization UI
Review profile: CHILL
Plan: Pro
Disabled knowledge base sources:
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (5)
scripts/send-l2-batch.tssrc/libs/l2ps/L2PSBatchAggregator.tssrc/libs/l2ps/L2PSConcurrentSync.tssrc/libs/l2ps/L2PSHashService.tssrc/libs/l2ps/L2PSProofManager.ts
🧰 Additional context used
🧬 Code graph analysis (3)
src/libs/l2ps/L2PSProofManager.ts (3)
src/libs/crypto/hashing.ts (1)
Hashing(15-26)src/model/entities/L2PSProofs.ts (1)
L2PSProofStatus(27-31)src/utilities/logger.ts (1)
error(125-132)
scripts/send-l2-batch.ts (2)
src/utilities/logger.ts (1)
error(125-132)src/libs/blockchain/transaction.ts (1)
Transaction(44-538)
src/libs/l2ps/L2PSHashService.ts (1)
src/utilities/logger.ts (1)
error(125-132)
🪛 Biome (2.1.2)
scripts/send-l2-batch.ts
[error] 131-133: This case is falling through to the next case.
Add a break or return statement to the end of this case to prevent fallthrough.
(lint/suspicious/noFallthroughSwitchClause)
🔇 Additional comments (7)
src/libs/l2ps/L2PSHashService.ts (1)
176-177: LGTM! Consistent error message extraction improves log clarity.The standardized error handling pattern across all catch blocks extracts the error message consistently and logs it as a string. This improves observability and makes error logs easier to parse.
Also applies to: 210-211, 264-265, 318-320, 328-330
src/libs/l2ps/L2PSProofManager.ts (1)
34-44: LGTM! Deterministic hashing and consistent error handling.The
deterministicStringifyhelper ensures consistent hashing by sorting object keys alphabetically, which is crucial after PostgreSQL JSONB round-trips. Combined with the standardized error message extraction pattern, this improves both correctness and observability.Also applies to: 123-125, 165-166, 253-254
src/libs/l2ps/L2PSConcurrentSync.ts (5)
7-14: LGTM! Good error handling standardization.The
getErrorMessagehelper provides robust error message extraction with proper fallbacks. This is a clean approach to standardize error handling across the module.
69-71: LGTM! Consistent error handling.The error handling now uses the standardized
getErrorMessagehelper, improving consistency across the module.
93-106: Verify error handling pattern for peer RPC calls.
getPeerMempoolInfodoesn't wrappeer.call()in a try-catch, unlikediscoverL2PSParticipants(line 53) andexchangeL2PSParticipation(line 275) which handle peer call failures gracefully. While the current usage insyncL2PSWithPeeris protected by an outer try-catch (line 214), this inconsistency could lead to unexpected behavior if the helper is reused elsewhere.Consider wrapping the peer call in a try-catch to return 0 on any error, making the helper more defensive and consistent with the module's error handling patterns.
217-239: LGTM! Well-structured refactoring.The refactored
syncL2PSWithPeeris much cleaner and more maintainable. Breaking down the logic into focused helper functions improves readability and testability. The error handling is now consistent throughout.
285-287: LGTM! Consistent error handling.The error handling now uses the standardized
getErrorMessagehelper, maintaining consistency with the rest of the refactoring.
| async function getLocalMempoolInfo(l2psUid: string): Promise<{ count: number, lastTimestamp: any }> { | ||
| const localTxs = await L2PSMempool.getByUID(l2psUid, "processed") | ||
| return { | ||
| count: localTxs.length, | ||
| lastTimestamp: localTxs.length > 0 ? localTxs[localTxs.length - 1].timestamp : 0 | ||
| } | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
🧩 Analysis chain
🏁 Script executed:
# First, find the file and check the L2PSMempool usage
find . -type f -name "*L2PS*" | head -20Repository: kynesyslabs/node
Length of output: 601
🏁 Script executed:
# Search for L2PSMempool class definition
rg "class L2PSMempool" -A 20Repository: kynesyslabs/node
Length of output: 3047
🏁 Script executed:
# Look at the getLocalMempoolInfo function and its context
cat -n src/libs/l2ps/L2PSConcurrentSync.ts | sed -n '100,130p'Repository: kynesyslabs/node
Length of output: 1387
🏁 Script executed:
# Find L2PSMempool getByUID method to understand return type
rg "getByUID" -B 2 -A 10Repository: kynesyslabs/node
Length of output: 8366
🏁 Script executed:
# Search for timestamp field in transaction objects
rg "timestamp" src/libs/l2ps/ -B 2 -A 2Repository: kynesyslabs/node
Length of output: 6174
🏁 Script executed:
# Check how since_timestamp is used in fetchPeerTransactions
rg "fetchPeerTransactions" -B 5 -A 15Repository: kynesyslabs/node
Length of output: 3213
🏁 Script executed:
# Get the full L2PSMempoolTx entity definition
cat -n src/model/entities/L2PSMempool.ts | head -80Repository: kynesyslabs/node
Length of output: 3093
🏁 Script executed:
# Check if timestamp is marked as optional or required in the entity
rg "@Column|timestamp" src/model/entities/L2PSMempool.ts -B 2 -A 2Repository: kynesyslabs/node
Length of output: 1386
Replace any type with proper return type definition.
Line 108: The return type uses any for lastTimestamp, which hides type inconsistency. The function returns either 0 (number) when the array is empty or localTxs[localTxs.length - 1].timestamp (string, based on the L2PSMempoolTx entity definition). Replace lastTimestamp: any with lastTimestamp: number | string to explicitly reflect that the function returns different types depending on the condition.
🤖 Prompt for AI Agents
In @src/libs/l2ps/L2PSConcurrentSync.ts around lines 108-114, The
getLocalMempoolInfo function currently types lastTimestamp as any; change its
return type to Promise<{ count: number; lastTimestamp: number | string }> and
ensure the signature for getLocalMempoolInfo(l2psUid: string) reflects
lastTimestamp: number | string so it accurately represents returning 0 (number)
or localTxs[localTxs.length - 1].timestamp (string) from L2PSMempool.getByUID.
…mponents for improved clarity and consistency
…tter clarity and maintainability
|


PR Type
Enhancement
Description
Implemented L2PS Batch Aggregator service to collect and submit L2PS transactions to main mempool
Added nonce field and reference_block tracking for transaction ordering and consistency
Enhanced L2PS mempool with batch operations, status lifecycle management, and cleanup utilities
Improved transaction type handling for L2PS encrypted transactions and identity normalization
Added comprehensive database entity support and index optimization for L2PS operations
Diagram Walkthrough
File Walkthrough
12 files
New L2PS batch aggregator service implementationEnhanced L2PS mempool with batch operations and status lifecycleInitialize L2PS batch aggregator and parallel networksAdd reference_block tracking for mempool transactionsAdd L2PS encrypted transaction type handlingUpdate L2PS transaction type importsImplement robust identity normalization and validationAdd sequence_number field and optimize indexesChange nonce column to bigint with nullable defaultChange nonce column to bigint with nullable defaultUpdate L2PS transaction type importsUpdate SerializedEncryptedObject type import2 files
Fix hex key file parsing and async encryption handlingSkip GCR edit processing for non-GCR transactions1 files
Register L2PS entities in database datasourceSummary by CodeRabbit
New Features
Bug Fixes
Chores
✏️ Tip: You can customize this high-level summary in your review settings.