-
Notifications
You must be signed in to change notification settings - Fork 3
Open
Description
Problem
Chat agent responses are very slow (30+ seconds) and sometimes fail to appear, despite existing loading indicators.
Root Cause Analysis
- AI conversation loops take excessive time based on prompt complexity
- No timeout handling for long-running requests
- Network issues or provider delays not gracefully handled
- Race conditions between mutation response and session queries
Steps to Reproduce
- Send complex message requiring extensive AI processing
- Observe long wait times before response appears
- Sometimes response never loads despite loading indicators
Expected Behavior
- Responses within reasonable time limits (<15 seconds for complex requests)
- Proper timeout handling with user feedback
- Reliable delivery of responses
Technical Details
- API:
POST /datamachine/v1/chatwith AIConversationLoop processing - Frontend: Loading states exist but responses still slow/fail
- Backend: No request timeouts, potential race conditions
Proposed Solutions
- Add configurable timeout limits for AI requests
- Implement retry logic for failed requests
- Optimize AI loop performance (caching, prompt optimization)
- Add server-side request queuing to prevent overload
- Better error propagation to frontend
Metadata
Metadata
Assignees
Labels
No labels