Skip to content

sadiq-bd/ai-support-agent

Repository files navigation

🤖 AI Support Agent

A full-stack, edge-deployed AI support agent with a built-in human-in-the-loop dashboard. The AI automatically handles user queries via Telegram, but a human operator can seamlessly take over the conversation in real-time through a web interface powered by WebSockets.

Screenshot

Built on Cloudflare Workers with Hono, Durable Objects, and D1 — zero cold starts, globally distributed, and serverless.

Author: Sadiq Ahmed · sadiq@sadiq.is-a.dev


✨ Features

  • AI-Powered Auto-Replies — Incoming Telegram messages are answered by an LLM automatically (configurable via any OpenAI-compatible API).
  • Human-in-the-Loop Dashboard — Operator web UI to monitor all conversations and jump into any chat in real-time.
  • Real-Time WebSocket Sync — Messages from Telegram are broadcast instantly to the dashboard; operator replies are sent back to Telegram.
  • Hibernatable Durable Objects — WebSocket connections persist across DO evictions using Cloudflare's hibernation API, keeping costs near zero during idle periods.
  • Smart AI / Human Toggle — AI replies can be toggled globally, or set to activate only when the human operator is inactive (auto-detected via WebSocket presence).
  • Human-Like Typing Simulation — AI responses are delivered with randomized typing delays and typing... indicators to feel natural.
  • Chat Search & Pagination — Search across all conversations and paginate through message history.
  • Unread Message Indicators — Dashboard shows unseen messages with visual badges.
  • Mobile-Responsive Dashboard — The operator UI works on phones with a native-app-like experience.
  • Basic Auth Protected — Dashboard and API endpoints are secured behind HTTP Basic Authentication stored in D1.
  • Webhook Secret Validation — Telegram webhook requests are verified using a secret token.

🏗️ Architecture

┌──────────────┐     Webhook      ┌────────────────────────────────────────────┐
│   Telegram   │ ──────────────▶  │          Cloudflare Worker (Hono)          │
│    User      │ ◀──────────────  │                                            │
└──────────────┘   Bot API Reply  │  ┌─────────┐  ┌──────┐  ┌──────────────┐  │
                                  │  │ LLM API │  │  D1  │  │Durable Object│  │
                                  │  │ (Groq)  │  │(SQLite│  │ (ChatHandler)│  │
                                  │  └─────────┘  └──────┘  └──────┬───────┘  │
                                  │                                │          │
                                  └────────────────────────────────┼──────────┘
                                                                   │
                                                              WebSocket
                                                                   │
                                                        ┌──────────▼──────────┐
                                                        │  Operator Dashboard │
                                                        │   (Browser / Web)   │
                                                        └─────────────────────┘

How It Works

  1. A Telegram user sends a message → Telegram delivers it to the Worker via webhook.
  2. The Worker upserts the user in D1, then decides whether to generate an AI reply based on the current state (use_ai_reply, operator activity).
  3. If AI is enabled, the Worker calls the configured LLM endpoint (OpenAI-compatible) with conversation history for context.
  4. The AI reply is sent back to Telegram with a simulated typing delay.
  5. Simultaneously, the incoming message (and AI response, if any) is broadcast via WebSocket to all connected operator dashboards.
  6. From the dashboard, a human operator can send a reply — which goes to the Durable Object → Telegram Bot API → user, and is logged in D1.
  7. When the operator's WebSocket disconnects, their last-active timestamp is recorded. The AI resumes auto-replying after a configurable inactivity window (default: 5 seconds).

📁 Project Structure

ai-support-agent/
├── public/
│   ├── scripts/
│   │   └── chat.js              # Client-side WebSocket logic for the dashboard
│   └── robots.txt
├── scripts/
│   ├── models-gen.mjs           # Auto-generates models/index.ts barrel file
│   └── webhook-setup.js         # Interactive Telegram webhook registration
├── src/
│   ├── app.ts                   # Hono app entry — routes & webhook handler
│   ├── ChatHandler.ts           # Durable Object — WebSocket server & chat actions
│   ├── configs.ts               # All configurable constants (LLM, pagination, prompts)
│   ├── utils.ts                 # JSON helpers, base64, IP utilities
│   ├── database/
│   │   ├── factory.ts           # Drizzle ORM D1 connection factory
│   │   └── migrations/          # Drizzle-generated SQL migrations
│   ├── helpers/
│   │   ├── chats.ts             # Chat & message history queries
│   │   ├── common.ts            # askLLM(), chatServer(), pagination utilities
│   │   ├── states.ts            # App state (KV-like) get/set/init via D1
│   │   └── telegram.ts          # Telegram Bot API helpers (send, typing, humanReply)
│   ├── middlewares/
│   │   ├── BasicAuth.ts         # HTTP Basic Auth middleware (credentials in D1)
│   │   └── TgSecretCheck.ts     # Telegram webhook secret verification
│   ├── models/
│   │   ├── BasicAuthCredentials.ts
│   │   ├── ChatLog.ts
│   │   ├── ChatUser.ts
│   │   ├── State.ts
│   │   └── index.ts             # Auto-generated barrel export
│   ├── pages/
│   │   └── chat.tsx             # Hono JSX — operator dashboard HTML
│   └── types/
│       └── webhookupdate.ts     # Telegram webhook payload types
├── drizzle.config.ts
├── wrangler.jsonc               # Cloudflare Worker configuration
├── tsconfig.json
└── package.json

🛠️ Tech Stack

Layer Technology
Runtime Cloudflare Workers
Framework Hono (with JSX for server-rendered pages)
Database Cloudflare D1 (SQLite)
ORM Drizzle ORM
Real-Time Durable Objects with Hibernatable WebSockets
AI / LLM Any OpenAI-compatible API (default: Groq)
Bot Platform Telegram Bot API (webhooks)
Package Mgr Bun

🚀 Getting Started

Prerequisites

1. Clone & Install

git clone https://github.com/your-username/ai-support-agent.git
cd ai-support-agent
bun install

2. Create a D1 Database

bunx wrangler d1 create ai-support-agent

Copy the database_id from the output.

3. Configure wrangler.jsonc

Open wrangler.jsonc and fill in the placeholders:

{
    "d1_databases": [
        {
            "database_name": "ai-support-agent",
            "database_id": "<your-d1-database-uuid>"       // from step 2
        }
    ],
    "vars": {
        "LLM_API_KEY": "<your-llm-api-key>",              // e.g. Groq API key
        "TELEGRAM_TOKEN": "<your-telegram-bot-token>",     // from BotFather
        "TELEGRAM_WEBHOOK_SECRET": "<generated-later>"     // from step 5
    }
}

Tip: For production, use wrangler secret put instead of plaintext vars for sensitive values like LLM_API_KEY and TELEGRAM_TOKEN.

4. Run Database Migrations

Generate and apply the schema:

bun run db:generate     # generates SQL from Drizzle models
bun run db:migrate      # applies migrations locally

For production:

bunx wrangler d1 migrations apply ai-support-agent --remote

5. Set Up the Telegram Webhook

bun run webhook-set

This interactive script will:

  1. Ask for your bot token and worker endpoint (e.g., https://ai-support-agent.<your-subdomain>.workers.dev/webhook-x)
  2. Register the webhook with Telegram
  3. Output a webhook secret — copy this into wrangler.jsoncTELEGRAM_WEBHOOK_SECRET

6. Add Dashboard Credentials

Insert a Basic Auth user directly into D1:

bunx wrangler d1 execute ai-support-agent --local --command \
  "INSERT INTO basic_auth_credentials (user, password) VALUES ('admin', 'your-secure-password');"

For production, use --remote instead of --local.

7. Start Development

bun run dev

The worker runs locally at http://localhost:8787.

  • Dashboard: http://localhost:8787/chats (requires Basic Auth)
  • Webhook: POST http://localhost:8787/webhook-x

8. Deploy to Production

bun run deploy

🔧 Configuration

All tunable constants live in src/configs.ts:

Constant Default Description
LLM_ENDPOINT https://api.groq.com/openai/v1/chat/completions OpenAI-compatible chat completions endpoint
LLM_MODEL openai/gpt-oss-120b Model identifier sent to the LLM API
LLM_MAX_CHAT_HISTORY 15 Max previous messages sent as context to the LLM
LLM_SYSTEM_PROMPT (see file) System prompt defining the AI agent's personality
MAX_MESSAGE_HISTORY 20 Messages per page in the dashboard
MAX_CHAT_HISTORY 20 Chats per page in the sidebar

Runtime State (Dynamic)

These are stored in D1 and can be toggled via the API:

State Key Default Description
use_ai_reply 1 1 = AI auto-replies enabled, 0 = disabled
use_ai_only_when_inactive 1 1 = AI replies only when the operator is offline
admin_last_active_at Timestamp of last operator WebSocket activity

Toggle AI via the API:

# Disable AI replies
curl -u admin:password "https://your-worker.dev/chat/options?use_ai_reply=0"

# Enable AI only when operator is inactive
curl -u admin:password "https://your-worker.dev/chat/options?use_ai_only_when_inactive=1"

📡 API Routes

Method Path Auth Description
POST /webhook-x TG Secret Telegram webhook endpoint
GET /chats Basic Auth Operator dashboard (HTML)
GET /chat/options Basic Auth Toggle AI reply settings (query params)
GET /chat/socket Basic Auth WebSocket upgrade for real-time chat

WebSocket Actions

Once connected to /chat/socket, the client communicates via JSON messages:

Action Direction Payload Description
loadChats Client → Server { action, page?, limit?, search? } Fetch paginated chat list
loadMessages Client → Server { action, chat_id, page?, limit? } Fetch message history for a chat
sendMessage Client → Server { action, chat_user_id, chat_id, text } Send a reply to a Telegram user
receiveMessage Server → Client { action, data: { chat_id, message, ... } } Broadcast incoming Telegram message

📐 Database Schema

Four tables managed by Drizzle ORM:

chat_users — Telegram users who have messaged the bot

Column Type Notes
id INTEGER Primary key, auto-increment
chat_id TEXT Unique, Telegram chat ID
user_id TEXT Telegram user ID
username TEXT Telegram username
first_name / last_name TEXT User display name
last_message TEXT Preview of last message
last_message_at TEXT Timestamp (epoch ms)
last_message_is_client INTEGER 1 = from user, 0 = reply
last_message_seen_at TEXT Null if unread

chat_logs — Full message history

Column Type Notes
id INTEGER Primary key, auto-increment
chat_user_id INTEGER FK → chat_users.id
message TEXT User's message
response TEXT AI or operator reply
is_ai_response INTEGER 1 = AI-generated, 0 = human
created_at TEXT Timestamp (epoch ms)

states — Key-value configuration store

Column Type Notes
id INTEGER Primary key
key TEXT Unique config key
value TEXT Config value

basic_auth_credentials — Dashboard login credentials

Column Type Notes
id INTEGER Primary key
user TEXT Unique username
password TEXT Plaintext password

📜 Available Scripts

Command Description
bun run dev Start local development server (Wrangler)
bun run deploy Deploy to Cloudflare Workers (minified)
bun run db:generate Auto-generate model barrel + Drizzle SQL migrations
bun run db:migrate Apply migrations to local D1
bun run webhook-set Interactive Telegram webhook setup
bun run cf-typegen Generate TypeScript types for Cloudflare bindings

🔒 Security Notes

  • Webhook verification — Every incoming Telegram request is validated against the TELEGRAM_WEBHOOK_SECRET header.
  • Dashboard auth — All operator endpoints (/chats, /chat/socket, /chat/options) require HTTP Basic Auth.
  • Secrets management — For production, use wrangler secret put instead of storing sensitive values as plaintext vars in wrangler.jsonc (although not a big deal if not exposed)
  • Passwordsbasic_auth_credentials stores plaintext passwords. Consider hashing for production use (will implement later)

👤 Author

Sadiq Ahmed


📄 License

Copyright © 2026 Sadiq Ahmed. All rights reserved.

This project is open source. See LICENSE for details.

Releases

No releases published

Packages

 
 
 

Contributors