SeeQL is an AI-powered conversational interface for querying databases using natural language. Built with Next.js and the Model Context Protocol (MCP), it lets users ask complex questions in plain English and receive rich, visual dashboards (Charts, Graphs, Tables, Metrics) instead of raw SQL results.
- Natural Language to SQL: Converts conversational questions into SQL queries automatically
- Dynamic Visualizations: Renders appropriate UI based on query results (stats, bar/pie charts, data tables)
- MCP Integration: Uses Model Context Protocol for secure database schema introspection and query execution
- Two-Stage Query Processing: Smart routing determines if a database query is needed before executing
- Large Dataset Handling: Efficiently handles large results with pagination, search, and print functionality
- Persistent Conversations: Thread-based chat history stored in SQLite
- Conversation Starters: Pre-built prompts for common queries
seeQL/
βββ database/ # Database files and schema
β βββ university.db # Pre-seeded SQLite database with university ERP data
β βββ init.sql # Database schema (30+ tables)
βββ src/ # Next.js application source
β βββ app/
β β βββ api/
β β β βββ chat/ # Main chat endpoint (NL to SQL pipeline)
β β β βββ mcp/ # MCP SSE transport endpoints
β β β βββ messages/ # Message persistence
β β β βββ query-results/ # Large dataset retrieval
β β β βββ threads/ # Thread CRUD operations
β β βββ page.tsx # Main application page
β β βββ layout.tsx # Root layout
β βββ components/
β β βββ CustomAssistantMessage.tsx # Renders openui-lang components
β β βββ FullResultsModal.tsx # Modal for large datasets
β βββ lib/
β βββ db.ts # SQLite database operations
β βββ mcp.ts # MCP server and tools
β βββ query-cache.ts # Query result caching (legacy)
βββ public/ # Static assets
β βββ logo.png # Application logo
βββ package.json # Next.js app configuration
βββ .env.local # Environment variables
- User Input: User submits a natural language question
- Schema Fetch: Backend retrieves database schema and relationships via MCP
- Query Decision: LLM determines if a SQL query is needed
- SQL Generation: If needed, LLM generates a SELECT query
- Query Execution: MCP executes the query in a secure sandbox
- UI Generation: LLM generates openui-lang code for visualization
- Rendering: React components render charts, tables, or metrics
When query results exceed 50 rows:
- Full results are saved to the database
- A 20-row preview is sent to the LLM
- LLM generates a "View All" button
- Clicking the button opens
FullResultsModalwith:- Infinite scroll pagination (100 rows/page)
- Server-side search across all columns
- Print functionality for all data
The application uses 6 MCP tools for database interaction:
| Tool | Description |
|---|---|
get_schema |
Returns complete database schema (tables, columns, foreign keys) |
list_tables |
Lists all tables with column/row counts |
get_table_details |
Detailed info for a specific table with sample data |
get_database_relationships |
All foreign key relationships |
get_sample_data |
Sample rows from a table |
run_query |
Executes SELECT/WITH queries with security validation |
Security: Only SELECT and WITH queries are allowed. Dangerous patterns (DROP, DELETE, UPDATE, etc.) are blocked.
The pre-seeded university.db contains a comprehensive University Management System:
| Category | Tables |
|---|---|
| Campus & Organization | campuses, departments, degree_programs, academic_terms, classrooms |
| Staff Management | staff, staff_positions_history, staff_salary_payouts, staff_attendance |
| Student Management | students, admissions, student_program_enrollments, student_transfers |
| Academics | sections, course_catalog, program_curriculum, staff_subject_allocations |
| Scheduling | timetable_slots, timetable_entries, student_attendance |
| Assessments | assessments, student_results |
| Fees & Finance | student_fee_invoices, student_fee_payments |
| Library | library_books, library_book_copies, library_memberships, library_loans |
| Transport | transport_routes, transport_stops, student_transport_enrollments |
| Other | study_materials, support_tickets |
Views: vw_student_cgpa, vw_fee_outstanding
| Layer | Technology |
|---|---|
| Frontend | React 19, Next.js 16, Tailwind CSS |
| UI Framework | @openuidev/react-ui, @openuidev/react-lang |
| AI/LLM | Google Gemini (via OpenAI-compatible API) |
| Database | SQLite (better-sqlite3) |
| Protocol | Model Context Protocol (MCP) |
| Streaming | Server-Sent Events (SSE) |
- Node.js 18+
- Google Gemini API key
Create .env inside the root directory with:
# LLM API Key
LLM_API_KEY=your_llm_api_key
# LLM API Base URL (OpenAI-compatible)
LLM_BASE_URL=base URL of your LLM
# Path to SQLite database
DB_PATH=database/university.db# Install dependencies
npm install
# Start development server
npm run devThe application will be available at http://localhost:3000
npm run build
npm startTry these conversation starters:
- "How many students are currently enrolled? Break it down by department."
- "Show me the top 5 departments ranked by the number of students enrolled."
- "What is the current fee collection status? How much is pending vs paid?"
- "Give me an overview of all active courses grouped by department."
The application uses two SQLite databases:
| Database | Location | Purpose |
|---|---|---|
university.db |
database/ |
Source data (read-only) |
chats.db |
Root directory | Chat threads, messages, and cached query results |
Tables in chats.db:
threads: Chat conversation threadsmessages: Individual messages within threadsquery_results: Cached results for large dataset queries
- Query Validation: Only SELECT and WITH queries allowed
- SQL Injection Protection: Blocks dangerous patterns (DROP, DELETE, UPDATE, etc.)
- Read-Only Database: Source database is opened in read-only mode
- No Comment Injection: Blocks SQL comments (
--,/* */)
- Add support for multiple database drivers (PostgreSQL, MySQL, etc.)
- Implement IAM / Auth based access control for Retrieval
- Build a custom UI component library for better UI and more use case coverage
- Enhance text2SQL module while reducing the LLM dependency, so that we can even use smaller LLMs in future
- Introduce full-coverage unit testing setup
See LICENSE for details.
