MCP Client web application for requirements management guidance. Connects to a remote MCP Server and provides a chat interface with multi-LLM support.
- Multi-LLM Support: Claude, GPT-4o, and Gemini via LiteLLM
- MCP Integration: Connects to remote MCP server using Streamable HTTP transport
- Topic-Focused: Strict system prompts ensure responses stay on requirements management topics
- Chat Interface: Streamlit-based UI with conversation history
- Session Persistence: PostgreSQL/SQLite storage for chat history
- Docker Support: Multi-stage builds for development and production
- Comprehensive Testing: pytest with async support and coverage
┌─────────────────────┐ REST API ┌─────────────────────┐
│ Streamlit Frontend │ ←───────────────→ │ FastAPI Backend │
│ (port 8501) │ │ (port 8000) │
│ - Chat UI │ │ - MCP Client │
│ - Session state │ │ - LLM Integration │
│ - Custom styling │ │ - Session storage │
└─────────────────────┘ └─────────────────────┘
│
▼
┌─────────────────────┐
│ MCP Server │
│ (Railway) │
│ /mcp endpoint │
└─────────────────────┘
requirements-advisor-client/
├── src/requirements_advisor_client/
│ ├── backend/ # FastAPI application
│ │ ├── main.py # API endpoints
│ │ ├── config.py # Pydantic settings
│ │ ├── logging.py # Loguru setup
│ │ ├── mcp_client.py # MCP client class
│ │ ├── llm.py # LiteLLM integration
│ │ ├── models.py # Pydantic models
│ │ └── database.py # SQLAlchemy setup
│ └── frontend/ # Streamlit application
│ ├── app.py # Chat UI
│ ├── config.py # Frontend settings
│ ├── styles.py # CSS/branding
│ └── .streamlit/
│ └── config.toml # Theme configuration
├── tests/ # pytest test suite
├── Dockerfile # Multi-stage Docker build
├── docker-compose.yml # Development setup
├── railway.toml # Railway deployment config
└── pyproject.toml # Project configuration
- Python 3.11+
- uv (recommended) or pip
- API keys for at least one LLM provider
# Clone the repository
git clone https://github.com/arthurfantaci/requirements-advisor-client.git
cd requirements-advisor-client
# Install dependencies
uv sync
# Install dev dependencies
uv sync --all-extraspip install -e ".[dev]"Copy the example environment file and configure your API keys:
cp .env.example .envEdit .env with your configuration:
# Required: At least one LLM API key
ANTHROPIC_API_KEY=sk-ant-...
OPENAI_API_KEY=sk-... # Optional
GOOGLE_API_KEY=... # Optional
# Optional: Override defaults
MCP_SERVER_URL=https://requirements-advisor-production.up.railway.app/mcp
DATABASE_URL=sqlite+aiosqlite:///./data/sessions.db
LOG_LEVEL=INFOOption 1: Using uv
# Terminal 1: Start backend
uv run uvicorn requirements_advisor_client.backend.main:app --reload --port 8000
# Terminal 2: Start frontend
uv run streamlit run src/requirements_advisor_client/frontend/app.py --server.port 8501Option 2: Using Docker Compose
# Start both services
docker compose up --build
# Or run in detached mode
docker compose up -d --buildOpen http://localhost:8501 in your browser.
# Run all tests
uv run pytest
# Run with coverage report
uv run pytest --cov
# Run specific test file
uv run pytest tests/backend/test_mcp_client.py# Run linter
uv run ruff check .
# Run formatter
uv run ruff format .
# Install pre-commit hooks
uv run pre-commit installThe codebase uses type hints throughout. Use your IDE's type checker or run:
uv run pyright src/# Build backend
docker build --target backend -t advisor-backend .
# Build frontend
docker build --target frontend -t advisor-frontend .# Start all services
docker compose up
# Start with PostgreSQL (optional)
docker compose --profile postgres up
# Stop services
docker compose down| Variable | Description | Default |
|---|---|---|
MCP_SERVER_URL |
Remote MCP server URL | https://requirements-advisor-production.up.railway.app/mcp |
DATABASE_URL |
Database connection string | sqlite+aiosqlite:///./data/sessions.db |
BACKEND_HOST |
Server bind address | 0.0.0.0 |
BACKEND_PORT |
Server port | 8000 |
LOG_LEVEL |
Logging level | INFO |
LOG_JSON |
Output logs as JSON | false |
LLM_MAX_ITERATIONS |
Max tool-calling iterations per request | 10 |
ANTHROPIC_API_KEY |
Anthropic API key | - |
OPENAI_API_KEY |
OpenAI API key | - |
GOOGLE_API_KEY |
Google AI API key | - |
| Variable | Description | Default |
|---|---|---|
API_URL |
Backend API URL | http://localhost:8000 |
Both services are configured for Railway deployment with the Dockerfile.
- Create a new Railway project
- Add Backend service from repository root (target:
backend) - Add Frontend service from repository root (target:
frontend) - Add PostgreSQL database
- Configure environment variables:
- Backend: API keys +
DATABASE_URL=${{Postgres.DATABASE_URL}} - Frontend:
API_URL=http://${{backend.RAILWAY_PRIVATE_DOMAIN}}:${{backend.PORT}}
- Backend: API keys +
Health check endpoint.
{
"status": "healthy",
"mcp_connected": true,
"version": "0.1.0"
}List available MCP tools.
Send a chat message.
{
"message": "How do I write good requirements?",
"provider": "gemini",
"session_id": null,
"history": []
}Response:
{
"response": "Here are some best practices...",
"session_id": "abc123",
"tools_used": []
}
### `GET /history/{session_id}`
Get chat history for a session.
## Contributing
1. Fork the repository
2. Create a feature branch (`git checkout -b feature/amazing-feature`)
3. Install pre-commit hooks (`pre-commit install`)
4. Make your changes
5. Run tests (`pytest`)
6. Commit your changes (`git commit -m 'Add amazing feature'`)
7. Push to the branch (`git push origin feature/amazing-feature`)
8. Open a Pull Request
## License
MIT