Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
25 changes: 25 additions & 0 deletions .github/workflows/test.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,25 @@
name: Test

on:
push:
branches: [master]
pull_request:
branches: [master]

jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Install uv
uses: astral-sh/setup-uv@v6
- name: Set up Python
run: uv python install
- name: Install dependencies
run: uv sync --locked
- name: Lint
run: uv run ruff check .
- name: Type check
run: uv run mypy --install-types --non-interactive .
- name: Test
run: uv run pytest -v --cov=bot tests/
2 changes: 1 addition & 1 deletion Dockerfile
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
FROM python:3.14-slim-bookworm
COPY --from=ghcr.io/astral-sh/uv:latest /uv /uvx /bin/
COPY --from=ghcr.io/astral-sh/uv:0.11.2 /uv /uvx /bin/

COPY . /app
WORKDIR /app
Expand Down
113 changes: 86 additions & 27 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,60 +1,119 @@
# agentic-slackbot
A simple Slack bot that uses the OpenAI Agents SDK to interact with the Model Context Protocol (MCP) server.

A simple Slack bot that uses the [OpenAI Agents SDK](https://github.com/openai/openai-agents-python) to interact with [Model Context Protocol (MCP)](https://modelcontextprotocol.io/) servers.

See also: [agentic-telegram-bot](https://github.com/John-Lin/agentic-telegram-bot) — a similar demo bot for Telegram.

## Features

- Channel @mention and DM support
- Thread-aware conversations (follow-ups stay in the same thread)
- Connects to any MCP server via `servers_config.json`
- Supports OpenAI, Azure OpenAI, and OpenAI-compatible proxy endpoints
- Per-conversation history with automatic truncation

## Install Dependencies

```bash
uv sync
```

## Slack App Setup

1. Create a new Slack app at [api.slack.com/apps](https://api.slack.com/apps).
2. Enable **Socket Mode** and generate an app-level token (`xapp-...`).
3. Under **OAuth & Permissions**, add the following bot token scopes:
- `app_mentions:read`
- `chat:write`
- `im:history`
- `users:read`
4. Under **Event Subscriptions**, subscribe to:
- `app_mention`
- `message.im`
5. Install the app to your workspace and copy the bot token (`xoxb-...`).

## Environment Variables

Create a `.envrc` file in the root directory of the project and add the following environment variables:
Create a `.envrc` or `.env` file in the root directory:

```
export OPENAI_API_KEY=""
export SLACK_BOT_TOKEN=""
export SLACK_APP_TOKEN=""
export OPENAI_MODEL="gpt-4o"
export HTTP_PROXY=""
export OPENAI_API_KEY=""
export OPENAI_MODEL="gpt-4.1"
```

If you are using Azure OpenAI, set these instead:

```
export AZURE_OPENAI_API_KEY=""
export AZURE_OPENAI_ENDPOINT="https://<myopenai>.azure.com/"
export OPENAI_MODEL="gpt-4.1"
export OPENAI_API_VERSION="2025-03-01-preview"
```

If you are using an OpenAI-compatible proxy:

```
export OPENAI_PROXY_BASE_URL="https://my-proxy.example.com/v1"
export OPENAI_PROXY_API_KEY=""
```

Optional HTTP proxy for outbound requests:

If you are using Azure OpenAI, you can set the following environment variables instead:
```
AZURE_OPENAI_API_KEY=""
AZURE_OPENAI_ENDPOINT="https://<myopenai>.azure.com/"
OPENAI_MODEL="gpt-4o"
OPENAI_API_VERSION="2025-03-01-preview"
export HTTP_PROXY=""
```

If you are using Langfuse
## MCP Server Configuration

Edit `servers_config.json` to add your MCP servers:

```json
{
"instructions": "Your custom system prompt here.",
"mcpServers": {
"my-server": {
"command": "uvx",
"args": ["my-mcp-server"]
}
}
}
```
export LANGFUSE_PUBLIC_KEY="xxx"
export LANGFUSE_SECRET_KEY="xxx"
export LANGFUSE_HOST="xxx"

For local MCP servers, use `uv --directory`:

```json
{
"instructions": "Your custom system prompt here.",
"mcpServers": {
"my-server": {
"command": "uv",
"args": ["--directory", "/path/to/my-server", "run", "my-entrypoint"]
}
}
}
```

## Running the Bot

```bash
uv run bot
``````
```

Running the bot in docker
## Docker

```bash
# Build the Docker image
docker build . -t agentic-slackbot

# Run the Docker container
docker run -e SLACK_BOT_TOKEN="" \
-e SLACK_APP_TOKEN="" \
-e HTTP_PROXY="" \
-e OPENAI_PROXY_BASE_URL="" \
-e OPENAI_PROXY_API_KEY="" \
-e OPENAI_MODEL=gpt-4o \
-e FIRECRAWL_API_URL="" slackbot
docker build -t agentic-slackbot .

docker run -d \
--name slackbot \
-e SLACK_BOT_TOKEN="" \
-e SLACK_APP_TOKEN="" \
-e OPENAI_API_KEY="" \
-e OPENAI_MODEL="gpt-4.1" \
-v /path/to/servers_config.json:/app/servers_config.json \
agentic-slackbot
```

## Credit
Expand Down
16 changes: 15 additions & 1 deletion bot/agent.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
from __future__ import annotations

import asyncio
import collections
import logging
import os
from typing import Any
Expand All @@ -24,6 +25,7 @@
)

MAX_TURNS = 25
MAX_CHATS = 200
MCP_SESSION_TIMEOUT_SECONDS = 30.0


Expand Down Expand Up @@ -71,19 +73,31 @@ def __init__(
mcp_servers=(mcp_servers if mcp_servers is not None else []),
)
self.name = name
self._conversations: dict[str, list[TResponseInputItem]] = {}
self._conversations: collections.OrderedDict[str, list[TResponseInputItem]] = (
collections.OrderedDict()
)
self._locks: dict[str, asyncio.Lock] = {}

def _evict_oldest(self) -> None:
"""Remove the least-recently-used chat when over MAX_CHATS."""
while len(self._conversations) > MAX_CHATS:
evicted_id, _ = self._conversations.popitem(last=False)
self._locks.pop(evicted_id, None)

def get_messages(self, chat_id: str) -> list[TResponseInputItem]:
return self._conversations.get(chat_id, [])

def set_messages(self, chat_id: str, messages: list[TResponseInputItem]) -> None:
self._conversations[chat_id] = messages
self._conversations.move_to_end(chat_id)
self._evict_oldest()

def append_user_message(self, chat_id: str, message: str) -> None:
if chat_id not in self._conversations:
self._conversations[chat_id] = []
self._conversations[chat_id].append({"role": "user", "content": message})
self._conversations.move_to_end(chat_id)
self._evict_oldest()

def truncate_history(self, chat_id: str) -> None:
"""Keep only the last MAX_TURNS turns of conversation history.
Expand Down
70 changes: 70 additions & 0 deletions tests/test_agent.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
from __future__ import annotations

import asyncio
from unittest.mock import create_autospec
from unittest.mock import patch

Expand All @@ -8,6 +9,7 @@
from openai import AsyncAzureOpenAI

from bot.agent import DEFAULT_INSTRUCTIONS
from bot.agent import MAX_CHATS
from bot.agent import MAX_TURNS
from bot.agent import OpenAIAgent

Expand Down Expand Up @@ -237,3 +239,71 @@ def test_empty_mcp_servers(self):
config = {"mcpServers": {}}
agent = OpenAIAgent.from_dict("test", config)
assert agent.agent.mcp_servers == []


class TestChatEviction:
def test_default_max_chats(self):
assert MAX_CHATS == 200

def test_evicts_oldest_chat_when_limit_exceeded(self, monkeypatch):
monkeypatch.setattr("bot.agent.MAX_CHATS", 3)
agent = OpenAIAgent(name="test")

agent.set_messages("C001", [{"role": "user", "content": "a"}])
agent.set_messages("C002", [{"role": "user", "content": "b"}])
agent.set_messages("C003", [{"role": "user", "content": "c"}])
assert len(agent._conversations) == 3

agent.set_messages("C004", [{"role": "user", "content": "d"}])
assert "C001" not in agent._conversations
assert len(agent._conversations) == 3
assert set(agent._conversations.keys()) == {"C002", "C003", "C004"}

def test_updating_existing_chat_does_not_evict(self, monkeypatch):
monkeypatch.setattr("bot.agent.MAX_CHATS", 2)
agent = OpenAIAgent(name="test")

agent.set_messages("C001", [{"role": "user", "content": "a"}])
agent.set_messages("C002", [{"role": "user", "content": "b"}])
agent.set_messages("C001", [{"role": "user", "content": "updated"}])
assert len(agent._conversations) == 2
assert agent.get_messages("C001")[0]["content"] == "updated"

def test_append_to_new_chat_triggers_eviction(self, monkeypatch):
monkeypatch.setattr("bot.agent.MAX_CHATS", 2)
agent = OpenAIAgent(name="test")

agent.set_messages("C001", [{"role": "user", "content": "a"}])
agent.set_messages("C002", [{"role": "user", "content": "b"}])
agent.append_user_message("C003", "c")

assert "C001" not in agent._conversations
assert len(agent._conversations) == 2

def test_accessing_chat_refreshes_its_position(self, monkeypatch):
monkeypatch.setattr("bot.agent.MAX_CHATS", 3)
agent = OpenAIAgent(name="test")

agent.set_messages("C001", [{"role": "user", "content": "a"}])
agent.set_messages("C002", [{"role": "user", "content": "b"}])
agent.set_messages("C003", [{"role": "user", "content": "c"}])

# Refresh C001
agent.set_messages("C001", [{"role": "user", "content": "refreshed"}])

# C002 is now oldest — adding C004 should evict C002
agent.set_messages("C004", [{"role": "user", "content": "d"}])
assert "C002" not in agent._conversations
assert "C001" in agent._conversations

def test_eviction_also_cleans_up_lock(self, monkeypatch):
monkeypatch.setattr("bot.agent.MAX_CHATS", 2)
agent = OpenAIAgent(name="test")

agent._locks["C001"] = asyncio.Lock()
agent.set_messages("C001", [{"role": "user", "content": "a"}])
agent._locks["C002"] = asyncio.Lock()
agent.set_messages("C002", [{"role": "user", "content": "b"}])

agent.set_messages("C003", [{"role": "user", "content": "c"}])
assert "C001" not in agent._locks
8 changes: 4 additions & 4 deletions tests/test_slack.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ async def test_mention_calls_agent_with_channel_key(self, bot):

ack.assert_called_once()
# Conversation key should be ts (no thread_ts means new conversation)
bot.agent.run.assert_called_once_with("1234567890.123456", "what is the weather?")
bot.agent.run.assert_called_once_with("1234567890.123456", "[U123] what is the weather?")

@pytest.mark.anyio
async def test_mention_in_thread_uses_thread_ts(self, bot):
Expand All @@ -59,7 +59,7 @@ async def test_mention_in_thread_uses_thread_ts(self, bot):

await bot.handle_mention(event, say, ack)

bot.agent.run.assert_called_once_with("1111111111.111111", "follow up")
bot.agent.run.assert_called_once_with("1111111111.111111", "[U123] follow up")

@pytest.mark.anyio
async def test_mention_replies_in_thread(self, bot):
Expand Down Expand Up @@ -89,7 +89,7 @@ async def test_mention_strips_bot_mention(self, bot):

await bot.handle_mention(event, say, ack)

bot.agent.run.assert_called_once_with("1234567890.123456", "help me")
bot.agent.run.assert_called_once_with("1234567890.123456", "[U123] help me")


class TestHandleMessage:
Expand All @@ -107,7 +107,7 @@ async def test_dm_calls_agent(self, bot):

await bot.handle_message(message, say, ack)

bot.agent.run.assert_called_once_with("1234567890.123456", "hello")
bot.agent.run.assert_called_once_with("1234567890.123456", "[U123] hello")

@pytest.mark.anyio
async def test_non_dm_is_ignored(self, bot):
Expand Down
Loading