An AI-native MCP (Model Context Protocol) server for SOC operations with Splunk, featuring automated investigation tools, label harvesting, and DeepTempo integration capabilities.
This project extends the capabilities of livehybrid/splunk-mcp with SOC-specific enrichment tools and security controls designed for AI-driven security investigations via Claude Desktop and other MCP clients.
- Traditional SOC Workflows - IP pivoting, lateral movement detection, data exfiltration analysis
- AI-Native Investigation - Cross-platform correlation, attack timeline reconstruction
- Label Harvesting - Automatic discovery and mapping of Splunk field labels
- Production Security - Input validation, audit logging, output sanitization
- Multi-Mode Operation - SSE, STDIO, and API modes for flexible deployment
- Python 3.10 or higher
- Splunk Enterprise or Cloud instance
- pip (included with Python)
-
Clone the repository:
git clone https://github.com/mando222/splunk-mcp-soc.git cd splunk-mcp-soc -
Install dependencies:
Using pip (recommended):
pip install -r requirements.txt
Or with UV:
uv sync
Or with Poetry:
poetry install
-
Configure environment variables:
Create a
.envfile:SPLUNK_HOST=localhost SPLUNK_PORT=8089 SPLUNK_USERNAME=admin SPLUNK_PASSWORD=your-password SPLUNK_SCHEME=https VERIFY_SSL=false
-
Test the connection:
python test_connection.py
-
Run the MCP server:
# STDIO mode (for Claude Desktop) python splunk_mcp.py stdio # SSE mode (default) python splunk_mcp.py # API mode python splunk_mcp.py api
- health_check - Verify Splunk connectivity and available apps
- ping - Check MCP server status
- current_user - Get authenticated user information
- list_users - List all Splunk users and roles
- list_indexes - List all accessible indexes
- get_index_info - Get detailed information about a specific index
- indexes_and_sourcetypes - Comprehensive index and sourcetype mapping
- search_splunk - Execute Splunk search queries with time ranges
- list_saved_searches - View saved searches
- list_kvstore_collections - List all KV store collections
- create_kvstore_collection - Create new collections
- delete_kvstore_collection - Remove collections
- pivot_by_ip - Investigate all activity from a specific IP address
- find_lateral_movement - Detect lateral movement patterns
- calculate_data_exfiltration - Analyze and quantify data exfiltration
- build_attack_timeline - Construct chronological attack timelines
- correlate_with_deeptempo_finding - Cross-reference with DeepTempo findings
- enrich_ip_with_threat_intel - Enrich IPs with reputation data from multiple sources
- Queries AbuseIPDB, AlienVault OTX, and internal Splunk threat lists
- Provides reputation score, threat types, and confidence levels
- check_ioc_reputation - Quick reputation check for any IOC (IP, domain, hash, URL)
- Auto-detects IOC type and provides actionable verdict
- add_to_threat_list - Add confirmed IOCs to Splunk threat intelligence
- Supports expiration and automatic cleanup
- get_mitre_attack_context - Get detailed MITRE ATT&CK technique information
- Maps findings to tactics, techniques, and procedures
- Includes detection methods and mitigations
- block_ip_address - Block malicious IPs at firewall/proxy level
- Temporary or permanent blocking
- Auto-unblock capability with configurable duration
- isolate_host - Quarantine compromised hosts from network
- Full, partial, or monitoring-only isolation levels
- Integrates with NAC and endpoint security tools
- create_incident_ticket - Auto-create tickets in ITSM platforms
- ServiceNow, Jira, or native Splunk incident tracking
- Automatic priority and SLA calculation
- send_alert_notification - Push alerts to communication channels
- Slack, Microsoft Teams, PagerDuty, email, SMS
- Severity-based routing
- detect_anomalies - Statistical anomaly detection on time-series data
- Z-score based detection with configurable sensitivity
- Identifies spikes, dips, and unusual patterns
- identify_rare_events - Find statistically rare occurrences
- Detects new processes, domains, or behaviors
- Useful for zero-day and APT detection
- baseline_normal_behavior - Establish behavioral profiles
- Learn normal patterns for users, hosts, or services
- Enables deviation-based threat detection
- harvest_labels - Discover field labels and schemas from Splunk indexes
- Configurable scope (all indexes, specific indexes, or CIM fields only)
- Returns field names, types, sample values, and metadata
- Supports filtering by index and time range
- get_field_summary - Get detailed information about a specific field
- Deep dive into field values, distribution, and relationships
- Useful for understanding individual field usage
- export_labels_to_deeptempo - Export labels in DeepTempo-compatible format
- Generic JSON structure that can be adapted to DeepTempo's needs
- Optional file export for integration workflows
Query: "Show me all activity from IP 10.1.42.42"
Results:
- 65 total events discovered
- 47 unique destinations contacted
- 10+ lateral movement attempts detected
- 1.2 GB data exfiltration identified
Query: "Build attack timeline for 10.1.42.42 and correlate with DeepTempo"
Results:
- 32-day attack timeline reconstructed
- Initial compromise β lateral movement β exfiltration
- 12 similar incidents identified
- Complete MITRE ATT&CK mapping
Query: "Hunt for similar C2 beaconing patterns across all hosts"
Results:
- 3 additional compromised hosts found
- Common service account identified (jenkins_service)
- Botnet infrastructure mapped
Generate and ingest test security data:
# Generate test data
python generate_test_data.py
# Ingest into Splunk
python ingest_test_data.py your-passwordThis creates an mcp_demo index with 115 security events:
- 50 C2 beaconing events
- 40 authentication/lateral movement events
- 20 DNS tunneling events
- 5 data exfiltration events
Run the test suite:
pytest tests/- SSE Mode (default):
docker compose up -d mcp- API Mode:
docker compose run --rm mcp python splunk_mcp.py api- STDIO Mode:
docker compose run -i --rm mcp python splunk_mcp.py stdio./run_tests.sh --docker| Variable | Description | Default |
|---|---|---|
SPLUNK_HOST |
Splunk server hostname | localhost |
SPLUNK_PORT |
Splunk management port | 8089 |
SPLUNK_USERNAME |
Authentication username | admin |
SPLUNK_PASSWORD |
Authentication password | - |
SPLUNK_TOKEN |
Optional: Use token instead of user/pass | - |
SPLUNK_SCHEME |
Connection scheme (http/https) | https |
VERIFY_SSL |
Enable SSL certificate verification | true |
FASTMCP_LOG_LEVEL |
Logging level | INFO |
SERVER_MODE |
Server mode (sse/api/stdio) | sse |
Add to your Claude Desktop config (~/Library/Application Support/Claude/claude_desktop_config.json):
{
"mcpServers": {
"splunk-soc": {
"command": "python",
"args": [
"/path/to/splunk-mcp-soc/splunk_mcp.py",
"stdio"
],
"env": {
"SPLUNK_HOST": "localhost",
"SPLUNK_PORT": "8089",
"SPLUNK_USERNAME": "admin",
"SPLUNK_PASSWORD": "your-password"
}
}
}
}| Document | Purpose |
|---|---|
| SETUP_INSTRUCTIONS.md | Detailed setup guide |
| DEMO_TOOLS_SPEC.md | Complete tool specifications |
| SOC_PLAYBOOKS.md | Investigation workflow examples |
| CONTRIBUTING.md | Development guidelines |
| DEMO_TESTING_GUIDE.md | Testing procedures |
Claude Desktop / MCP Client
β
βββ Splunk MCP Server (this project)
β βββ SOC Investigation Tools
β βββ Label Harvesting
β βββ Splunk SDK Integration
β
βββ DeepTempo MCP Server (separate)
βββ Embedding Similarity Search
βββ MITRE ATT&CK Mapping
βββ LogLM Analysis
- β SSL/TLS support with configurable verification
- β Token-based and credential-based authentication
- β Environment variable configuration
- β Input validation on all tools
- β Audit logging support
- Never commit
.envfiles - Use
VERIFY_SSL=truein production - Rotate credentials regularly
- Monitor audit logs
- Use least-privilege Splunk accounts
Contributions are welcome! Please see CONTRIBUTING.md for guidelines.
This project is built upon livehybrid/splunk-mcp v0.3.0 and extends it with:
- SOC-specific investigation tools
- Label harvesting capabilities
- DeepTempo integration support
- Enhanced security controls
- FastMCP - MCP server framework
- Splunk SDK for Python - Splunk API client
- python-decouple - Configuration management
Apache License 2.0 - See LICENSE for details.
# Test Splunk connectivity
python test_connection.py
# Check logs
tail -f splunk_mcp.log# Ingest test data
python ingest_test_data.py your-password
# Verify in Splunk UI
index=mcp_demo | stats count by event_type- Verify
.envfile exists with correct values - Check Python version (3.10+ required)
- Ensure Splunk is accessible
- Review error logs
For issues and questions:
- Check documentation
- Review error logs
- Open an issue on GitHub
Built with FastMCP for AI-native security operations π