Like marimo algae drifting in crystal waters, your code flows and evolves β each cell a living sphere of computation, gently touching others, creating ripples of reactive change. In this digital ocean, data streams like currents, models grow like organic formations, and insights emerge naturally from the depths. Let your ML experiments flow freely, tracked and nurtured, as nature intended.
marimo-flow.mp4
Marimo Flow combines reactive notebook development with AI-powered assistance and robust ML experiment tracking:
- π€ AI-First Development with MCP: Model Context Protocol (MCP) integration brings live documentation, code examples, and AI assistance directly into your notebooks - access up-to-date library docs for Marimo, Polars, Plotly, and more without leaving your workflow
- π Reactive Execution: Marimo's dataflow graph ensures your notebooks are always consistent - change a parameter and watch your entire pipeline update automatically
- π Seamless ML Pipeline: MLflow integration tracks every experiment, model, and metric without breaking your flow
- π― Interactive Development: Real-time parameter tuning with instant feedback and beautiful visualizations
This combination eliminates the reproducibility issues of traditional notebooks while providing AI-enhanced, enterprise-grade experiment tracking.
- Model Context Protocol Integration: Live documentation and AI assistance in your notebooks
- Context7 Server: Access up-to-date docs for any Python library without leaving marimo
- Marimo MCP Server: Specialized assistance for marimo patterns and best practices
- Local LLM Support: Ollama integration for privacy-focused AI code completion
- π Reactive Notebooks: Git-friendly
.pynotebooks with automatic dependency tracking - π¬ MLflow Tracking: Complete ML lifecycle management with model registry
- π― Interactive Development: Real-time parameter tuning with instant visual feedback
- πΎ SQLite Backend: Lightweight, file-based storage for experiments
- π³ Docker Deployment: One-command setup with docker-compose
- π¦ Curated Snippets & Tutorials: 4 reusable snippet modules plus 15+ tutorial notebooks covering Polars, Plotly, Marimo UI patterns, RAG, and OpenVINO
- π Comprehensive Docs: Built-in reference guides with 100+ code examples
- π GitHub Pages: Auto-deploy interactive notebooks with WASM
# Clone repository
git clone https://github.com/bjoernbethge/marimo-flow.git
cd marimo-flow
# Build and start services
docker compose -f docker/docker-compose.yaml up --build -d
# Access services
# Marimo: http://localhost:2718
# MLflow: http://localhost:5000
# View logs
docker compose -f docker/docker-compose.yaml logs -f
# Stop services
docker compose -f docker/docker-compose.yaml down| Variant | Image Tag | Use Case |
|---|---|---|
| CPU | ghcr.io/bjoernbethge/marimo-flow:latest |
No GPU (lightweight) |
| CUDA | ghcr.io/bjoernbethge/marimo-flow:cuda |
NVIDIA GPUs |
| XPU | ghcr.io/bjoernbethge/marimo-flow:xpu |
Intel Arc/Data Center GPUs |
# NVIDIA GPU (requires nvidia-docker)
docker compose -f docker/docker-compose.cuda.yaml up -d
# Intel GPU (requires Intel GPU drivers)
docker compose -f docker/docker-compose.xpu.yaml up -d# Install dependencies
uv sync
# Start MLflow server (in background or separate terminal)
uv run mlflow server \
--host 0.0.0.0 \
--port 5000 \
--backend-store-uri sqlite:///data/experiments/db/mlflow.db \
--default-artifact-root ./data/experiments/artifacts \
--serve-artifacts
# Start Marimo (in another terminal)
uv run marimo edit examples/All notebooks live in examples/ and can be opened with uv run marimo edit examples/<file>.py.
01_interactive_data_profiler.pyβ DuckDB-powered data explorer with filters, previews, and interactive scatter plots for any local database.02_mlflow_experiment_console.pyβ Connect to an MLflow tracking directory, inspect experiments, and visualize metric trends inline with Altair.03_pina_walrus_solver.pyβ Toggle between baseline PINNs and the Walrus adapter to solve a Poisson equation with live training controls.04_hyperparameter_tuning.pyβ Optuna-based hyperparameter search for PINA/PyTorch models with MLflow tracking and interactive study settings.05_model_registry.pyβ Train, register, and promote MLflow models end-to-end, including stage transitions and inference checks.06_production_pipeline.pyβ Production-style pipeline featuring validation gates, training, registry integration, deployment steps, and monitoring hooks.09_pina_live_monitoring.pyβ Live training monitoring with real-time loss plotting, error analysis, and comprehensive visualization tools.
Additional learning material lives in examples/tutorials/ (15+ focused notebooks covering marimo UI patterns, Polars, Plotly, DuckDB, OpenVINO, RAG, and PYG) plus examples/tutorials/legacy/ for the retired 00β03 pipeline.
marimo-flow/
βββ .claude/ # Claude Code configuration
β βββ Skills/ # Domain-specific skills
β β βββ marimo/ # Marimo notebook development
β β βββ mlflow/ # MLflow tracking & GenAI
β β βββ pina/ # Physics-informed neural networks
β β βββ _integration/ # Cross-skill workflows
β βββ settings.json # Hooks (format, lint, protection)
βββ .vscode/
β βββ mcp.json # VS Code Copilot MCP config
βββ .mcp.json # Claude Code MCP config
βββ examples/ # Production-ready marimo notebooks
β βββ 01_interactive_data_profiler.py
β βββ 02_mlflow_experiment_console.py
β βββ 03_pina_walrus_solver.py
β βββ 04_hyperparameter_tuning.py
β βββ 05_model_registry.py
β βββ 06_production_pipeline.py
β βββ 09_pina_live_monitoring.py
β βββ tutorials/ # Learning notebooks
β βββ pina/ # PINA tutorials (5 notebooks)
β βββ mlflow/ # MLflow tutorials (5 notebooks)
β βββ *.py # Marimo, Polars, Plotly patterns
βββ src/marimo_flow/ # Installable package
β βββ core/ # PINA solvers, training, visualization
β βββ snippets/ # Reusable chart/dataframe helpers
βββ docs/ # Reference documentation
β βββ marimo-quickstart.md # Marimo guide
β βββ polars-quickstart.md # Polars guide
β βββ plotly-quickstart.md # Plotly guide
β βββ pina-quickstart.md # PINA guide
β βββ integration-patterns.md # Integration examples
βββ data/
β βββ mlflow/ # MLflow storage
β βββ artifacts/ # Model artifacts
β βββ db/ # SQLite database
β βββ prompts/ # Prompt templates
βββ docker/ # Docker configuration
βββ pyproject.toml # Dependencies
βββ README.md # This file
The marimo_flow package provides reusable components:
# Chart and dataframe helpers
from marimo_flow.snippets import build_interactive_scatter, filter_dataframe
# PINA solver components
from marimo_flow.core import ModelFactory, ProblemManager, SolverManagerTutorial notebooks in examples/tutorials/ demonstrate these patterns with progressive examples for PINA, MLflow, and common visualization tasks.
The docs/ directory contains comprehensive LLM-friendly documentation for key technologies:
- Quick-start guides for Marimo, Polars, Plotly, and PINA
- Integration patterns and best practices
- Code examples and common workflows
Marimo Flow is AI-first with built-in Model Context Protocol (MCP) support for intelligent, context-aware development assistance.
Traditional notebooks require constant context-switching to documentation sites. With MCP:
- π Live Documentation: Access up-to-date library docs directly in marimo
- π€ AI Code Completion: Context-aware suggestions from local LLMs (Ollama)
- π‘ Smart Assistance: Ask questions about libraries and get instant, accurate answers
- π Always Current: Documentation updates automatically, no more outdated tutorials
Access real-time documentation for any Python library:
# Ask: "How do I use polars window functions?"
# Get: Current polars docs, code examples, best practices
# Ask: "Show me plotly 3D scatter plot examples"
# Get: Latest plotly API with working code samplesSupported Libraries:
- Polars, Pandas, NumPy - Data manipulation
- Plotly, Altair, Matplotlib - Visualization
- Scikit-learn, PyTorch - Machine Learning
- And 1000+ more Python packages
Get expert help with marimo-specific patterns:
# Ask: "How do I create a reactive form in marimo?"
# Get: marimo form patterns, state management examples
# Ask: "Show me marimo UI element examples"
# Get: Complete UI component reference with codeExample 1: Learning New Libraries
# You're exploring polars window functions
# Type: "polars rolling mean example"
# MCP returns: Latest polars docs + working code
df.with_columns(
pl.col("sales").rolling_mean(window_size=7).alias("7d_avg")
)Example 2: Debugging
# Stuck on a plotly error?
# Ask: "Why is my plotly 3D scatter not showing?"
# Get: Common issues, solutions, and corrected codeExample 3: Best Practices
# Want to optimize code?
# Ask: "Best way to aggregate in polars?"
# Get: Performance tips, lazy evaluation patterns- Code Completion: Context-aware suggestions as you type (Ollama local LLM)
- Inline Documentation: Hover over functions for instant docs
- Smart Refactoring: AI suggests improvements based on current libraries
- Interactive Q&A: Chat with AI about your code using latest docs
MCP servers are pre-configured in .marimo.toml:
[mcp]
presets = ["context7", "marimo"]
[ai.ollama]
model = "gpt-oss:20b-cloud"
base_url = "http://localhost:11434/v1"If you're running inside Docker, the same mcp block lives in docker/.marimo.toml, so both local and containerized sessions pick up identical presets.
You can extend functionality by adding custom MCP servers in .marimo.toml:
[mcp.mcpServers.your-custom-server]
command = "npx"
args = ["-y", "@your-org/your-mcp-server"]Expose MLflow trace operations to MCP-aware IDEs/assistants (e.g., Claude Desktop, Cursor) by running:
mlflow mcp runRun the command from an environment where MLFLOW_TRACKING_URI (or MLFLOW_BACKEND_STORE_URI/MLFLOW_DEFAULT_ARTIFACT_ROOT) points at your experiments. The server stays up until interrupted and can be proxied alongside Marimo/MLflow so every tool shares the same MCP context.
Learn More:
- Marimo MCP Guide - Official MCP documentation
- Model Context Protocol - MCP specification and resources
Marimo Flow includes full Claude Code support with domain-specific skills, MCP servers, and automated hooks.
| Server | Purpose | Config |
|---|---|---|
| marimo | Notebook inspection, debugging, linting | HTTP on port 2718 |
| mlflow | Trace search, feedback, evaluation | stdio via mlflow mcp run |
| context7 | Live library documentation | stdio via npx |
| serena | Semantic code search | stdio via uvx |
Start marimo MCP server:
# Install once (recommended)
uv tool install "marimo[lsp,recommended,sql,mcp]>=0.18.0"
# Start server
marimo edit --mcp --no-token --port 2718 --headlessThree specialized skills in .claude/Skills/ provide expert guidance:
| Skill | Triggers | MCP Tools |
|---|---|---|
| marimo | marimo, reactive notebook, mo.ui |
Notebook inspection, linting, context7 docs |
| mlflow | mlflow, experiment tracking, genai tracing |
Trace search, feedback, evaluation, context7 docs |
| pina | pina, pinns, pde solver, neural operator |
MLflow tracking, context7 docs |
Pre-resolved context7 library IDs (no lookup needed):
/marimo-team/marimo- marimo docs (2,413 snippets)/mlflow/mlflow- mlflow docs (9,559 snippets)/mathlab/pina- PINA docs (2,345 snippets)
Cross-platform hooks in .claude/settings.json:
| Hook | Trigger | Action |
|---|---|---|
| SessionStart | Session begins | Start marimo MCP server |
| PostToolUse | Edit/Write .py files |
Auto-format with ruff |
| PreToolUse | Edit uv.lock |
Block (protection) |
MCP config for VS Code Copilot in .vscode/mcp.json:
{
"servers": {
"marimo": { "type": "http", "url": "http://127.0.0.1:2718/mcp/server" },
"mlflow": { "type": "stdio", "command": "mlflow", "args": ["mcp", "run"] }
}
}Docker setup (configured in docker/docker-compose.yaml):
MLFLOW_BACKEND_STORE_URI:sqlite:////app/data/experiments/db/mlflow.dbMLFLOW_DEFAULT_ARTIFACT_ROOT:/app/data/experiments/artifactsMLFLOW_HOST:0.0.0.0(allows external access)MLFLOW_PORT:5000OLLAMA_BASE_URL:http://host.docker.internal:11434(requires Ollama on host)
Local development:
MLFLOW_TRACKING_URI:http://localhost:5000(default)
The Docker container runs both services via docker/start.sh:
- Marimo: Port 2718 - Interactive notebook environment
- MLflow: Port 5000 - Experiment tracking UI
GPU Support: NVIDIA GPU support is enabled by default. Remove the deploy.resources section in docker-compose.yaml if running without GPU.
- scikit-learn
^1.5.2- Machine learning library - NumPy
^2.1.3- Numerical computing - pandas
^2.2.3- Data manipulation and analysis - PyArrow
^18.0.0- Columnar data processing - SciPy
^1.14.1- Scientific computing - matplotlib
^3.9.2- Plotting library
- Polars
^1.12.0- Lightning-fast DataFrame library - DuckDB
^1.1.3- In-process analytical database - Altair
^5.4.1- Declarative statistical visualization
- OpenAI
^1.54.4- GPT API integration - FastAPI
^0.115.4- Modern web framework - Pydantic
^2.10.2- Data validation
- SQLAlchemy
^2.0.36- SQL toolkit and ORM - Alembic
^1.14.0- Database migrations - SQLGlot
^25.30.2- SQL parser and transpiler
- Black
^24.10.0- Code formatter - Ruff
^0.7.4- Fast Python linter - pytest
^8.3.3- Testing framework - MyPy
^1.13.0- Static type checker
- Experiments:
GET /api/2.0/mlflow/experiments/list - Runs:
GET /api/2.0/mlflow/runs/search - Models:
GET /api/2.0/mlflow/registered-models/list
- Notebooks:
GET /- File browser and editor - Apps:
GET /run/<notebook>- Run notebook as web app
We welcome contributions! Please see our Contributing Guidelines for details on:
- Development setup and workflow
- Code standards and style guide
- Testing requirements
- Pull request process
Quick Start for Contributors:
- Fork the repository
- Create a feature branch:
git checkout -b feature-name - Make your changes following the coding standards
- Test your changes:
uv run pytest - Submit a pull request
See CONTRIBUTING.md for comprehensive guidelines.
See CHANGELOG.md for a detailed version history and release notes.
Current Version: 0.2.0
This project is licensed under the MIT License - see the LICENSE file for details.
Built with β€οΈ using Marimo and MLflow