Introducing Middleman — The Precision Context Refiner for MCP Pipelines #705
JithunMethusahan
started this conversation in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Pre-submission Checklist
What would you like to share?
Hey everyone 👋
I'm excited to share a tool I built to solve a problem that's been quietly draining tokens from every MCP-powered workflow: context bloat.
The Problem
When your AI agent fetches a Wikipedia article or scrapes a webpage via MCP, it receives the entire raw payload — often 15,000–25,000 tokens of noise for maybe 800 tokens of useful signal. You pay for all of it.
What Middleman Does
Middleman is an MCP server that sits between your data source and your LLM. It intercepts the raw data, refines it using Gemini 1.5 Flash, and delivers a clean, XML-structured summary.
Real numbers:
Key Features
.txt,.log,.mdfilesGet Started
Repo: https://github.com/JithunMethusahan/middleman
I'd love feedback from the community — especially if you test it with your own MCP stack. Happy to answer questions below!
Beta Was this translation helpful? Give feedback.
All reactions