From 369641f4d32559b457e5db9542e733e2848e271d Mon Sep 17 00:00:00 2001 From: Tue Haulund Date: Wed, 1 Apr 2026 20:25:37 +0200 Subject: [PATCH 1/4] chore: update Replay team goals for Q2 --- contents/teams/replay/objectives.mdx | 73 +++++++++++++++++++--------- 1 file changed, 51 insertions(+), 22 deletions(-) diff --git a/contents/teams/replay/objectives.mdx b/contents/teams/replay/objectives.mdx index 80da375456a8..dd1ac571b977 100644 --- a/contents/teams/replay/objectives.mdx +++ b/contents/teams/replay/objectives.mdx @@ -1,33 +1,62 @@ -## 🚀 Goal 1: Get Hayne rocketing +## 🧠 Goal 1: Nail single-session on-demand summaries -It's great to have Hayne on the team, let's make the time and priority to make it a success! +Ship a reliable, on-demand AI summary for individual sessions. Users should be able to hit a button and get a useful natural-language summary of what happened in a session — key user actions, errors, frustration signals, and outcomes. -## 👀 Goal 2: Make Replay low-maintenance 🔧 +- Wire up the rasterizer → AI pipeline end-to-end +- Iterate on summary quality (prompt tuning, frame selection, context enrichment) +- Configurable customer-specific context (let customers provide their own domain context to make summaries more relevant) +- Measure latency and cost per summary, set targets for both -* Clean up legacy features and finish outstanding migrations from old to new - * Move everyone to PostHog recorder - * Remove support for Blobby V1 - * Remove LTS feature and related Celery tasks -* Improve fidelity - let's squash the filtering bugs and visual inconsistencies that make Replay seem "rough" to use -* Improve debugging facilities - it can be hard to reproduce issues locally or get our hands on the data we need, let's build the internal tooling needed to solve that +## 🎬 Goal 2: Scale video export pipeline -## ⌚ Goal 3: Replay everywhere! +The rasterizer can render sessions to video — now make it production-ready at scale. -* When a recording doesn't exist, explain why! -* Make the view recording buttons consistent -* Make the view recording buttons session aware -* Out with modals, in with tabs! -* Where else can we link to recordings from within other products? -* External integrations - jam.dev? GitHub? ZenDesk? Linear? FreshDesk? Jira? Trello? +- Harden the Temporal worker for reliability and throughput +- Optimize resource usage (browser concurrency, memory, CPU) +- Explore GPU acceleration for further speedup -## 🏎️ Goal 4: Make Replay compliant +## 🏷️ Goal 3: Build session categorizer -* Let's shred some recordings 🤘 +Train a binary classifier that can label sessions as "interesting" or not. Use this model to power a categorizer that groups sessions by properties (rage clicks, errors, conversion flows, etc.) but only surfaces the ones the model flags as interesting. -## 💰 Goal 5: How are our unit economics trending? +- Extend the session summarizer (Goal 1) to also classify sessions as interesting/not — this becomes our training dataset +- Train and evaluate the binary classifier +- Build the categorizer layer on top: group by session properties, filter by interestingness +- Run classifier on a schedule -How have recent infrastructure changes impacted our unit economics? Is our pricing still competitive? +## 🖥️ Goal 4: Build new AI-native frontend for Replay -## 👨‍🔬 Goal 6. Make "show me something interesting" tangible +Reimagine the Replay UI around AI-first workflows. Move away from recency as the primary way to rank recordings — instead, leverage the session categorizer (Goal 3) to surface interesting recordings across categories. -Customers want us to highlight interesting moments and sessions, but that means different things to different people. Can we generate a library of what Interesting means to our customers? +- Replace the chronological recording list with category-driven views +- Surface AI-generated summaries and labels directly in the list UI +- Design the UX for exploring sessions by category rather than scrolling by time + +## 🔌 Goal 5: Basic MCP for Session Replay + +Expose Session Replay as MCP tools so AI agents (Claude Desktop, etc.) can search, retrieve, and summarize sessions programmatically. + +## 💰 Goal 6: Business model for AI features + +Figure out how to price and sustain AI-powered Replay features. + +- Map out costs per session for summaries, categorization, and video export +- Identify break-even points at different usage tiers +- Propose a pricing model that works for customers and for us + +## 🔒 Goal 7: Map out PII, data retention, and IP concerns + +Figure out the compliance, legal, and brand concerns around training future models on recording data. + +- What PII risks exist when sending session data to LLMs? +- How do data retention policies interact with AI-generated artifacts? +- What are the IP implications of training on customer recordings? +- What do customers expect and what do we need to communicate? + +## 🗂️ Goal 8: Set up data labelling system for replays + +Build the infrastructure for labelling replay data at scale — needed to train and evaluate future models on session data. + +- Evaluate labelling tools / build internal tooling +- Design the labelling workflow and quality controls +- Start building a labelled dataset From 306b80e310039740bc253ff65ad725207c5e9611 Mon Sep 17 00:00:00 2001 From: Tue Haulund Date: Wed, 1 Apr 2026 22:00:43 +0200 Subject: [PATCH 2/4] Update contents/teams/replay/objectives.mdx Co-authored-by: coryslater <25396141+fivestarspicy@users.noreply.github.com> --- contents/teams/replay/objectives.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/contents/teams/replay/objectives.mdx b/contents/teams/replay/objectives.mdx index dd1ac571b977..6b3242050a0a 100644 --- a/contents/teams/replay/objectives.mdx +++ b/contents/teams/replay/objectives.mdx @@ -28,7 +28,7 @@ Train a binary classifier that can label sessions as "interesting" or not. Use t Reimagine the Replay UI around AI-first workflows. Move away from recency as the primary way to rank recordings — instead, leverage the session categorizer (Goal 3) to surface interesting recordings across categories. -- Replace the chronological recording list with category-driven views +- Augment the chronological recording list with category-driven views - Surface AI-generated summaries and labels directly in the list UI - Design the UX for exploring sessions by category rather than scrolling by time From ce7b6a5ebcb4f09497eef4f61e58415dade1c9ae Mon Sep 17 00:00:00 2001 From: Tue Haulund Date: Wed, 1 Apr 2026 22:01:02 +0200 Subject: [PATCH 3/4] Update contents/teams/replay/objectives.mdx Co-authored-by: Paul D'Ambra --- contents/teams/replay/objectives.mdx | 1 + 1 file changed, 1 insertion(+) diff --git a/contents/teams/replay/objectives.mdx b/contents/teams/replay/objectives.mdx index 6b3242050a0a..60a22c03a2e0 100644 --- a/contents/teams/replay/objectives.mdx +++ b/contents/teams/replay/objectives.mdx @@ -52,6 +52,7 @@ Figure out the compliance, legal, and brand concerns around training future mode - How do data retention policies interact with AI-generated artifacts? - What are the IP implications of training on customer recordings? - What do customers expect and what do we need to communicate? +- What does s-tier consent look like? ## 🗂️ Goal 8: Set up data labelling system for replays From ef73d56555e19c188e8d385c87c6526a4062acb3 Mon Sep 17 00:00:00 2001 From: Tue Haulund Date: Wed, 1 Apr 2026 22:01:09 +0200 Subject: [PATCH 4/4] Update contents/teams/replay/objectives.mdx Co-authored-by: Paul D'Ambra --- contents/teams/replay/objectives.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/contents/teams/replay/objectives.mdx b/contents/teams/replay/objectives.mdx index 60a22c03a2e0..118c0473d39c 100644 --- a/contents/teams/replay/objectives.mdx +++ b/contents/teams/replay/objectives.mdx @@ -46,7 +46,7 @@ Figure out how to price and sustain AI-powered Replay features. ## 🔒 Goal 7: Map out PII, data retention, and IP concerns -Figure out the compliance, legal, and brand concerns around training future models on recording data. +Before we begin using machine learning we need to figure out the compliance, legal, and brand concerns around training future models on recording data. - What PII risks exist when sending session data to LLMs? - How do data retention policies interact with AI-generated artifacts?