From 6e78b4d53ddfd7f1a749e1a755694042fb1e720b Mon Sep 17 00:00:00 2001 From: wsiemens Date: Wed, 1 Apr 2026 21:06:49 -0500 Subject: [PATCH 1/3] initial github for finally project --- .claude/commands/doc-review.md | 1 + .claude/settings.json | 4 +- .claude/settings.local.json | 11 ++ README.md | 78 ++++----- .../.claude-plugin/plugin.json | 5 + independent-reviewer/hooks/hooks.json | 15 ++ planning/MARKET_DATA_SUMMARY.md | 104 ------------ planning/PLAN.md | 152 +++++++++++++++++- 8 files changed, 221 insertions(+), 149 deletions(-) create mode 100644 .claude/commands/doc-review.md create mode 100644 .claude/settings.local.json create mode 100644 independent-reviewer/.claude-plugin/plugin.json create mode 100644 independent-reviewer/hooks/hooks.json delete mode 100644 planning/MARKET_DATA_SUMMARY.md diff --git a/.claude/commands/doc-review.md b/.claude/commands/doc-review.md new file mode 100644 index 00000000..e9627c72 --- /dev/null +++ b/.claude/commands/doc-review.md @@ -0,0 +1 @@ +Review the documentation file in the planning folder called $ARGUMENTS and add questions, clarifications or feedback to a new section at the end, along with any opportunities to simplify. \ No newline at end of file diff --git a/.claude/settings.json b/.claude/settings.json index aa06f43d..f0bf3a08 100644 --- a/.claude/settings.json +++ b/.claude/settings.json @@ -2,6 +2,8 @@ "enabledPlugins": { "frontend-design@claude-plugins-official": true, "context7@claude-plugins-official": true, - "playwright@claude-plugins-official": true + "playwright@claude-plugins-official": true, + "independent-reviewer@Wayne-Tools": true, + "code-simplifier@claude-plugins-official": true } } diff --git a/.claude/settings.local.json b/.claude/settings.local.json new file mode 100644 index 00000000..683b58c3 --- /dev/null +++ b/.claude/settings.local.json @@ -0,0 +1,11 @@ +{ + "permissions": { + "allow": [ + "WebSearch", + "WebFetch(domain:polygon.io)", + "WebFetch(domain:massive.com)", + "Bash(codex exec:*)", + "Skill(update-config)" + ] + } +} diff --git a/README.md b/README.md index 3f2582ae..e228ff54 100644 --- a/README.md +++ b/README.md @@ -1,62 +1,64 @@ # FinAlly — AI Trading Workstation -A visually stunning AI-powered trading workstation that streams live market data, simulates portfolio trading, and integrates an LLM chat assistant that can analyze positions and execute trades via natural language. - -Built entirely by coding agents as a capstone project for an agentic AI coding course. +A visually striking AI-powered trading workstation with live market data, simulated portfolio management, and an LLM chat assistant that can analyze positions and execute trades. Looks and feels like a Bloomberg terminal with an AI copilot. ## Features -- **Live price streaming** via SSE with green/red flash animations -- **Simulated portfolio** — $10k virtual cash, market orders, instant fills -- **Portfolio visualizations** — heatmap (treemap), P&L chart, positions table -- **AI chat assistant** — analyzes holdings, suggests and auto-executes trades -- **Watchlist management** — track tickers manually or via AI -- **Dark terminal aesthetic** — Bloomberg-inspired, data-dense layout - -## Architecture - -Single Docker container serving everything on port 8000: - -- **Frontend**: Next.js (static export) with TypeScript and Tailwind CSS -- **Backend**: FastAPI (Python/uv) with SSE streaming -- **Database**: SQLite with lazy initialization -- **AI**: LiteLLM → OpenRouter (Cerebras inference) with structured outputs -- **Market data**: Built-in GBM simulator (default) or Massive API (optional) +- Live streaming prices with green/red flash animations +- Sparkline mini-charts built from the SSE stream +- $10,000 virtual cash to trade with (market orders, instant fill) +- Portfolio heatmap (treemap) and P&L chart +- AI chat assistant — ask questions, get analysis, execute trades via natural language ## Quick Start ```bash -# Clone and configure +# Copy and fill in your API key cp .env.example .env -# Add your OPENROUTER_API_KEY to .env -# Run with Docker -docker build -t finally . -docker run -v finally-data:/app/db -p 8000:8000 --env-file .env finally +# Start (macOS/Linux) +./scripts/start_mac.sh -# Open http://localhost:8000 +# Start (Windows PowerShell) +./scripts/start_windows.ps1 ``` +Open [http://localhost:8000](http://localhost:8000). + ## Environment Variables | Variable | Required | Description | |---|---|---| -| `OPENROUTER_API_KEY` | Yes | OpenRouter API key for AI chat | -| `MASSIVE_API_KEY` | No | Massive (Polygon.io) key for real market data; omit to use simulator | +| `OPENROUTER_API_KEY` | Yes | OpenRouter key for LLM chat | +| `MASSIVE_API_KEY` | No | Polygon.io key for real market data (simulator used if absent) | | `LLM_MOCK` | No | Set `true` for deterministic mock LLM responses (testing) | -## Project Structure +## Architecture +Single Docker container on port 8000: + +- **Frontend**: Next.js (TypeScript), built as a static export, served by FastAPI +- **Backend**: FastAPI (Python/uv) +- **Database**: SQLite, volume-mounted at `db/finally.db` +- **Real-time**: Server-Sent Events (SSE) for price streaming +- **AI**: LiteLLM → OpenRouter (Cerebras inference) with structured outputs + +## Development + +```bash +# Backend +cd backend && uv sync --extra dev +uv run pytest -v + +# Frontend +cd frontend && npm install +npm run dev ``` -finally/ -├── frontend/ # Next.js static export -├── backend/ # FastAPI uv project -├── planning/ # Project documentation and agent contracts -├── test/ # Playwright E2E tests -├── db/ # SQLite volume mount (runtime) -└── scripts/ # Start/stop helpers -``` -## License +## Testing + +E2E tests use Playwright with `LLM_MOCK=true`: -See [LICENSE](LICENSE). +```bash +cd test && docker compose -f docker-compose.test.yml up +``` diff --git a/independent-reviewer/.claude-plugin/plugin.json b/independent-reviewer/.claude-plugin/plugin.json new file mode 100644 index 00000000..f2379ba7 --- /dev/null +++ b/independent-reviewer/.claude-plugin/plugin.json @@ -0,0 +1,5 @@ +{ + "name": "independent-reviewer", + "description": "Carry out an independent review of all changes since last commit", + "version": "1.0.0" +} \ No newline at end of file diff --git a/independent-reviewer/hooks/hooks.json b/independent-reviewer/hooks/hooks.json new file mode 100644 index 00000000..0cd1c9ba --- /dev/null +++ b/independent-reviewer/hooks/hooks.json @@ -0,0 +1,15 @@ +{ + "hooks": { + "Stop": [ + { + "hooks": [ + { + "type": "command", + "command": "codex exec \"Review changes since last commit and write changes to planning/REVIEW.md\"" + } + ] + } + ] + } +} + diff --git a/planning/MARKET_DATA_SUMMARY.md b/planning/MARKET_DATA_SUMMARY.md deleted file mode 100644 index ae518283..00000000 --- a/planning/MARKET_DATA_SUMMARY.md +++ /dev/null @@ -1,104 +0,0 @@ -# Market Data Backend — Summary - -**Status:** Complete, tested, reviewed, all issues resolved. - -## What Was Built - -A complete market data subsystem in `backend/app/market/` (8 modules, ~500 lines) providing live price simulation and real market data via a unified interface. - -### Architecture - -``` -MarketDataSource (ABC) -├── SimulatorDataSource → GBM simulator (default, no API key needed) -└── MassiveDataSource → Polygon.io REST poller (when MASSIVE_API_KEY set) - │ - ▼ - PriceCache (thread-safe, in-memory) - │ - ├──→ SSE stream endpoint (/api/stream/prices) - ├──→ Portfolio valuation - └──→ Trade execution -``` - -### Modules - -| File | Purpose | -|------|---------| -| `models.py` | `PriceUpdate` — immutable frozen dataclass (ticker, price, previous_price, timestamp, change, direction) | -| `interface.py` | `MarketDataSource` — abstract base class defining `start/stop/add_ticker/remove_ticker/get_tickers` | -| `cache.py` | `PriceCache` — thread-safe price store with version counter for SSE change detection | -| `seed_prices.py` | Realistic seed prices, per-ticker GBM params (drift/volatility), correlation groups | -| `simulator.py` | `GBMSimulator` (Geometric Brownian Motion with Cholesky-correlated moves) + `SimulatorDataSource` | -| `massive_client.py` | `MassiveDataSource` — REST polling client for Polygon.io via the `massive` package | -| `factory.py` | `create_market_data_source()` — selects simulator or Massive based on `MASSIVE_API_KEY` env var | -| `stream.py` | `create_stream_router()` — FastAPI SSE endpoint factory using version-based change detection | - -### Key Design Decisions - -- **Strategy pattern** — both data sources implement the same ABC; downstream code is source-agnostic -- **PriceCache as single point of truth** — producers write, consumers read; no direct coupling -- **GBM with correlated moves** — Cholesky decomposition of sector-based correlation matrix; tech stocks correlate at 0.6, finance at 0.5, cross-sector at 0.3 -- **Random shock events** — ~0.1% chance per tick per ticker of a 2-5% move for visual drama -- **SSE over WebSockets** — simpler, one-way push, universal browser support - -## Test Suite - -**73 tests, all passing.** 6 test modules in `backend/tests/market/`. - -| Module | Tests | Coverage | -|--------|-------|----------| -| test_models.py | 11 | models.py: 100% | -| test_cache.py | 13 | cache.py: 100% | -| test_simulator.py | 17 | simulator.py: 98% | -| test_simulator_source.py | 10 | (integration tests) | -| test_factory.py | 7 | factory.py: 100% | -| test_massive.py | 13 | massive_client.py: 56% (expected — API methods mocked) | - -Overall coverage: 84%. - -## Code Review & Fixes Applied - -A comprehensive code review identified 7 issues. All were resolved: - -1. **pyproject.toml build config** — added `[tool.hatch.build.targets.wheel] packages = ["app"]` -2. **Lazy imports removed** — `massive` is a core dependency; imports moved to top level -3. **SSE return type fixed** — `_generate_events` annotated as `AsyncGenerator[str, None]` -4. **Public `get_tickers()`** — added to `GBMSimulator` to avoid private attribute access -5. **Correlation constants cleaned up** — removed unused `DEFAULT_CORR`, consolidated into `CROSS_GROUP_CORR` -6. **Unused test imports removed** — `pytest`, `math`, `asyncio` cleaned from 4 test files -7. **Massive test mocks fixed** — `source._client` set in tests, patches target correct names - -## Demo - -A Rich terminal demo is available at `backend/market_data_demo.py`: - -```bash -cd backend -uv run market_data_demo.py -``` - -Displays a live-updating dashboard with all 10 tickers, sparklines, color-coded direction arrows, and an event log for notable price moves. Runs 60 seconds or until Ctrl+C. - -## Usage for Downstream Code - -```python -from app.market import PriceCache, create_market_data_source - -# Startup -cache = PriceCache() -source = create_market_data_source(cache) # Reads MASSIVE_API_KEY -await source.start(["AAPL", "GOOGL", "MSFT", ...]) - -# Read prices -update = cache.get("AAPL") # PriceUpdate or None -price = cache.get_price("AAPL") # float or None -all_prices = cache.get_all() # dict[str, PriceUpdate] - -# Dynamic watchlist -await source.add_ticker("TSLA") -await source.remove_ticker("GOOGL") - -# Shutdown -await source.stop() -``` diff --git a/planning/PLAN.md b/planning/PLAN.md index bc1811b3..2ba779f4 100644 --- a/planning/PLAN.md +++ b/planning/PLAN.md @@ -147,13 +147,25 @@ LLM_MOCK=false Both the simulator and the Massive client implement the same abstract interface. The backend selects which to use based on the environment variable. All downstream code (SSE streaming, price cache, frontend) is agnostic to the source. +### Daily Change % + +Both implementations expose a `prev_close` price per ticker, used to compute the "daily change %" shown in the watchlist: + +``` +change = current_price - prev_close +change_pct = (change / prev_close) * 100 +``` + +- **Massive API**: fetches `prev_close` once on startup (and when a new ticker is added) via `GET /v2/aggs/ticker/{ticker}/prev` — available on the free tier. The response field is `results[0].c` (previous day's closing price). +- **Simulator**: each ticker is seeded with a hardcoded `prev_close` that is slightly different from its starting price (e.g., ±0.5–1.5%), giving realistic-looking daily change values from the start. + ### Simulator (Default) - Generates prices using geometric Brownian motion (GBM) with configurable drift and volatility per ticker - Updates at ~500ms intervals - Correlated moves across tickers (e.g., tech stocks move together) - Occasional random "events" — sudden 2-5% moves on a ticker for drama -- Starts from realistic seed prices (e.g., AAPL ~$190, GOOGL ~$175, etc.) +- Starts from realistic seed prices (e.g., AAPL ~$190, GOOGL ~$175, etc.) with a hardcoded `prev_close` per ticker - Runs as an in-process background task — no external dependencies ### Massive API (Optional) @@ -162,12 +174,13 @@ Both the simulator and the Massive client implement the same abstract interface. - Polls for the union of all watched tickers on a configurable interval - Free tier (5 calls/min): poll every 15 seconds - Paid tiers: poll every 2-15 seconds depending on tier +- Fetches `prev_close` per ticker via `GET /v2/aggs/ticker/{ticker}/prev` on startup and when new tickers are added - Parses REST response into the same format as the simulator ### Shared Price Cache - A single background task (simulator or Massive poller) writes to an in-memory price cache -- The cache holds the latest price, previous price, and timestamp for each ticker +- The cache holds the latest price, previous price, prev_close, and timestamp for each ticker - SSE streams read from this cache and push updates to connected clients - This architecture supports future multi-user scenarios without changes to the data layer @@ -228,7 +241,7 @@ All tables include a `user_id` column defaulting to `"default"`. This is hardcod **portfolio_snapshots** — Portfolio value over time (for P&L chart). Recorded every 30 seconds by a background task, and immediately after each trade execution. - `id` TEXT PRIMARY KEY (UUID) - `user_id` TEXT (default: `"default"`) -- `total_value` REAL +- `total_value` REAL — cash balance plus market value of all open positions - `recorded_at` TEXT (ISO timestamp) **chat_messages** — Conversation history with LLM @@ -253,6 +266,23 @@ All tables include a `user_id` column defaulting to `"default"`. This is hardcod |--------|------|-------------| | GET | `/api/stream/prices` | SSE stream of live price updates | +Each SSE event is a JSON object: +```json +{ + "ticker": "AAPL", + "price": 191.42, + "prev_price": 191.10, + "prev_close": 189.50, + "change": 0.32, + "change_pct": 0.17, + "day_change": 1.92, + "day_change_pct": 1.01, + "direction": "up", + "timestamp": "2026-03-31T12:00:00.000Z" +} +``` +`direction` is `"up"`, `"down"`, or `"unchanged"`. `change`/`change_pct` are tick-over-tick. `day_change`/`day_change_pct` are relative to `prev_close`. + ### Portfolio | Method | Path | Description | |--------|------|-------------| @@ -260,6 +290,58 @@ All tables include a `user_id` column defaulting to `"default"`. This is hardcod | POST | `/api/portfolio/trade` | Execute a trade: `{ticker, quantity, side}` | | GET | `/api/portfolio/history` | Portfolio value snapshots over time (for P&L chart) | +**GET `/api/portfolio`** response: +```json +{ + "cash": 7430.00, + "total_value": 12150.00, + "positions": [ + { + "ticker": "AAPL", + "quantity": 10, + "avg_cost": 185.00, + "current_price": 191.42, + "market_value": 1914.20, + "unrealized_pnl": 64.20, + "unrealized_pnl_pct": 3.47 + } + ] +} +``` +`total_value` = `cash` + sum of all `market_value`. Positions with `quantity = 0` are omitted. + +**POST `/api/portfolio/trade`** request: `{ticker, quantity, side}` where `side` is `"buy"` or `"sell"`. + +Success response (`200`): +```json +{ + "ok": true, + "trade": { + "ticker": "AAPL", + "side": "buy", + "quantity": 10, + "price": 191.42, + "executed_at": "2026-03-31T12:00:00.000Z" + } +} +``` +Error response (`400`): +```json +{ + "ok": false, + "error": "Insufficient cash" +} +``` + +**GET `/api/portfolio/history`** response: +```json +[ + {"timestamp": "2026-03-31T11:30:00.000Z", "total_value": 10000.00}, + {"timestamp": "2026-03-31T12:00:00.000Z", "total_value": 12150.00} +] +``` +Ordered oldest-first. Used directly by the P&L chart. + ### Watchlist | Method | Path | Description | |--------|------|-------------| @@ -267,16 +349,71 @@ All tables include a `user_id` column defaulting to `"default"`. This is hardcod | POST | `/api/watchlist` | Add a ticker: `{ticker}` | | DELETE | `/api/watchlist/{ticker}` | Remove a ticker | +**GET `/api/watchlist`** response: +```json +[ + { + "ticker": "AAPL", + "price": 191.42, + "prev_price": 191.10, + "change": 0.32, + "change_pct": 0.17, + "direction": "up" + } +] +``` +If no price is available yet (e.g. ticker just added), `price` is `null` and other price fields are `null`. + +**POST `/api/watchlist`** request: `{ticker}`. Returns `201` on success: +```json +{"ok": true, "ticker": "PYPL"} +``` +Returns `400` if ticker already in watchlist, `422` if ticker is invalid/not found in price cache. + +**DELETE `/api/watchlist/{ticker}`** returns `200`: +```json +{"ok": true, "ticker": "PYPL"} +``` +Returns `404` if ticker not in watchlist. + ### Chat | Method | Path | Description | |--------|------|-------------| | POST | `/api/chat` | Send a message, receive complete JSON response (message + executed actions) | +**POST `/api/chat`** request: `{message: "Buy 5 shares of NVDA"}`. + +Response (`200`): +```json +{ + "message": "Done — bought 5 shares of NVDA at $875.10.", + "trades_executed": [ + { + "ticker": "NVDA", + "side": "buy", + "quantity": 5, + "price": 875.10, + "ok": true + } + ], + "watchlist_changes_executed": [ + { + "ticker": "PYPL", + "action": "add", + "ok": true + } + ] +} +``` +`trades_executed` and `watchlist_changes_executed` are always present (empty arrays if none). Each item includes `ok: true/false` and an optional `error` string if the action failed validation. + ### System | Method | Path | Description | |--------|------|-------------| | GET | `/api/health` | Health check (for Docker/deployment) | +**GET `/api/health`** response (`200`): `{"status": "ok"}` + --- ## 9. LLM Integration @@ -290,7 +427,7 @@ There is an OPENROUTER_API_KEY in the .env file in the project root. When the user sends a chat message, the backend: 1. Loads the user's current portfolio context (cash, positions with P&L, watchlist with live prices, total portfolio value) -2. Loads recent conversation history from the `chat_messages` table +2. Loads the last 50 messages from `chat_messages` (capped to avoid context overflow) 3. Constructs a prompt with a system message, portfolio context, conversation history, and the user's new message 4. Calls the LLM via LiteLLM → OpenRouter, requesting structured output, using the cerebras-inference skill 5. Parses the complete structured JSON response @@ -309,14 +446,15 @@ The LLM is instructed to respond with JSON matching this schema: {"ticker": "AAPL", "side": "buy", "quantity": 10} ], "watchlist_changes": [ - {"ticker": "PYPL", "action": "add"} + {"ticker": "PYPL", "action": "add"}, + {"ticker": "NFLX", "action": "remove"} ] } ``` - `message` (required): The conversational text shown to the user - `trades` (optional): Array of trades to auto-execute. Each trade goes through the same validation as manual trades (sufficient cash for buys, sufficient shares for sells) -- `watchlist_changes` (optional): Array of watchlist modifications +- `watchlist_changes` (optional): Array of watchlist modifications. `action` is `"add"` or `"remove"`. ### Auto-Execution @@ -454,3 +592,5 @@ The container is designed to deploy to AWS App Runner, Render, or any container - Portfolio visualization: heatmap renders with correct colors, P&L chart has data points - AI chat (mocked): send a message, receive a response, trade execution appears inline - SSE resilience: disconnect and verify reconnection + +--- From 3961650d58c2ece8938b7afd1f6dd931276997c9 Mon Sep 17 00:00:00 2001 From: wsiemens <34421933+wsiemens@users.noreply.github.com> Date: Fri, 3 Apr 2026 19:11:02 -0500 Subject: [PATCH 2/3] "Update Claude PR Assistant workflow" From 35068c9f147a686a894c490bc4f7fc88983003fa Mon Sep 17 00:00:00 2001 From: wsiemens <34421933+wsiemens@users.noreply.github.com> Date: Fri, 3 Apr 2026 19:11:03 -0500 Subject: [PATCH 3/3] "Update Claude Code Review workflow"