Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .github/workflows/claude.yml
Original file line number Diff line number Diff line change
Expand Up @@ -46,5 +46,5 @@ jobs:
# Optional: Add claude_args to customize behavior and configuration
# See https://github.com/anthropics/claude-code-action/blob/main/docs/usage.md
# or https://code.claude.com/docs/en/cli-reference for available options
# claude_args: '--allowed-tools Bash(gh pr:*)'
# claude_args: '--allowed-tools Bash(gh pr *)'

13 changes: 13 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -205,3 +205,16 @@ cython_debug/
marimo/_static/
marimo/_lsp/
__marimo__/

# FinAlly project-specific
# Runtime SQLite databases (schema lives in backend/app/db, data is volatile)
db/*.db
db/*.db-journal
backend/db/*.db
backend/db/*.db-journal

# Playwright MCP scratch data
.playwright-mcp/

# Ad-hoc screenshots saved at repo root
/finally-*.png
77 changes: 40 additions & 37 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,62 +1,65 @@
# FinAlly — AI Trading Workstation

A visually stunning AI-powered trading workstation that streams live market data, simulates portfolio trading, and integrates an LLM chat assistant that can analyze positions and execute trades via natural language.
An AI-powered trading workstation that streams live market data, simulates a $10k portfolio, and ships with an LLM chat assistant that can analyze positions and execute approved trades via natural language.

Built entirely by coding agents as a capstone project for an agentic AI coding course.
Built as the capstone project for an agentic AI coding course. The full specification lives in [`planning/PLAN.md`](planning/PLAN.md).

## Features
## Stack

- **Live price streaming** via SSE with green/red flash animations
- **Simulated portfolio** — $10k virtual cash, market orders, instant fills
- **Portfolio visualizations** — heatmap (treemap), P&L chart, positions table
- **AI chat assistant** — analyzes holdings, suggests and auto-executes trades
- **Watchlist management** — track tickers manually or via AI
- **Dark terminal aesthetic** — Bloomberg-inspired, data-dense layout
- **Backend** — FastAPI (Python 3.12+, managed with `uv`), SQLite, SSE streaming
- **Frontend** — Next.js 15 + React 19, TypeScript, Tailwind CSS, TradingView Lightweight Charts, Recharts
- **AI** — LiteLLM → OpenRouter (`openrouter/openai/gpt-oss-120b`) with structured outputs
- **Market data** — In-process GBM simulator by default, optional Massive REST client

## Architecture
## Running Locally

Single Docker container serving everything on port 8000:
Configure environment variables at the repo root in `.env` (all optional):

- **Frontend**: Next.js (static export) with TypeScript and Tailwind CSS
- **Backend**: FastAPI (Python/uv) with SSE streaming
- **Database**: SQLite with lazy initialization
- **AI**: LiteLLM → OpenRouter (Cerebras inference) with structured outputs
- **Market data**: Built-in GBM simulator (default) or Massive API (optional)
```bash
OPENROUTER_API_KEY= # omit to run chat in deterministic mock mode
MASSIVE_API_KEY= # omit to run the built-in simulator
LLM_MOCK=false # true forces mock chat responses
SIMULATOR_SEED= # integer seed for reproducible price paths
```

## Quick Start
Backend (serves the API on http://localhost:8000):

```bash
# Clone and configure
cp .env.example .env
# Add your OPENROUTER_API_KEY to .env
cd backend
uv sync --extra dev
uv run uvicorn app.main:app --reload
```

# Run with Docker
docker build -t finally .
docker run -v finally-data:/app/db -p 8000:8000 --env-file .env finally
Frontend (Next.js dev server on http://localhost:3000, proxies to the backend):

# Open http://localhost:8000
```bash
cd frontend
npm install
npm run dev
```

## Environment Variables
The backend lazily creates `backend/db/finally.db` on first request and seeds the default watchlist (AAPL, GOOGL, MSFT, AMZN, TSLA, NVDA, META, JPM, V, NFLX), a `default` user profile with $10,000 cash, and an initial portfolio snapshot.

| Variable | Required | Description |
|---|---|---|
| `OPENROUTER_API_KEY` | Yes | OpenRouter API key for AI chat |
| `MASSIVE_API_KEY` | No | Massive (Polygon.io) key for real market data; omit to use simulator |
| `LLM_MOCK` | No | Set `true` for deterministic mock LLM responses (testing) |
## Tests

## Project Structure
```bash
cd backend
uv run --extra dev pytest # full suite
uv run --extra dev ruff check . # lint
```

## Repository Layout

```
finally/
├── frontend/ # Next.js static export
├── backend/ # FastAPI uv project
├── planning/ # Project documentation and agent contracts
├── test/ # Playwright E2E tests
├── db/ # SQLite volume mount (runtime)
└── scripts/ # Start/stop helpers
├── backend/ FastAPI app (app/), schema + seed (db/), pytest suite (tests/)
├── frontend/ Next.js app (app/, components/, lib/)
├── planning/ Project specification — PLAN.md is the source of truth
└── CLAUDE.md Agent instructions
```

Docker packaging, start/stop scripts, and end-to-end Playwright tests described in `planning/PLAN.md` are not yet in the repo.

## License

See [LICENSE](LICENSE).
5 changes: 4 additions & 1 deletion backend/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ FastAPI backend for the FinAlly AI Trading Workstation.
- `cache.py` - Thread-safe price cache
- `interface.py` - MarketDataSource abstract interface
- `simulator.py` - GBM-based market simulator
- `massive_client.py` - Massive/Polygon.io API client
- `massive_client.py` - Massive API client
- `factory.py` - Data source factory
- `stream.py` - SSE streaming endpoint
- `seed_prices.py` - Default ticker prices and parameters
Expand Down Expand Up @@ -39,7 +39,10 @@ uv run pytest -v

## Environment Variables

- `OPENROUTER_API_KEY` - Optional. If set, use live OpenRouter-backed chat. If not set, chat falls back to deterministic mock mode.
- `LLM_MOCK` - Optional. Set to `true` to force deterministic mock chat responses.
- `MASSIVE_API_KEY` - Optional. If set, use real market data from Massive API. If not set, use the built-in simulator.
- `SIMULATOR_SEED` - Optional. Integer RNG seed for deterministic simulator runs in tests.

## Development

Expand Down
69 changes: 69 additions & 0 deletions backend/app/chat_api.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,69 @@
"""HTTP routes for chat: GET history, POST message."""

from __future__ import annotations

from typing import Any, Literal

from fastapi import APIRouter, Depends, HTTPException, status
from pydantic import BaseModel, Field

from . import db
from .llm import service as llm_service
from .state import AppState, get_state


class ChatPostRequest(BaseModel):
message: str = Field(..., min_length=1, max_length=4000)
allow_trade_execution: bool = False


class ChatMessageResponse(BaseModel):
id: str
role: Literal["user", "assistant"]
content: str
actions: dict[str, Any] | None
created_at: str


class ChatPostResponse(BaseModel):
user_message: ChatMessageResponse
assistant_message: ChatMessageResponse


router = APIRouter(prefix="/api/chat", tags=["chat"])


def _to_response(msg: db.ChatMessage) -> ChatMessageResponse:
return ChatMessageResponse(
id=msg.id,
role=msg.role,
content=msg.content,
actions=msg.actions,
created_at=msg.created_at,
)


@router.get("", response_model=list[ChatMessageResponse])
def get_history(limit: int = 50) -> list[ChatMessageResponse]:
if limit < 1 or limit > 500:
raise HTTPException(
status_code=status.HTTP_400_BAD_REQUEST,
detail="limit must be between 1 and 500",
)
return [_to_response(m) for m in db.chat.list_recent(limit=limit)]


@router.post("", response_model=ChatPostResponse)
async def post_message(
body: ChatPostRequest, state: AppState = Depends(get_state)
) -> ChatPostResponse:
user_row, assistant_row, _ = await llm_service.handle_user_message(
body.message,
state.price_cache,
state.market_source,
allow_trade_execution=body.allow_trade_execution,
)
return ChatPostResponse(
user_message=_to_response(user_row),
assistant_message=_to_response(assistant_row),
)
37 changes: 37 additions & 0 deletions backend/app/db/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
"""FinAlly persistence layer.

A thin SQLite wrapper. No ORM — just module-level functions per table that take
and return plain dataclasses or primitives. The schema is created and seeded
lazily on first use via `ensure_initialized()`.
"""

from . import chat, positions, profile, snapshots, trades, watchlist
from .chat import ChatMessage
from .init import (
DEFAULT_CASH_BALANCE,
DEFAULT_USER_ID,
DEFAULT_WATCHLIST,
ensure_initialized,
reset_initialization_state,
)
from .positions import Position
from .snapshots import Snapshot
from .trades import Trade

__all__ = [
"ensure_initialized",
"reset_initialization_state",
"DEFAULT_USER_ID",
"DEFAULT_CASH_BALANCE",
"DEFAULT_WATCHLIST",
"ChatMessage",
"Position",
"Snapshot",
"Trade",
"chat",
"positions",
"profile",
"snapshots",
"trades",
"watchlist",
]
77 changes: 77 additions & 0 deletions backend/app/db/chat.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,77 @@
"""Chat messages repository."""

from __future__ import annotations

import json
import uuid
from dataclasses import dataclass
from datetime import UTC, datetime
from typing import Any, Literal

from .connection import connect
from .init import DEFAULT_USER_ID

Role = Literal["user", "assistant"]


@dataclass(frozen=True)
class ChatMessage:
id: str
role: Role
content: str
actions: dict[str, Any] | None
created_at: str


def _now() -> str:
return datetime.now(UTC).isoformat()


def append_message(
role: Role,
content: str,
actions: dict[str, Any] | None = None,
user_id: str = DEFAULT_USER_ID,
) -> ChatMessage:
msg = ChatMessage(
id=str(uuid.uuid4()),
role=role,
content=content,
actions=actions,
created_at=_now(),
)
with connect() as conn:
conn.execute(
"INSERT INTO chat_messages (id, user_id, role, content, actions, created_at) "
"VALUES (?, ?, ?, ?, ?, ?)",
(
msg.id, user_id, msg.role, msg.content,
json.dumps(actions) if actions is not None else None,
msg.created_at,
),
)
return msg


def list_recent(limit: int = 50, user_id: str = DEFAULT_USER_ID) -> list[ChatMessage]:
"""Return up to `limit` most recent messages, oldest-first (chat-render order)."""
with connect() as conn:
rows = conn.execute(
"SELECT id, role, content, actions, created_at FROM chat_messages "
"WHERE user_id = ? ORDER BY created_at DESC LIMIT ?",
(user_id, limit),
).fetchall()

out: list[ChatMessage] = []
for r in reversed(rows):
actions = json.loads(r["actions"]) if r["actions"] else None
out.append(
ChatMessage(
id=r["id"],
role=r["role"],
content=r["content"],
actions=actions,
created_at=r["created_at"],
)
)
return out
39 changes: 39 additions & 0 deletions backend/app/db/connection.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
"""SQLite connection management."""

from __future__ import annotations

import os
import sqlite3
from collections.abc import Iterator
from contextlib import contextmanager
from pathlib import Path

DEFAULT_DB_PATH = "db/finally.db"
DB_PATH_ENV = "FINALLY_DB_PATH"


def get_db_path() -> str:
"""Resolve the SQLite file path. Honors $FINALLY_DB_PATH, else 'db/finally.db'."""
return os.environ.get(DB_PATH_ENV, DEFAULT_DB_PATH)


def _connect(path: str) -> sqlite3.Connection:
"""Open a connection with WAL + foreign keys enabled. Creates parent dirs as needed."""
if path != ":memory:":
Path(path).parent.mkdir(parents=True, exist_ok=True)
conn = sqlite3.connect(path, check_same_thread=False, isolation_level=None)
conn.row_factory = sqlite3.Row
conn.execute("PRAGMA foreign_keys = ON")
if path != ":memory:":
conn.execute("PRAGMA journal_mode = WAL")
return conn


@contextmanager
def connect(path: str | None = None) -> Iterator[sqlite3.Connection]:
"""Yield a connection scoped to a single operation. Closes on exit."""
conn = _connect(path or get_db_path())
try:
yield conn
finally:
conn.close()
Loading