Skip to content

PierreExeter/textbook-AI-assistant

Repository files navigation

Textbook AI Assistant

CI Python 3.11 License: GPL-3.0

An interactive AI tutor for textbooks using RAG (Retrieval-Augmented Generation). Ingests PDF or web sources into a ChromaDB vector store, then answers questions via a LlamaIndex chat engine (interactive CLI or Chainlit web UI).

chainlit-UI

Features

  • Multi-source ingestion -- PDF files (with OCR and table extraction via Docling) and web URLs
  • Pluggable embeddings -- HuggingFace (local, free) or OpenAI
  • Configurable LLM -- Any OpenAI-compatible API (Ollama, OpenAI, vLLM, etc.)
  • Persistent vector store -- ChromaDB stores embeddings locally, so re-ingestion is skipped on subsequent runs
  • Dual interfaces -- Interactive CLI for quick questions, Chainlit web UI for a richer experience
  • Advanced retrieval -- MMR, default, or hybrid search modes with configurable top-k
  • Flexible configuration -- YAML defaults, user config overrides, environment variables, and CLI flags

Documentation

Full documentation is available at pierreexeter.github.io/textbook-AI-assistant.

Getting Started Install, pull a model, and ask your first question
Usage CLI flags, web UI, LLM and embedding provider switching
Configuration Full reference for all config fields
Architecture System overview, data flow, and design decisions
Contributing Dev setup, testing, code style, and extension guides

Quick Start

Requires Python 3.11+, uv, and Ollama (or any OpenAI-compatible API). See Getting Started for full details.

git clone https://github.com/PierreExeter/textbook-AI-assistant.git
cd textbook-AI-assistant
uv sync
cp .env.example .env
ollama pull llama3.2

Usage

CLI

# Ask questions about a PDF textbook
uv run textbook-ai textbook/attention-is-all-you-need.pdf

# Ask questions about a web page
uv run textbook-ai https://example.com/article

See Usage for all CLI flags and provider recipes.

Web UI

TEXTBOOK_AI_SOURCE__PATH=textbook/attention-is-all-you-need.pdf uv run chainlit run src/textbook_ai/chainlit_app.py

Configuration

Configuration is resolved in priority order: CLI flags > env vars > user YAML > defaults. Any field can be overridden via TEXTBOOK_AI_SECTION__KEY environment variables. See Configuration for the full reference.

Security Notes

  • Never commit .env or API keys
  • The .gitignore excludes .env and the .textbook_ai_index/ data directory
  • When using Ollama locally, no API keys leave your machine

License

GPL-3.0

Contributing

Contributions are welcome! See the Contributing Guide for dev setup, testing, and code style.

About

RAG-based AI tutor for textbooks. Ingests PDF or web sources into ChromaDB, answers questions via LlamaIndex. CLI and Chainlit web UI.

Resources

License

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages