Skip to content

Latest commit

 

History

History
64 lines (42 loc) · 1.42 KB

File metadata and controls

64 lines (42 loc) · 1.42 KB

rag_time

Simple Chat UI to chat about a private codebase using LLMs locally.

Technology:

Getting started

Prerequisites

  1. Make sure you have Python 3.9 or later installed

  2. Download and install Ollama

  3. Pull the model:

    ollama pull llama:3b

Run the Chat bot

  1. Create a Python virtual environment and activate it:

    python3 -m venv .venv && source .venv/bin/activate
  2. Install Python dependencies:

    pip install -r requirements.txt
  3. Clone an example repository to question the chat bot about:

    git clone https://github.com/discourse/discourse
  4. Set up the vector database:

    python ingest-code.py
  5. Start the chat bot:

    chainlit run main.py
  6. To exit the Python virtual environment after you are done, run:

    deactivate

Make it your own

Modify the .env file to run the chat bot on your codebase and language.