Skip to content

pSciComp/exoLLMTraining

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LLM Training Exercise

This is a LLM training exercise written in Python to step through the practical steps required when training an LLM locally or on a HPC cluster setup.

What we actually do here, is that we are using LoRA, QLoRA to be precise, to fine-tune Apertus 8B on the Swiss Code of Obligations.

To be fair, the fine-tuning is performed and configured in a somewhat crude manner, but the goal here is also not to build a tool that we can actually sell to a lawyer, but to understand how we can setup a training properly.

Usage

This project contains various exercises defined under ./exercises. To get started head over to Exercise 0 that will guide you through the initial setup.

Further exercises are categorised into:

Structure
Exercises specific to project structure and coding practices.

HPC
Exercises that focus on the usage of a HPC cluster (e.g. Slurm).

Exercises

Structure

HPC

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Sponsor this project

 

Packages

 
 
 

Contributors

Languages