Skip to content
#

adalora

Here are 9 public repositories matching this topic...

Language: All
Filter by language

Elixir port of HuggingFace's PEFT (Parameter-Efficient Fine-Tuning) library. Implements LoRA, AdaLoRA, IA3, prefix tuning, prompt tuning, and 30+ state-of-the-art PEFT methods for efficient neural network adaptation. Built for the BEAM ecosystem with native Nx/Axon integration.

  • Updated Jan 1, 2026
  • Jupyter Notebook
NSIO_NeuralSearchIndexingOptimization

Optimizing the Differentiable Search Index (DSI) with data augmentation (Num2Word, Stopwords Removal, POS-MLM) and parameter-efficient fine-tuning (LoRA, QLoRA, AdaLoRA, ConvoLoRA), improving retrieval accuracy and efficiency while reducing memory and computational overhead. Evaluated on the MS MARCO dataset for scalable performance.

  • Updated Apr 3, 2025
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the adalora topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the adalora topic, visit your repo's landing page and select "manage topics."

Learn more