You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository was archived by the owner on Sep 10, 2025. It is now read-only.
🚀 The feature, motivation and pitch
We would like to make
torchtunean optional dependency. The first step towards that is to avoid importingtorchtuneunless it is actively used.To make this migration easier, let's move the top level imports into the functions/classes that require them.
Here's an example where we delay imports here:
torchchat/torchchat/usages/eval.py
Lines 216 to 225 in 1384f7d
Task: Update all imports of
torchtunein the repo, such that imports are only done when necessaryTo test your changes, run:
python torchchat.py generate llama3.2-1BAlternatives
No response
Additional context
No response
RFC (Optional)
No response