Providing Lora Adapter path #286
Unanswered
selectorseb
asked this question in
Q&A
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
How can I provide the path to my trained lora adapters ?
When running server_vllm.py what would be the equivalent of
--enable-lora --lora-modules function-lora=/root/functionary/OUTPUT_LORAfrom vllm so I can attach the Lora Adapters to the modelI see the
--enable-loraparameter as an option but not the--lora-modulesBeta Was this translation helpful? Give feedback.
All reactions