Replies: 1 comment
-
| Or, could I ask how to use huggingface transformer library's model in my local machine? | 
Beta Was this translation helpful? Give feedback.
                  
                    0 replies
                  
                
            
  
    Sign up for free
    to join this conversation on GitHub.
    Already have an account?
    Sign in to comment
  
        
    
Uh oh!
There was an error while loading. Please reload this page.
-
I want to use the repository's model file, which implements the forward pass of the model, such as '/vllm/vllm/models/opt.py'. But I don't know how to use this. May I ask you how to use this default (model_names).py files?
Beta Was this translation helpful? Give feedback.
All reactions