-
-
Notifications
You must be signed in to change notification settings - Fork 3.7k
Add optional OpenAI configuration options (OPENAI_BASE_URL, SMART_LLM, FAST_LLM) #864
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
The latest updates on your projects. Learn more about Vercel for Git ↗︎ 1 Skipped Deployment
|
…, FAST_LLM) Co-authored-by: egelhaus <[email protected]>
WalkthroughThe changes introduce dynamic configuration for OpenAI model selection and API endpoint via environment variables. Hardcoded model names and base URLs are replaced with values from Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant API_Controller
participant OpenaiService
participant OpenAI_API
User->>API_Controller: Send chat/completion request
API_Controller->>OpenaiService: Forward request
OpenaiService->>OpenaiService: Select model (from env or default)
OpenaiService->>OpenAI_API: Send request (with model, baseURL from env)
OpenAI_API-->>OpenaiService: Return response
OpenaiService-->>API_Controller: Return response
API_Controller-->>User: Return response
Poem
📜 Recent review detailsConfiguration used: CodeRabbit UI 📒 Files selected for processing (4)
🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Join our Discord community for assistance with any issues or questions. Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I'd say the AI did a pretty good job.
@coderabbitai review |
✅ Actions performedReview triggered.
|
This PR implements the requested feature to add optional OpenAI configuration options that enable using alternative OpenAI-compatible providers like OpenRouter, Claude, Gemini, and others.
Changes Made
New Environment Variables
Added three new optional environment variables to
.env.example
:OPENAI_BASE_URL
: Configure alternative OpenAI-compatible endpointsSMART_LLM
: Model for complex AI tasks (defaults togpt-4.1
)FAST_LLM
: Model for simple/fast AI tasks (defaults togpt-4o-mini
)Updated OpenAI Service
baseURL
support to OpenAI client initializationgetSmartModel()
andgetFastModel()
for dynamic model selectionSMART_LLM
FAST_LLM
Updated Controllers and Services
Usage Examples
Default OpenAI Configuration
OpenRouter Configuration
Claude via OpenRouter
Benefits
✅ Provider Flexibility: Use any OpenAI-compatible API (OpenRouter, Azure OpenAI, etc.)
✅ Cost Optimization: Choose appropriate models for task complexity
✅ Easy Configuration: Change providers without code modifications
✅ Backward Compatibility: Works with existing configurations
✅ Smart Defaults: Fallbacks ensure system works even without new variables
Implementation Details
The implementation maintains full backward compatibility - if the new environment variables are not set, the service falls back to the original hardcoded models.
Fixes #498.
Warning
Firewall rules blocked me from connecting to one or more addresses
I tried to connect to the following addresses, but was blocked by firewall rules:
binaries.prisma.sh
checkpoint.prisma.io
If you need me to access, download, or install something from one of these locations, you can either:
💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.
Summary by CodeRabbit