-
-
Notifications
You must be signed in to change notification settings - Fork 154
Description
Is your feature request related to a problem? Please describe.
I don't directly use any of the listed AI providers: OpenAI, Anthropic, DeepSeek, Gemini. I happen to use OpenRouter, which like many meta-providers, does provide an OpenAI API compatible endpoint. Through OpenRouter, I can access all the LLMs I choose. In the past I've also used self-hosted apps like LiteLLM to provide the same function - basically act as an API gateway, one place to manage access to the actual LLMs. I would like to be able to use those same LLMs in this way with the AI Assistant in Pulse.
Describe the solution you'd like
Another option under the AI provider configuration to allow self-defined OpenAI API compatible endpoints. User adds the endpoint (in my case it'd be "https://openrouter.ai/api/v1") and the API key, and from there it should function the same as if the user had selected the OpenAI provider directly.
Describe alternatives you've considered
I tried adding the OpenRouter endpoint to the Ollama field but couldn't get it to work. Not sure it's likely to anyway.
Additional context
It seems like allowing meta-providers like this would reduce the need for the team to keep adding providers to the app