Skip to content

Conversation

@makelinux
Copy link
Contributor

Allow users to specify the inference model through the INFERENCE_MODEL
environment variable instead of hardcoding it, with fallback to
ollama/llama3.2:3b if not set.

Signed-off-by: Costa Shulyupin [email protected]

Allow users to specify the inference model through the INFERENCE_MODEL
environment variable instead of hardcoding it, with fallback to
ollama/llama3.2:3b if not set.

Signed-off-by: Costa Shulyupin <[email protected]>
@meta-cla meta-cla bot added the CLA Signed This label is managed by the Meta Open Source bot. label Dec 13, 2025
@leseb leseb merged commit 2b85600 into llamastack:main Dec 15, 2025
6 checks passed
r-bit-rry pushed a commit to r-bit-rry/llama-stack that referenced this pull request Dec 23, 2025
Allow users to specify the inference model through the INFERENCE_MODEL
environment variable instead of hardcoding it, with fallback to
ollama/llama3.2:3b if not set.

Signed-off-by: Costa Shulyupin <[email protected]>

Signed-off-by: Costa Shulyupin <[email protected]>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

CLA Signed This label is managed by the Meta Open Source bot.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants