Skip to content

Conversation

leseb
Copy link
Collaborator

@leseb leseb commented Oct 6, 2025

0.3.0 with llamastack/llama-stack#3625 forces us to use "llama stack run" and the server module doesn't execute the server anymore.

0.3.0 with llamastack/llama-stack#3625 forces us
to use "llama stack run" and the server module doesn't execute the
server anymore.

Signed-off-by: Sébastien Han <[email protected]>
if python -c "
# Determine which CLI to use based on llama-stack version
VERSION_CODE=$(python -c "
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can we make the script itself a shell HERE document like in

read -r -d '' detect_version_script <<EOT
....
EOT

VERSION_CODE=$(python -c "$detect_version_script")

That makes it easier with formatting etc I guess. You could also use single quote around EOT to avoid variable expansion.

btw, I'm not super happy with having shell scripts that call python code embedded in golang binaries, including hard-coded version numbers.

Also, I wonder why isn't this part of the distribution imnage as the distribution image knows best how to start up itself ?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's ok if we skip that HERE document for now as it doesn't make this hack much beautifuler anyways :)

Copy link
Collaborator

@rhuss rhuss left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The particular change looks good to me, but I really wonder why we need to calculate the start command in the controller and don't let the distribution image startup on its own as it knows best how to do the starting ?

@leseb
Copy link
Collaborator Author

leseb commented Oct 13, 2025

The particular change looks good to me, but I really wonder why we need to calculate the start command in the controller and don't let the distribution image startup on its own as it knows best how to do the starting ?

This handles the case where we use the ConfigMap to store the run.yaml, when this happens we need to override the entrypoint to give the path of the run.yaml file.

@rhuss
Copy link
Collaborator

rhuss commented Oct 13, 2025

I see the use case, but I don't think that this is the only or best solution. Alternatively, the startup script (or heck, even lls run could do that) could check for a run.yaml mounted somewhere and pick that up if present. That startup script can perfectly well live within the distribution image. This entry point should be stable and don't have any arguments (maybe the location of the run.yaml if we want to make it explicit). That way we don't have to change the operator but the distribution image's startup script if there are changes like these.

Not something that should be fixed here though, so fine with this PR.

@VaishnaviHire
Copy link
Collaborator

/lgtm

+1 for moving it to distribution image.

Copy link
Collaborator

@nathan-weinberg nathan-weinberg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1 as well - if we can open a tracker for that I can take a look at doing it

@leseb leseb merged commit 90365e7 into llamastack:main Oct 16, 2025
6 checks passed
@leseb leseb deleted the new-cli branch October 16, 2025 14:58
@leseb
Copy link
Collaborator Author

leseb commented Oct 16, 2025

+1 as well - if we can open a tracker for that I can take a look at doing it

Do you mind doing it for me, please? @nathan-weinberg I can work with you on what needs to be done. Thanks for your help!

VaishnaviHire pushed a commit to VaishnaviHire/llama-stack-k8s-operator that referenced this pull request Oct 17, 2025
0.3.0 with llamastack/llama-stack#3625 forces us
to use "llama stack run" and the server module doesn't execute the
server anymore.

Signed-off-by: Sébastien Han <[email protected]>
(cherry picked from commit 90365e7)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants