Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 2 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -80,11 +80,10 @@ spec:
replicas: 1
server:
distribution:
name: ollama
name: starter
containerSpec:
port: 8321
env:
- name: INFERENCE_MODEL
- name: OLLAMA_INFERENCE_MODEL
value: "llama3.2:1b"
- name: OLLAMA_URL
value: "http://ollama-server-service.ollama-dist.svc.cluster.local:11434"
Expand Down
4 changes: 2 additions & 2 deletions config/samples/_v1alpha1_llamastackdistribution.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -7,13 +7,13 @@ spec:
server:
containerSpec:
env:
- name: INFERENCE_MODEL
- name: OLLAMA_INFERENCE_MODEL
value: 'llama3.2:1b'
- name: OLLAMA_URL
value: 'http://ollama-server-service.ollama-dist.svc.cluster.local:11434'
name: llama-stack
distribution:
name: ollama
name: starter
# Uncomment the storage section to use persistent storage
# storage: {} # Will use default size of 10Gi and default mount path of /.llama
# Or specify custom values:
Expand Down
12 changes: 5 additions & 7 deletions config/samples/example-with-configmap.yaml
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If #156 merges before this PR, this file can be deleted.

Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ data:
run.yaml: |
# Llama Stack Configuration
version: '2'
image_name: ollama
image_name: starter
apis:
- inference
providers:
Expand All @@ -25,19 +25,17 @@ data:
apiVersion: llamastack.io/v1alpha1
kind: LlamaStackDistribution
metadata:
name: llamastack-with-config
name: llamastack-with-userconfig
spec:
replicas: 1
server:
distribution:
name: ollama
name: starter
containerSpec:
port: 8321
env:
- name: INFERENCE_MODEL
value: "llama3.2:1b"
- name: OLLAMA_URL
value: "http://ollama-server-service.ollama-dist.svc.cluster.local:11434"
- name: OLLAMA_EMBEDDING_MODEL
value: all-minilm:l6-v2
userConfig:
configMapName: llama-stack-config
# configMapNamespace: "" # Optional - defaults to the same namespace as the CR
19 changes: 19 additions & 0 deletions config/samples/example-withoutconfigmpa.yaml
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

After #156 is merged, this file will have the same properties as example-with-custom-config.yaml, and example-with-configmap will be removed. If this one goes through first I'll update the name:
config/samples/example-withoutconfigmpa.yaml ->
config/samples/example-withoutconfigmap.yaml

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

i saw #156 is closed, any action still needed for this PR?

Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
---
apiVersion: llamastack.io/v1alpha1
kind: LlamaStackDistribution
metadata:
name: llamastack-without-userconfig
spec:
replicas: 1
server:
distribution:
name: starter
containerSpec:
env:
- name: OLLAMA_INFERENCE_MODEL
value: "llama3.2:1b"
- name: OLLAMA_URL
value: "http://ollama-server-service.ollama-dist.svc.cluster.local:11434"
storage:
size: "10Gi" # Optional - defaults to 10Gi
mountPath: "/home/lls/.lls" # Optional - defaults to /.llama
11 changes: 4 additions & 7 deletions distributions.json
Original file line number Diff line number Diff line change
@@ -1,9 +1,6 @@
{
"starter": "docker.io/llamastack/distribution-starter:latest",
"ollama": "docker.io/llamastack/distribution-ollama:latest",
"bedrock": "docker.io/llamastack/distribution-bedrock:latest",
"remote-vllm": "docker.io/llamastack/distribution-remote-vllm:latest",
"tgi": "docker.io/llamastack/distribution-tgi:latest",
"together": "docker.io/llamastack/distribution-together:latest",
"vllm-gpu": "docker.io/llamastack/distribution-vllm-gpu:latest"
"starter": "docker.io/llamastack/distribution-starter:latest",
"remote-vllm": "docker.io/llamastack/distribution-remote-vllm:latest",
"meta-reference-gpu": "docker.io/llamastack/distribution-meta-reference-gpu:latest",
"postgres-demo": "docker.io/llamastack/distribution-postgres-demo:latest"
}