Skip to content

Commit 5119e63

Browse files
authored
Update quick-start-with-mcp.md
1 parent 3a045ca commit 5119e63

File tree

1 file changed

+4
-15
lines changed

1 file changed

+4
-15
lines changed

docs/llama-nexus/mcp/quick-start-with-mcp.md

Lines changed: 4 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -17,10 +17,10 @@ This tutorial shows how to set up real-time weather functionality with Llama-Nex
1717
## 1. Set Up Your MCP Server
1818

1919
```bash
20-
curl -LO https://github.com/decentralized-mcp/gaia-mcp-servers/releases/download/0.6.0/gaia-mcp-servers-unknown-linux-gnu-x86_64.tar.gz
21-
tar xvf gaia-mcp-servers-unknown-linux-gnu-x86_64.tar.gz
20+
curl -LO https://github.com/cardea-mcp/cardea-mcp-servers/releases/download/0.7.0/cardea-mcp-servers-unknown-linux-gnu-x86_64.tar.gz
21+
tar xvf cardea-mcp-servers-unknown-linux-gnu-x86_64.tar.gz
2222
```
23-
> Download for your platform: https://github.com/decentralized-mcp/gaia-mcp-servers/releases/tag/0.6.0
23+
> Download for your platform: https://github.com/cardea-mcp/cardea-mcp-servers/releases/tag/0.7.0
2424
2525
Set the environment variables:
2626

@@ -33,7 +33,7 @@ export LLAMA_LOG=debug
3333
Run the MCP server (accessible from external connections):
3434

3535
```bash
36-
./gaia-weather-mcp-server --transport stream-http --socket-addr 0.0.0.0:8002
36+
./cardea-weather-mcp-server --transport stream-http --socket-addr 0.0.0.0:8002
3737
```
3838

3939
**Important**: Ensure port 8002 is open in your firewall/security group settings if you're running on a cloud machine.
@@ -92,17 +92,6 @@ curl --location 'http://localhost:9095/admin/servers/register' \
9292
}'
9393
```
9494

95-
Register an embedding API server for the `/embeddings` endpoint:
96-
97-
```bash
98-
curl --location 'http://localhost:9095/admin/servers/register' \
99-
--header 'Content-Type: application/json' \
100-
--data '{
101-
"url": "https://0x448f0405310a9258cd5eab5f25f15679808c5db2.gaia.domains",
102-
"kind": "embeddings"
103-
}'
104-
```
105-
10695
## 3. Test the Setup
10796

10897
Test the inference server by requesting the `/chat/completions` API endpoint:

0 commit comments

Comments
 (0)