@@ -17,10 +17,10 @@ This tutorial shows how to set up real-time weather functionality with Llama-Nex
1717## 1. Set Up Your MCP Server
1818
1919``` bash
20- curl -LO https://github.com/decentralized -mcp/gaia -mcp-servers/releases/download/0.6 .0/gaia -mcp-servers-unknown-linux-gnu-x86_64.tar.gz
21- tar xvf gaia -mcp-servers-unknown-linux-gnu-x86_64.tar.gz
20+ curl -LO https://github.com/cardea -mcp/cardea -mcp-servers/releases/download/0.7 .0/cardea -mcp-servers-unknown-linux-gnu-x86_64.tar.gz
21+ tar xvf cardea -mcp-servers-unknown-linux-gnu-x86_64.tar.gz
2222```
23- > Download for your platform: https://github.com/decentralized -mcp/gaia -mcp-servers/releases/tag/0.6 .0
23+ > Download for your platform: https://github.com/cardea -mcp/cardea -mcp-servers/releases/tag/0.7 .0
2424
2525Set the environment variables:
2626
@@ -33,7 +33,7 @@ export LLAMA_LOG=debug
3333Run the MCP server (accessible from external connections):
3434
3535``` bash
36- ./gaia -weather-mcp-server --transport stream-http --socket-addr 0.0.0.0:8002
36+ ./cardea -weather-mcp-server --transport stream-http --socket-addr 0.0.0.0:8002
3737```
3838
3939** Important** : Ensure port 8002 is open in your firewall/security group settings if you're running on a cloud machine.
@@ -92,17 +92,6 @@ curl --location 'http://localhost:9095/admin/servers/register' \
9292}'
9393```
9494
95- Register an embedding API server for the ` /embeddings ` endpoint:
96-
97- ``` bash
98- curl --location ' http://localhost:9095/admin/servers/register' \
99- --header ' Content-Type: application/json' \
100- --data ' {
101- "url": "https://0x448f0405310a9258cd5eab5f25f15679808c5db2.gaia.domains",
102- "kind": "embeddings"
103- }'
104- ```
105-
10695## 3. Test the Setup
10796
10897Test the inference server by requesting the ` /chat/completions ` API endpoint:
0 commit comments