You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The error persists on both released and pre-released versions 1.3.7 and I keep getting this error from simple chat and tool calls. I assume there is some mismatch between how Continue reads the output and what ollama is spitting out: Error parsing Ollama response: TypeError: Cannot destructure property 'role' of 'res.message' as it is undefined. {"id":"chatcmpl-294","object":"chat.completion","created":1757714568,"model":"llama3.1:8b","system_fingerprint":"fp_ollama","choices":[{"index":0,"message":{"role":"assistant","content":"I'm doing well, thanks for asking. How can I assist you today? Do you have any questions or need help with something specific?"},"finish_reason":"stop"}],"usage":{"prompt_tokens":362,"completion_tokens":29,"total_tokens":391}}
The ollama is remote instance with bearer token behind nginx reverse proxy (hopefully this doesn't reshape the output from ollama) with important parts that the Continue uses are:
` # Forward OpenAI-compatible /v1/api/* paths to Ollama
location ~ ^/v1/api/chat$ {
rewrite ^/v1/api/chat$ /v1/chat/completions break;
proxy_pass http://ollama_backend;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_connect_timeout 300s;
proxy_send_timeout 300s;
proxy_read_timeout 300s;
proxy_buffering off;
proxy_request_buffering off;
}
location ~ ^/v1/api/show$ {
rewrite ^/v1/api/show$ /api/show break;
proxy_pass http://ollama_backend;
proxy_set_header Host $host;
proxy_set_header X-Real-IP $remote_addr;
proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto $scheme;
proxy_connect_timeout 300s;
proxy_send_timeout 300s;
proxy_read_timeout 300s;
proxy_buffering off;
proxy_request_buffering off;
}`
``
Am I doing something wrong here? do I need to reconfigure the Continue mapping for my ollama instance? I don't see anyone else having this issue so I wonder
Thanks in advance
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi people,
The error persists on both released and pre-released versions 1.3.7 and I keep getting this error from simple chat and tool calls. I assume there is some mismatch between how Continue reads the output and what ollama is spitting out:
Error parsing Ollama response: TypeError: Cannot destructure property 'role' of 'res.message' as it is undefined. {"id":"chatcmpl-294","object":"chat.completion","created":1757714568,"model":"llama3.1:8b","system_fingerprint":"fp_ollama","choices":[{"index":0,"message":{"role":"assistant","content":"I'm doing well, thanks for asking. How can I assist you today? Do you have any questions or need help with something specific?"},"finish_reason":"stop"}],"usage":{"prompt_tokens":362,"completion_tokens":29,"total_tokens":391}}
The ollama is remote instance with bearer token behind nginx reverse proxy (hopefully this doesn't reshape the output from ollama) with important parts that the Continue uses are:
Beta Was this translation helpful? Give feedback.
All reactions