Skip to content

Conversation

@matt-bernstein
Copy link

Sharing some local changes I made to continue using loom with openai models now that the chat completions endpoint does not support logprobs [1]. Hardcodes using a non-chat model (davinci) if needed and updates the API syntax to the latest version of the openai client library.

Is anyone else still using openai models for loom? I notice there's a feature branch (metaprocesses) with new code using openai.

[1] https://community.openai.com/t/logprobs-are-missing-from-the-chat-endpoints/289514

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant