-
Notifications
You must be signed in to change notification settings - Fork 3.1k
return usage from Anthropic OpenAI adapter #6238
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
✅ Deploy Preview for continuedev canceled.
|
@@ -223,6 +235,15 @@ export class AnthropicApi implements BaseLlmApi { | |||
lastToolUseName = value.content_block.name; | |||
} | |||
break; | |||
case "message_start": | |||
usage.prompt_tokens = value.message.usage.input_tokens; | |||
usage.prompt_tokens_details = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Type Error: The code attempts to assign prompt_tokens_details to the usage object, but this property is not defined in the OpenAI CompletionUsage type. The CompletionUsage type only includes completion_tokens, prompt_tokens, and total_tokens. This assignment will cause a TypeScript error and may lead to runtime issues when trying to access undefined properties.
React with 👍 to tell me that this comment was useful, or 👎 if not (and I'll stop posting more comments like this in the future)
😱 Found 1 issue. Time to roll up your sleeves! 😱 |
Description
Anthropic support for returning usage data from openai adapters (chat stream only for now). Includes a test for usage data to be returned, that we can incrementally add for other providers as we support
Checklist
Tests
Tests included in main.test.ts that can later be extended to check for usage in other providers