Skip to content

[Platform] Standardise token usage #311

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

junaidbinfarooq
Copy link
Contributor

@junaidbinfarooq junaidbinfarooq commented Aug 13, 2025

Q A
Bug fix? no
New feature? yes
Docs? yes
Issues #208
License MIT

Changes proposed:

The PR aims to standardize the token usage extraction and its availability in the AI bundle. Essentially, the following is done:

  • Makes token usage explicit by adding a dedicated dto for token information
  • Populates the token usage dto inside different token output processors and then adds it to the metadata object

@carsonbot carsonbot added Feature New feature Platform Issues & PRs about the AI Platform component Status: Needs Review labels Aug 13, 2025
$result = $agent->call($messages, [
'max_tokens' => 500, // specific options just for this call
]);

$metadata = $result->getMetadata();
if (null === $tokenUsage = $result->getTokenUsage()) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can this happen?

Copy link
Contributor Author

@junaidbinfarooq junaidbinfarooq Aug 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, if you mean null === $tokenUsage, when token_usage is marked false in the config, as far as I can see.
Edit: Here it is unnecessary.

final class TokenUsage implements \JsonSerializable
{
public function __construct(
public ?int $promptTokens = null,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
public ?int $promptTokens = null,
public ?int $prompt = null,

Maybe remove all the Token suffix here? They feel superfluous

Copy link
Contributor Author

@junaidbinfarooq junaidbinfarooq Aug 13, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No problem.
I would not have kept the suffix if the object were simply named something like Tokens, but I thought TokenUsage is a better name. Or perhaps the DTO should be named as Tokens.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

To be discussed with @chr-hertel

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I did remove the suffixes BTW.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks, not sure now that this was the right decision 😅

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TokenUsage for the class name and tokens as the suffixes for the property names sounds more logical to me.
Or conversely, Tokens for the class name and no suffixes for the property names would also be a good option.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

TokenUsage and tokens as suffix sound best to me

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🤦‍♂️ 😄

@junaidbinfarooq junaidbinfarooq force-pushed the feat/standardize-token-usage branch 2 times, most recently from 7a34c26 to d771023 Compare August 13, 2025 21:33
@junaidbinfarooq junaidbinfarooq force-pushed the feat/standardize-token-usage branch 2 times, most recently from 75e98c8 to 048ac63 Compare August 13, 2025 21:49
Copy link
Member

@chr-hertel chr-hertel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Sorry, but we need to slim this down a bit - in general smaller PRs, that build on top of each other are favorable to me. so let's break it down like you did:

Makes token usage explicit
I like having the value object, we should definitely have that 👍
But I'm not in favor of having the getTokenUsage() directly on the result since the result is a high lever abstraction and token usage is quite specific to remote inference of GenAI models - not all models have that. let's keep it in the metadata please.

Adds support to automatically register token usage extractors so that the token usage information is available in the result when a model is invoked
That's a bit misleading since token usage was already sitting on an extension point => OutputProcessor and your design decision to use that extension point to add another extension point doesn't really resonate with me. why is the new extension point needed? it feels like we're just adding them because we can.

Makes the token usage auto-registration configurable
For me this is too early, i don't think that a use-case, do you? how often do we see the case for someone to bring their own token usage extractor.


all in all, let's start here please with the TokenUsage value object in the metadata class - that's an easy step we see the same picture 👍

edit: another PR could be to fix that issue #208 by registering the corresponding output processor based on the config setting

@junaidbinfarooq
Copy link
Contributor Author

@chr-hertel

Makes token usage explicit
I like having the value object, we should definitely have that 👍
But I'm not in favor of having the getTokenUsage() directly on the result since the result is a high lever abstraction and token usage is quite specific to remote inference of GenAI models - not all models have that. let's keep it in the metadata please.

Hmm, I initially thought about keeping it in metadata, but having it inside the Result object also made sense since an API should send us this information. I, so far, have seen such information in the responses of whatever APIs I tried, but perhaps you are right, and other models out there do not provide such information.

Adds support to automatically register token usage extractors so that the token usage information is available in the result when a model is invoked
That's a bit misleading since token usage was already sitting on an extension point => OutputProcessor and your design decision to use that extension point to add another extension point doesn't really resonate with me. why is the new extension point needed? it feels like we're just adding them because we can.

Nope, the idea, instead, was to introduce an extension point specific to token usage extraction and exposure. This would also simplify for bridges to their own implentations of the extractor interface if needed.

Makes the token usage auto-registration configurable
For me this is too early, i don't think that a use-case, do you? how often do we see the case for someone to bring their own token usage extractor.

So far, in this repository, I have found three such extractors (currently TokenOutputProcessor classes), including the new Google Vertex AI bridge, and all three of them perform the extraction differently, albeit not entirely. As you also rightly pointed out above, token usage is something specific to GenAI models, and not all of them have it. Not all of them will expose the data in a similar fashion, either.

@junaidbinfarooq
Copy link
Contributor Author

@OskarStark @chr-hertel
I have reverted the previous changes and only kept the token usage information. Also, edited the PR description.
Please review the changes.

@@ -70,8 +72,11 @@ public function testItAddsRemainingTokensToMetadata()
$processor->processOutput($output);

$metadata = $output->result->getMetadata();
$tokenUsage = $metadata->get('token_usage');
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What about having a dedicated setTokenUsage/getTokenUsage on the metadata object?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Not sure about this.
Metadata object already has a lot of methods, and each of them is generic in nature as far as I can see, and doesn't concern any specific model characteristic.

@junaidbinfarooq junaidbinfarooq force-pushed the feat/standardize-token-usage branch from e33ba4a to 069b4c3 Compare August 15, 2025 18:24
Copy link
Member

@chr-hertel chr-hertel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

almost done i'd say - thanks already!

- Makes token usage explicit
- Adds support to automatically register token usage extractors so that the token usage information is available in the result when a model is invoked
- Makes the token usage auto-registration configurable
- Makes token usage explicit by adding a dedicated dto for token information
- Populates the token usage dto inside different token output processors and then adds it to metadata object
@junaidbinfarooq
Copy link
Contributor Author

almost done i'd say - thanks already!

I am working on autoconfiguring token usage processors inside the AI bundle and based that branch on this one.
A PR will arrive after this one is finished.

Copy link
Member

@chr-hertel chr-hertel left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good to merge from my end - @OskarStark?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Feature New feature Platform Issues & PRs about the AI Platform component Status: Needs Review
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants