Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Missing cached token information for OpenAI #668

Open
golbin opened this issue Oct 26, 2024 · 1 comment
Open

Missing cached token information for OpenAI #668

golbin opened this issue Oct 26, 2024 · 1 comment

Comments

@golbin
Copy link
Contributor

golbin commented Oct 26, 2024

The OpenAI usage includes the following information, but only Anthropic services store cached token details:

"usage": {
  "prompt_tokens": 2006,
  "completion_tokens": 300,
  "total_tokens": 2306,
  "prompt_tokens_details": {
    "cached_tokens": 1920
  },
  "completion_tokens_details": {
    "reasoning_tokens": 0
  }
}

I noticed that it's being prepared as shown below. It might be better to transition this to the OpenAI service:

class TokenDetails(BaseModel):

@vipyne
Copy link
Member

vipyne commented Oct 28, 2024

Thanks @golbin, Would you be up to making a PR with the changes you envision?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants