We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
The OpenAI usage includes the following information, but only Anthropic services store cached token details:
"usage": { "prompt_tokens": 2006, "completion_tokens": 300, "total_tokens": 2306, "prompt_tokens_details": { "cached_tokens": 1920 }, "completion_tokens_details": { "reasoning_tokens": 0 } }
I noticed that it's being prepared as shown below. It might be better to transition this to the OpenAI service:
pipecat/src/pipecat/services/openai_realtime_beta/events.py
Line 367 in ca15d97
The text was updated successfully, but these errors were encountered:
Thanks @golbin, Would you be up to making a PR with the changes you envision?
Sorry, something went wrong.
No branches or pull requests
The OpenAI usage includes the following information, but only Anthropic services store cached token details:
I noticed that it's being prepared as shown below. It might be better to transition this to the OpenAI service:
pipecat/src/pipecat/services/openai_realtime_beta/events.py
Line 367 in ca15d97
The text was updated successfully, but these errors were encountered: