You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am requesting a new feature for the useCompletion hook in the Vercel AI SDK to support an onStepFinish callback, similar to the one available in the backend generateText function. This enhancement would enable developers to handle intermediate steps and results directly within the hook, providing a more seamless and interactive development experience.
Current Behavior
Currently, backend generateText has an onStepFinish callback as follows:
const result = await generateText({
model: yourModel,
maxSteps: 10,
onStepFinish({ text, toolCalls, toolResults, finishReason, usage }) {
// your own logic, e.g., for saving or using step
},
// ...
});
However, corresponding frontend hooks do not provide access to this functionality. For example:
Developers cannot currently access intermediate step data such as text, toolCalls, toolResults, finishReason, or usage within the hook.
Use Cases
Proposed Behavior
Add an onStepFinish option to the frontend hook, allowing developers to define a callback function that receives the same parameters as the backend implementation. The enhanced hook might look like this:
Add an onStepFinish parameter to the frontend hook.
Pass the onStepFinish function to the underlying API route and ensure it triggers for each intermediate step.
Expose the same data (text, toolCalls, toolResults, finishReason, usage) as in the backend implementation.
Example Use Case
Consider a application where intermediate responses are shown to users before the final completion is ready. With the onStepFinish feature, the frontend hook could be used to:
Display partial responses as they are generated.
Log usage metrics or tool calls for analytics.
Update the UI dynamically without waiting for the final result.
Additional context
would be cool to have seen onFinish within the useChat
The text was updated successfully, but these errors were encountered:
have a command+k like feature similar to cursor because it wasn't really a chat I went for useCompletion works well for most of my use cases such as translation, autocomplete but I wanted to add some tool calls for one of the APIs and then log the steps to the front-end. I might just stream it with the data option for now
Feature Description
Summary
I am requesting a new feature for the useCompletion hook in the Vercel AI SDK to support an onStepFinish callback, similar to the one available in the backend generateText function. This enhancement would enable developers to handle intermediate steps and results directly within the hook, providing a more seamless and interactive development experience.
Current Behavior
Currently, backend generateText has an onStepFinish callback as follows:
However, corresponding frontend hooks do not provide access to this functionality. For example:
Developers cannot currently access intermediate step data such as text, toolCalls, toolResults, finishReason, or usage within the hook.
Use Cases
Proposed Behavior
Add an onStepFinish option to the frontend hook, allowing developers to define a callback function that receives the same parameters as the backend implementation. The enhanced hook might look like this:
Implementation Details
Add an onStepFinish parameter to the frontend hook.
Pass the onStepFinish function to the underlying API route and ensure it triggers for each intermediate step.
Expose the same data (text, toolCalls, toolResults, finishReason, usage) as in the backend implementation.
Example Use Case
Consider a application where intermediate responses are shown to users before the final completion is ready. With the onStepFinish feature, the frontend hook could be used to:
Display partial responses as they are generated.
Log usage metrics or tool calls for analytics.
Update the UI dynamically without waiting for the final result.
Additional context
would be cool to have seen onFinish within the useChat
The text was updated successfully, but these errors were encountered: