You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Right now, we have to decide between either using useAssistant() to access the code-interpreter but having no access to all the local tools OR having access to all tools via useChat() but no access to the code interpreter. It would super powerful if both is available within one setup. Preferably, the useAssistant() hook allows for incorporating tools as known with useChat().
Use Cases
Tools can be designed to fetch large datasets from APIs and provide to the agents, which is a great feature. However, if the size of the response is too large, any AI API rate limit is reached quickly and the agents/bots will crash. Instead of pulling the data into a bot's context, it makes more sense to upload them as files and process them in a sandbox environment such as code-interpreter to keep the context/memory of an agent/bot clean and efficiently used.
Additional context
Right now we are running a sub-agent independently from the Vercel AI SDK built plain with the OpenAI library for being able to involve the code interpreter. This sub-agent is provided to the "main" Vercel AI agent as a tool. We need the code-interpreter in order to process large datasets that would exceed the token limit of the main-agent right away if pulled into the context.
Our fetch tools detect whether the response is too large for the main-agent token limit, and if so, upload the files to the sub-agent thread (openAI assistant) and respond with the openAI filedId for reference. The main agent uses this reference to instruct the sub-agent to process that data.
The text was updated successfully, but these errors were encountered:
Feature Description
Right now, we have to decide between either using useAssistant() to access the code-interpreter but having no access to all the local tools OR having access to all tools via useChat() but no access to the code interpreter. It would super powerful if both is available within one setup. Preferably, the useAssistant() hook allows for incorporating tools as known with useChat().
Use Cases
Tools can be designed to fetch large datasets from APIs and provide to the agents, which is a great feature. However, if the size of the response is too large, any AI API rate limit is reached quickly and the agents/bots will crash. Instead of pulling the data into a bot's context, it makes more sense to upload them as files and process them in a sandbox environment such as code-interpreter to keep the context/memory of an agent/bot clean and efficiently used.
Additional context
Right now we are running a sub-agent independently from the Vercel AI SDK built plain with the OpenAI library for being able to involve the code interpreter. This sub-agent is provided to the "main" Vercel AI agent as a tool. We need the code-interpreter in order to process large datasets that would exceed the token limit of the main-agent right away if pulled into the context.
Our fetch tools detect whether the response is too large for the main-agent token limit, and if so, upload the files to the sub-agent thread (openAI assistant) and respond with the openAI filedId for reference. The main agent uses this reference to instruct the sub-agent to process that data.
The text was updated successfully, but these errors were encountered: