Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

await the generation in the no cache branch #7233

Closed
wants to merge 1 commit into from

Conversation

kevinnayar
Copy link

Fixes #7172

@dosubot dosubot bot added the size:XS This PR changes 0-9 lines, ignoring generated files. label Nov 19, 2024
Copy link

vercel bot commented Nov 19, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
langchainjs-docs ✅ Ready (Inspect) Visit Preview Nov 19, 2024 6:31pm
1 Skipped Deployment
Name Status Preview Comments Updated (UTC)
langchainjs-api-refs ⬜️ Ignored (Inspect) Nov 19, 2024 6:31pm

@dosubot dosubot bot added the auto:bug Related to a bug, vulnerability, unexpected error with an existing feature label Nov 19, 2024
@jacoblee93
Copy link
Collaborator

Thanks for the PR!

How exactly does this fix the linked issue?

@kevinnayar
Copy link
Author

Oh I'm sorry, I attached the wrong issue but I cannot seem to locate the one I was viewing earlier and am trying to address. The attached issue is related to ChatGoogleGenerativeAI and withStructuredOutput, but the one I was referencing was related to ChatVertexAI. I'll keep digging for it.

Without awaiting the return on this method, the response when calling model.withStructuredOutput(...) is consistently the following error:

TypeError: Cannot read properties of undefined (reading 'text')

In this call stack sequence invoke -> generatePrompt -> generate, the last method doesn't return a value which results in this error. I believe this is because generate's return type is a promise/non-promise union and therefore, it should be awaited.

generate(...): Promise<{
    generations: never[][];
    llmOutput: any;
} | {
    generations: any[];
    llmOutput: {};
}>

@jacoblee93
Copy link
Collaborator

Gotcha. The only difference the await makes here after a return is in the call stack if an error were to occur:

https://stackoverflow.com/questions/38708550/difference-between-return-await-promise-and-return-promise

@jacoblee93 jacoblee93 closed this Nov 21, 2024
@kevinnayar
Copy link
Author

Yep, understand what awaits does. I did think it was weird that it resolves the issue when this minor edit of the dist file. I'll do a bit more debugging and create a more meaningful bug first. The issue does persist without it. Thanks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
auto:bug Related to a bug, vulnerability, unexpected error with an existing feature size:XS This PR changes 0-9 lines, ignoring generated files.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Gemini JSON Mode Not Working
2 participants