-
Notifications
You must be signed in to change notification settings - Fork 5.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feat: IBM watsonx.ai Chat integration #16589
Feat: IBM watsonx.ai Chat integration #16589
Conversation
llama-index-integrations/llms/llama-index-llms-ibm/llama_index/llms/ibm/base.py
Outdated
Show resolved
Hide resolved
llama-index-integrations/llms/llama-index-llms-ibm/llama_index/llms/ibm/base.py
Outdated
Show resolved
Hide resolved
async def acomplete( | ||
self, prompt: str, formatted: bool = False, **kwargs: Any | ||
) -> CompletionResponse: | ||
return self.complete(prompt, formatted=formatted, **kwargs) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
there's no async client or async client methods?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Part of them is still under construction in the source package, so they will be delivered soon (as the next step).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This looks fine to me, just a note about some pydantic v1 vs v2 changes
Description
Please include a summary of the change and which issue is fixed. Please also include relevant motivation and context. List any dependencies that are required for this change.
Summary:
The purpose of these changes is to add Chat API support to the IBM watsonx.ai integration. Such will accept the
tool_choice
andtool_choice_option
options. Methods introduced/changed:chat()
stream_chat()
FYI @Mateusz-Switala @LukaszCmielowski
New Package?
Did I fill in the
tool.llamahub
section in thepyproject.toml
and provide a detailed README.md for my new integration or package?Version Bump?
Did I bump the version in the
pyproject.toml
file of the package I am updating? (Except for thellama-index-core
package)Type of Change
Please delete options that are not relevant.
How Has This Been Tested?
Your pull-request will likely not be merged unless it is covered by some form of impactful unit testing.
Suggested Checklist:
make format; make lint
to appease the lint gods