-
Notifications
You must be signed in to change notification settings - Fork 5.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add embedding integration with Intel Gaudi in llama-index-embeddings-gaudi #16521
Conversation
) | ||
|
||
|
||
class GaudiHuggingFaceEmbeddings(HuggingFaceEmbeddings): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is kind of un-intuitive -- can we implement the actual embedding class from llama-index right? No need to involve langhcain (You could subclass BaseEmbedding or the HuggingFaceEmbedding class from llama-index_
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Okay, I have just re-written the implementation to be based on BaseEmbedding of llama-index. @logan-markewich
Also removed the Instruct class from the original implementation since we are focusing on embeddings in this PR.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thank you!
@logan-markewich Could you please review my updates when you have a chance? I'd appreciate it if the PR can be merged after issues are resolved. Thanks so much! |
Description
This PR adds Intel Gaudi support to the list of llama_index embeddings integration. This enables users to run embedding models locally on Intel Gaudi using llama_index modules.
Fixes # (issue)
N/A
New Package?
Did I fill in the
tool.llamahub
section in thepyproject.toml
and provide a detailed README.md for my new integration or package?Version Bump?
Did I bump the version in the
pyproject.toml
file of the package I am updating? (Except for thellama-index-core
package)Type of Change
Please delete options that are not relevant.
How Has This Been Tested?
Your pull-request will likely not be merged unless it is covered by some form of impactful unit testing.
Suggested Checklist:
make format; make lint
to appease the lint gods