You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I was thinking of implementing something like ollama pull. For example, model_pull() would take either a model name or a local (absolute or relative) path as input.
It would also be a good idea to add another function that lists all locally available models and the functions they can be used for. When listing local/offline models, we can repeat the message about local inference (#23).
Some folks might download models onto their computer via different paths than what {transformers} or other LLM modules set
To avoid downloading a model multiple times, some users might prefer to provide a path to their model instead
The text was updated successfully, but these errors were encountered: