We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hi.
Like I said, tested 2.0.6 and 2.1.0 this weekend, and both work great. No problem!
Now about LLM. ollama is very slow on VPS, no GPU.
Can you add Groq API? https://console.groq.com/docs/models https://console.groq.com/docs/api-reference#chat It is free for now, only need registration.
Thanks for great project!
No response
The text was updated successfully, but these errors were encountered:
Glad to hear that 😉
Now about LLM. ollama is very slow on VPS, no GPU. Can you add Groq API? https://console.groq.com/docs/models https://console.groq.com/docs/api-reference#chat It is free for now, only need registration.
At the moment our goal is to stabilize reNgine-ng, but we add your request to the backlog We will respond to you later
If you think you are able to do this and submit a PR, go for it !
Sorry, something went wrong.
No branches or pull requests
Expected feature
Hi.
Like I said, tested 2.0.6 and 2.1.0 this weekend, and both work great.
No problem!
Now about LLM.
ollama is very slow on VPS, no GPU.
Can you add Groq API?
https://console.groq.com/docs/models
https://console.groq.com/docs/api-reference#chat
It is free for now, only need registration.
Thanks for great project!
Alternative solutions
No response
Anything else?
No response
Acknowledgements
The text was updated successfully, but these errors were encountered: