Releases: TabbyML/tabby
Releases Β· TabbyML/tabby
v0.21.0-rc.3
v0.21.0-rc.3
v0.21.0-rc.2
v0.21.0-rc.2
nightly
fix: should check local model before resolving model id (#3470) * fix: should check local model before resolving model id * chore: add change log * chore: add github id for change log
v0.21.0-rc.1
v0.21.0-rc.1
v0.21.0-rc.0
v0.21.0-rc.0
[email protected]
chore(intellij): bump intellij plugin version to 1.8.6.
v0.20.0
π Features
- Search results can now be edited directly.
- Allow switching backend chat models in Answer Engine.
- Added a connection test button in the
System
tab to test the connection to the backend LLM server.
π§° Fixes and Improvements
- Optimized CR-LF inference in code completion. (#3279)
- Bumped
llama.cpp
version tob3995
.
Full Changelog: v0.19.0...v0.20.0
v0.20.0-rc.3
Release 0.20.0-rc.3 [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] [email protected] Generated by cargo-workspaces
v0.20.0-rc.2
v0.20.0-rc.2
v0.20.0-rc.1
v0.20.0-rc.1