Skip to content

Releases: TabbyML/tabby

v0.21.0-rc.3

27 Nov 08:34
Compare
Choose a tag to compare
v0.21.0-rc.3 Pre-release
Pre-release
v0.21.0-rc.3

v0.21.0-rc.2

26 Nov 08:19
Compare
Choose a tag to compare
v0.21.0-rc.2 Pre-release
Pre-release
v0.21.0-rc.2

nightly

08 Sep 01:39
b6b8d60
Compare
Choose a tag to compare
nightly Pre-release
Pre-release
fix: should check local model before resolving model id (#3470)

* fix: should check local model before resolving model id

* chore: add change log

* chore: add github id for change log

v0.21.0-rc.1

21 Nov 12:12
Compare
Choose a tag to compare
v0.21.0-rc.1 Pre-release
Pre-release
v0.21.0-rc.1

v0.21.0-rc.0

19 Nov 04:53
Compare
Choose a tag to compare
v0.21.0-rc.0 Pre-release
Pre-release
v0.21.0-rc.0

[email protected]

14 Nov 04:49
Compare
Choose a tag to compare
chore(intellij): bump intellij plugin version to 1.8.6.

v0.20.0

10 Nov 06:19
Compare
Choose a tag to compare

πŸš€ Features

  • Search results can now be edited directly.
  • Allow switching backend chat models in Answer Engine.
  • Added a connection test button in the System tab to test the connection to the backend LLM server.

🧰 Fixes and Improvements

  • Optimized CR-LF inference in code completion. (#3279)
  • Bumped llama.cpp version to b3995.

Full Changelog: v0.19.0...v0.20.0

v0.20.0-rc.3

08 Nov 23:17
Compare
Choose a tag to compare

v0.20.0-rc.2

07 Nov 01:52
Compare
Choose a tag to compare
v0.20.0-rc.2 Pre-release
Pre-release
v0.20.0-rc.2

v0.20.0-rc.1

05 Nov 16:57
Compare
Choose a tag to compare
v0.20.0-rc.1 Pre-release
Pre-release
v0.20.0-rc.1