Releases: TabbyML/tabby
Releases Β· TabbyML/tabby
nightly
feat(ui): integrate author for github issues/PRs (#3500) * feat(ui): display information of pr/issue author * update: add hover card * update * update: mock * update: fetch user by email * update * update * update * update * update * [autofix.ci] apply automated fixes --------- Co-authored-by: autofix-ci[bot] <114827586+autofix-ci[bot]@users.noreply.github.com>
v0.21.2-rc.2
v0.21.2-rc.2
v0.21.1
β οΈ Notice
- This is a patch release, please also check the full release note for 0.21.
π§° Fixed and Improvements
- Fixed Gitlab Context Provider.
v0.21.1-rc.0
v0.21.1-rc.0
v0.21.0
β οΈ Notice
- Due to changes in the indexing format, the
~/.tabby/index
directory will be automatically removed before any further indexing jobs are run. It is expected that the indexing jobs will be re-run (instead of incrementally) after the upgrade.
π Features
- Support connecting to llamafile model backend.
- Display Open / Closed state for issues / pull requests in Answer Engine context card.
- Support deleting the entire thread in Answer Engine.
- Add rate limiter options for HTTP-powered model backends.
π§° Fixed and Improvements
- Fixed a panic that occurred when specifying a local model (#3464)
- Add pagination to Answer Engine threads.
- Fix Vulkan binary distributions.
- Improve the retry logic for chunk embedding computation in indexing job.
π« New Contributors
- @emmanuel-ferdman made their first contribution in #3459
Full Changelog: v0.20.0...v0.21.0
v0.21.0-rc.7
v0.21.0-rc.7
v0.21.0-rc.6
v0.21.0-rc.6
v0.21.0-rc.5
v0.21.0-rc.5
v0.21.0-rc.4
v0.21.0-rc.4
[email protected]
chore(intellij): bump intellij plugin version to 1.9.0.