Tabby is a self-hosted AI coding assistant, offering an open-source and on-premises alternative to GitHub Copilot. It boasts several key features:
The easiest way to start a Tabby server is by using the following Docker command:
docker run -it \
--gpus all -p 8080:8080 -v $HOME/.tabby:/data \
tabbyml/tabby \
serve --model TabbyML/SantaCoder-1B --device cuda
For additional options (e.g inference type, parallelism), please refer to the documentation at https://tabbyml.github.io/tabby.
git clone --recurse-submodules https://github.com/TabbyML/tabby
cd tabby
Set up the Rust environment by following this tutorial.
Install the required dependencies:
# For MacOS
brew install protobuf
# For Ubuntu / Debian
apt-get install protobuf-compiler libopenblas-dev
cargo build
.... and don't forget to submit a Pull Request