Skip to content

Commit

Permalink
docs: add execution permission to the llama-server when running tabby…
Browse files Browse the repository at this point in the history
… on Linux (#2549)

The user should give execution permission to the llama-server when running on Tabby on a Linux standalone install. Otherwise, the application will crash due to a permission error. 

```sh
The application panicked (crashed).
Message:  Failed to start llama-server <embedding> with command Command { std: "/home/<user>/tabby/dist/tabby/llama-server" "-m" "/home/<user>/.tabby/models/TabbyML/Nomic-Embed-Text/ggml/model.gguf" "--cont-batching" "--port" "30888" "-np" "1" "--log-disable" "--ctx-size" "4096" "-ngl" "9999" "--embedding" "--ubatch-size" "4096", kill_on_drop: true }: Permission denied (os error 13)
Location: crates/llama-cpp-server/src/supervisor.rs:80
```
  • Loading branch information
Srkl authored Jun 29, 2024
1 parent 6ccbcdc commit 984d522
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions website/docs/quick-start/installation/linux/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ Running Tabby on Linux using Tabby's standalone executable distribution.
## Find the Linux executable file
* Unzip the file you downloaded. The `tabby` executable will be in a subdirectory of dist.
* Change to this subdirectory or relocate `tabby` to a folder of your choice.
* Make it executable: `chmod +x tabby`
* Make it executable: `chmod +x tabby llama-server`

Run the following command:
```
Expand All @@ -46,4 +46,4 @@ You can choose different models, as shown in [the model registry](https://tabby.
You should see a success message similar to the one in the screenshot below. After that, you can visit http://localhost:8080 to access your Tabby instance.
<div align="left">
<img src={successImage} alt="Linux running success" style={{ width: 800 }} />
</div>
</div>

0 comments on commit 984d522

Please sign in to comment.