-
After setting up tabby with the docker compose file in quick start, I wanted to look into implementing a chat completion for my favourite editor neovim. I wanted to use codecompanion.nvim for this, as this plugin allows to specify new adapters easily. However, I got stuck, figuring out the http API to just send basic chat completion requests. Can someone please help me make this first step?
|
Beta Was this translation helpful? Give feedback.
Replies: 2 comments
-
The chat completions api is an openai like interface, could you try following curl command (with your bearer token):
|
Beta Was this translation helpful? Give feedback.
-
Great, thank you, I managed to configure codecompanion.nvim for usage with chat using this configuration: {
"olimorris/codecompanion.nvim",
dependencies = {
"nvim-lua/plenary.nvim",
"nvim-treesitter/nvim-treesitter",
},
config = function()
require("codecompanion").setup({
strategies = {
chat = { adapter = "tabby" },
inline = { adapter = "tabby" },
},
adapters = {
tabby = function()
return require("codecompanion.adapters").extend("openai_compatible", {
env = {
url = "http://localhost:8080", -- optional: default value is ollama url http://127.0.0.1:11434
api_key = "auth_881a958cbb454e1aa87a3fa17e8e9649", -- optional: if your endpoint is authenticated
chat_url = "/v1/chat/completions", -- optional: default value, override if different
},
schema = {
model = {
default = "Qwen2-1.5B-Instruct",
choices = { "Qwen2-1.5B-Instruct" },
},
},
})
end,
},
})
end,
} One problem was, that tabby doesn't provide a |
Beta Was this translation helpful? Give feedback.
Great, thank you, I managed to configure codecompanion.nvim for usage with chat using this configuration: