Skip to content

Commit

Permalink
Merge pull request #79 from better629/main
Browse files Browse the repository at this point in the history
update open_llm config
  • Loading branch information
better629 authored Feb 18, 2024
2 parents 2987e6c + f0d129c commit f90b218
Show file tree
Hide file tree
Showing 2 changed files with 8 additions and 8 deletions.
10 changes: 5 additions & 5 deletions src/en/guide/tutorials/integration_with_open_llm.md
Original file line number Diff line number Diff line change
Expand Up @@ -249,12 +249,12 @@ Such as LLaMA-Factory, FastChat, vllm openai compatible interface

```yaml
llm:
llm_type: 'open_llm'
base_url: 'http://106.75.10.65:8001/v1'
api_type: 'open_llm'
base_url: 'http://106.75.10.xxx:8000/v1'
model: 'llama2-13b'
```
The complete routing of the openapi chat interface `http://0.0.0.0:8000/v1/chat/completions`, `base_url` only needs to be configured to `http://0.0.0.0:8000/v1`, and the remaining parts will be filled by openai sdk itself.
The complete routing of the openapi chat interface `http://0.0.0.0:8000/v1/chat/completions`, `base_url` only needs to be configured to `http://0.0.0.0:8000/v1`, and the remaining parts will be filled by openai sdk itself. `model` is the actual value of the request interface parameter `model`.

#### ollama api interface

Expand All @@ -264,12 +264,12 @@ Such as model services deployed through ollama

```yaml
llm:
llm_type: 'ollama'
api_type: 'ollama'
base_url: 'http://127.0.0.1:11434/api'
model: 'llama2'
```

The complete route of ollama chat interface `http://127.0.0.1:11434/api/chat`, `base_url` only needs to be configured to `http://127.0.0.1:11434/api`, and the remaining part is filled by `OllamaGPTAPI`.
The complete route of ollama chat interface `http://127.0.0.1:11434/api/chat`, `base_url` only needs to be configured to `http://127.0.0.1:11434/api`, and the remaining part is filled by `OllamaLLM`. `model` is the actual value of the request parameter `model`.

## Optional, repair LLM output

Expand Down
6 changes: 3 additions & 3 deletions src/zh/guide/tutorials/integration_with_open_llm.md
Original file line number Diff line number Diff line change
Expand Up @@ -251,11 +251,11 @@ curl -X POST http://localhost:11434/api/chat -d '{
```yaml
llm:
api_type: open_llm
base_url: 'http://106.75.10.65:8001/v1'
base_url: 'http://106.75.10.xxx:8000/v1'
model: 'llama2-13b'
```
openapi chat接口的完整路由`http://0.0.0.0:8000/v1/chat/completions`,`base_url` 只需要配置到`http://0.0.0.0:8000/v1` ,剩余部分openai sdk会补齐。
openapi chat接口的完整路由`http://0.0.0.0:8000/v1/chat/completions`,`base_url` 只需要配置到`http://0.0.0.0:8000/v1` ,剩余部分openai sdk会补齐。`model`为请求接口参数`model`的实际值。

#### ollama api接口

Expand All @@ -270,7 +270,7 @@ llm:
model: 'llama2'
```

ollama chat接口的完整路由`http://127.0.0.1:11434/api/chat` ,`base_url` 只需要配置到`http://127.0.0.1:11434/api` ,剩余部分由`OllamaGPTAPI` 补齐。
ollama chat接口的完整路由`http://127.0.0.1:11434/api/chat` ,`base_url` 只需要配置到`http://127.0.0.1:11434/api` ,剩余部分由`OllamaLLM` 补齐。`model` 为请求接口参数`model` 的实际值

## 可选的,修复LLM输出结果

Expand Down

0 comments on commit f90b218

Please sign in to comment.