Skip to content

Commit

Permalink
Merge pull request #55 from justinh-rahb/patch-1
Browse files Browse the repository at this point in the history
  • Loading branch information
tjbck authored May 1, 2024
2 parents 886f17b + fd2f957 commit 39b40db
Show file tree
Hide file tree
Showing 2 changed files with 12 additions and 41 deletions.
49 changes: 10 additions & 39 deletions docs/getting-started/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -108,7 +108,13 @@ When using Docker to install Open WebUI, make sure to include the `-v open-webui
docker run -d -p 3000:8080 -e OLLAMA_BASE_URL=https://example.com -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```
### Installing Both Open WebUI and Ollama Together
- **To run Open WebUI with Nvidia GPU support**, use this command:
```bash
docker run -d -p 3000:8080 --gpus all --add-host=host.docker.internal:host-gateway -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:cuda
```
### Installing Ollama
- **With GPU Support**, Use this command:
Expand All @@ -122,44 +128,9 @@ When using Docker to install Open WebUI, make sure to include the `-v open-webui
docker run -d -p 3000:8080 -v ollama:/root/.ollama -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:ollama
```
- After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000). Enjoy! 😄
### GPU Support
#### Nvidia CUDA
To run Ollama with Nvidia GPU support, utilize the Nvidia-docker tool for GPU access, and set the appropriate environment variables for CUDA support:
```bash
docker run -d -p 3000:8080 \
--gpus all \
--add-host=host.docker.internal:host-gateway \
--volume open-webui:/app/backend/data \
--name open-webui \
--restart always \
ghcr.io/open-webui/open-webui:main
```
#### AMD ROCm
To run Ollama with AMD GPU support, set the `HSA_OVERRIDE_GFX_VERSION` environment variable and ensure the Docker container can access the GPU:
```bash
docker run -d -p 3000:8080 \
-e HSA_OVERRIDE_GFX_VERSION=11.0.0 \
--device /dev/kfd \
--device /dev/dri \
--group-add video \
--add-host=host.docker.internal:host-gateway \
--volume open-webui:/app/backend/data \
--name open-webui \
--restart always \
ghcr.io/open-webui/open-webui:main
```
Replace `HSA_OVERRIDE_GFX_VERSION=11.0.0` with the version appropriate for your AMD GPU model as described in the earlier sections. This command ensures compatibility and optimal performance with AMD GPUs.
After installation, you can access Open WebUI at [http://localhost:3000](http://localhost:3000). Enjoy! 😄
#### Open WebUI: Server Connection Error
### Open WebUI: Server Connection Error
Encountering connection issues between the Open WebUI Docker container and the Ollama server? This problem often arises because distro-packaged versions of Docker—like those from the Ubuntu repository—do not support the `host.docker.internal` alias for reaching the host directly. Inside a container, referring to `localhost` or `127.0.0.1` typically points back to the container itself, not the host machine.
Expand All @@ -183,7 +154,7 @@ For more details on networking in Docker and addressing common connectivity issu
docker compose up -d --build
```
- **For GPU Support:** Use an additional Docker Compose file:
- **For Nvidia GPU Support:** Use an additional Docker Compose file:
```bash
docker compose -f docker-compose.yaml -f docker-compose.gpu.yaml up -d --build
Expand Down
4 changes: 2 additions & 2 deletions docs/tutorial/images.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ Open WebUI supports image generation through the **AUTOMATIC1111** [API](https:/
```
3. For Docker installation of WebUI with the environment variables preset, use the following command:
```
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -e AUTOMATIC1111_BASE_URL=http://host.docker.internal:7860/ -e IMAGE_GENERATION_ENABLED=True -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -e AUTOMATIC1111_BASE_URL=http://host.docker.internal:7860/ -e ENABLE_IMAGE_GENERATION=True -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```

### Configuring Open WebUI
Expand Down Expand Up @@ -49,7 +49,7 @@ ComfyUI provides an alternative interface for managing and interacting with imag
```
3. For Docker installation of WebUI with the environment variables preset, use the following command:
```
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -e COMFYUI_BASE_URL=http://host.docker.internal:7860/ -e IMAGE_GENERATION_ENABLED=True -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
docker run -d -p 3000:8080 --add-host=host.docker.internal:host-gateway -e COMFYUI_BASE_URL=http://host.docker.internal:7860/ -e ENABLE_IMAGE_GENERATION=True -v open-webui:/app/backend/data --name open-webui --restart always ghcr.io/open-webui/open-webui:main
```

### Configuring Open WebUI
Expand Down

0 comments on commit 39b40db

Please sign in to comment.