An all-in-one Docker Compose config for providing access to local and external LLMs with multiple chat interfaces.
- Caddy: Acts as central entrypoint for the whole stack
- Ollama: Provides access to local LLM models
- LiteLLM: OpenAI compatible API proxy for local Ollama provided models and upstream models
- Multiple ChatGPT-style web interfaces for interacting with the LLM models
- Local
- local-mistral
- local-mixtral-8x7b
- local-llama3-8b
- OpenAI
- openai-gpt-3.5-turbo
- openai-gpt-4-turbo
- openai-gpt-4o
- Google
- google-gemini-1.5-pro
- Anthropic
- anthropic-claude-3-sonnet
- anthropic-claude-3-opus
- Groq
- groq-llama3-70b
- Docker
- Docker Compose
- Git
- Clone this repository
- Copy the default config
cp default.env .env
- Edit
.env
and add the relevant API keys - Start the Docker Compose configuration:
docker-compose up
- Access the Caddy webserver at
http://localhost:3000