A simple Windows batch script that automates the installation and configuration of Ollama with CORS enabled for client-side applications.
Running Large Language Models (LLMs) locally offers several advantages:
- 🔒 Complete privacy - your data never leaves your machine
- 💨 Fast response times - no internet latency
- 💰 No API costs or usage limits
- 🔌 Works offline
However, by default, Ollama's security settings prevent web applications from accessing it directly due to CORS (Cross-Origin Resource Sharing) restrictions. This creates friction for users trying to interact with Ollama through browser-based applications.
- 📥 Downloads the official Ollama installer
- 💿 Performs a silent installation
- ⚙️ Configures CORS settings automatically
- 🤖 Downloads the llama3 model
- 🚀 Starts Ollama with the correct settings
- Download
install-ollama.bat
- Right-click and select "Run as administrator"
- Wait for the installation to complete
- That's it! Your Ollama installation is now configured for web access
The script:
- Sets
OLLAMA_ORIGINS="*"
as a system environment variable - Manages Ollama processes to ensure clean configuration
- Installs and configures everything with minimal user interaction
This configuration allows any webpage to access your local Ollama instance. While this enables easy integration with web applications, be aware of the security implications. Only use this setup if you trust the web applications accessing Ollama.
- Windows 10 or 11
- Administrator privileges (for installation)
- ~5GB free space (for Ollama and base model)
If you encounter any issues:
- Check that you ran the script as administrator
- Verify Ollama is running (
netstat -ano | findstr :11434
) - Check that the model was downloaded (
ollama list
)