Skip to content

๐Ÿš€ Local DeepSeek-r1 Power with Ollama!

Latest
Compare
Choose a tag to compare
@warmshao warmshao released this 28 Jan 12:52
· 22 commits to main since this release
0c9cb9b

Hey everyone,

We've just rolled out a new release packed with awesome updates:

  1. Browser-Use Upgrade: We're now fully compatible with the latest browser-use version 0.1.29! ๐ŸŽ‰
  2. Local Ollama Integration: Get ready for completely local and private AI with support for the incredible deepseek-r1 model via Ollama! ๐Ÿ 

Before You Dive In:

  • Update Code: Don't forget to git pull to grab the latest code changes.
  • Reinstall Dependencies: Run pip install -r requirements.txt to ensure all your dependencies are up to date.

Important Notes on deepseek-r1:

  • Model Size Matters: We've found that deepseek-r1:14b and larger models work exceptionally well! Smaller models may not provide the best experience, so we recommend sticking with the larger options. ๐Ÿค”

How to Get Started with Ollama and deepseek-r1:

  1. Install Ollama: Head over to ollama and download/install Ollama on your system. ๐Ÿ’ป
  2. Run deepseek-r1: Open your terminal and run the command: ollama run deepseek-r1:14b (or a larger model if you prefer).
  3. WebUI Setup: Launch the WebUI following the instructions. Here's a crucial step: Uncheck "Use Vision" and set "Max Actions per Step" to 1. โœ…
  4. Enjoy! You're now all set to experience the power of local deepseek-r1. Have fun! ๐Ÿฅณ

Happy Chinese New Year! ๐Ÿฎ