Skip to content

Commit

Permalink
Merge pull request #16 from jonfairbanks/develop
Browse files Browse the repository at this point in the history
Further fixes for Docker + Windows
  • Loading branch information
jonfairbanks authored Feb 28, 2024
2 parents e2acb78 + 1bb8dfb commit 09b38a8
Show file tree
Hide file tree
Showing 5 changed files with 24 additions and 12 deletions.
3 changes: 2 additions & 1 deletion .dockerignore
Original file line number Diff line number Diff line change
Expand Up @@ -14,4 +14,5 @@ logo.*
*.sample
.env*
Dockerfile
docker-compose.yml
docker-compose.yml
*.log
10 changes: 8 additions & 2 deletions Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,12 @@ USER appuser
# Install application into container
COPY . .

# Expose the Streamlit port
EXPOSE 8501

# Setup a health check against Streamlit
HEALTHCHECK CMD curl --fail http://localhost:8501/_stcore/health

# Run the application
ENTRYPOINT ["python", "-m", "streamlit"]
CMD ["run", "main.py"]
ENTRYPOINT [ "python", "-m", "streamlit" ]
CMD ["run", "main.py", "--server.port=8501", "--server.address=0.0.0.0"]
11 changes: 6 additions & 5 deletions components/page_state.py
Original file line number Diff line number Diff line change
Expand Up @@ -21,15 +21,16 @@ def set_initial_state():
try:
models = get_models()
st.session_state["ollama_models"] = models
except Exception as err:
logs.log.warn(
f"Warning: Initial loading of Ollama models failed. You might be hosting Ollama somewhere other than localhost. -- {err}"
)
except Exception:
st.session_state["ollama_models"] = []
pass

if "selected_model" not in st.session_state:
st.session_state["selected_model"] = st.session_state["ollama_models"][0]
try:
st.session_state["selected_model"] = st.session_state["ollama_models"][0]
except Exception:
st.session_state["selected_model"] = None
pass

if "messages" not in st.session_state:
st.session_state["messages"] = [
Expand Down
8 changes: 5 additions & 3 deletions docker-compose.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,18 +3,20 @@ services:
local-rag:
container_name: local-rag
image: jonfairbanks/local-rag
network_mode: host
restart: unless-stopped
environment:
- TZ=America/Los_Angeles
ports:
- '8501:8501/tcp'
volumes:
- .:/home/appuser:rw
- ./data:/home/appuser/data:rw
deploy:
resources:
reservations:
devices:
- driver: nvidia
device_ids: ['0']
capabilities: [gpu]
capabilities: [gpu]

volumes:
local-rag: {}
4 changes: 3 additions & 1 deletion docs/todo.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,7 @@ Although not final, items are generally sorted from highest to lowest priority.
- [ ] Websites
- [x] Export Data (Chat History, ...)
- [x] Docker Support
- [x] Windows Support
- [ ] Extract Metadata and Load into Index
- [ ] Parallelize Document Embeddings
- [ ] Swap to OpenAI compatible endpoints
Expand All @@ -31,6 +32,7 @@ Although not final, items are generally sorted from highest to lowest priority.
- [x] Show Loaders in UI (File Uploads, Conversions, ...)
- [x] View and Manage Imported Files
- [x] About Tab in Sidebar w/ Resources
- [x] Enable Caching
- [ ] Allow Users to Set LLM Settings
- [x] System Prompt
- [ ] Chat Mode
Expand All @@ -56,7 +58,7 @@ Although not final, items are generally sorted from highest to lowest priority.

### Known Issues & Bugs

- [ ] **HIGH PRIORITY:** Upon sending a Chat message, the File Processing expander appears to re-run itself (seems something is not using state correctly)
- [x] Upon sending a Chat message, the File Processing expander appears to re-run itself (seems something is not using state correctly)
- [ ] Refreshing the page loses all state (expected Streamlit behavior; need to implement local-storage)
- [x] Files can be uploaded before Ollama config is set, leading to embedding errors
- [x] Assuming Ollama is hosted on localhost, Models are automatically loaded and selected, but the dropdown does not render the selected option

0 comments on commit 09b38a8

Please sign in to comment.