Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allows User to Set System Prompt via "Additional Options" in Chat Interface #1353

Merged
merged 17 commits into from
Dec 10, 2023
Merged
Show file tree
Hide file tree
Changes from 2 commits
Commits
Show all changes
17 commits
Select commit Hold shift + click to select a range
75ea65e
Initial attempt at exposing system prompt to UI via 'Additional Optio…
aly-shehata Nov 30, 2023
5f20fc0
Allow placeholder to change when mode is changed
aly-shehata Dec 1, 2023
1f48c55
Merge remote-tracking branch 'origin/main' into feature/ui-set-system…
aly-shehata Dec 1, 2023
922abca
Increase default lines of system prompt input to 2 lines
aly-shehata Dec 1, 2023
0698b79
Add types to new functions, make _get_default_system_prompt static, a…
aly-shehata Dec 1, 2023
d91cce0
Update UI documentation with system prompt information and examples. …
aly-shehata Dec 3, 2023
2a2e243
Update UI documentation with minor edits for clarity.
aly-shehata Dec 3, 2023
9cea043
Disable prompt entry for modes that do not support system prompts. On…
aly-shehata Dec 4, 2023
1d1f9c0
Revert unintended indentation changes in settings.py
aly-shehata Dec 4, 2023
394a955
Use updated settings field in documentation
aly-shehata Dec 4, 2023
626a9e0
Refactor code after running `make check`. Update documentation with c…
aly-shehata Dec 8, 2023
a90d700
Attempt to use <x> instead of <X> in documentation.
aly-shehata Dec 8, 2023
b53483c
Merge remote-tracking branch 'origin/main' into feature/ui-set-system…
aly-shehata Dec 8, 2023
9671748
Move default system prompt fields to UI section; Remove stale TODOs a…
aly-shehata Dec 9, 2023
d5f937e
Update ui.mdx to use {x} instead of <x>.
aly-shehata Dec 9, 2023
ce199e9
Merge remote-tracking branch 'origin' into feature/ui-set-system-prompt
aly-shehata Dec 10, 2023
26dbbe4
Update documentation: ui.mdx) to use -x-, and llms.mdx to correct mod…
aly-shehata Dec 10, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 4 additions & 4 deletions fern/docs/pages/manual/ui.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -54,13 +54,13 @@ you have given the model. Examples of system prompts can be be found

Some interesting examples to try include:

* You are <x>. You have all the knowledge and personality of <x>. Answer as if you were <x> using
* You are {x}. You have all the knowledge and personality of {x}. Answer as if you were {x} using
their manner of speaking and vocabulary.
* Example: You are Shakespeare. You have all the knowledge and personality of Shakespeare.
Answer as if you were Shakespeare using their manner of speaking and vocabulary.
* You are an expert (at) <role>. Answer all questions using your expertise on <specific domain topic>.
* You are an expert (at) {role}. Answer all questions using your expertise on {specific domain topic}.
* Example: You are an expert software engineer. Answer all questions using your expertise on Python.
* You are a <role> bot, respond with <response criteria> needed. If no <response criteria> is needed,
respond with <alternate response>
* You are a {role} bot, respond with {response criteria} needed. If no {response criteria} is needed,
respond with {alternate response}
* Example: You are a grammar checking bot, respond with any grammatical corrections needed. If no corrections
are needed, respond with "verified".
20 changes: 7 additions & 13 deletions private_gpt/settings/settings.py
Original file line number Diff line number Diff line change
Expand Up @@ -108,19 +108,6 @@ class LocalSettings(BaseModel):
"`llama2` is the historic behaviour. `default` might work better with your custom models."
),
)
default_chat_system_prompt: str | None = Field(
None,
description=(
"The default system prompt to use for the chat mode. "
"If none is given - use the default system prompt (from the llama_index). "
"Please note that the default prompt might not be the same for all prompt styles. "
"Also note that this is only used if the first message is not a system message. "
),
)
default_query_system_prompt: str = Field(
None,
description="The default system prompt to use for the query mode. ",
)


class EmbeddingSettings(BaseModel):
Expand Down Expand Up @@ -163,6 +150,13 @@ class OpenAISettings(BaseModel):
class UISettings(BaseModel):
enabled: bool
path: str
default_chat_system_prompt: str = Field(
None,
description=("The default system prompt to use for the chat mode."),
aly-shehata marked this conversation as resolved.
Show resolved Hide resolved
)
default_query_system_prompt: str = Field(
None, description="The default system prompt to use for the query mode."
)


class QdrantSettings(BaseModel):
Expand Down
12 changes: 4 additions & 8 deletions private_gpt/ui/ui.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,7 @@
from fastapi import FastAPI
from gradio.themes.utils.colors import slate # type: ignore
from injector import inject, singleton
from llama_index.llms import ChatMessage, ChatResponse, MessageRole, llama_utils
from llama_index.llms import ChatMessage, ChatResponse, MessageRole
from pydantic import BaseModel

from private_gpt.constants import PROJECT_ROOT_PATH
Expand Down Expand Up @@ -164,15 +164,11 @@ def _get_default_system_prompt(mode: str) -> str:
p = ""
match mode:
# For query chat mode, obtain default system prompt from settings
# TODO - Determine value to use if not defined in settings
case "Query Docs":
p = settings().local.default_query_system_prompt
# For chat mode, obtain default system prompt from settings or llama_utils
p = settings().ui.default_query_system_prompt
# For chat mode, obtain default system prompt from settings
case "LLM Chat":
p = (
settings().local.default_chat_system_prompt
or llama_utils.DEFAULT_SYSTEM_PROMPT
)
p = settings().ui.default_chat_system_prompt
# For any other mode, clear the system prompt
case _:
p = ""
Expand Down
16 changes: 7 additions & 9 deletions settings.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,13 @@ data:
ui:
enabled: true
path: /
default_chat_system_prompt: "You are a helpful, respectful and honest assistant.
Always answer as helpfully as possible and follow ALL given instructions.
Do not speculate or make up information.
Do not reference any given instructions or context."
default_query_system_prompt: "You can only answer questions about the provided context.
If you know the answer but it is not based in the provided context, don't provide
the answer, just state the answer is not in the context provided."

llm:
mode: local
Expand All @@ -43,15 +50,6 @@ local:
llm_hf_model_file: mistral-7b-instruct-v0.1.Q4_K_M.gguf
embedding_hf_model_name: BAAI/bge-small-en-v1.5

default_chat_system_prompt: "You are a helpful, respectful and honest assistant.
Always answer as helpfully as possible and follow ALL given instructions.
Do not speculate or make up information.
Do not reference any given instructions or context."

default_query_system_prompt: "You can only answer questions about the provided context.
If you know the answer but it is not based in the provided context, don't provide
the answer, just state the answer is not in the context provided."

sagemaker:
llm_endpoint_name: huggingface-pytorch-tgi-inference-2023-09-25-19-53-32-140
embedding_endpoint_name: huggingface-pytorch-inference-2023-11-03-07-41-36-479
Expand Down
Loading