-
Notifications
You must be signed in to change notification settings - Fork 335
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(prompts!): prompt template management #5724
Draft
mikeldking
wants to merge
92
commits into
main
Choose a base branch
from
prompts
base: main
Could not load branches
Branch not found: {{ refName }}
Loading
Could not load tags
Nothing to show
Loading
Are you sure you want to change the base?
Some commits from the old base branch may be removed from the timeline,
and old review comments may become outdated.
Draft
+26,676
−3,680
Conversation
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
dosubot
bot
added
the
size:M
This PR changes 30-99 lines, ignoring generated files.
label
Dec 12, 2024
mikeldking
added
the
feature branch
a feature branch that consolidates multiple features into a single commit on main
label
Dec 12, 2024
RogerHYang
requested changes
Dec 12, 2024
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
blocking feature branch
mikeldking
force-pushed
the
prompts
branch
4 times, most recently
from
December 28, 2024 07:57
6da6bdc
to
3c3c5e0
Compare
* wip * WIP * WIP * add the prompt type * add initial listing * feat: temporary prompt * final changes * add aria role
…rompt specific playground (#5748) * feat(prompts): Add prompt selection combobox to playground instances * Implement prompt/:id/playground route and add link from prompt page * Rename prompt combobox components for clarity
* feat(prompts): Implement SavePromptForm for creating new prompts * Rename button to 'save'
* Add prompt table migrations * Fix typo * Update migration test * Address migration feedback * Use default names for constraints * Add source prompt id
* cleanup * feat(prompts): rudimentary prompt details page #5741 * Update app/src/pages/prompt/PromptTabContent.tsx Co-authored-by: Anthony Powell <[email protected]> --------- Co-authored-by: Anthony Powell <[email protected]>
* Add PromptVersion query * Update src/phoenix/server/api/types/PromptVersion.py * reorganize * cleanup * Update the schema --------- Co-authored-by: Mikyo King <[email protected]>
* refactor(prompts): URL based routing and loader sharing for prompts - /prompts/:prompId has been converted to a layout component - sub-routes of promptId share loader data with promptLoader - prompt detail page tabs perform navigations * feat(prompts): Add query fragment to prompt versions sub-route * feat(prompts): Implement routing, loaders, and fragments for prompt version list * Add PromptVersion to node query * fix imports * Ensure all unset fields are set * fix comma * Distinguish prompt version details from one another * Move truncate component * Consolidate prompt tab nav logic --------- Co-authored-by: Dustin Ngo <[email protected]>
…5780) * feat(prompts): Render model invocation params in propt details view * Use fragment in PromptInvocationParameters * temporarily use component styles for lists until we can migrate * fix scrolling --------- Co-authored-by: Mikyo King <[email protected]>
* feat(prompts): Render prompt messages on prompt detail view * Remove unused import * Implement copy and paste button on prompt chat messages * Render chat messages on prompt version details page * Redirect to latest version when clicking versions tab Additionally highlight active version in versions list * Update example data to match template language * Fix prompt details page width, prompt versions borders / scroll * Distinguish readOnly TemplateEditor styling from normal TemplateEditor * Refactor active prompt id into prop * Replace custom button with styled anchor * Rename node in promptVersion query
* Flesh out PromptVersionTemplate type * Return GraphQL objects with the correct type * Use new types in node query * Decouple pydantic models and gql types * Rebuild gql schema * Rework model names * Update gql schema * Propagate name into schema * Incorporate feedback * Update schema * adjust UI to new schema * cleanup * Remove `hub` naming and clean up type annotations --------- Co-authored-by: Mikyo King <[email protected]>
* chore: fix lint errors * cleanup * cleanup
* feat(prompts): preview of last 5 versions * feat(prompts): preview of last 5 versions * rename
* feat(prompts): graphql types for tools, output_schema * fix types
* feat(prompts): Implement prompts as code examples beneath prompt - python prompts as code - typescript prompts as code * Implement typescript language formatter for prompts * Align emitted code with openai documentation sample code * Minor UX tweaks wrt code blocks and version details page * Reduce tab size, update prompt mock data * Make code card collapsible * Make prompt code snippets conform to new schemas * Tweak comment * Refactor templates into lodash templates
* Add prompt to version tag payload * Build gql schema * Separate delete and set payloads * Add Prompt to delete payload as well
* Resolve `source` prompt for cloned prompts * Build gql schema
* refactor(components): use storybook builtins for theming * reinstall" * cleanup * add descriptions
* Properly handle global id * Use Prompt object directly
* Add patchPromptDescription mutation * Update gql schema * Rename PatchPromptDescription -> PatchPrompt * Ruff 🐶 * Build GQL schema
Co-authored-by: Mikyo King <[email protected]>
* Resolve versions on Prompt * Build GQL schema * Scope PromptVersion to Prompt
* feat(prompts): Break JSONPromptMessage content into list of parts * Port PromptChatMessage component queries to pull in message content parts * Bump openapi schema * Update prompt input schemas, fix prompt tests * Bump openapi schema * Refactor chat template types to use new prompt message schema * Bump uv lock * Fix auth test * Rename Image model to ImageResult * Update comment * Normalize depth across all content part types * Fix auth test * Fix discrepencies between ContentParts with Pydantic, GQL Input, and GQL Output - Pydantic discriminated union was too strict, need to populate type field with defaults during validation since they are not provided by input - Strawberry input types need to default to UNSET when using OneOf decorator arg - unlike Strawberry input type, Strawberry output types need to match the same shape as the pydantic models so that they can be automatically converted from ORM model * Use new gql query types in FE * add unit tests for third-party SDKs * fix typo * fix type check * build openapi schema * fix relay * fix ci for evals pkg * disable fail fast for evals * revert changes to requirements --------- Co-authored-by: Roger Yang <[email protected]>
…ntent prompts (#6084) * feat(prompts): Save and display playground instances as multi-part content prompts * Improve commenting * Rename file * Fix text normalization inconsistencies, add tests * Hide chat part disclosures when there is only when text child * Align tool result id with tool result content * Improve comments * More comments
…6091) This prevents data-loss from re-parenting playground on a new route Before: /prompts/1234/playground -> click dataset -> /playground/datasets/9876 Causes remount, data loss After: /prompts/1234/playground -> click dataset -> /prompts/1234/playground?datasetId="9876" No remount, no data loss, just state updates
* fix(components): fix button size and variance props * remove varianceProps interface
* Add GET /prompts route * Use proper ORM class
* feat(prompts): Update prompt combobox after save Also reset dirty state on save * Block page navigation when any instance is dirty * Use native beforeunload dialog * Aggresively re-fetch prompt combobox This allows the prompt combobox to repopulate when new prompts are created, but it also causes it to overfetch whenever you change prompts
* feat(prompts): cleanup versions ui * cleanup * cleanup styles
* feat(prompts): Display basic LLM details on Prompt Details pages * Remove extraneous export
* feat: date range picker * WIP * cleanp * time field * cleanup
…ema for code snippets (#6132) * feat(prompts): Convert from Prompt message schema to LLM Provider schema for code snippets * Update prompt code snippet empty state * Reuse prompt utils in promptToOpenai conversion and back
* Allow tag updates along with version creation * Build gql schema * Update relay artifacts
Check out this pull request on See visual diffs & provide feedback on Jupyter Notebooks. Powered by ReviewNB |
…ia (#6143) * feat(components): Port RadioGroup from react aria * Add another radio group story * Fix width of radio group * Fix layout issues with RadioGroups inside of overflow * Replace deprecated usage of arizeai RadioGroup * Make L radio group variant consistent with old radio group * Rename RadioGroup dir to Radio * Implement ToggleButtonGroup * Convert RadioGroups to ToggleButtonGroups * Update Radio component styles and stories * Adjust styles for ToggleButtonGroup and RadioGroup * Use more css vars
* Add label color and version metadata * Update gql schema * Add color column to the right table * Add metadata to Prompt as well
* Add prompt label node and mutations * Add PromptLabel to node interface * Add prompts resolver on PromptLabel node * Add prompt label mutations mixin and build gql schema * Add prompt labels query * Delete needless comment * Address feedback * Build gql schema
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
feature branch
a feature branch that consolidates multiple features into a single commit on main
size:M
This PR changes 30-99 lines, ignoring generated files.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
resolves #5582