Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

snippet refactor #85

Closed
wants to merge 2 commits into from
Closed

snippet refactor #85

wants to merge 2 commits into from

Conversation

C-Loftus
Copy link
Owner

@C-Loftus C-Loftus commented Jul 9, 2024

@jaresty made some changes to support snippets in a new PR

  • I think snip should be a prompt, not an insertion modifier.
    model snip generate a pandas dataclass or model snip generate a class

  • I think we shouldn't really have an insertion modifier list since it adds more complexity

    • My number one goal is to improve discoverability right now. I think this hurts that.
  • Kept some things that were removed in the other PR that I mentioned in my last comment.

In general I also am not clear how to use this. How do you have it insert a snippet like talon inserts snippets where you can jump between sections? I don't find the prompt performs that well but maybe I am doing something wrong

@jaresty
Copy link
Collaborator

jaresty commented Jul 9, 2024

I prefer this to be an insertion modifier because it can be used with various kinds of prompts. Having it be its own prompt makes it really limited, since you aren't able to use it with all of the prompts that you already have. The way I was using it seemed to be working really well for me-did you try that PR?

@jaresty
Copy link
Collaborator

jaresty commented Jul 9, 2024

For what its worth that was using it with model please

@C-Loftus
Copy link
Owner Author

C-Loftus commented Jul 9, 2024

model snip <user.text> in this PR is equivalent to model please <user.text> snip . It is the same thing just less verbose.

Regardless, I tried it with the other PR. In both I am not finding it to work well. Does it insert with blanks like Talon snippet for you? I am finding that it just generates code like it would for a normal model please request. The advantage of a talon snippet in vscode would be the ability to jump between sections but I am not finding this does that.

Both images are results of the other PR. Same sort of results with this PR

model please generate a dataclass snip
image

model please generate a while loop snip
image

@C-Loftus C-Loftus marked this pull request as draft July 9, 2024 23:00
@jaresty
Copy link
Collaborator

jaresty commented Jul 9, 2024

It is generating with placeholders for me:
image

@jaresty
Copy link
Collaborator

jaresty commented Jul 9, 2024

Did you try using snip next?

@jaresty
Copy link
Collaborator

jaresty commented Jul 9, 2024

I want to use it with model answer, which this grammar would not allow for.

@C-Loftus
Copy link
Owner Author

C-Loftus commented Jul 9, 2024

For me I now find it generates snippet placeholders around 25% of the time for simple syntax-based requests like generating a for loop. But anything above more complicated doesn't seem to work.

@jaresty
Copy link
Collaborator

jaresty commented Jul 9, 2024

I was getting much higher success rate than that 🤔 and for more complex results as well.

@C-Loftus
Copy link
Owner Author

C-Loftus commented Jul 9, 2024

Could you give a few examples outputs with the prompt you used?

@jaresty
Copy link
Collaborator

jaresty commented Jul 9, 2024

"model answer snip"

(applied to # generate a data class)
image

@jaresty
Copy link
Collaborator

jaresty commented Jul 9, 2024

Maybe this is because I'm using gpt-4o

@jaresty
Copy link
Collaborator

jaresty commented Jul 10, 2024

model answer snip with a code block alone turns it into a snippet that allows me to edit the values that I might want to just using snip next

image

@jaresty
Copy link
Collaborator

jaresty commented Jul 10, 2024

I could definitely see adding a model paste snip that does the same for you

@C-Loftus C-Loftus closed this Jul 10, 2024
@C-Loftus
Copy link
Owner Author

Ok seems to be a gpt 4o difference. Closing this

@jaresty
Copy link
Collaborator

jaresty commented Jul 10, 2024

I could get behind this being a custom prompt like this if it also considered the current selection (like model answer). I think there are two usages:

  1. With a text prompt, spoken
  2. Operating on the selection or the clipboard

@jaresty
Copy link
Collaborator

jaresty commented Jul 10, 2024

I'm warming up to this direction-can we reopen it?

@jaresty
Copy link
Collaborator

jaresty commented Jul 10, 2024

I think I'd prefer it behave more like a regular prompt though-maybe we can defer solving adding additional context until we do it for other prompts as well.

@C-Loftus C-Loftus reopened this Jul 10, 2024
@C-Loftus
Copy link
Owner Author

C-Loftus commented Jul 10, 2024

Reopened. I do this generally fits my goals with the format of things better. (i.e. prompts that are easy to follow, a grammar that is somewhat similar to cursorless, and limited amounts of helper fns/special cases, error handlings/extra talon lists)

However, I am not sure how to make it so you can also use model answer with it. Don't have a lot of bandwidth at the moment to play around with things unfortunately.

@jaresty
Copy link
Collaborator

jaresty commented Jul 10, 2024

Here's another flavor of this that optimizes for the model answer use case and makes this operate more like other prompts: #86

@jaresty
Copy link
Collaborator

jaresty commented Jul 10, 2024

I think what's actually missing here may be that you aren't using the cursorless insertion method to paste unless I missed it. actions.user.insert_snippet(text)

@C-Loftus C-Loftus closed this Jul 12, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants