-
-
Notifications
You must be signed in to change notification settings - Fork 21
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
snippet refactor #85
snippet refactor #85
Conversation
for more information, see https://pre-commit.ci
I prefer this to be an insertion modifier because it can be used with various kinds of prompts. Having it be its own prompt makes it really limited, since you aren't able to use it with all of the prompts that you already have. The way I was using it seemed to be working really well for me-did you try that PR? |
For what its worth that was using it with model please |
Regardless, I tried it with the other PR. In both I am not finding it to work well. Does it insert with blanks like Talon snippet for you? I am finding that it just generates code like it would for a normal model please request. The advantage of a talon snippet in vscode would be the ability to jump between sections but I am not finding this does that. Both images are results of the other PR. Same sort of results with this PR |
Did you try using snip next? |
I want to use it with model answer, which this grammar would not allow for. |
For me I now find it generates snippet placeholders around 25% of the time for simple syntax-based requests like generating a for loop. But anything above more complicated doesn't seem to work. |
I was getting much higher success rate than that 🤔 and for more complex results as well. |
Could you give a few examples outputs with the prompt you used? |
Maybe this is because I'm using gpt-4o |
I could definitely see adding a |
Ok seems to be a gpt 4o difference. Closing this |
I could get behind this being a custom prompt like this if it also considered the current selection (like model answer). I think there are two usages:
|
I'm warming up to this direction-can we reopen it? |
I think I'd prefer it behave more like a regular prompt though-maybe we can defer solving adding additional context until we do it for other prompts as well. |
Reopened. I do this generally fits my goals with the format of things better. (i.e. prompts that are easy to follow, a grammar that is somewhat similar to cursorless, and limited amounts of helper fns/special cases, error handlings/extra talon lists) However, I am not sure how to make it so you can also use model answer with it. Don't have a lot of bandwidth at the moment to play around with things unfortunately. |
Here's another flavor of this that optimizes for the model answer use case and makes this operate more like other prompts: #86 |
I think what's actually missing here may be that you aren't using the cursorless insertion method to paste unless I missed it. |
@jaresty made some changes to support snippets in a new PR
I think
snip
should be a prompt, not an insertion modifier.model snip generate a pandas dataclass
ormodel snip generate a class
I think we shouldn't really have an insertion modifier list since it adds more complexity
Kept some things that were removed in the other PR that I mentioned in my last comment.
In general I also am not clear how to use this. How do you have it insert a snippet like talon inserts snippets where you can jump between sections? I don't find the prompt performs that well but maybe I am doing something wrong