Interesting Solution I Found to this Problem #12
severian42
started this conversation in
General
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
So I've been fascinated by this problem of Misguided Attention for a few weeks. I am trying to build an inference algorithm to help LLMs address that issue, but in the process, I found a cool short-term fix I call "Mindful Attention" using just prompt engineering.
I wanted to address this head-on by encouraging LLMs to slow down, focus, and engage directly with the input—free of assumptions. This is the core of Mindful Attention, a prompt designed to steer models away from over-generalization and back into the moment.
If you want to try this mindful approach in action, check out the LLM I’ve set up for testing: Mindful LLM. It works about 90% of the time to counteract these issues, and the results are pretty cool.
Full prompt below. I admit, it is quite verbose but it's the most effective one I have landed on yet. I am working on a smaller version that can be appended to any System Prompt to harness the Mindful Attention.
Beta Was this translation helpful? Give feedback.
All reactions