Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Experimentation with DistillCARP #14

Open
wants to merge 16 commits into
base: main
Choose a base branch
from
Open

Experimentation with DistillCARP #14

wants to merge 16 commits into from

Conversation

LouisCastricato
Copy link
Collaborator

DistillCOMET shows a lot of success in conditioning their common sense model by the knowledge provided by GPT3.

I'll be experimenting with applying a similar approach to CARP, prompting GPT-NeoX to generate critiques for a given story. For every element of the story-critique dataset I'll generate 3 - 6 critiques and use that to perform prompt softening during training.

The hope is that this will bring the training objective closer to the inference objective.

@LouisCastricato
Copy link
Collaborator Author

Needs to be updated to conform to the new trainer architecture.

@CLAassistant
Copy link

CLAassistant commented Apr 23, 2023

CLA assistant check
All committers have signed the CLA.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants