Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can it work with gpt-3.5-turbo? #3

Open
Louvivien opened this issue May 2, 2023 · 3 comments
Open

Can it work with gpt-3.5-turbo? #3

Louvivien opened this issue May 2, 2023 · 3 comments

Comments

@Louvivien
Copy link

Hi,
Thank you for this code.
I have tried to change the model to gpt-3.5-turbo and I have changed the token_limit to 4000 but I keep having this message after the second question I ask : Warning: Conversation history cleared due to reaching the token limit. Please rephrase your query.
Is there a way to make it work with gpt-3.5-turbo?

@alfiedennen
Copy link
Owner

You would need to refactor the size of the embeddings you create and limit the number of embeddings used in a query.

@Louvivien
Copy link
Author

Thanks, any insights on how I can do that?

@fpartous
Copy link

fpartous commented Jun 9, 2023

I'm interested in this as well, at least until I have GPT4 API access

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants