Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Allow use of gpt-4-32k model #87

Open
archvalmiki opened this issue Jun 9, 2023 · 1 comment
Open

Allow use of gpt-4-32k model #87

archvalmiki opened this issue Jun 9, 2023 · 1 comment

Comments

@archvalmiki
Copy link

Currently, specifying that in model doesn't work. Allow this option to be used in conjunction with max tokens to be set to 32,768.

@yigitkonur
Copy link

You could try going and editing the plugin's own code to remove the places where the max_tokens parameter is. This parameter is no longer mandatory - if you don't need to limit the AI, I think not using max_tokens will solve this problem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants