Skip to content

Conversation

@Aminehassou
Copy link
Contributor

@Aminehassou Aminehassou commented Jan 10, 2024

This bug was caused by us not setting the token limit to 512 after joining our list into one string. This would cause llamacpp to return a max token limit error for any article that had content bigger than 512 tokens. This is now fixed

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant