docs: Add LLM messageHistory#108
Conversation
| - `modelSource` - A string that specifies the location of the model binary. For more information, take a look at [loading models](../fundamentals/loading-models.md) page. | ||
| - `tokenizerSource` - URL to the binary file which contains the tokenizer | ||
| - `systemPrompt` - Often used to tell the model what is its purpose, for example - "Be a helpful translator" | ||
| - `messageHistory` - An array of `MessageType` objects that represent the conversation history. |
There was a problem hiding this comment.
Can we maybe also add here that it can be used to provide context to the model? I feel like this can be misunderstood
|
Hi everyone, first of all, thanks so much for this amazing tool. Second, I’m here to report a bug related to the messageHistory. I’ve thoroughly tested it, and the messageHistory appears to be stable because:
When messageHistory is empty, the AI works perfectly. But when messageHistory has elements, the AI gone crazy. I’ve tried removing messageHistory and it works fine, but without context, it can’t resume the conversation. an example of messageHistory i'm passing: Do you have any advice on how to properly resume the conversation? |
|
I also think that using useLLM should have deps array, to make it stable: Example: since |
Description
Add LLM messageHistory documentation
Type of change
Checklist