Skip to content

docs: Add LLM messageHistory#108

Merged
jakmro merged 2 commits intov0.3.0-docsfrom
@jakmro/docs-llm-messageHistory
Feb 26, 2025
Merged

docs: Add LLM messageHistory#108
jakmro merged 2 commits intov0.3.0-docsfrom
@jakmro/docs-llm-messageHistory

Conversation

@jakmro
Copy link
Copy Markdown
Contributor

@jakmro jakmro commented Feb 25, 2025

Description

Add LLM messageHistory documentation

Type of change

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • Documentation update (improves or adds clarity to existing documentation)

Checklist

  • I have performed a self-review of my code
  • I have commented my code, particularly in hard-to-understand areas
  • I have updated the documentation accordingly
  • My changes generate no new warnings

@jakmro jakmro requested a review from chmjkb February 25, 2025 11:57
Comment thread docs/docs/hookless-api/LLMModule.md Outdated
- `modelSource` - A string that specifies the location of the model binary. For more information, take a look at [loading models](../fundamentals/loading-models.md) page.
- `tokenizerSource` - URL to the binary file which contains the tokenizer
- `systemPrompt` - Often used to tell the model what is its purpose, for example - "Be a helpful translator"
- `messageHistory` - An array of `MessageType` objects that represent the conversation history.
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we maybe also add here that it can be used to provide context to the model? I feel like this can be misunderstood

@jakmro jakmro merged commit 642dfe5 into v0.3.0-docs Feb 26, 2025
@jakmro jakmro deleted the @jakmro/docs-llm-messageHistory branch February 26, 2025 08:57
@kuriel-dev
Copy link
Copy Markdown

kuriel-dev commented May 3, 2025

Hi everyone, first of all, thanks so much for this amazing tool.

Second, I’m here to report a bug related to the messageHistory.
(I’ve made sure this is set as stable as possible before reporting.)

I’ve thoroughly tested it, and the messageHistory appears to be stable because:

  1. I’m loading a chat from a phone database.
  2. I’m passing the messageHistory (mapped) to useLLM().
  3. This only happens once. (i pass messageHistory only 1 time)

When messageHistory is empty, the AI works perfectly.

But when messageHistory has elements, the AI gone crazy.

I’ve tried removing messageHistory and it works fine, but without context, it can’t resume the conversation.

const llama = useLLM({
  modelSource: modelSource,
  tokenizerSource: tokenizerSource,
  // messageHistory: messageHistory,
});

an example of messageHistory i'm passing:

    [
      {"content": "Hola cómo estás?", "role": "user"}, 
      {"content": "Estoy bien, gracias. ¿En qué puedo ayudarte?", "role": "assistant"}
    ]

Do you have any advice on how to properly resume the conversation?

@kuriel-dev
Copy link
Copy Markdown

I also think that using useLLM should have deps array, to make it stable:

const llama = useLLM({
    modelSource:  // the modelSource,
    tokenizerSource: // the tokenizerSource,
  }, [deps]); <---- array deps for best sync

Example:

const llama = useLLM({
    modelSource,
    tokenizerSource,
   messageHistory,
  }, [messageHistory]);

since llama.setMessageHistory does not exists. (this can be another good option)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants