Skip to content

Model download improvements / Background download #77

@mqtik

Description

@mqtik

Problem description

If the user goes to a screen where useLLM is being used, but have not downloaded the model before, it downloads the model. As some models can be quite heavy, the user likely isn't going to stick around to wait for the whole download to finish, so he might go into another app. He might even leave the phone on the table and just go do something else, and the screen will turn off.
I'm experiencing some issues with this, if the user comes back to the app to see the status, it's stuck at 12% for example - so the download stops at some point, or the progress is not updating anymore. It might be the latter.
Other times, the model might've been downloaded, but isModelReady is false and error is Model and tokenizer already loaded

Proposed solution

To illustrate the current implementation in the hook:
Download issue:

  1. Open a screen where useLLM is used
  2. Go into another app for while
  3. Come back

The loaded issue is a bit trickier, because it has happened after I've loaded the model and had a few conversations with it. After many restarts, it tried to start the download again but got stuck in 0. After further restarts, it downloaded the whole (previously downloaded) model.

Alternative solutions

No response

Benefits to React Native ExecuTorch

Usability. Users are prone to keep with their lives while a model downloads.

Additional context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions