Skip to content

Trying to run it on a low VRAM system #283

@mercurial-moon

Description

@mercurial-moon

Hi,

I was testing your example_browserclient tutorial and it works well on a 8GB VRAM system, memory usage is around 3.5 to 4GB when loading the tiny.en and large-v3 models together as required by your system.

I'm trying to run it on a more resource constrained system with 2GB of VRAM so had some questions.

  1. Does your library support turbo model?, from what I've heard its faster that large-v3 with minor reduction in accuracy.
  2. Is it possible to run the example_browserclient with only 1 model that is turbo? is this allowed? would this cause accuracy issues or simply not work. I see that in your AudioToTextRecorder you pass in 2 models tiny.en and large-v2.
  3. Do you plan to support Nvidia Parakeet, from what I understand that it was designed for streaming use case.. not sure if streaming is the same as real time though.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions