Skip to content

Models should run on NPU  #8

@hrishikeshp7

Description

@hrishikeshp7

Lately , using app for a few quick searches causes device to heat up very quickly and battery also drains very quick . I think this is happening because the model either runs on cpu or gpu .

I think it would be better optimised for inference on the npu especially on qualcomm devices .

Can it be done ? Or is it doable .

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions