Skip to content

How much vRAM should I have for fine tuning DeBERTa v2 xxlarge? #10803

@ngoquanghuy99

Description

@ngoquanghuy99

I'm fine tuning DeBERTa v2 xxlarge with 1.5B parameters on Nvidia Tesla T4 (16GB vRAM) and it returns "CUDA out of memory".
How much vRAM is enough?
@LysandreJik

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions