Skip to content

ScanningWindowInferer accumulates gradient data and inference fails with out of memory #1420

@aihsani

Description

@aihsani

Bug Description
The documentation does not explain that the MONAI inferers accumulate backpropagation information with each additional batch. The name "inferer" inherently implies no gradient information should be stored.

This took a few hours of detailed debugging to find the root cause, and creates a difficult user experience.

To Reproduce
Write a simple inference program using ScanningWindowInferer where the inference is performed on GPU. On CPU gradient info accumulation drastically slows down inference speed.

Expected behavior
The documentation should state that the inferers should be wrapped with torch.no_grad(), or the current inferer classes should be renamed with ScanningWindowForwardPropagator and wrapped with ScanningWindowInferer which internally uses torch.no_grad().

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions