Skip to content

[Issue]: hipblas status not supported on certain shape pytorch matmul #3550

@ZJLi2013

Description

@ZJLi2013

Problem Description

    ray_directions = torch.matmul(K, C).squeeze(-1)
    ray_directions = ray_directions.reshape(*B, 3) 
[HIPBLAS DEBUG] Original intrinsics shape: torch.Size([1, 98, 378, 504, 3, 3]), dtype: torch.float32
[HIPBLAS DEBUG] Original coordinates shape: torch.Size([1, 98, 378, 504, 3]), dtype: torch.float32
[HIPBLAS DEBUG] Flattened K shape: torch.Size([18670176, 3, 3]), dtype: torch.float32
[HIPBLAS DEBUG] Flattened C shape: torch.Size([18670176, 3, 1]), dtype: torch.float32
[HIPBLAS DEBUG] Batch dimensions B: torch.Size([1, 98, 378, 504]), Total N: 18670176

    ray_directions = torch.matmul(K, C).squeeze(-1)  # (N, 3) - uses 2D GEMM
                     ^^^^^^^^^^^^^^^^^^
RuntimeError: CUDA error: HIPBLAS_STATUS_NOT_SUPPORTED when calling `HIPBLAS_STATUS_NOT_SUPPORTED`

any suggestions ?

Operating System

ubuntu 22.04

CPU

amd

GPU

mi300

ROCm Version

rocm6.4.3

ROCm Component

No response

Steps to Reproduce

No response

(Optional for Linux users) Output of /opt/rocm/bin/rocminfo --support

No response

Additional Information

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions