Found the following error:
AttributeError: 'RoPEAttention' object has no attribute 'Weighted_GAP'
Originating from:
FSSAM/sam2/modeling/sam/transformer.py, line 455
The RoPEAttention class inherits from the previously defined (in the same file) Attention class, which in turn inherits from torch.nn.Module. None of the above mentioned classes have an attribute, method, etc named Weighted_GAP. It is only mentioned when it is called by the RopEAttention class.