Skip to content

Write tests for .log_abs_det_jacobian() methods #110

@fritzo

Description

@fritzo

I think we could do this by numerically approximating the Jacobian via finite difference. Alternatively we could stack a bunch of grad()s, computing the determinant as in this suggestion:

def test_jacobian(self):
    t = MyTransform()
    for x in my_test_points:
        y  = t(x)
        jacobian = torch.stack([grad(yi, [x], create_graph=True)[0] for yi in y])
        det_jacobian = torch.potrf(jacobian).diag().prod()**2
        log_abs_det_jacobian = torch.abs(det_jacobian).log()
        self.assertEqual(t.log_abs_det_jacobian(x, y), log_abs_det_jacobian)

It would be nice to make this test generic like the other tests in TestTransforms.

Note that this might not work for StickBreakingTransform because the det computation is over a non-square matrix since the source and target spaces have different dimensions.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions