forked from pytorch/pytorch
-
Notifications
You must be signed in to change notification settings - Fork 1
Open
Description
I think we could do this by numerically approximating the Jacobian via finite difference. Alternatively we could stack a bunch of grad()s, computing the determinant as in this suggestion:
def test_jacobian(self):
t = MyTransform()
for x in my_test_points:
y = t(x)
jacobian = torch.stack([grad(yi, [x], create_graph=True)[0] for yi in y])
det_jacobian = torch.potrf(jacobian).diag().prod()**2
log_abs_det_jacobian = torch.abs(det_jacobian).log()
self.assertEqual(t.log_abs_det_jacobian(x, y), log_abs_det_jacobian)It would be nice to make this test generic like the other tests in TestTransforms.
Note that this might not work for StickBreakingTransform because the det computation is over a non-square matrix since the source and target spaces have different dimensions.
Metadata
Metadata
Assignees
Labels
No labels