I've tried 2 ways, both still produce a torch.Tensor type and output the correct values. But I still only get 1/4 correct.
Here is the 2 ways:
def relu(x: torch.Tensor) -> torch.Tensor:
#return torch.as_tensor(list(map(lambda n: n if (n >= 0.0) else -0.0, x)))
return torch.as_tensor([n if n >= 0.0 else -0.0 for n in x])
The error message I get is confusing, I think it means it's not compatible with multidimensional tensors? Or does this error mean something else?
Why does it say element 1 does not require gradient function and has no gradient function. Yet this throws an error? This error shouldn't exist because it cancels itself out in the sale sentence.
🧪 Testing: Implement ReLU (Easy)
──────────────────────────────────────────────────
✅ [1/4] Basic values (1.6ms)
💥 [2/4] 2-D tensor
RuntimeError: Boolean value of Tensor with more than one value is ambiguous
return torch.as_tensor([n if n >= 0.0 else -0.0 for n in x])
^^^^^^^^
RuntimeError: Boolean value of Tensor with more than one value is ambiguous
💥 [3/4] Gradient check
RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn
return Variable._execution_engine.run_backward( # Calls into the C++ engine to run the backward pass
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
RuntimeError: element 0 of tensors does not require grad and does not have a grad_fn
💥 [4/4] Performance
RuntimeError: Boolean value of Tensor with more than one value is ambiguous
return torch.as_tensor([n if n >= 0.0 else -0.0 for n in x])
^^^^^^^^
RuntimeError: Boolean value of Tensor with more than one value is ambiguous
──────────────────────────────────────────────────
📊 1/4 tests passed.
Keep going! Use hint("relu") if you're stuck.
So I opened the interpreter and typed in the solution to the Relu challenge manually, with fixed arguments. That then gave me this error:

Surely there's more than one way to do this challenge without copying the solution verbatim.
Another thing I find confusing is on the solution it gets a green tick for being a 2-D Tensor, but it's not. It's a a 1-D Tensor?

I've tried 2 ways, both still produce a torch.Tensor type and output the correct values. But I still only get 1/4 correct.
Here is the 2 ways:
The error message I get is confusing, I think it means it's not compatible with multidimensional tensors? Or does this error mean something else?
Why does it say element 1 does not require gradient function and has no gradient function. Yet this throws an error? This error shouldn't exist because it cancels itself out in the sale sentence.
So I opened the interpreter and typed in the solution to the Relu challenge manually, with fixed arguments. That then gave me this error:
Surely there's more than one way to do this challenge without copying the solution verbatim.
Another thing I find confusing is on the solution it gets a green tick for being a 2-D Tensor, but it's not. It's a a 1-D Tensor?