You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The error comes from the fact that the tensordict should have an empty batch size like the env. Shape and batch size are synonymous but I agree the error is misleading
@vmoens , thanks for the tip. With tdact = TensorDict({"action":torch.tensor([1])}, batch_size=torch.Size([])) , the code snippet did not throw that error. Agree that the "error" message is confusing, it being not even an error. What is the reason for that check to be there ?
Because we want to enforce that the leading dims of the tensors match the environment batch size (and therefore comply with the specs).
If your tensordict has a batch size of [1] it means that all its content has a leading dimension of [1] and hence that you're running an environment that has a batch size (even if it's just 1).
Once you start executing envs in parallel, we build circular buffers to pass the data across the workers, using the env specs as indicators of the shape of the buffers. If you have data that is not properly shaped using some conventions it's easy to break things as the data won't fit in the buffers.
Describe the bug
What is the env.shape ? Does not appear to be defined anywhere
To Reproduce
Traceback (most recent call last): File ... *** RuntimeError: Expected a tensordict with shape==env.shape, got torch.Size([1]) and torch.Size([])
Expected behavior
A clear and concise description of what you expected to happen.
Screenshots
No stacktrace
System info
Describe the characteristic of your environment:
Additional context
Add any other context about the problem here.
Reason and Possible fixes
If you know or suspect the reason for this bug, paste the code lines and suggest modifications.
Checklist
The text was updated successfully, but these errors were encountered: