Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BugFix] make sure the params of exploration-wrapper is float #1700

Merged
merged 2 commits into from
Nov 15, 2023

Conversation

FrankTianTT
Copy link
Contributor

Description

we should make sure the params of exploration-wrapper is float, or the params decay maybe abnormal.

Motivation and Context

If the user passes an int into the wrapper (e.g. "1" instead of "1.0", like stupid me), the params will always be 0 and don't change throughout the step is called:

 policy = TensorDictModule(
      module=torch.nn.Linear(4, 4, bias=False),
      in_keys="observation",
      out_keys="action",
  )
  p = OrnsteinUhlenbeckProcessWrapper(
      policy,
      eps_init=1,
  )
  for i in range(10):
      p.step()
      print(p.eps.item())

result is

0
0
0
0
0
0
0
0
0
0

if we pass a float, the function of wrapper is normal:

 policy = TensorDictModule(
      module=torch.nn.Linear(4, 4, bias=False),
      in_keys="observation",
      out_keys="action",
  )
  p = OrnsteinUhlenbeckProcessWrapper(
      policy,
      eps_init=1.0,
  )
  for i in range(10):
      p.step()
      print(p.eps.item())

result is

0.9991000294685364
0.9982000589370728
0.9973000884056091
0.9964001178741455
0.9955001473426819
0.9946001768112183
0.9937002062797546
0.992800235748291
0.9919002652168274
0.9910002946853638

the reason of this bug is we use the type of eps_init as the type of self.eps.

Types of changes

What types of changes does your code introduce? Remove all that do not apply:

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds core functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)
  • Documentation (update in the documentation)
  • Example (update in the folder of examples)

Checklist

Go over all the following points, and put an x in all the boxes that apply.
If you are unsure about any of these, don't hesitate to ask. We are here to help!

  • I have read the CONTRIBUTION guide (required)
  • My change requires a change to the documentation.
  • I have updated the tests accordingly (required for a bug fix or a new feature).
  • I have updated the documentation accordingly.

Copy link

pytorch-bot bot commented Nov 15, 2023

🔗 Helpful Links

🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/rl/1700

Note: Links to docs will display an error until the docs builds have been completed.

❌ 2 New Failures, 6 Unrelated Failures

As of commit 09878b1 with merge base 0a38cbc (image):

NEW FAILURES - The following jobs have failed:

FLAKY - The following jobs failed but were likely due to flakiness present on trunk:

BROKEN TRUNK - The following jobs failed but were present on the merge base:

👉 Rebase onto the `viable/strict` branch to avoid these failures

This comment was automatically generated by Dr. CI and updates every 15 minutes.

@facebook-github-bot facebook-github-bot added the CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed. label Nov 15, 2023
Copy link
Contributor

@vmoens vmoens left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fantastic thanks a million

@vmoens vmoens merged commit 44bd026 into pytorch:main Nov 15, 2023
@vmoens vmoens added the bug Something isn't working label Nov 15, 2023
@FrankTianTT FrankTianTT deleted the fix-exploration branch November 18, 2023 15:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working CLA Signed This label is managed by the Facebook bot. Authors need to sign the CLA before a PR can be reviewed.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants