Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Deepbots GoalEnv #79

Open
tsampazk opened this issue Mar 4, 2021 · 11 comments
Open

Deepbots GoalEnv #79

tsampazk opened this issue Mar 4, 2021 · 11 comments
Assignees
Labels
enhancement New feature or request

Comments

@tsampazk
Copy link
Member

tsampazk commented Mar 4, 2021

No description provided.

@tsampazk tsampazk added the enhancement New feature or request label Mar 4, 2021
@tsampazk
Copy link
Member Author

tsampazk commented Dec 5, 2021

@KelvinYang0320 Are you interested in taking look at gym's GoalEnv and what we need to do in order to integrate it with deepbots?

@KelvinYang0320
Copy link
Member

KelvinYang0320 commented Dec 5, 2021

@tsampazk Yes, I will spend some time to look into GoalEnv.

@KelvinYang0320 KelvinYang0320 self-assigned this Dec 5, 2021
@tsampazk
Copy link
Member Author

tsampazk commented Dec 5, 2021

@KelvinYang0320 Thank you!

@KelvinYang0320 KelvinYang0320 linked a pull request Dec 8, 2021 that will close this issue
@KelvinYang0320
Copy link
Member

@ManosMagnus @tsampazk
At the end of last year, GoalEnv has been removed from Gym.
I think there are two alternative solutions:

  1. Deepbots can still inherit GoalEnv by pip install gym-robotics, but users will need to install gym-robotics when installing deepbots.
  2. We can copy/modify their GoalEnv and integrate it with deepbots without installing gym-robotics.
    https://github.com/Farama-Foundation/Gym-Robotics/blob/main/gym_robotics/core.py

What do you think?

@tsampazk
Copy link
Member Author

tsampazk commented Jun 23, 2022

This is quite tricky i think. On the one hand, we very much prefer not to introduce any additional dependencies to the framework. Especially for a library that will be or is used by very specific use cases and not universally. On the other hand, i think that it is not proper and we shouldn't just copy the existing GoalEnv from the gym-robotics library.

@KelvinYang0320 is this actually needed for the panda example? Or any other current example?
Edit: I think i remember that some implementations of RL algorithms require GoalEnv and that's why we started working on it, right?

@KelvinYang0320
Copy link
Member

KelvinYang0320 commented Jun 23, 2022

@KelvinYang0320 is this actually needed for the panda example? Or any other current example?

I think GoalEnv is quite useful for our robotic tasks as I mentioned here.
I have made a Panda-GoalEnv draft PR to check these features.

  1. Goal-compatible observation space
    • observation, desired_goal, and achieved_goal
  2. compute_reward(achieved_goal, desired_goal, info)
    • Our get_reward(action) is actually not that suitable for robotics control task.

Another example is that Panda-gym v2.0.1 works with gym-robotics.GoalEnv.

Edit: I think i remember that some implementations of RL algorithms require GoalEnv and that's why we started working on it, right?

Yes, HER requires the environment to inherits from gym.GoalEnv in SB3.
However, it seems that they are fixing that now.

@tsampazk
Copy link
Member Author

Alright thanks for the information!

1. Deepbots can still inherit GoalEnv by `pip install gym-robotics`, but users will need to install `gym-robotics` when installing `deepbots`.

I thought about the additional dependency on gym-robotics. Using this would mean adding additional inheritance to some gym-robotics class that includes GoalEnv. I will need some time to look into this further in detail, so i think we can put this on hold for a while and focus on the fixes regarding deepworlds. Unless you have a clear idea on what we can do specifically, so we can discuss it.

@KelvinYang0320
Copy link
Member

@tsampazk I agree to put this on hold. 👌
Check this openai/gym#2456 about the robotics environments.

My goal is to make all the environments inside Gym itself good general purpose benchmarks suitable that someone new to the field of reinforcement learning can look at and say "okay, there's are the things everyone uses that I should play with and understand."

@tsampazk
Copy link
Member Author

Yeah i skimmed through that issue and discussion and followed a couple of links, but it started to look like a long rabbit hole of links and discussions and just stopped myself. :P

That quote is a really really good summary of gym's mission, but i'm not exactly sure what that means for us, being a framework for a robotic simulator. I guess we'll see in the future.

@KelvinYang0320
Copy link
Member

@tsampazk I think we can move this issue to Pre-release 0.2.1-dev0 or close it. What do you think?

@tsampazk
Copy link
Member Author

Yes @KelvinYang0320, it seems to still be an open issue, i think that we can move it to the pre-release.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants