Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

could you please provide sim2sim code when deploy the h1 policy #48

Open
dbdxnuliba opened this issue Jan 13, 2025 · 5 comments
Open

could you please provide sim2sim code when deploy the h1 policy #48

dbdxnuliba opened this issue Jan 13, 2025 · 5 comments

Comments

@dbdxnuliba
Copy link

dbdxnuliba commented Jan 13, 2025

could you please provide sim2sim(mujoco) code when deploy the h1 policy

@tesslerc
Copy link
Collaborator

We don't have any plans to integrate MuJoCo. However if you're interested in adding support, we'd happily accept a PR.

@dbdxnuliba
Copy link
Author

thanks for your replay ,can we train a policy for h1 only with vr data input which only include left wrist , right wrist and head pose ,if indeed ,then what's the corresponding command

@tesslerc
Copy link
Collaborator

That's not supported "out of the box" but it should be fairly simple to adapt.
If you look at the mimic reward, it takes the full target pose and compares it with the full current pose. For rotations and translations.

You can adapt that to only consider the 3 joints you're interested in by selecting the 3 corresponding indices before computing the mean-over-joints.

I'd suggest to mix in some full-body data through AMP. The discriminator will help fine-tune the overall posture to be more stable and realistic while the 3-point-tracking will help ensure it matches the target coordiates.

@dbdxnuliba
Copy link
Author

dbdxnuliba commented Jan 25, 2025

Thank you for your reply,
I have three questions
Q1:
In the README, you mentioned "Unitree H1 humanoid with end-effector and head joints made visible"
is it meanning when training the h1 , the motion file for the Unitree H1's .npy file only contains the poses of the head and end effector. Is it correct that it does not require the pose information of the whole body joints and the floating base as input?

Q2:
in h1_punch.npy, it contains serveral files,

Image

Image

What is the content included in each file? Is it the joint angles, the position of the floating base, or is it only the pose information of the end effector and the head? Could you please tell me,

Q3:
How can we generate such a compressed npy motion file that meets the format requirements about protomotion for h1 training ,When we want to train H1 for other actions,

Many thanks

@tesslerc
Copy link
Collaborator

  1. The motion file contains the coordinates and rotations of all joints. In this modification you also have the head, toes, and wrist positions. Since they aren't actuated, in the original config you had to infer the wrist position based on the elbow position + orientation.
  2. The numpy is a dictionary. Look at how we load it and parse it in the h1 motion lib file.
  3. https://github.com/NVlabs/ProtoMotions/blob/main/data/scripts/convert_h1_to_isaac.py this is an example of how we converted for our demonstrations.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants