Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about saving .npy results and reach task #60

Open
Mael-zys opened this issue Feb 24, 2025 · 4 comments
Open

Question about saving .npy results and reach task #60

Mael-zys opened this issue Feb 24, 2025 · 4 comments

Comments

@Mael-zys
Copy link

Hi, thanks for this great work!

I'm currently using ProtoMotions V1 with the pretrained model.

python phys_anim/eval_agent.py +robot=smpl +backbone=isaacgym \
+motion_file=data/motions/smpl_humanoid_walk.npy \
+checkpoint=./data/pretrained_models/MaskedMimic/last.ckpt \
+opt=masked_mimic/tasks/path_follower +force_flat_terrain=True

But I can not find where I can save the generated motion in a npy file. Could you give me some suggestions about this?

Another question is that when I try it with "reach" task, the human always fall down and takes a long time to stand up, how can I solve this problem?

Thank you for your help!

@Yu-Jiale
Copy link

Hi! I'm trying to use the pre-trained MaskedMimic and full body tracker. But I haven't found where the pre-trained model is. Could you please tell me? Thanks,

@tesslerc
Copy link
Collaborator

@Mael-zys yesterday we released v2.1
One added feature is recording the motions alongside the videos. See here.

When you press "L" it will start recording. Pressing "L" again will stop and save.
It saves the video and motion file in the same directory with similar names.
The motion file is simply a dictionary of translations + rotations for the humanoid in each frame, based on the joint ordering in the config file and with xyzw quaternion format.

Here's an example code for loading the recording for the SMPL humanoid with IsaacLab and kinematically replaying it.
Keep in mind that this code is a bit messy and probably not the ideal way to replay motions.

from isaaclab.app import AppLauncher

# Simple example showing how to start and stop the helper
app_launcher = AppLauncher({"headless": False})
simulation_app = app_launcher.app

import os
import omni
import torch
from pxr import Gf, UsdGeom, UsdShade, Vt
import isaacsim.core.utils.prims as prim_utils
import isaaclab.sim as sim_utils
from isaaclab.terrains.terrain_importer_cfg import TerrainImporterCfg
from isaacsim.core.prims import XFormPrim
from isaaclab.assets import AssetBaseCfg
from isaaclab.utils.assets import ISAACLAB_NUCLEUS_DIR, ISAAC_NUCLEUS_DIR


stage = omni.usd.get_context().get_stage()
UsdGeom.Xform.Define(stage, "/World/Xform/char0")
stage.GetPrimAtPath("/World/Xform/char0").GetReferences().AddReference("protomotions/data/assets/usd/smpl_humanoid.usda")

scale = 100
fps = 30
_file = "output/renderings/CMU-smpl-88-88_08_poses/fbt_transformer_15_steps_1gpu-2025-02-20-14-28-20.pt"
motion_data = torch.load(_file)

global_translation = motion_data["global_translation"]
global_rotation = motion_data["global_rotation"]

num_frames = global_rotation.shape[0]
print("Will visualize %d frames" % num_frames)

stage.SetEndTimeCode(num_frames)
stage.SetFramesPerSecond(fps)
stage.SetTimeCodesPerSecond(fps)

p_xform = XFormPrim(prim_paths_expr="/World/Xform/char0/bodies/*")

terrain_visual_material = sim_utils.MdlFileCfg(
    mdl_path="{NVIDIA_NUCLEUS_DIR}/Materials/Base/Architecture/Shingles_01.mdl",
    project_uvw=True,
)
cfg_ground = sim_utils.GroundPlaneCfg(size=(2.0e6, 2.0e6))
cfg_ground.func("/World/ground", cfg_ground)
cfg_light = sim_utils.DomeLightCfg(
    intensity=750.0,
    texture_file=f"{ISAAC_NUCLEUS_DIR}/Materials/Textures/Skies/PolyHaven/kloofendal_43d_clear_puresky_4k.hdr",
)
cfg_light.func("/World/Light", cfg_light)
global_rotation_reordered = global_rotation.roll(shifts=1, dims=-1)  # xyzw to wxyz
def play():
    ##### humanoid #####
    for f in range(num_frames):
        body_position = global_translation[f]
        body_rotation = global_rotation_reordered[f]
        p_xform.set_world_poses(body_position.cpu(), body_rotation.cpu())
        simulation_app.update()  

while True:
    play()


simulation_app.close()

@tesslerc
Copy link
Collaborator

@Yu-Jiale , we expect to release the models next week.
The update to v2 broke compatibility with the old models, our new ones are trained. We need to evaluate to make sure they meet our expectations and we will release.

@Yu-Jiale
Copy link

@tesslerc Good job! Thanks for the open-source code!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants