Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

I modified the xacro model and added a kinect camera, and there are currently two issues #231

Open
TalentLeshen-DLUT opened this issue Nov 30, 2024 · 11 comments

Comments

@TalentLeshen-DLUT
Copy link

I modified the xacro model in ros2humble2.1.1 you provided, adding part of the model and a kinect camera model. I successfully used mock and rviz2 to drive the robot arm, but when I wanted to use gazebo, I found that I could not open it.
I integrated with https://github.com/chenshuxiao/azure-kinect-driver-ros2-humble?tab=readme-ov-file and vision_opencv in the project. When I started mock and rviz2, it showed that there was a problem with the TF tree, but I didn't know how to fix it.
Could you give me some advice?
Screenshot from 2024-11-30 17-01-32
Screenshot from 2024-11-30 17-01-41
Screenshot from 2024-11-30 17-14-13
Screenshot from 2024-11-30 17-15-00

@mhubii
Copy link
Member

mhubii commented Nov 30, 2024

hi @TalentLeshen-DLUT. Could you share a repository of this integration?

@TalentLeshen-DLUT
Copy link
Author

你好@TalentLeshen-DLUT。您可以分享此集成的存储库吗?

Yes, I think you can see my project in the link below. I'm looking forward to your advice. https://github.com/TalentLeshen-DLUT/iiwa14_with_Azure_Kinect

@mhubii
Copy link
Member

mhubii commented Dec 2, 2024

@TalentLeshen-DLUT
Copy link
Author

Thanks for your answer, I am currently trying to modify gazebo related content. Looking at the visual integration aspect, I found that I needed to add sensor_3d.yaml to the file, Just like https://github.com/moveit/moveit2_tutorials/blob/main/doc/examples/perception_pipeline/launch/perception_pipeline_demo.l Lines 47-51 in aunch.py,
.sensors_3d(
file_path=os.path.join(
get_package_share_directory("moveit2_tutorials"),
"config/sensors_3d.yaml",
)
but I'm not sure where to make changes.

@mhubii
Copy link
Member

mhubii commented Dec 4, 2024

hi @TalentLeshen-DLUT , been writing a quick demo for how to integrate a camera into Gazebo and will share shortly. Out of curiosity, how are Azure Kinect and Gazebo related? As in, are you trying to have the Azure Kinect mesh inside Gazebo? Thank you!

@TalentLeshen-DLUT
Copy link
Author

你好@TalentLeshen-DLUT,一直在编写一个关于如何将摄像头集成到 Gazebo 的快速演示,很快就会分享。出于好奇,Azure Kinect 和 Gazebo 有什么关系?您是否想将 Azure Kinect 网格放入 Gazebo 中?谢谢!

I currently don't think there's much difference between the Azure Kinect DK and other cameras, except for the relevant topic, as shown in the figure. By opening the camera and the robot, and then converting the TF of the camera and the robot, the integration of the camera and the robot should be realized. I wanted to put Kinect into Gaezbo, but since I was just learning and using ros2, I found it difficult to do so.
Screenshot from 2024-12-05 10-29-08

@mhubii
Copy link
Member

mhubii commented Dec 5, 2024

no worries. Hope I will post a demo here soon. Until then, when looking through your repo, you add tags to existing xacro files. xacro files can be thought of as a programmatic way to combine xml files (think of import in Python or #include in C++). Meaning, when you wish to create a new xacro to describe your own robot, you would usually use xacro:include to do so. In your case, you would use xacro:include to include the iiwa and to include the azure kinect. Everything Gazebo (such as camera a plugin) related goes inside gazebo tags. Gazebo uses these tags to generate sdf files. Finally, you have to add joints / links as necessary to combine the different includes.

@TalentLeshen-DLUT
Copy link
Author

@mhubii Thanks for your answers, I think I have a better understanding of how to integrate Kinect into gazebo. I'm looking forward to seeing your demo soon.

@TalentLeshen-DLUT
Copy link
Author

@mhubii Hello, in my previous work, I implemented the connection between the motorized spindle -Beckhoff- robot control cabinet, which enabled me to control the start and stop of the motorized spindle and the speed through the sunrise workbench. Start-stop is achieved by sending a digital signal of DC+24V, and the speed is achieved by sending a continuous DC 0-10V analog signal. Now I want to realize the control of ROS2 and motorized spindle through FRI-1.15, but I have no clue. Could you please give me some advice, so that I can understand how I should modify fri to achieve my goal?

@TalentLeshen-DLUT
Copy link
Author

I looked through the KUKA Fast Robot Interface C++ SDK (version 1.15), I think I can by setting

  1. void KUKA: : FRI: : LBRCommand: : setBooleanIOValue (const char *, const bool),
  2. void KUKA: : FRI: : LBRCommand: : setDigitalIOValue (const char , const unsigned long long),
  3. void KUKA: : FRI: : LBRCommand: : setAnalogIOValue (const char , const double)

to make robot sent to receive ROS2 bool as well as the simulation, to control the start-stop of the motorized spindle and speed, But I have no idea how to get ROS2 to send msgs.

@mhubii
Copy link
Member

mhubii commented Dec 16, 2024

Hi @TalentLeshen-DLUT. Yes, that needs to be looked into and added here. Would it be okay to continue this discussion in #198. Thank you!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants