vacuumdemo.mp4
Source code for a demo project which showcases how to work with Apple Vision Pro and RealityKit, ARKit APIs.
Here I showcase how to work with
- ARKit: head tracking, hand tracking, scene understanding
- Load & play sounds
- Process collisions
- Work with underlying mesh data using MTLBuffers
- I always enjoy with Injection and on Apple Vision Pro it feels like you are a mage when you change the code and smth changes in front of you while your are wearing the device
I wish I had this opensource example before. Happy to share it with community.