Skip to content

AR Mode in Shot Generator

Charles Forman edited this page Apr 26, 2020 · 1 revision

AR Mode in Shot Generator

Screen Shot 2020-04-26 at 12 16 11 PM

AR Mode is a way for people to view and interact with their Shot Generator scenes in 3D using their phone as a 3D Virtual viewport. The functionality is similar to VR, except that in AR, the user doesn't have controllers, they only have their phone. So the phone functions both as the viewport and the controller. AR is great everyone has mobile phones, while few people have VR headsets, and additionally, no one gets sick using their mobile phone whereas a good percentage of VR users experience sickness.

This is good for getting a sense of the scene, creating a scene from scratch and manipulating an existing scene.

Basic Usage

User connects to the SG scene just as another VR user would connect. On their phone they click "Enter AR". This enters an AR context where the Shot Generator scene is fully overlayed on top of the AR. They can freely move their viewport into the world. In the viewport, they center of screen is the "cursor" or the selection ray. So they can point at a place on the ground and press the teleport button, and teleport there. They can move the cursor over an object and hold the Select button to pick the object up and move it to a new location and drop it.

The interface buttons that the user needs to interact with the scene are overlaid on the view. The user touches / holds those interface buttons to perform various tasks.

Technical Explanation

Once the AR context is started, the entire world is overlaid on the real world. Normally in AR, objects are composited over the camera's image to make it appear that the 3D objects are in the real world. In our case, we do not want this. We only want to display the virtual world. We are not really augmenting existing reality, we are using reality to provide a window into the virtual world through the phone's camera/IMU tracking and display.

World Scale

Changing the world's scale allows better control for the user to do certain things.

Moving the phone around in AR is surprisingly accurate. This means you can do tasks like selecting IK control points to do very fine adjustments of pose. In AR, the user's physical range of motion is a little less than in VR. For example, in VR, you may be in an open room, but in AR, you are probably sitting at a desk. Therefore we can scale the world down so the user can make bigger movements. At 100% the world is 1:1. 1m = 1m. At 50%, .5m = 1m. At mini mode, the scale is 1/6th so 0.1666m = 1m.

Simplicity of use / UI Complexity

Because we have to put all of the buttons over the top of the viewport, we have to be very thoughtful about the design and simplicity of the interface. Therefore, the functionality of AR will be less than VR. Also, because the user is on their phone, they can make fine adjustments on their desktop computer in Shot Generator.

Therefore, we don't need to include parameter settings for each object. For example, the user could not add an object and change it to a chair. They would have to add an object in AR, and then in Shot Generator on their desktop, they would change the object to a chair. This can change in the future, but I think it's important to keep the UI as clean and simple as possible.

Interface Buttons:

add object

scale world

select

teleport

place camera

settings

directional arrows

Clone this wiki locally