Skip to content
wbueschel edited this page Feb 19, 2013 · 2 revisions

These are the results of a brainstorming session in Dresden. The goal was to come up with ideas how to use device gestures like rotation or stacking for contact/overlay Augmented Reality (abbreviated as cAR in the rest of this document) and specifically for Active Reading. I have also included some very rough sketches that we created during this meeting.

Towards a possible definition of cAR

  • a particular kind of optical see-through AR
  • where the device is directly and closely applied to a reference object or surface
  • and the augmentation is relative to this object's coordinate system, not the world coordinates

General Interaction Vocabulary for cAR

  • (multi-) touch
  • pen interaction
  • make/end contact with surface
  • translation
  • device on reference surface (relative motion)
  • device and references object (without relative motion)
  • rotation
  • device on reference surface (relative motion)
  • device and reference object (without relative motion)
  • shaking
  • device on reference surface (relative motion)
  • device and reference object (without relative motion)
  • flipping (breaks contact with surface, rotation of device around on of the axes parallel to the screen)
  • stacking

Device Gestures for Active Reading

Horizontal Rotation

  • It is very important to show complete lines of text to improve reading experience whenever possible.
  • This limits the usefulness of rotation for interaction, the orientation is used to adapt to different page widths instead.
  • User interface should be adapted when rotation occurs (portrait or landscape mode):
  • Menus should be designed to be suitable for both modes, change their position.
  • Due to wider pages in landscape mode, extra width should not be used for additional UI elements.

Shaking

  • 4 spatial configurations of shaking can be defined: a) device on the reference object b) device together with the reference object c) of the reference object (instead of the device) d) device independently of a reference object
  • However, b) and c) might be unsuited, because the reference object may be too heavy to be moved or may even be fixed in place
  • Furthermore, the difference between a) and d) could be hard to distinguish for both the user and the system
  • We propose to concentrate on a), shaking of the device on the reference object
  • Shaking could be used for canceling actions or closing overlay windows, employing a metaphor of getting rid of something unwanted (see figure 1).
  • Alternatively, synchronized shaking could be used to pair two devices. However, we think this should be easier/automatic (see also stacking).
  • For some other context, shaking can be used to arranging elements, e.g., nodes of a graph.
  • Shaking can easily happen by accident, therefore whatever function shaking is mapped to should support
  • some form of undo or
  • the user must be able to easily get back to the previous state

Sketch: visualization of reference targets

Stacking

  • We propose to use stacking mainly for pairing two devices to allow synchronization of annotations, bookmarks, etc.
  • To allow easy stacking, the device could have some notches or magnets
  • We think that interaction should primarily concentrate on the own device and that this device should be on the top.
  • Filtering needs to be possible:
  • Source: get, put, sync
  • Category of items: annotations, highlights, bookmarks
  • Selection could support different techniques:
  • tapping to selct individual items
  • lasso selection and two finger rectangular selection for groups of items
  • selection of everything, everything in view and everything on a specific page
  • As holding stacked displays and a reference object at the same time can be difficult, we propose to use some kind of menu that can be used with the thumb if necessary.
  • We thought about using a radial menu at the thumbs position but prefer a slide-in menu (see figure 2)
  • There should be some clear indication if an item belongs to me or to the other device
  • Primarily we should show the items on the device they are on
  • If this is not clear enough to the user, we could tint the items, e.g., blue for my items and red for the other's items.
  • In that case, different colors of items, e.g., differently colored highlights, could be displayed with different shades of the owner's color (see figure 2).
  • Problems we see with stacking are:
  • We question whether a user can hold two connected devices and the reference object. Most likely, the user will prefer a table as support
  • Maybe apart from using the "3D"/parallax effect, we are not sure that stacking is a real advantage compared to "normal" wireless synchronization/sharing.

Sketch: visualization of reference targets

Flipping

  • As suggested earlier, we think that flipping might be used for a mode switch between reading/annotation and browsing/looking words up
  • However, we are unsure whether this is helpful:
  • Mode switches can probably be accomplished easier through touch buttons or a menu
  • Flipping breaks contact to the reference object, forcing the user to find the original position again afterwards.
  • The mode switch itself may not be necessary (or desirable), depending on individual preferences and workflow