State-changing virtual and physical rendering to augment interaction with shape displays, 2013
My Role: Interaction Concept Discussion, Graphics Implementation. Collaborators: Daniel Leithinger, Sean Follmer, Alex Olwal, Akimitsu Hogge, Hiroshi Ishii
Recent research in 3D user interfaces pushes towards immersive graphics and actuated shape displays. Sublimate project explores the hybrid of these directions where we use sublimation and deposition as metaphors for the transitions between physical and virtual states. We discuss how digital models, handles and controls can be interacted with as virtual 3D graphics or dynamic physical shapes, and how user interfaces can rapidly and fluidly switch between those representations.
Our vision of Sublimate is a human—computer interface with the ability to computationally control both virtual graphics and physical matter. An object rendered through this system can rapidly change its visual appearance, physical shape, po- sition, and material properties, such as density. While such a system does not currently exist and might be physically im- possible to build even in the future, we aim at creating in- terfaces that appear perceptually similar to the user through a mix of actuated shape displays and spatially co-located 3D graphics.
A Sublimate in- terface renders data as a solid object through a shape display or as spatial 3D graphics through an AR display. We refer to the transitions from shape output to 3D graphics as “subli- mation,” and the transition from 3D graphics to shape output as “deposition”. The system can also render partially sublimated objects, which consist of both tangible physical and intangible graphical portions
To explore this space, we developed two systems that integrate actuated shape displays and augmented reality (AR) for co-located physical shapes and 3D graphics. Our spatial optical see-through display provides a single user with head-tracked stereoscopic augmentation, whereas our hand-held devices enable multi-user interaction through video see-through AR. We describe interaction techniques and applications that explore 3D interaction for these new modalities.