Holovision: asynchronous mixed reality groupware to support physical work tasks
University of New Brunswick
Software that supports collaboration (groupware) is becoming ubiquitous in the workplace. The ability to share documents, images, and videos, or to have face-to-face conversations almost anywhere and anytime has transformed the workplace and increased productivity. At the same time, mixed reality devices are beginning to gain traction as viable platforms to support the production and consumption of new forms of information like spatial data and in-situ 3D objects. Current groupware systems focus largely on supporting virtual work, that is information work and manipulating information that can be displayed on the screen. However, many work tasks are physical in nature, they require manipulating, repairing, and assembling physical objects. The ability to share these new forms of information offers new and relatively under explored opportunities for collaboration around physical work tasks. Drawing on inspiration from previous research in the field, as well as 3D guidance techniques in video games, the mixed reality groupware system, “HoloVision” was created to take advantage of this new collaboration medium and provide a proof-of-concept for the design of future asynchronous mixed reality groupware systems. In this thesis, I document the background research, design, development, and testing of HoloVision as a novel mixed reality groupware system that combines speech-to-text, gaze, video, and spatial data, in a way that allows a user to intuitively author persistent “hypermedia” annotations to asynchronously support the guidance of future users in both collocated and remote workspaces.