An overview of the DualStream system. (A \& B) An illustration of how the front and rear cameras of a mobile device can be used to share information about self and surroundings with remote collaborators. The captured information is used to create 3D Holograms, Spatial Video Feeds, and Environment Snapshots. (C) DualStream being used to share information about a car's engine, while viewing a remote expert as if they were in the same location. (D) The expert viewing the shared 3D hologram of the car, and a 2D snapshot of the surrounding remote environment.


In-person human interaction relies on our spatial perception of each other and our surroundings. Current remote communication tools partially address each of these aspects. Video calls convey real user representations but without spatial interactions. Augmented and Virtual Reality (AR/VR) experiences are immersive and spatial but often use virtual environments and characters instead of real-life representations. Bridging these gaps, we introduce DualStream, a system for synchronous mobile AR remote communication that captures, streams, and displays spatial representations of users and their surroundings. DualStream supports transitions between user and environment representations with different levels of visuospatial fidelity, as well as the creation of persistent shared spaces using environment snapshots. We demonstrate how DualStream can enable spatial communication in real-world contexts, and support the creation of blended spaces for collaboration. A formative evaluation of DualStream revealed that users valued the ability to interact spatially and move between representations, and could see DualStream fitting into their own remote communication practices in the near future. Drawing from these findings, we discuss new opportunities for designing more widely accessible spatial communication tools, centered around the mobile phone.

To appear in: 2023 IEEE International Symposium on Mixed and Augmented Reality (ISMAR). Preprint

Overview of the project

DualStream was developed as part of the Shared Reality Project - a collaboration between ATLAS and Ericsson Research. An overview video of the project can be viewed below:

DualStream Video Figure