in|situ| Lab - Log In

Adaptive collaboration tools for 2D and 3D interaction in large visualization systems

Master-level internship at in|situ|

Advisors: Cédric Fleury (


The goal of this internship is to enable remote users located in front of 2D wall-sized displays or in 3D immersive virtual reality systems to collaborate together even if they do not have the same interaction capabilities. This internship will take place in the context of the DIGISCOPE project. DIGISCOPE is an “Equipment of Excellence” funded by the French national research agency (ANR). Its goal is to create a high-performance visualization infrastructure for collaborative interaction with extremely large datasets and computations. This infrastructure consists of 10 interconnected interactive spaces, including large virtual reality systems, 3D display devices, wall-sized displays and interactive surfaces.


More and more large data sets (scientific data, CAD models, etc.) have to be visualized by remote experts (see Figure 1). These data sets are complex to visualize and require some special visualization systems as high resolution wall-sized displays or 3D immersive virtual reality systems. Remote collaboration enables experts to combine both their expertise in the data analysis and the different features of their visualization system. For example, some experts can have a 3D view of the data using a CAVE, while some others can have a wider view of the data or a view of the data at different time using a wall-sized display.

Even if the remote experts do not use the same visualization system and, consequently, the same interaction capabilities (2D, 3D, tactile, etc.), they still need to be able to interact together and to understand what the others are doing [3]. For example, users in front of a wall-sized display can interact using 2D devices, while users in an immersive virtual reality system use 3D interaction techniques. In this particular case, users have to be able to interact together on shared data with both 2D and 3D techniques at the same time. Collaborative interaction involves direct manipulations of shared content by several users [1], but also communication among users: showing something to the others [2], annotating the data, sharing viewpoints on the data, etc. The main challenge of this internship is to enable all the users to interact together by leveraging the particular interaction capabilities of each physical system. Designing for such as asymmetric capabilities will lead to new collaboration strategies for large interactive spaces.

The goal of this internship is to propose some collaborative interaction tools which can easily be adapted to different visualization systems. In a first step, we would like to design tools for 2D and 3D interactions for direct manipulation between the WILD wall-sized display, the WILDED tactile wall-sized display and a 3D immersive virtual reality system. The co-manipulation technique proposed in [1] could be adapted to wall-sized displays such as WILD and WILDER (using an infrared tracking system or the touch capability). To go further, it would be good to find a more generic way to adapt collaborative tools to different kinds of visualization systems. We want to work on a tool implementation which fits multiple visualization systems and application contexts. An idea can be to describe each tool by some manipulation rules and to propose a software architecture which matches the interaction devices with these manipulation rules.

The internship involves:

  • exploring state-of-the-art techniques for collaborative interactions between remote users both in 2D systems and in 3D immersive environments,
  • exploring the concept of instrumental interaction,
  • adapting some collaborative 3D interaction tools to wall-sized displays,
  • running some experiments to evaluate the new versions of these tools,
  • finding a generic solution to adapt collaborative interaction tools to visualization systems,
  • implementing a software architecture which can be used to adapt collaborative interaction tools to different kinds of visualization systems.

Required skills

  • Basic background in Human-Computer Interaction.
  • Java and C/C++ programming.
  • Experience with 3D graphical API is a plus.
  • Experience with user tracking devices is a plus.


The internship will take place in the ExSitu research team, which is a joint project between Inria (Saclay-Ile de France) and the LRI lab at Université Paris-Sud and CNRS.


[1] C. Fleury, T. Duval, V. Gouranton, A. Steed. Evaluation of Remote Collaborative Manipulation for Scientific Data Analysis, in Proceedings of the ACM Symposium on Virtual Reality Software and Technology (VRST), December 2012 (

[2] T. Duval, C. Fleury. An asymmetric 2D Pointer/3D Ray for 3D Interaction within Collaborative Virtual Environments, in Proceedings of the ACM International Conference on 3D Web Technology (Web3D), June 2009.

[3] T. Duval, T.T.H. Nguyen, C. Fleury, A. Chauffaut, G. Dumont, V. Gouranton. Improving Awareness for 3D Virtual Collaboration by Embedding the Features of Users' Physical Environments and by Augmenting Interaction Tools with Cognitive Feedback Cues, in Journal on Multimodal User Interfaces, Springer, Vol. 8, Issue 2, pp 187-197, June 2014.