in|situ| Lab - Log In

WILD: Wall-sized Interaction with Large Datasets

Emmanuel Pietriga & Michel Beaudouin-Lafon


WILD is an experimental high-resolution, interactive platform for conducting research on collaborative human-computer interaction and the visualization of large datasets. Our primary target users are scientists from other disciplines, including astrophysicists, biologists, chemists, as well as computer scientists, to visualize, explore and analyze their data.

Installed in February 2009, the platform was officially opened on June 19, 2009 (see WILD Inauguration).

For more information, see the INRIA Web site and the Project page. The WILD room is now part of the Digiscope project.


Interaction and Collaboration

WILD focuses on interaction, providing users with a 3D real-time motion capture system, multi-touch tabletop displays and other devices. Unlike other wall-sized displays, users will be able to interact and collaborate directly, to both visualize and manipulate heterogeneous data sets.

Multi-scale Interaction lets users navigate through large and complex datasets by visualizing them at different scales. The high-resolution wall affords multi-scale interaction through simple locomotion: approaching the wall reveals details. Motion tracking will enable us to design new visualization and navigation techniques that use full-body motion to control scale.

Multi-surface Interaction manages data displayed on multiple surfaces such as the wall, tabletop display, and mobile devices (PDAs, iPod Touch, mobile phones, etc.). A key issue is to provide efficient techniques to help users transfer information seamlessly from one surface to another. The motion tracking system will offer a unique opportunity to investigate new multi-surface interaction techniques.

Multi-user Interaction supports users collaborating to achieve a task, users interacting simultaneously on the same dataset, and the exchange of data among users. WILD will focus on collaborative interactions involving multiple display and input surfaces. Typical situations include two users working on the same dataset, one sitting at the table with a global view of the wall display, the other standing closer to the wall, getting detailed information about a region of the screen.

Participatory Design

Our research method is based on involving end users, such as astrophysicists and biologists, throughout the design process. Together, we will design the collaborative interaction and visualization techniques that will support their activities: We will analyze their needs and create early prototypes; We will observe their use of the prototypes and collect their ideas for improvement; We will conduct controlled experiments and longitudinal studies; We will refine the prototypes. In the end, we will have designed and validated techniques that better suit the needs of scientists in various disciplines based on real usage scenarios.

Technical Details

  • Wall-sized display:
    • a 5.5m large x 1.8m high wall,
    • displaying 20480 x 6400 = 131 million pixels,
    • using 32 display screens (30" each) laid out in an 8x4 matrix,
    • driven by a cluster of 18 computers: Apple MacPro 2 x 3.2GHz quadricore,10Gb RAM (16Gb for the two front-ends), 2Tb disk, 2 x NVidia 8800GT per machine,
    • communicating through a dedicated 10Gb/s network.
  • 3D motion capture system with 10 cameras tracking the position of users and objects in real time in the whole room with submillimiter accuracy.
  • Tabletop display with multi-touch capabilities.

Pictures Gallery

WILD-HD-1.jpg - © CNRS Phototheque / Cyril FRESILLON / ESA / Planck HFI & LFI Consortia
WILD-HD-2.jpg - © CNRS Phototheque / Cyril FRESILLON / ESA / Planck HFI & LFI Consortia
WILD-HD-3.jpg - © CNRS Phototheque / Cyril FRESILLON / ESA / Planck HFI & LFI Consortia
WILD-HD-4.jpg - © CNRS Phototheque / Cyril FRESILLON / ESA / Planck HFI & LFI Consortia
WILD-HD-5.jpg - © CNRS Phototheque / Cyril FRESILLON / ESA / Planck HFI & LFI Consortia
WILD-HD-6.jpg - © CNRS Phototheque / Cyril FRESILLON / ESA / Planck HFI & LFI Consortia
WILD-HD-7.jpg - © CNRS Phototheque / Cyril FRESILLON / ESA / Planck HFI & LFI Consortia
WILD-HD-8.jpg - © CNRS Phototheque / Cyril FRESILLON / ESA / Planck HFI & LFI Consortia

© CNRS Phototheque / Cyril FRESILLON / ESA / Planck HFI & LFI Consortia

Partners

  • in|situ|;
  • Aviz (Visual Analytics), INRIA;
  • AMI (Architectures and Models for Interaction), LIMSI (UPR CNRS associated with Université Paris-Sud).

Collaborating laboratories:

Institut d'Astrophysique Spatiale (IAS), Institut de Biochimie et Biophysique Moléculaire et Cellulaire (IBBMC), Institut de Chimie Moléculaire et des Matériaux d'Orsay (ICMMO), Institut de Génétique et Microbiologie (IGM), Laboratoire de l'Accélérateur Linéaire (LAL), Laboratoire d'Informatique pour la Mécanique et les Sciences de l'Ingénieur (LIMSI), Laboratoire de Neuroimagerie Assistée par Ordinateur (LNAO), Laboratoire de Mathématiques Appliquées aux Systèmes (MAS).

Funding

Région Ile-de-France, Digiteo, CNRS, INRIA, INRIA-MSR, Université Paris-Sud, ANR


Project proposal (in French):