WILD is an experimental high-resolution, interactive platform for conducting research on collaborative human-computer interaction and the visualization of large datasets. The platform will be accessible to scientists from other disciplines, including astrophysicists, biologists, chemists, as well as computer scientists, to visualize, explore and analyze their data.
Interaction and Collaboration
WILD focuses on interaction, providing users with a 3D real-time motion capture system, multi-touch tabletop displays and other devices. Unlike other wall-sized displays, users will be able to interact and collaborate directly, to both visualize and manipulate heterogeneous data sets.
Multi-scale Interaction lets users navigate through large and complex datasets by visualizing them at different scales. The high-resolution wall affords multi-scale interaction through simple locomotion: approaching the wall reveals details. Motion tracking will enable us to design new visualization and navigation techniques that use full-body motion to control scale.
Multi-surface Interaction manages data displayed on multiple surfaces such as the wall, tabletop display, and mobile devices (PDAs, iPod Touch, mobile phones, etc.). A key issue is to provide efficient techniques to help users transfer information seamlessly from one surface to another. The motion tracking system will offer a unique opportunity to investigate new multi-surface interaction techniques.
Multi-user Interaction supports users collaborating to achieve a task, users interacting simultaneously on the same dataset, and the exchange of data among users. WILD will focus on collaborative interactions involving multiple display and input surfaces. Typical situations include two users working on the same dataset, one sitting at the table with a global view of the wall display, the other standing closer to the wall, getting detailed information about a region of the screen.
Participatory Design
Our research method is based on involving end users, such as astrophysicists and biologists, throughout the design process. Together, we will design the collaborative interaction and visualization techniques that will support their activities: We will analyze their needs and create early prototypes; We will observe their use of the prototypes and collect their ideas for improvement; We will conduct controlled experiments and longitudinal studies; We will refine the prototypes. In the end, we will have designed and validated techniques that better suit the needs of scientists in various disciplines based on real usage scenarios.
Hardware platform
WILD is composed of the following hardware components:
- Wall-sized display: a 5.5m large x 1.8m high wall (18' x 6'), displaying 20480 x 6400 = 131 million pixels, using 32 display screens (30" each) laid out in an 8x4 matrix. Each screen can be adjusted individually, and the wall is made of 4 independent structures supporting 8 screens each.
- 18-computer cluster: 16 computers drive the wall and two front-end are used to control them. Apple MacPro 2 x 3.2GHz quadricore,10Gb RAM (16Gb for the two front-ends), 2Tb disk, 2 x NVidia 8800GT per machine, communicating through a dedicated 10Gb/s network.
- Real-time motion capture: a VICON 3D motion capture system with 10 cameras tracks the position of passive markers in real time in the whole room with submillimiter accuracy. The markers can be used to track the position of people as well as objects, supporting whole-body interaction and mid-air interaction with arbitrary devices.
- Interactive table: an IntuiLab multi-touch table with high-definition under-the-table projection, FTIR input tracking and RFID reader provides a large interactive surface to complement the wall.
- Interaction devices: a collection of iPod Touch, iPad, gyroscopic mice, 3D mice and other custom mobile devices allows us to prototype and test various interaction techniques.
Partners
The three scientific partners of the project are:
- The In Situ group joint between LRI/Université Paris-Sud and INRIA Saclay - Ile-de-France;
- The AVIZ group from INRIA Saclay - Ile-de-France;
- The AMI group from LIMSI-CNRS.
Eight laboratories from the Plateau de Saclay Campus are associated to the project:
- IAS (astrophysics) - Institut d'Astrophysique Spatiale
- IBBMC (biochemistry) - Institut de Biochimie et Biophysique Moléculaire et Cellulaire
- ICMMO (chemistry) - Institut de Chimie Moléculaire et des Matériaux d'Orsay
- IGM (biology) - Institut de Génétique et de Microbiologie
- LAL (particle physics) - Laboratoire de l'Accélérateur Linéaire
- LIMSI (mechanical engineering) - Laboratoire d'Informatique et de Mécanique pour les Sciences de l'ingénieur
- LNAO (neuroscience) - Laboratoire de Neuroimagerie Assistée par Ordinateur
- MAS (applied mathematics) - Laboratoire de Mathématiques Appliquées aux Systèmes
Funding
Funding for the project comes mainly from Région Ile-de-France and the Digiteo research park in science and technology of information.
Additional funding comes from the National Research Agency (ANR), CNRS, INRIA, Université Paris-Sud, and the INRIA-Microsoft Research joint laboratory.