Français Anglais
Accueil Annuaire Plan du site
Home > Research results > Dissertations & habilitations
Research results
Ph.D de

Group : Human-Centered Computing

Supporting Collaborative Practices Across Wall-Sized Displays with Video-Mediated Communication

Starts on 01/09/2014
Advisor : BEAUDOUIN-LAFON, Michel

Funding :
Affiliation : Université Paris-Sud
Laboratory : LRI - HCC

Defended on 12/12/2017, committee :
Directeur de thèse :
- M. Michel Beaudouin-Lafon, Professor at Université Paris-Sud

Co-Encadrant de thèse :
- M. Cédric Fleury, Maître de Conférence at Université Paris-Sud

Rapporteurs :
- M. Carman Neustaedter, Associate Professor at Simon Fraser University
- Mme. Myriam Lewkowicz. Professor at Université de Technologie de Troyes

Examinateurs :
- M. Jean-Claude Martin. Professor at Université Paris-Sud
- M. Albrecht Schmidt, Professor at LMU Munich University

Research activities :

Abstract :
Collaboration can take many forms, we might sit side by side to work on an artifact, stand around a table to manipulate shared objects, or stand in front of a large display to visualize big datasets. Technology has long provided support for these practices through many devices: desktop computers let two people work side by side on digital objects, tabletops let groups of people gather around shared data, and wall-sized displays support visualizing and manipulating large digital data sets. Their traits determine how people use them in co-located collaboration. But when collaborators are located remotely, to what extent does technology support these activities?
In this dissertation, I argue that the success of a telecommunications system does not depend on its capacity to imitate co-located conditions, but in its ability to support the collaborative practices that emerge from the specific characteristics of the technology. I explore this question using wall-sized displays as a collaborative technology. Wall-sized displays are large and can present massive data sets at a high resolution, two traits that establish the behaviors that take place in these spaces. Notably, people physically navigate data, moving close to and far away from the display instead of zooming in and out, and walking towards objects instead of panning. These opportunities shape how people collaborate: they can simultaneously and independently navigate data, indicating far objects to each other through gestures; as well as perform tightly-coupled work, physically navigating data together while talking about it.
To explore technological support for remote collaboration across wall-sized displays, I built CamRay, a telecommunication tool that allows to study collaboration across large interactive spaces. CamRay uses an array of cameras to capture users’ faces as they physically navigate data on a wall-sized display, and presents this video in a remote display on top of existing content. This tool is designed to explore collaboration needs across wall-sized displays and how to support them. I use CamRay to observe how pairs perform two different collaborative tasks at a distance. Based on these observations, I propose two ways of displaying video: Follow-Local and Follow-Remote. In Follow-Local, the video feed of the remote collaborator follows the local user, and in Follow-Remote it follows the remote user. I perform two experiments to evaluate how these video behaviors support different aspects of collaboration. I find that Follow-Remote preserves the spatial relations between the remote speaker and the content, supporting pointing gestures. Follow-Local enables virtual face-to-face conversations, supporting representational gestures. Finally, I summarize these findings to inform the design of future systems for remote collaboration across wall-sized displays, and reflect on the considerations that designers of telecommunication systems should take when adding support for remote communication across collaborative technologies.

Ph.D. dissertations & Faculty habilitations


The Center for Data Science of the University of Paris-Saclay deployed a platform compatible with Linked Data in 2016. Because researchers face many difficulties utilizing these technologies, an approach and then a platform we call LinkedWiki were designed and tested over the university’s cloud (IAAS) to enable the creation of modular virtual search environments (VREs) compatible with Linked Data. We are thus able to offer researchers a means to discover, produce and reuse the research data available within the Linked Open Data, i.e., the global information system emerging at the scale of the internet. This experience enabled us to demonstrate that the operational use of Linked Data within a university is perfectly possible with this approach. However, some problems persist, such as (i) the respect of protocols and (ii) the lack of adapted tools to interrogate the Linked Open Data with SPARQL. We propose solutions to both these problems. In order to be able to verify the respect of a SPARQL protocol within the Linked Data of a university, we have created the SPARQL Score indicator which evaluates the compliance of the SPARQL services before their deployments in a university’s information system. In addition, to help researchers interrogate the LOD, we implemented a SPARQLets-Finder, a demonstrator which shows that it is possible to facilitate the design of SPARQL queries using autocompletion tools without prior knowledge of the RDF schemas within the LOD.