Project: BioTint'IT, Physical, Modular and Articulated Interface for Interactive Molecular Manipulation
Bastien Vincke, Mohamed Anis Ghaoui, Nicolas Férey, Xavier Martinez.
Physical, Modular and Articulated Interface for Interactive Molecular Manipulation, Sensors 2020, 20(18), 5415; DOI 10.3390/s20185415 - HAL hal-03030681
- PDF version of the article (Sensors)
- A poster presented with a demo at the national conference IHM'2023
- A poster presented with a demo at the national conference Interaction Humain-Machine in 2023 (HAL hal-03835651) and at the national conference of the Groupe Graphisme Modélisation Moléculaire in 2021 (HAL hal-04582042)
- A video presenting our molecular tangible interface and its virtual twin
Summary
This portfolio explains and illustrates the design of a modular, wireless and articulated tangible molecular interface based on an Internet of Things approach, that allows the manipulation of a molecular virtual twin using its molecular physical model.
A user building and manipulating our tangible molecular interface and its synchronized virtual twin
Contributions
As a molecular LEGO, the device designed in the context BionTint'It project allows a user to build an articulated and modular physical molecular model. This model embeds sensors to detect in interactive time, component connections and angles between connected molecular components during user manipulation, using a wireless approach based on Internet of Things technology. It allows the user to build, manipulate and steer a virtual twin of his custom physical model. This work was made possible thanks to 3 Farman projects funding for Master 2 internships. Beyond these results in terms of tangible molecular interface, our work allows researchers to steer a molecular simulation in progress using this tangible molecular interface or to support practical work in the classroom in molecular science.
Perspective
Our short-term objective is also to be able to display augmented reality visual feedback directly on the tangible interface during manipulation. This is supposed to address challenging tracking problematics, to get a robust and precise orientation and position of the deformable physical model surrounded by user hands, during manipulation.
Impact
The results of this project led to national funding of the PIRATE interdisciplinary project by Agence Nationale pour la Recherche in 2021, to target rational drug design applications, and to support learning in the chemistry classrooms.
Article: PerfectFit: Custom-Fit Garment Design in Augmented Reality
Akihiro Kiuchi, Anran Qi, Eve Mingxiao Li, Dávid MaruscsáK, Christian Sandor, and Takeo Igarashi. PerfectFit: Custom-Fit Garment Design in Augmented Reality. ACM SIGGRAPH Asia XR 2023. Best Demo Award, Audience Vote. DOI 10.1145/3610549.3614592 - HAL hal-04358171
Two examples of custom-fit garment design using PerfectFit: the designer views the virtual garment on the clients in different poses from various viewpoints (a,d) to identify fitting issues. The design adjusts garment size by a slider (b), extending the front panel for full coverage of the belly (c), and extending the sleeve to cover the wrist (e).
System overview. The designer extends the hem of the blue virtual dress for better coverage on client's body. The extended part is highlighted in dark blue with yellow stripes illustration only
Summary
Our prototype addresses multiple users, more precisely, the customer and the designer/tailor. In this prototype, the customer can see a virtual piece of clothing superimposed onto its body in an AR mirror. The AR mirror includes a big screen and a Kinect Azure depth sensor. The depth sensor provides 3D data of the user's body in real-time, meaning that our system knows the shape of the user's body, which allows us to render the virtual garment on top of the user while having realistic interaction between the body and the garment. Our customers can see themselves on the big screen wearing the virtual garment in real-time, which follows the person's movement. The other person using our system, in this case the tailor, wears a Head-Mounted Display (HMD) through which it is possible to see the virtual garment superimposed directly onto the customer's body. The tailor can edit the virtual garment. For example, it can make the sleeves of the garment shorter or longer to fit the length of the customers' arms.
Designing this system included many technical and design challenges. Such a highly challenging task was to display real-time interactive cloth simulation on a high-resolution device. We used a CANON MREAL X1 HMD,. While introducing 4k resolution (2k / eye), the hardware is very light and comfortable to wear. We believe the widespread adoption of HMDs in real-world scenarios depends on their comfort and ergonomic design.
Additionally, the design of 3D interactions with cloth simulation is not a widely discovered area. Therefore, we were experimenting with multiple interaction designs: 1) Editing with hands, 2) Editing with a piece of hardware.
Impact
In December 2023, we presented our prototype at the SIGGRAPH Asia Conference and Exhibition on Computer Graphics and Interactive Techniques in Sydney. We demonstrated our work for three consecutive days for industry professionals and emerging talents. During the demonstration, we gathered verbal feedback from the users and inspected how people interacted with our system. Our demonstration won the best XR Demo (Audience Vote). Winning this award ensured that our system was well-received among users and highlighted the actuality of the project's context.
Application: Magic Bubbles, XR app for reassurance and social interaction of autistic children
V. Bauer, T. Bouchara, O. Duris, C. Labossière, M.-N. Clément, P. Bourdot. Head-mounted augmented reality to support reassurance and social interaction for autistic children with severe learning disabilities. Frontiers in Virtual Reality, 2023, 4, DOI:10.3389/frvir.2023.1106061 - hal-04199416.
Online and Local Videos
- Video presenting the long-term test at André Boulloche hospital (Vimeo, 3'30")
- Video presenting the preliminary acceptability testing at André Boulloche hospital (vimeo, 1'30")
- Video presenting a session extract view from the practitioners only (local video)
- Video detailing the possible interactions, at home during the Covid pandemic (vimeo, 3')
Figure 1. Screen captures of the serious game Magic Bubbles
Context
Magic Bubbles is a multisensory Augmented Reality (AR) environment, created by the team in partnership with the André Boulloche day hospital in the context of the PhD thesis of Valentin Bauer (supervision : P. Bourdot, T. Bouchara) and the project DIM RFSI AudioXR4TSA. Its goal is to secure children with severe autism and reinforce their relationship with the practitioners. Therefore the paper illustrates achievements of the team on an application topic new for the team, i.e. XR and handicap, as well as new progress on the understanding of human perception in and through XR.
Based on audio, visual, and tactile stimuli, Magic Bubbles allows children to interact with appealing objects (e.g., bubble column Figure 1A-2), being either generic (e.g., soap bubble Figure 1B-5) or individualized (e.g., drawings, Figure 1A and 1C-6). Practitioners can support exploration (e.g., verbal or physical guidance) while perceiving what the child sees and hears through a screen monitor (Figure 1D).
Contribution
The design of the application followed a participative design process. First, proposals were made according to a literature review and the interview of 34 skateholders (autistic persons, family members and practitioners) reported in a journal paper (hal-03817642). These proposals were discussed with two practitioners to correspond to the hospital settings and then evaluated and adapted with eleven other practitioners (conference paper hal-03817662). Pre-tests with ten children with neurodevelopmental conditions and intellectual disability then confirmed its acceptability and usability (conference paper hal-03817648).
Finally, this portfolio paper hal-04199416 presents the results of a long-term study (six weekly AR sessions over three months) with seven autistic children with severe learning disabilities and complex needs. This paper provides empirical insights that could inform future Extended Reality (XR) autism research, given that it is one of the first XR studies to our knowledge conducted on the long-term with autistic children being minimally verbal and with intellectual disability (Figure 2). Second, it is one of the first autism AR study to our knowledge that uses mixed methods of inquiry (qualitative and quantitative) largely relying on the grounded theory. This approach may be adopted in future XR autism research when targeting children being minimally verbal and with intellectual disability. Third, a detailed categorization of children's experiences with respect to their underlying causes was derived. Even if this categorization still needs to be validated with more autistic children, it could inform the methodology of future AR/VR studies focusing on children with autism or a related neurodevelopmental condition, by providing a tool to understand and analyze the quality of their holistic experiences.
Figure 2. Children with severe autism spectrum disorders testing our Magic Bubbles augmented reality application in sessions at the André Boulloche day hospital
Impact
Beyond the journal publication, we have presented this work at several seminars and invited talks, including conferences IHM'24, and Forum des Sciences Cognitives.
This paper, and the MagicBubbles application in general, is part of the works of Valentin Bauer's PhD thesis awarded with the best interdisciplinary thesis award from AFIHM this year.
This project allowed us to initiate another project on music therapy in VR with the same Day Hospital André Boulloche and new partners — academics researchers and therapists — both from Denmark (see paper hal-04415204 for instance)
Articles: ShapeGuide and ShapeCompare
Yujiro Okuya, Nicolas Ladeveze, Cédric Fleury, and Patrick Bourdot. 2018. ShapeGuide: Shape-Based 3D Interaction for Parameter Modification of Native CAD Data. Frontiers in Robotics and AI Journal, volume 5. DOI 10.3389/frobt.2018.00118 - HAL hal-01958226.
- PDF of the ShapeGuide paper (local copy)
- PDF of the ShapeGuide paper (HAL)
- Video (4') presenting the ShapeGuide paper (YouTube)
- Short video (50") presenting ShapeGuide (local copy)
Yujiro Okuya, Olivier Gladin, Nicolas Ladevèze, Cédric Fleury, and Patrick Bourdot. 2020. Investigating Collaborative Exploration of Design Alternatives on a Wall-Sized Display. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI '20). New York, NY, USA, 1–12. DOI:10.1145/3313831.3376736. - HAL hal-02947245.
- PDF of the ShapeCompare paper (local copy)
- PDF of the ShapeCompare paper (HAL)
- Video (14'35) presenting the ShapeCompare paper (local copy)
- Short video (30") presenting ShapeCompare (local copy)
- More videos attached as Supplementary material
Summary
This portfolio element describes an immersive approach to editing native CAD data and parameters using direct interaction techniques on its resulting mesh (ShapeGuide) and to compare design alternatives using wall-sized display (ShapeCompare).
ShapeGuide allows users to modify the shape of a rear-view mirror within a realistic virtual environment of a car’s cockpit from a first person perspective using VENISE CAVE with haptic feedback, impacting the user modifications in interactive time on the native CAD data model
ShapeCompare allows users to explore design alternatives using a wall-sized display
Contribution
In the field of design and engineering, the Ph.D. of Y. Okuya led to an important body of work on the topic of collaborative interaction between heterogeneous interactive systems, especially when such systems do not have comparable immersive characteristics (e.g., with or without stereoscopic or haptic rendering). Along with a generic VR-CAD integration framework for heterogeneous and remote collaboration, we designed several original interactive paradigms, such as ShapeGuide (a haptic and pseudo-haptic shape-based modification technique of CAD models in multi-sensorimotor immersive environments) published in Frontiers in Robotics & AI in 2018, and ShapeCompare (a collaborative CAD shape decision technique in wall-sized display) presented at ACM CHI 2020.
Impact
The results of this work are important in terms of the uses of such a direct interaction editing system, as well as design alternatives exploration in the automotive and aeronautics industries. This work also led to internal collaborations within the IaH department teams especially with ExSitu team, taking advantage of the merger between LIMSI and LRI.