I am a creative technologist (see my portfolio)
I help other artists using new technologies to achieve their vision.
I have fun protoyping live stage robots and synaesthetic installations.
I am a media artist (see my portfolio)
Somewhere in-between robotics and magic, I explore the technological solutions for perceptual trickery, be it for art or science.
I invent systems where technology fades to give way to narratives and reveries.
I am a researcher (see my publications)
in haptics, virtual reality and human-machine interaction.
I received a PhD in haptics within the HYBRID team of Inria Rennes together with the Immersive Lab of InterDigital (ex-Technicolor R&I).
My participation to the french "3mn thesis" contest!
The challenge is to present our phd work to the general public in less than 180 seconds, with only one slide.
If you speak French, you'll get an insight into the questions I faced during my phd...
While virtual reality applications flourish, there is a growing need for technological solutions to induce compelling self-motion, as an alternative to cumbersome motion platforms. Haptic devices target the sense of touch, yet more and more researchers managed to address the sense of motion by means of specific and localized haptic stimulations. This innovative approach constitutes a specific paradigm that can be called “haptic motion”. This paper aims to introduce, formalize, survey and discuss this relatively new research field. First, we summarize some core concepts of self-motion perception, and propose a definition of the haptic motion approach based on three criteria. Then, we present a summary of existing related literature, from which we formulate and discuss three research problems that we estimate key for the development of the field: the rationale to design a proper haptic stimulus, the methods to evaluate and characterize self-motion sensations, and the usage of multimodal motion cues.
The sensation of self-motion is essential in many virtual reality applications, from entertainment to training, such as flying and driving simulators. If the common approach used in amusement parks is to actuate the seats with cumbersome systems, multisensory integration can also be leveraged to get rich effects from lightweight solutions. In this paper, we introduce a novel approach called the “Kinesthetic HMD”: actuating a head-mounted display with force feedback in order to provide sensations of self-motion. We discuss its design considerations and demonstrate an augmented flight simulator use case with a proof-of-concept prototype. Thus, by providing congruent vestibular and proprioceptive cues related to balance and self-motion, the Kinesthetic HMD represents a promising approach for a variety of virtual reality applications in which motion sensations are prominent. We conducted a user study assessing our approach’s ability to enhance self-motion sensations. Taken together, our results show that our Kinesthetic HMD provides significantly stronger and more egocentric sensations than a visual-only self-motion experience.
The development of tactile screens opens new perspectives for co-located images and haptic rendering, leading to the concept of “haptic images”. They emerge from the combination of image data, rendering hardware, and haptic perception. This enables one to perceive haptic feedback while manually exploring an image. This raises nevertheless two scientific challenges, which serve as thematic axes for the state of the art of this survey. Firstly, the choice of appropriate haptic data raises a number of issues about human perception, measurements, modeling and distribution. Secondly, the choice of appropriate rendering technology implies a difficult trade-off between expressiveness and usability.
Touchscreens have largely spread out over the last decade and have become one of the most ordinary human-machine interface. However, despite their many assets, touchscreens still lack of tactile sensations: they always feel flat, smooth, rigid and static under the finger, no matter the visual content. In this work, we investigate how to provide touchscreens with the means to touch us and express a variety of image-related haptic features.
We first propose a new format for haptic data which provides a generic haptic description of a virtual object without prior knowledge on display hardware.
This format is meant to be seamlessly integrated in audiovisual content creation workflows, and to be easily manipulated by non-experts in multidisciplinary contexts.
Then, we address the challenge of providing a diversity of haptic sensations with lightweight actuation, with the novel approach called “KinesTouch”. We propose in particular a novel friction effect based on large lateral motion
that increases or diminishes the sliding velocity between the finger and the screen.
Finally, we introduce “Touchy”, a method to apply pseudo-haptic principles to touchscreen interactions. We present a set of pseudo-haptic effects which evoke haptic properties like roughness, stiffness or friction through the
vibrations, stretches, dilatations and compressions of a ring-shaped cursor displayed under the user’s finger. We extend these effects to 3D scenes, and discuss the differences between 2D and 3D content enhancement.
"Touchy" is a novel approach to enhance images on touchscreens with haptic effects through purely visual cues. A symbolic cursor is introduced under the user's finger(s), which shape and motion are altered in order to express a variety of haptic properties : hardness/softness, roughness/smoothness, bumpiness/flatness, or stickiness/slipperiness. Because it is purely software, Touchy does not require additional hardware and is very easy to widespread. It is multitouch and several users can experiment independent pseudo-haptic effects simultaneoulsy. We propose a gallery of haptic images showcasing six different effects on a tactile tablet.
In this paper, we introduce KinesTouch, a novel approach for tactile screen enhancement providing four types of haptic feedback with a single force-feedback device: compliance, friction, fine roughness, and shape. We present the design and implementation of a corresponding set of haptic effects as well as a proof-of-concept setup. Regarding friction in particular, we propose a novel effect based on large lateral motion that increases or diminishes the sliding velocity between the finger and the screen. A user study was conducted on this effect to confirm its ability to produce distinct sliding sensations. Visual cues were confirmed to influence sliding judgments, but further studies would help clarifying the role of tactile cues. Finally, we showcase several use cases illustrating the possibilities offered by the KinesTouch to enhance 2D and 3D interactions on tactile screens in various contexts.
In this paper, we propose a new format for haptic texture mapping which is not dependent on the haptic rendering setup hardware. Our "haptic material" format encodes ten elementary haptic features in dedicated maps, similarly to \materials" used in computer graphics. These ten different features enable the expression of compliance, surface geometry and friction attributes through vibratory, cutaneous and kinesthetic cues, as well as thermal rendering. The diversity of haptic data allows various hardware to share this single format, each of them selecting which features to render depending on its capabilities.
With the growth of virtual reality setups, digital sculpting tools become more and more immersive. It is now possible to create a piece of art within a virtual environment, directly with the controllers. However, these devices do not allow to touch the virtual material as a sculptor would do. To tackle this issue we investigate in this paper the use of a tangible surface that could be used in virtual reality setups. We designed a low-cost prototype composed of two layers of sensors in order to measure a wide range of pressure. We also propose two mapping techniques to map our device to a virtual 3D mesh to be sculpted.
A texture rendering system relying on pseudo-haptic and audio feedback is presented in this paper. While the user touches the texture displayed on a tactile screen, the associated image is deformed according to the contact area and the rubbing motion to simulate pressure. Additionally audio feedback is synthesized in real-time to simulate friction. A novel example-based scheme takes advantage of recorded audio samples of friction between actual textures and a finger at several speeds to synthesize the final output sound. This system can be implemented on any existing tactile screen without any extra mechanical device.