Rémi Brouet - Multi-touch gesture interactions and deformable geometry for 3D edition on touch screen

09:30
Thursday
12
Mar
2015
Speaker: 
Rémi Brouet
Teams: 

Lieu de soutenance :

Grand Amphi de l'INRIA Rhône-Alpes, Montbonnot.

Jury :

Rapporteurs :

Joaquim JORGE, Full Professeur (Lisbonne)
Laurent GRISONI, Professeur (Lille 1)     

Examinateurs :

Martin HACHET, Chargé de Recherche (INRIA Bordeaux)
Georges-Pierre BONNEAU, Professeur (LJK)

Directeurs de thèse :

Renaud BLANCH, Maitre de conférence (LIG)
Marie-Paule CANI, Professeur   (LJK)

 

 
 

Despite the advances made in the fields of existing objects capture and of procedural gen- eration, creation of content for virtual worlds can not be perform without human interaction. This thesis suggests to exploit new touch devices ("multi-touch" screens) to obtain an easy, in- tuitive 2D interaction in order to navigate inside a virtual environment, to manipulate, position and deform 3D objects.

 
First, we study the possibilities and limitations of the hand and finger gestures while in- teracting on a touch screen in order to discover which gestures are the most adapted to edit 3D scene and environment. In particular, we evaluate the effective number of degrees of free- dom of the human hand when constrained on a planar surface.Meanwhile, we develop a new gesture analysis method using phases to identify key motion of the hand and fingers in real time. These results, combined to several specific user-studies, lead to a gestural design pattern which handle not only navigation (camera positioning), but also object positioning, rotation and global scaling. Then, this pattern is extended to complex deformation (such as adding and deleting material, bending or twisting part of objects, using local control). Using these results, we are able to propose and evaluate a 3D world editing interface that handle a natural touch interaction, in which mode selection (i.e. navigation, object positioning or object defor- mation) and task selections is automatically processed by the system, relying on the gesture and the interaction context (without any menu or button). Finally, we extend this interface to integrate more complex deformations, adapting the garment transfer from a character to any other in order to process interactive deformation of the garment while the wearing character is deformed.