contrat Labex PERSYVAL, sept. 2013 - août 2015
TAngibilité Physiologique Instrumentée : Outil mixte redimensionnable pour la Conception d'Artefact
Project web page (last update : June 2015) :
Studying the convergence of both physical and digital worlds, this project aims at identifying and engineering gesture recognition-based interactive systems of the future, based on Resizable Mixed Tools (RMT). In this project, these tools are dedicated to the typical domain of physical artifacts’ design, such as industrial products, and digital artifacts’ design, such as 3D models.
Thanks to technological evolutions, such as the Microsoft Kinect device, gesture-based interaction is now considered as an obvious and more natural way to interact with computers . It opens to a wide range of potential application areas such as health care or education. In addition, gestures also play an important role in thinking and collaborating, such as design activities.
Particularly, tangible gesture interactions can play a positive role in collaborative design. However, designers end up using a lot of tangible devices, differing in sizes, in order to balance their need for input real estate and their need for space. Being able to continuously balance both needs is important, especially for manipulating and visualizing 3D CAD models, but the problem is difficult as it challenges today’s interaction paradigms and technologies.
At the heart of this project, RMTs are objects combining physical and digital representations with resizable capabilities, used as tools for design activities. Supporting much more natural interactions and activities anchored in the physical world, we envision such a system (1) able to adapt interaction to user’s needs, (2) able to proactively assist users with their tasks thanks to gesture and intention recognition, (3) facilitating and increasing efficiency for design activities of artifacts.
Requiring an interdisciplinary approach to be addressed, four major, strongly related and difficult scientific challenges are identified:
- Robustly and accurately recognizing forearm grasping gestures and identifying users’ intentions in order to allow an interactive system to proactively assist users (e.g. designers) with their task.
- Identifying relevant tangible and resizable interaction techniques based on RMTs, and engineering interactive systems able to adapt interaction to user’s needs and gestures.
- Engineering appropriate RMT-based design tools, able to adapt to designers’ needs and able to provide information on the grasping properties of the product to be designed.
- Providing 3D digital representations for design activities allowing rich deformation modes coupled with relevant RMT-based interaction techniques to manipulate deformable 3D models.
The addressed challenges cover several scientific domains: human-computer interaction and tangible gesture interaction as well as resizable user interfaces, cooperative design and design practices with CAD tools, EMG signal processing and classification, and 3D modeling and deformable 3D models. It fosters collaborations between researchers having inter-disciplinary expertise from the departments GIPSA-lab (EMG), G-SCOP (Cooperative design), LIG (HCI), and LJK (3D models).
G-SCOP (Cooperative Design project),
GIPSA-lab (équipe SAIGA),
équipe EVASION (LJK)