Aller au contenu

WP3 - Robust Human/Machine Interaction

Dernière mise à jour :

AdobeStock_472814384 960x540 Adobestock @Bleu Planet Studio.jpegAutonomous systems always require an interaction with the people who are the operators, for whom or with whom they must be implemented. This interaction can be reduced to a supervision of the operation of the system, of its tasks or missions. The human-system interactions can also be much more intense temporally and involve several communication modalities (verbal, gestural, haptic, etc.) up to physical interactions. These interactions, which take place through appropriate interfaces, must be efficient, fluent and secure. For systems with a high degree of autonomy, they must allow the operator to have a certain transparency in the functioning of the system, an understanding of his actions, and in general to have a certain level of control of the system and his actions. What we call robustness of human-system interactions covers all these different dimensions.

AdobeStock_340383326 960x540@Tiemey.jpegThis work package considers the safe physical interactions (cobotics) between robotic systems and operators that may be necessary for the learning of tasks (trajectories, efforts exerted, etc.) their realisation in synergy between man and system (co-manipulation, gesture assistance, etc.), the control of system activities, etc. Generally speaking, in these interactions the system must offer a guarantee of safety and security of operation. We therefore consider how the location and activities of the operators can be detected in a robust way and how the control of the system can guarantee the respect of certain constraints (maximum interaction energy, non-collision, etc.). 

The remote operation (remote piloting) of autonomous systems (UGV, UAV, manipulator robots) to guarantee the control properties (controllability, stability) of the systems, the adaptation to the tasks and the difficulties are investigated. We deal with the robustness of the performances in front of the variations of the parameters (LPV models) as it is the case for the space telemanipulators which are subjected to strong gradients of temperature as well as the stability of the systems with communications with variable time delay, but also with the questions of intuitive tele-operation. The synthesis of synergistic behavior models by learning prediction models of the physical and visual interactions of systems are investigated in the specific framework of remote operation by visuo-haptic feedback.

The notion of agentivity in human-autonomous system interactions, i.e. the feeling of control over the actions of autonomous systems, the understanding and ultimately the operator's confidence in the system are assessed by various methods based on neuro-physiological correlates (ECG, actrimetry, Neuronal Correlates of Consciousness, etc.). In the context of autonomous vehicles, for example, let us mention that from the point of view of ergonomics, interaction, or at least information, is particularly necessary for the user to sufficiently understand the intentions of the system to place his trust (the level of understanding itself being a question). 

To other extents, data collection comes with gesture analysis, or even monitoring of human / robot interactions at the reactive level. Indeed, the corrections induced by the control of a robot give indications on the reactive decisions of humans and therefore of their behaviour.

Teams Involved

ONERA / AUCTUS / LABRI DART / LaBRI IS / IMS BSA / MNEMOSYNE