scholarly journals The importance of mutual gaze in human-robot interaction

2019 ◽  
Author(s):  
Kyveli Kompatsiari ◽  
Vadim Tikhanoff ◽  
Francesca Ciardo ◽  
Giorgio Metta ◽  
Agnieszka Wykowska

Mutual gaze is a key element of human development, and constitutes an important factor in human interactions. In this study, we examined –through analysis of subjective reports– the influence of an online eye-contact of a humanoid robot on humans’ reception of the robot. To this end, we manipulated the robot gaze, i.e., mutual (social) gaze and neutral (non-social) gaze, throughout an experiment involving letter identification. Our results suggest that people are sensitive to the mutual gaze of an artificial agent, they feel more engaged with the robot when a mutual gaze is established, and eye-contact supports attributing human-like characteris-tics to the robot. These findings are relevant both to the human-robot interaction (HRI) research - enhancing social behavior of robots, and also for cognitive neuroscience - studying mechanisms of social cognition in relatively realistic social interactive scenarios.

2019 ◽  
Author(s):  
Kyveli Kompatsiari ◽  
Francesca Ciardo ◽  
Vadim Tikhanoff ◽  
Giorgio Metta ◽  
Agnieszka Wykowska

This paper reports a study where we examined how a humanoid robot was evaluated by users, dependent on established eye contact. In two experiments, we manipulated how the robot gazes, namely either by looking at the subjects’ eyes (mutual gaze) or to a socially neutral position (neutral). Across the two experiments, we altered the level of predictiveness of the robot’s gaze direction with respect to a subsequent target stimulus (in Exp.1 the direction was non-predictive, in Exp. 2 the gaze direction was counter-predictive). Results of subjective reports showed that participants were sensitive to eye contact. Moreover, participants were more engaged and ascribed higher intentionality to the robot in the mutual gaze condition relative to the neutral condition. This was independent of predictiveness of the gaze cue. Our results suggest that embodied humanoid robots can establish eye contact, which in turn has a positive impact on perceived socialness of the robot, and on the quality of human-robot interaction (HRI). Therefore, establishing mutual gaze should be considered in design of robot behaviors for social HRI.


Author(s):  
Giorgio Metta

This chapter outlines a number of research lines that, starting from the observation of nature, attempt to mimic human behavior in humanoid robots. Humanoid robotics is one of the most exciting proving grounds for the development of biologically inspired hardware and software—machines that try to recreate billions of years of evolution with some of the abilities and characteristics of living beings. Humanoids could be especially useful for their ability to “live” in human-populated environments, occupying the same physical space as people and using tools that have been designed for people. Natural human–robot interaction is also an important facet of humanoid research. Finally, learning and adapting from experience, the hallmark of human intelligence, may require some approximation to the human body in order to attain similar capacities to humans. This chapter focuses particularly on compliant actuation, soft robotics, biomimetic robot vision, robot touch, and brain-inspired motor control in the context of the iCub humanoid robot.


Author(s):  
Margot M. E. Neggers ◽  
Raymond H. Cuijpers ◽  
Peter A. M. Ruijten ◽  
Wijnand A. IJsselsteijn

AbstractAutonomous mobile robots that operate in environments with people are expected to be able to deal with human proxemics and social distances. Previous research investigated how robots can approach persons or how to implement human-aware navigation algorithms. However, experimental research on how robots can avoid a person in a comfortable way is largely missing. The aim of the current work is to experimentally determine the shape and size of personal space of a human passed by a robot. In two studies, both a humanoid as well as a non-humanoid robot were used to pass a person at different sides and distances, after which they were asked to rate their perceived comfort. As expected, perceived comfort increases with distance. However, the shape was not circular: passing at the back of a person is more uncomfortable compared to passing at the front, especially in the case of the humanoid robot. These results give us more insight into the shape and size of personal space in human–robot interaction. Furthermore, they can serve as necessary input to human-aware navigation algorithms for autonomous mobile robots in which human comfort is traded off with efficiency goals.


2020 ◽  
Vol 12 (1) ◽  
pp. 58-73
Author(s):  
Sofia Thunberg ◽  
Tom Ziemke

AbstractInteraction between humans and robots will benefit if people have at least a rough mental model of what a robot knows about the world and what it plans to do. But how do we design human-robot interactions to facilitate this? Previous research has shown that one can change people’s mental models of robots by manipulating the robots’ physical appearance. However, this has mostly not been done in a user-centred way, i.e. without a focus on what users need and want. Starting from theories of how humans form and adapt mental models of others, we investigated how the participatory design method, PICTIVE, can be used to generate design ideas about how a humanoid robot could communicate. Five participants went through three phases based on eight scenarios from the state-of-the-art tasks in the RoboCup@Home social robotics competition. The results indicate that participatory design can be a suitable method to generate design concepts for robots’ communication in human-robot interaction.


Author(s):  
Stefan Schiffer ◽  
Alexander Ferrein

In this work we report on our effort to design and implement an early introduction to basic robotics principles for children at kindergarten age.  The humanoid robot Pepper, which is a great platform for human-robot interaction experiments, was presenting the lecture by reading out the contents to the children making use of its speech synthesis capability.  One of the main challenges of this effort was to explain complex robotics contents in a way that pre-school children could follow the basic principles and ideas using examples from their world of experience. A quiz in a Runaround-game-show style after the lecture activated the children to recap the contents  they acquired about how mobile robots work in principle. Besides the thrill being exposed to a mobile robot that would also react to the children, they were very excited and at the same time very concentrated. What sets apart our effort from other work is that part of the lecturing is actually done by a robot itself and that a quiz at the end of the lesson is done using robots as well. To the best of our knowledge this is one of only few attempts to use Pepper not as a tele-teaching tool, but as the teacher itself in order to engage pre-school children with complex robotics contents. We  got very positive feedback from the children as well as from their educators.


2018 ◽  
Vol 9 (1) ◽  
pp. 221-234 ◽  
Author(s):  
João Avelino ◽  
Tiago Paulino ◽  
Carlos Cardoso ◽  
Ricardo Nunes ◽  
Plinio Moreno ◽  
...  

Abstract Handshaking is a fundamental part of human physical interaction that is transversal to various cultural backgrounds. It is also a very challenging task in the field of Physical Human-Robot Interaction (pHRI), requiring compliant force control in order to plan the arm’s motion and for a confident, but at the same time pleasant grasp of the human user’s hand. In this paper,we focus on the study of the hand grip strength for comfortable handshakes and perform three sets of physical interaction experiments between twenty human subjects in the first experiment, thirty-five human subjects in the second one, and thirty-eight human subjects in the third one. Tests are made with a social robot whose hands are instrumented with tactile sensors that provide skin-like sensation. From these experiments, we: (i) learn the preferred grip closure according to each user group; (ii) analyze the tactile feedback provided by the sensors for each closure; (iii) develop and evaluate the hand grip controller based on previous data. In addition to the robot-human interactions, we also learn about the robot executed handshake interactions with inanimate objects, in order to detect if it is shaking hands with a human or an inanimate object. This work adds physical human-robot interaction to the repertory of social skills of our robot, fulfilling a demand previously identified by many users of the robot.


2009 ◽  
Vol 6 (3-4) ◽  
pp. 369-397 ◽  
Author(s):  
Kerstin Dautenhahn ◽  
Chrystopher L. Nehaniv ◽  
Michael L. Walters ◽  
Ben Robins ◽  
Hatice Kose-Bagci ◽  
...  

2007 ◽  
Vol 23 (5) ◽  
pp. 840-851 ◽  
Author(s):  
Rainer Stiefelhagen ◽  
Hazim Kemal Ekenel ◽  
Christian Fugen ◽  
Petra Gieselmann ◽  
Hartwig Holzapfel ◽  
...  

2013 ◽  
Vol 5 (4) ◽  
pp. 491-501 ◽  
Author(s):  
Elisabeth T. van Dijk ◽  
Elena Torta ◽  
Raymond H. Cuijpers

Sign in / Sign up

Export Citation Format

Share Document