scholarly journals Affective Robot Story-Telling Human-Robot Interaction: Exploratory Real-Time Emotion Estimation Analysis Using Facial Expressions and Physiological Signals

IEEE Access ◽  
2020 ◽  
Vol 8 ◽  
pp. 134051-134066 ◽  
Author(s):  
Mikel Val-Calvo ◽  
Jose Ramon Alvarez-Sanchez ◽  
Jose Manuel Ferrandez-Vicente ◽  
Eduardo Fernandez
Author(s):  
Mikel Val-Calvo ◽  
José R. Álvarez-Sánchez ◽  
Jose M. Ferrández-Vicente ◽  
Eduardo Fernández

Author(s):  
Min Raj Nepali ◽  
Priyanka C Karthik ◽  
Jharna Majumdar

<p>Advanced Robot for Interactive Application (ARIA) is a Humanoid Robotic Head which is capable of mimicking Various Human Facial Expressions. Much work has been done on Implementation of Humanoid Robotic Head with High end systems and Personal Computers (PCs). This paper presents the essential elements necessary for the implementation of Advanced Robot for Interactive Application (ARIA) on UDOO Board. The main aim of the Project was to develop a control system and Graphical User Interface (GUI) for ARIA to deliver real time human facial expressions using embedded board. Humanoid Robotic Head which is capable of mimicking Human Facial Expressions in Real time. Implementation of ARIA   involved careful selection of Embedded Board, actuators, control algorithms, motor drivers, operating system, communication protocols, and programming languages. The Board contains a Quad Core A9 Processor and a Controller embedded on it, which are interconnected. In this project the controller is dedicated to control micro servo motors which are controlling eyes, eyebrows and eyelids movements whereas the Processor Handles the Dynamixel motors, GUI and different communication modules.</p>


Author(s):  
Vignesh Prasad ◽  
Ruth Stock-Homburg ◽  
Jan Peters

AbstractFor some years now, the use of social, anthropomorphic robots in various situations has been on the rise. These are robots developed to interact with humans and are equipped with corresponding extremities. They already support human users in various industries, such as retail, gastronomy, hotels, education and healthcare. During such Human-Robot Interaction (HRI) scenarios, physical touch plays a central role in the various applications of social robots as interactive non-verbal behaviour is a key factor in making the interaction more natural. Shaking hands is a simple, natural interaction used commonly in many social contexts and is seen as a symbol of greeting, farewell and congratulations. In this paper, we take a look at the existing state of Human-Robot Handshaking research, categorise the works based on their focus areas, draw out the major findings of these areas while analysing their pitfalls. We mainly see that some form of synchronisation exists during the different phases of the interaction. In addition to this, we also find that additional factors like gaze, voice facial expressions etc. can affect the perception of a robotic handshake and that internal factors like personality and mood can affect the way in which handshaking behaviours are executed by humans. Based on the findings and insights, we finally discuss possible ways forward for research on such physically interactive behaviours.


Sign in / Sign up

Export Citation Format

Share Document