Department for Informatics | Sitemap | LMU-Portal
Deutsch
  • Home
  • Future Students
  • Enrolled students
  • Teaching
    • Archive
      • Detail
      • Padabama
      • Presentations
      • Publikationen
      • Themen
  • Research
  • People
  • Contact
  • Visitors
  • Jobs
  • FAQ
  • Internal
Home > Teaching > Archive > Detail

Facial Expressions and Communication Patterns in Robots: A Face Design

master thesis

Status in progress
Advisor Jan Leusmann
Professor Prof. Dr. Sven Mayer

Task

Facial expressions are a powerful form of communication humans use to convey a wide range of non-verbal cues. With the idea that robots will become an integral part of our daily interactions, we want to explore how different facial expressions of a humanoid robot affect human-robot interaction, as there currently needs to be a greater understanding of how humans perceive facial expressions in robots. Do users want robots to have human-like levels of expressiveness in collaborative tasks? What are essential states to be expressed by a robot with facial expressions? Can we perceive robots as curious, uncertain, or confident just by their facial expressions?

You will:

  • Perform a literature review on the facial expressions of robots and other digital assistants
  • Find out the content which is appropriate to be delivered by facial expression in the context of robot-learning and collaboration.
  • Find at least three core concepts of how to create and design facial emotions
  • Iteratively design different facial designs for a robot with various expressions
  • Conduct a user study to investigate the effect of different facial expressions from robots to humans
  • Summarize your findings in a thesis and present them
  • (Optional) Co-writing a research paper

You need:

  • Strong communication skills in english
  • Very good design skills
  • (Optional) Prior experience in animation design
  • Fundamental programming skills (python)

Keywords

human-robot interaction, robots
To top
Impressum – Privacy policy – Contact  |  Last modified on 2020-04-11 by Changkun Ou (rev 35667)