#FHWSPodcast:
A new aid for the visually impaired
![[Translate to Englisch:] Alicia Weigel [Translate to Englisch:] Profil-Foto der Interviewerin: Alicia Weigel](/fileadmin/ki/bilder/forschung/Alicia_Weigel-rund.png)
Interviewer:
Alicia Weigel
![[Translate to Englisch:] Prof. Dr. Nicolas Müller [Translate to Englisch:] Prof. Dr. Nicolas Müller](/fileadmin/ki/bilder/forschung/Mueller-Nicholas.png)
Interviewee:
Prof. Dr. Nicolas Müller
Wie Menschen mit einer Seh-Einschränkung trotzdem Gesichtsemotionen während eines Gesprächs erkennen können, erklärt Prof. Nicholas Müller (Sozioinformatik) im #FHWSPodcast.
In our #FHWSPodcast, Prof. Nicholas Müller (socio-informatics) explains how persons with a visual impairment can nevertheless perceive facial expressions during a conversation.
For healthy people, it is perfectly normal to pay attention to the facial expres-sions of listeners while talking. If someone knits their brows, it is immediately clear they did not understand what was said. But what do people with visual im-pairments do? They are lacking this information, which can often lead to misun-derstandings. This is precisely where Prof. Nicholas Müller, professor for socio-informatics, wants to help with the “vibrotactile emotion recognition” project. Artificial intelligence plays a major role in this project.
You can learn exactly what “vibrotactile emotion recognition” means and how it is all implemented in practice in our #FHWSPodcast.
Download the transcript of this #FHWSPodcast in English (PDF)