2019, Article / Letter to editor ((2019))Background: A large part of the communication cues exchanged between persons is nonverbal. Persons with a visual impairment are often unable to perceive these cues, such as gestures or facial expression of emotions. In a previous study, we have determined that visually impaired persons can increase their ability to recognize facial expressions of emotions from validated pictures and videos by using an emotion recognition system that signals vibrotactile cues associated with one of the six basic emotions. Objective: The aim of this study was to determine whether the previously tested emotion recognition system worked equally well in realistic situations and under controlled laboratory conditions. Methods: The emotion recognition system consists of a camera mounted on spectacles, a tablet running facial emotion recognition software, and a waist belt with vibrotactile stimulators to provide haptic feedback representing Ekman’s six universal emotions. A total of 8 visually impaired persons (4 females and 4 males; mean age 46.75 years, age range 28-66 years) participated in two training sessions followed by one experimental session. During the experiment, participants engaged in two 15 minute conversations, in one of which they wore the emotion recognition system. To conclude the study, exit interviews were conducted to assess the experiences of the participants. Due to technical issues with the registration of the emotion recognition software, only 6 participants were included in the video analysis. Results: We found that participants were quickly able to learn, distinguish, and remember vibrotactile signals associated with the six emotions. A total of 4 participants felt that they were able to use the vibrotactile signals in the conversation. Moreover, 5 out of the 6 participants had no difficulties in keeping the camera focused on the conversation partner. The emotion recognition was very accurate in detecting happiness but performed unsatisfactorily in recognizing the other five universal emotions. Conclusions: The system requires some essential improvements in performance and wearability before it is ready to support visually impaired persons in their daily life interactions. Nevertheless, the participants saw potential in the system as an assistive technology, assuming their user requirements can be met.
Schematic overview of the used system.
Schematic overview of the used system.
…
Emotion mapping. The mapping of Ekman's universal emotions on the waist band.
Emotion mapping. The mapping of Ekman's universal emotions on the waist band.
…
Crosstabs of agreement between coders and software. The table shows a tally of the number of time the coders and FaceReader classified a fragment as a particular emotion. The diagonal shows the number of times that the coders and FaceReader classified a fragment as the same emotion.
Crosstabs of agreement between coders and software. The table shows a tally of the number of time the coders and FaceReader classified a fragment as a particular emotion. The diagonal shows the number of times that the coders and FaceReader classified a fragment as the same emotion.
…
Figures - uploaded by Hendrik BuimerAuthor content
Content may be subject to copyright.
ResearchGate Logo
Discover the world's research
20+ million members
135+ million publications
700k+ research projects
Join for free
2017, Article in monograph or in proceedings (Poster presented at the ACM conference ASSETS '17, October 29-November 1, 2017, Baltimore, MD, USA, pp. 331-332)One of the big problems visually impaired persons experience in their daily lives, is the inability to see non-verbal cues of conversation partners. In this study, a wearable assistive technology is presented and evaluated which supports visually impaired persons with the recognition of facial expressions of emotions. The wearable assistive technology consists of a camera clipped on spectacles, emotion recognition software, and a vibrotactile belt with six tactors. An earlier controlled experimental study showed that users of the system improved significantly in their ability to recognize emotions from validated stimuli. In this paper, the next iteration in testing the system is presented, in which a more realistic usage situation was simulated. Eight visually impaired persons were invited to participate in conversations with an actor, who was instructed not to exaggerate his facial expressions. Participants engaged in two 15-minute mock job interview conversations, during one of which they were wearing the system. In the other conversation, no assistive technologies were used. The preliminary results showed that the concept of such wearable assistive technologies remains feasible. Participants within the study found it easy to learn and interpret the vibrotactile cues, which was also shown in their training performance. Furthermore, most participants could use the vibrotactile cues, while being able to stay engaged in the conversation. Nevertheless, some improvements are needed before the system can be used as assistive technology. The accuracy of the system was negatively affected by the lighting and movement conditions present in realistic conversations, compared to the controlled experiment condition. Furthermore, participants requested developments to improve the wearability of the system.