2019, Article / Letter to editor ((2019))Background: A large part of the communication cues exchanged between persons is nonverbal. Persons with a visual impairment are often unable to perceive these cues, such as gestures or facial expression of emotions. In a previous study, we have determined that visually impaired persons can increase their ability to recognize facial expressions of emotions from validated pictures and videos by using an emotion recognition system that signals vibrotactile cues associated with one of the six basic emotions. Objective: The aim of this study was to determine whether the previously tested emotion recognition system worked equally well in realistic situations and under controlled laboratory conditions. Methods: The emotion recognition system consists of a camera mounted on spectacles, a tablet running facial emotion recognition software, and a waist belt with vibrotactile stimulators to provide haptic feedback representing Ekman’s six universal emotions. A total of 8 visually impaired persons (4 females and 4 males; mean age 46.75 years, age range 28-66 years) participated in two training sessions followed by one experimental session. During the experiment, participants engaged in two 15 minute conversations, in one of which they wore the emotion recognition system. To conclude the study, exit interviews were conducted to assess the experiences of the participants. Due to technical issues with the registration of the emotion recognition software, only 6 participants were included in the video analysis. Results: We found that participants were quickly able to learn, distinguish, and remember vibrotactile signals associated with the six emotions. A total of 4 participants felt that they were able to use the vibrotactile signals in the conversation. Moreover, 5 out of the 6 participants had no difficulties in keeping the camera focused on the conversation partner. The emotion recognition was very accurate in detecting happiness but performed unsatisfactorily in recognizing the other five universal emotions. Conclusions: The system requires some essential improvements in performance and wearability before it is ready to support visually impaired persons in their daily life interactions. Nevertheless, the participants saw potential in the system as an assistive technology, assuming their user requirements can be met.
Schematic overview of the used system.
Schematic overview of the used system.
…
Emotion mapping. The mapping of Ekman's universal emotions on the waist band.
Emotion mapping. The mapping of Ekman's universal emotions on the waist band.
…
Crosstabs of agreement between coders and software. The table shows a tally of the number of time the coders and FaceReader classified a fragment as a particular emotion. The diagonal shows the number of times that the coders and FaceReader classified a fragment as the same emotion.
Crosstabs of agreement between coders and software. The table shows a tally of the number of time the coders and FaceReader classified a fragment as a particular emotion. The diagonal shows the number of times that the coders and FaceReader classified a fragment as the same emotion.
…
Figures - uploaded by Hendrik BuimerAuthor content
Content may be subject to copyright.
ResearchGate Logo
Discover the world's research
20+ million members
135+ million publications
700k+ research projects
Join for free
2018, Article / Letter to editor (vol. 13, iss. 3, (2018))In face-to-face social interactions, blind and visually impaired persons (VIPs) lack access to nonverbal cues like facial expressions, body posture, and gestures, which may lead to impaired interpersonal communication. In this study, a wearable sensory substitution device (SSD) consisting of a head mounted camera and a haptic belt was evaluated to determine whether vibrotactile cues around the waist could be used to convey facial expressions to users and whether such a device is desired by VIPs for use in daily living situations. Ten VIPs (mean age: 38.8, SD: 14.4) and 10 sighted persons (SPs) (mean age: 44.5, SD: 19.6) participated in the study, in which validated sets of pictures, silent videos, and videos with audio of facial expressions were presented to the participant. A control measurement was first performed to determine how accurately participants could identify facial expressions while relying on their functional senses. After a short training, participants were asked to determine facial expressions while wearing the emotion feedback system. VIPs using the device showed significant improvements in their ability to determine which facial expressions were shown. A significant increase in accuracy of 44.4% was found across all types of stimuli when comparing the scores of the control (mean±SEM: 35.0±2.5%) and supported (mean±SEM: 79.4±2.1%) phases. The greatest improvements achieved with the support of the SSD were found for silent stimuli (68.3% for pictures and 50.8% for silent videos). SPs also showed consistent, though not statistically significant, improvements while supported. Overall, our study shows that vibrotactile cues are well suited to convey facial expressions to VIPs in real-time. Participants became skilled with the device after a short training session. Further testing and development of the SSD is required to improve its accuracy and aesthetics for potential daily use.
2016, Article in monograph or in proceedings (Project: Smart glasses for visually impaired persons, pp. 157-163)The rise of smart technologies has created new opportunities to support blind and visually impaired persons (VIPs). One of the biggest problems we identified in our previous research on problems VIPs face during activities of daily life concerned the recognition of persons and their facial expressions. In this study we developed a system to detect faces, recognize their emotions, and provide vibrotactile feedback about the emotions expressed. The prototype system was tested to determine whether vibrotactile feedback through a haptic belt is capable of enhancing social interactions for VIPs. The system consisted of commercially available technologies. A Logitech C920 webcam mounted on a cap, a Microsoft Surface Pro 4 carried in a mesh backpack, an Elitac tactile belt worn around the waist, and the VicarVision FaceReader software application, which recognizes facial expressions. In preliminary tests with the systems both visually impaired and sighted persons were presented with sets of stimuli consisting of actors displaying six emotions (e.g. joy, surprise, anger, sadness, fear, and disgust) derived from the validated Amsterdam Dynamic Facial Expression Set and Warsaw Set of Emotional Facial Expression Pictures with matching audio by using nonlinguistic affect bursts. Subjects had to determine the emotions expressed in the videos without and, after a training period, with haptic feedback. An exit survey was conducted aimed to gain insights into the opinion of the users, on the perceived usefulness and benefits of the emotional feedback, and their willingness of using the prototype as assistive technology in daily life. Haptic feedback about facial expressions may improve the ability of VIPs to determine emotions expressed by others and, as a result, increase the confidence of VIPs during social interactions. More studies are needed to determine whether this is a viable method to convey information and enhance social interactions in the daily life of VIPs.