Advertisement
FraunhoferIIS/YouTube
mood

This Google Glass app uses facial expressions to tell how you're feeling

SHORE analyses facial expressions in real-time to determine the mood a person is in, as well as their gender and age.

WHEN TALKING ABOUT Google Glass, the main concern that is brought up is privacy.

The idea that someone wearing them is recording your every move is often cited and while the reality isn’t quite like that, the concern is a real one.

Although such fears may not be quelled by the unveiling of a new app that uses facial features to determine a person’s mood.

The SHORE (Sophisticated High-speed Object Recognition Engine) Human Emotion Detector, which was created by Fraunhofer IIS, uses facial expressions to determine the mood a person is in as well as their gender and age.

It analyses emotions like happiness, sadness, anger and surprise and displays this information on screen.

The software was ‘trained’ by accessing a database of more than 10,000 annotated faces and combining that with learning algorithms, it ends up with high recognition rates.

However, the makers are keen to stress that it’s not able to determine a person’s identity. The software only analyses emotions and none of the images or information displayed leave the device.

The makers see a number of applications for the technology like helping out the visually impaired or those suffering from ASD (autism spectrum disorder).

FraunhoferIIS / YouTube

Read: Everything you ever wanted to know about Google Glass (but were afraid to ask) >

Read: Should you be worried about whether your cloud data is safe? >

Your Voice
Readers Comments
25
    Submit a report
    Please help us understand how this comment violates our community guidelines.
    Thank you for the feedback
    Your feedback has been sent to our team for review.