Business ETC uses cookies. By continuing to browse this site you are agreeing to our use of cookies. Click here to find out more »
Dublin: 13 °C Saturday 1 November, 2014

This Google Glass app uses facial expressions to tell how you’re feeling

SHORE analyses facial expressions in real-time to determine the mood a person is in, as well as their gender and age.

Image: FraunhoferIIS/YouTube

WHEN TALKING ABOUT Google Glass, the main concern that is brought up is privacy.

The idea that someone wearing them is recording your every move is often cited and while the reality isn’t quite like that, the concern is a real one.

Although such fears may not be quelled by the unveiling of a new app that uses facial features to determine a person’s mood.

The SHORE (Sophisticated High-speed Object Recognition Engine) Human Emotion Detector, which was created by Fraunhofer IIS, uses facial expressions to determine the mood a person is in as well as their gender and age.

It analyses emotions like happiness, sadness, anger and surprise and displays this information on screen.

The software was ‘trained’ by accessing a database of more than 10,000 annotated faces and combining that with learning algorithms, it ends up with high recognition rates.

However, the makers are keen to stress that it’s not able to determine a person’s identity. The software only analyses emotions and none of the images or information displayed leave the device.

The makers see a number of applications for the technology like helping out the visually impaired or those suffering from ASD (autism spectrum disorder).


Read: Everything you ever wanted to know about Google Glass (but were afraid to ask) >

Read: Should you be worried about whether your cloud data is safe? >

  • Share on Facebook
  • Email this article
  •  

Read next:

Comments (25 Comments)

Add New Comment