Feature Top (Full Width)

Monday, 29 June 2015

Digital Empathy: How technology will read emotions.

As humans we have evolved to convey our emotions to each other; both as a means of improving our social connections and to aid our survival. Even to this day we consciously and unconsciously contort, skew, manipulate and alter muscles under our skin to provide others with clues to our emotions. If we can read these visual cues, and tease from them insight into understanding of the well-being of others, then can software read faces, with accuracy, to comprehend how the user is feeling?  

Consider for a moment, how much  face time, technology has access to during the course of a day. How many facial expressions you demonstrate in front of the television, your computer, games console or mobile phone. How many times you may smile at a text message, be frustrated with a tricky level on Candy Crush Saga or laugh whilst watching comedy. Now consider which technologies allow the device to 'see' the user's face: mobile phones, tablets and some consoles already have camera devices attached to them. Potentially our gadgets already have the tools to do this, they just need the means to understand what they see and to recognise which emotions are being displayed.

In a TED talk, Connected, but alone?, Sherry Turkle explains how software has been created to enable technology to achieve accurate facial recognition of emotion. Turkle describes how facial expressions are broken up into 45 segments called action units: for example, the corner of one side of the mouth is one action unit. When a device reads these action units it combines them into a snapshot of an emotion called a data point, which is compared, through a complex algorithm, to a databank comprising of  over 12 million data points obtained from people of different ages, genders and ethnicity. By comparing what it can see, to what it knows, it can tell the difference between even the subtlest of emotions (eg. the difference between a smile and a smirk), with an amazing level of accuracy, and add the new encounter to its databank to enhance future learning. The software mimics, in a rudimentary way, how humans learn.

The practicalities of being able to read emotions would be life-changing in the field of special educational needs and disability (SEND). If someone who has autism struggles to recognise facial expression, they could perhaps wear a technology such as Google Glass, which would not only read the facial expression of who the user is looking at, but would tell the user, in the display, what the  person was feeling. Likewise, it could be used to aid visually impaired or blind people, to understand the emotion of others through the form of audible feedback. This combined with the tone and delivery of speech would help to give a more accurate understanding of the speaker's emotion.
Additionally, emotional recognition could be used in education. If a student was reading or interacting with a website or software and was confused, the technology could adapt itself to aid understanding by slowing down the pace of the learning, repeating a part or maybe explaining the content in a different way. Alternatively it could be used to build a profile of how to best teach a child, logging which programs provoke happier responses, free from expressions of puzzlement.

However, do we really want this invasion of our intimate feelings? All of us, to some degree use technology to create a different version of ourselves when communicating, or existing in virtual representation. We sometimes send a text message because we do not want someone else to know our true emotion, even going further in some situations by adding emoji or emoticons to disguise or exaggerate our true feelings. We want to post our happiest, best photos to social media because this is how we wish to be thought of, even though we experience low points and struggles in our life. We want to show control over what others perceive of us, beyond the control we can exercise in a one-to-one situation with another being. 

Additionally, our reaction is being recorded and can be done so in conjunction with what we are viewing. The potential for using this data for marketing is massive. What if you received, instead of regular spam or junk mail (which you delete), focused emails and adverts optimised to provoke certain emotions within you. What if an advert for a product could be tailor-made to make you, an individual, happy? Does it mean that our ability to make conscious choices will erode (beyond that which it already has) under the new powerful psychological tools which will be available to advertisers. Many of us willfully sign off on our rights to privacy by agreeing to the terms and conditions of social media websites, the use of apps or even applying for a loan, with the benefit being the free use of the service. But if you have to trade something for it, then the service is not free. The question that we will have to face in the future is: will the services available to us, be worth the information we provide companies, regardless of their altruistic or profiteering tendencies, or more simply put: how much our are emotions worth?


Image of digital eye from wallpapertube.com

No comments:

Post a Comment