Our journalism usually sits behind a paywall, but we believe this is the time to make more of our content freely available to as many readers as possible. For more free coverage, sign up to COVID-19 Watch.

In August, Amazon’s game-changing algorithm “Amazon Rekognition” — which scans and understands emotional expressions in its users — was given an update, now recognising additional emotional expressions, including fear.

Highly impactful for brands that want to further understand how their customers feel about their branded customer experience (CX), this technology may be a breakthrough in artificial intelligence (AI).

But it begs the question: will it improve the customer journey or is emotional data a dangerous game?

As Jeff Bezos, chief of Amazon said: “The most important thing is to focus obsessively on the customer”. But has this sentiment been taken a little too literally?

A recent emotion-detection software research study examined the six most popular emotion categories used by consumers: anger, disgust, happiness, sadness, surprise and fear. Thanks to its latest update, Rekognition has been assessing all of the above.

According to the study, “technology companies are investing tremendous resources to figure out how to objectively ‘read’ emotions in people by detecting their presumed facial expressions, such as scowling faces, frowning faces, and smiling faces, in an automated fashion”.

So with researchers already speculating on the accuracy of the emotion detectors in question, as well as raising some pretty serious ethical and data security concerns, it seems Amazon has left the door open for competitors who want to make a more ethical purchase.

Why is Amazon Rekognition relevant to CX? 

Qualitative research is still largely a human-led discipline. Rekognition will challenge this, and Amazon will be able to scale this like no other provider has before.

Customer experience is about understanding people, and Rekognition will enable companies to deliver personalised experiences based on understanding what their customers want at a scale that has previously been impossible to do in a commercially-viable sense.

Beyond the augmented reality (AR) Snapchat and Instagram-style filters that can help with advertising and promotion, there is a deeper opportunity for facial recognition to evoke empathetic experiences that are powered by machines.

If we can read peoples’ emotions, subsequently, we can program to be more empathetic. Right now, a machine can’t tell if we’re unhappy or happy, unless we tell it explicitly.

With facial recognition, we can detect signs of emotion to inform machines, which could do everything from sending the customer a nice SMS to cheer them up, to helping a brand understand the mental state it leaves its customers in.

The future of CX will likely involve machines reading people’s faces, listening to their tone of voice, and possibly even reading their biological signs in order to deliver better experiences. The only major issue? Strictly speaking, it’s not technology but privacy.

Your expected reaction 

In practice, real-time facial expression feedback during a movie could provide great insights to producers.

Mass-media conglomerate Disney (via its research studio) has already been implementing this kind of technology to predict how viewers will react to movies, assessing our emotional reactions via deep learning techniques such as AI, machine learning, visual computing and data analysis.

In early 2019 it presented the first ever method to track jaw-dropping movie moments, introducing their “Accurate Markerless Jaw Tracking for Facial Performance Capture”.

But what does this look like for customers on the receiving end? Of course, data privacy is going to surface, but history shows we are continuously giving more and more data, despite ethical concerns, so that’s unlikely to change.

If people receive more personalised, relevant, and enjoyable experiences as a result of facial recognition, then it will undoubtedly be valued despite current concerns.

What now? 

Ultimately, facial recognition could become a new frontier for brands if the technology can be relevantly applied. There’s a multitude of emotion-detection software programs available, but like all technology, brands shouldn’t just use it because it’s new.

Security needs to be paramount, with companies treating ‘data collection’ and ‘security’ as two equally important aspects. Ignoring security is almost guaranteed to land you in hot water, with cyber attacks almost inevitable for big-enough brands.

But if a company can show responsibility in how they manage data, then they’re going to be able to leverage facial recognition to the extent of their wildest dreams.

This article originally appeared on SmartCompany and is republished here with permission.

Peter Fray

This crisis will cut hard and deep but one day it will be over.

What will be left? What do you want to be left?

I know what I want to see: I want to see a thriving, independent and robust Australian-owned news media. I want to see governments, authorities and those with power held to account. I want to see the media held to account too.

Demand for what we do is running high. Thank you. You can help us even more by encouraging others to subscribe — or by subscribing yourself if you haven’t already done so.

If you like what we do, please subscribe.

Peter Fray
Editor-In-Chief of Crikey

Support us today