Risks Of Bias In ‘Emotional AI’

Artificial intelligence (AI) has grown exponentially, from driverless vehicles to voice automation in households the practical applications of AI are rapidly expanding, leading to a growing interest in the potential biases of such algorithms. 

AI facial analysis is increasingly used when people checking. For example, some organisations tell candidates to answer predefined questions in a recorded video and use facial recognition to analyse the potential applicant faces. 

Furthermore, some companies are developing facial recognition software to scan the faces in crowds and assess threats, specifically mentioning doubt and anger as emotions that indicate threats.

A recent study shows evidence that facial recognition software interprets emotions differently based on the person’s race. 
Lisa Feldman Barrett, Professor of Psychology at Northeastern University, said that such technologies appear to disregard a growing body of evidence undermining the notion that the basic facial expressions are universal across cultures. As a result, such technologies, some of which are already being deployed in real-world settings, run the risk of being unreliable or discriminatory. 

Because of the subjective nature of emotions, emotional AI is especially prone to bias. For example, a study found that emotional analysis technology assigns more negative emotions to people of certain ethnicities than to others. 

Today, AI is not sophisticated enough to understand cultural differences in expressing and reading emotions, making it harder to draw accurate conclusions. For instance, a smile might mean one thing in Germany and another in Japan. Confusing these meanings can lead businesses to make wrong decisions. Consider the ramifications in the workplace, where an algorithm consistently identifying an individual as exhibiting negative emotions might affect employment prospects.

Conscious or unconscious emotional bias can perpetuate stereotypes and assumptions at an unprecedented scale.

The implications of algorithmic bias are a clear reminder that business and technology leaders must understand and prevent such biases from seeping in. For example, employees often think they’re in the right role, but upon trying new projects might find their skills are better aligned elsewhere.  Some organisations are already allowing employees to try different roles once a month to see what jobs they like most and researchers claim that is where bias in AI could reinforce existing stereotypes. 

In the US, where nearly 90% of civil engineers and over 80% of first-line police and detectives are male, an algorithm that has been conditioned to analyse male features might struggle to read emotional responses and engagement levels among female recruits. This could lead to flawed role allocation and training decisions.

The future development of AI facial recognition tools must deal with some key challenges:-

Creating products that adapt to consumer emotions:  With emotion tracking, product developers can learn which features elicit the most excitement and engagement in users.  As these systems become more commonplace, insurance companies are going to want a piece of the data. This could mean higher premiums for older people, as the data would suggest that, despite many prompts to rest, the driver pressed on. 

Improving tools to measure customer satisfaction:   Companies are giving businesses the tools to help their employees interact better with customers.  Its algorithms can not only identify “compassion fatigue” in customer service agents, but can also guide agents on how to respond to callers via an app. 

Transforming the learning experience:  Emotional insights could be used to augment the learning experience across all ages. It could, for example, allow teachers to design lessons that spur maximum engagement, putting key information at engagement peaks and switching content at troughs. It also offers insights into the students themselves, helping to identify who needs more attention. 

As more and more companies incorporate emotional AI in their operations and products, is imperative that they’re aware of the potential for bias to creep in and that they actively work to prevent it.

Emotional AI will be a very powerful tool, forcing businesses to reconsider their relationships with consumers and employees alike. It will not only offer new metrics to understand people, but will also redefine products as we know them. But as businesses expand into the world of emotional intelligence, the need to prevent biases from creeping in will be essential. Failure to act will leave certain groups systematically more misunderstood than ever, a far cry from the promises offered by emotional AI.

Harvard Business Review:         Forbes:        Guardian:         SSRN:        Business Day:     GetImmersion

 You Might Also Read:

Facial Recognition Company Hacked:

 
« Hackers Threaten To Publish Police Informant Data
World Password Day - 6th May »

CyberSecurity Jobsite
Perimeter 81

Directory of Suppliers

MIRACL

MIRACL

MIRACL provides the world’s only single step Multi-Factor Authentication (MFA) which can replace passwords on 100% of mobiles, desktops or even Smart TVs.

The PC Support Group

The PC Support Group

A partnership with The PC Support Group delivers improved productivity, reduced costs and protects your business through exceptional IT, telecoms and cybersecurity services.

LockLizard

LockLizard

Locklizard provides PDF DRM software that protects PDF documents from unauthorized access and misuse. Share and sell documents securely - prevent document leakage, sharing and piracy.

Practice Labs

Practice Labs

Practice Labs is an IT competency hub, where live-lab environments give access to real equipment for hands-on practice of essential cybersecurity skills.

ManageEngine

ManageEngine

As the IT management division of Zoho Corporation, ManageEngine prioritizes flexible solutions that work for all businesses, regardless of size or budget.

Acunetix

Acunetix

Acunetix is a leading web vulnerability scanner, widely acclaimed to include the most advanced SQL injection and XSS black box scanning technology.

National Defence Radio Establishment (FRA) - Sweden

National Defence Radio Establishment (FRA) - Sweden

The National Defence Radio Establishment (Försvarets Radioanstalt), is the Swedish national authority for Signals Intelligence, also providing Information assurance services to government authorities.

PCI Pal

PCI Pal

PCI Pal’s secure cloud payment solutions are certified to the highest level of security by the leading card companies.

Haystax Technology

Haystax Technology

Haystax’s security analytics platform applies artificial intelligence techniques to identify and prioritize threats in real time.

Cybint Solutions

Cybint Solutions

Cybint provides customized cyber education and training solutions for Higher Education, Companies and Government.

Montimage

Montimage

Montimage develops tools for testing and monitoring networks, applications and services; in particular, for the verification of functional, performance (QoS/QoE) and security aspects.

Qrator Labs

Qrator Labs

Qrator Labs is a leader in DDoS attack mitigation, helping organizations protect their websites from the most harmful, sophisticated DDoS attacks.

Cyber Defence Solutions (CDS)

Cyber Defence Solutions (CDS)

Cyber Defence Solutions is a cyber and privacy Consultancy with extensive experience in the development and implementation of cyber and data security solutions to your assets.

Lucata

Lucata

Lucata solutions support groundbreaking graph analytics and improved machine learning for organizations in financial services, cybersecurity, healthcare, pharmaceuticals, telecommunications and more.

SE Ventures

SE Ventures

SE Ventures provides capital to big ideas and bold entrepreneurs who can benefit from Schneider Electric's deep domain expertise, R&D assets, and global customer base.

Pointsharp

Pointsharp

Pointsharp delivers software and services that help organizations secure data, identities, and access in a user-friendly way.

Harrison Clarke

Harrison Clarke

Harrison Clarke is a leading staffing and recruiting firm in the Cloud, Cybersecurity, Data & AI space.

turingpoint

turingpoint

turingpoint GmbH is a tech enabled boutique consultancy. It was founded by security experts with a focus on cyber security and software solutions.

Automotive Information Sharing & Analysis Center (Auto-ISAC)

Automotive Information Sharing & Analysis Center (Auto-ISAC)

Auto-ISAC provides a forum for companies to analyze and identify threats sooner and share solutions that enhance vehicle cybersecurity.

XONA Systems

XONA Systems

XONA is The Zero Trust user access platform for the OT enterprise. Secure operational access to critical systems - from anywhere.

GoCloud Systems

GoCloud Systems

GoCloud is an IT consulting firm. We provide IT strategy and cloud adoption services to the New Zealand Government, Non-Profit Organisations and private industry.