The trouble with emotion-reading AI
Briefly

The trouble with emotion-reading AI
"The AI revolution makes it possible to measure emotions and mental states. So why not use it widely and fix what's broken? That's the idea behind emotion AI, which is also called "affective computing," "sentiment analysis," or "algorithmic affect management." The idea is to use sensors and AI to detect, interpret, classify, and act upon human emotions in the workplace."
"Thanks to improvements and breakthroughs in a wide range of technologies (including computer vision, natural language processing, speech and voice analysis, biometrics, machine learning and deep learning, and edge computing hardware) emotion AI is now possible. Many companies have come forward to provide ready-to-use solutions for emotional AI apps, including Cogito, Affectiva, Hume AI, Entropik, and HireVue."
"The idea is simple: Collect data from employees, process it through AI, and get a result that shows how an employee feels. Depending on the solution, the data comes from: Vocal features - pitch, tone, cadence, micro-pauses, vocal stress; Facial expression - video analysis of video calls and through desktop cameras; Text - mass sentiment analysis on emails, Slack/Teams messages, survey responses, and performance reviews."
"Physiological biosignals - heart rate variability, galvanic skin response (via wearables); Behavioral telemetry - keystroke cadence, mouse dynamics, app-switching patterns; Posture and gaze - computer vision analysis from cameras installed in workplaces. Despite the progress and variety of solutions, this whole area is problematic for businesses."
Emotion AI, also called affective computing, sentiment analysis, or algorithmic affect management, uses sensors and AI to detect, interpret, classify, and act on human emotions in the workplace. It relies on advances in computer vision, natural language processing, speech and voice analysis, biometrics, machine learning, deep learning, and edge computing hardware. Companies offer ready-to-use solutions that collect employee data and process it to produce results about how employees feel. Data sources include vocal features, facial expressions from video calls and desktop cameras, text sentiment from emails and workplace messaging, physiological biosignals from wearables, behavioral telemetry such as keystroke and mouse dynamics, and posture and gaze from workplace cameras. Despite feasibility, the approach is described as problematic for businesses.
Read at Computerworld
Unable to calculate read time
[
|
]