The next frontier after AI (Artificial Intelligence) as we know, is to teach machines to touch, feel and respond to human emotions—or what is broadly described as Emotional Intelligence.
Few will argue the need for AI to simplify the healthcare experience. But does the humanization of healthcare not become the sacrificial lamb of transactional AI in healthcare? The recent deactivation and decommission of Pepper and Fabio respectively are fitting examples that demonstrate the difficulty with programming empathy to robots, however humanoid. Pepper was tested for a range of transactions from supporting autistic students to senior companion care; but across all of these the cost benefit analysis didn’t pan out in Pepper’s favor. As a result, Pepper’s production pause seems to revolve around cost efficiency relative to value. Separately, the Fabio’s of the AI world on the other hand seemed to have trouble with translating empathic gestures in the right context and as a result the “creep” factor was rampant. So much so that in the UK where Fabio was first tested as a greeter at an upscale wine store, customers went out of their way to play a game of ‘hide and seek refuge’ to avoid Fabio!
The exciting (some say frustrating) thing about emotional or ‘Affective’ machine learning is that machines are being created with amazing capabilities to analyze and continuously monitor our hidden emotional responses, not just our behavior and purchase response. Whether we like it or loathe it, we are now hooked on to our devices. And in the new virtual, always on world, it is hard to completely disconnect and unplug from smart phones, computers, TVs, livestreams and even in clinic or in store cameras as we embrace the virtual normal.
These ubiquitous devices are continuously recording our smiles and frowns and mapping our innermost emotions. In healthcare, this data if captured and used appropriately can enable us to better understand the consumer mindset in healthcare. It can enable sharper real time understanding of consumer pain points. AI can help us decode body language for instance to understand if a consumer has fully understood their follow up instructions in a doctor’s office; further, it can help us to decode which patients are most receptive to follow up engagements. This will avoid wasted consumption on consumers who are not ready to engage and enrich the customer connect with those that are.
New technologies, referred to as “Emotional AI” are part of the broader technology of AI. Emotional AI now refers to the many ways in which machines can interpret our thoughts and emotions and assign values to a smile, a frown or a perplexed eyebrow. Machines are already able to analyze mountains of data within seconds. In the last two decades, machines have been taught to read emotions and images. They are able to correlate these with positive experiences (like brand fulfillment) or negative experiences (like fear, stress, disappointment or anger).
In our business as healthcare professionals, we know that AI and EI are already in use. They are valuable tools for healthcare research because they can correlate subconscious reactions with actual purchase behavior, satisfaction and in the future even net promoter scores or at the very least, likelihood to recommend. At Call centers, emotional AI can give nurses and benefits navigators useful feedback on the customer’s mindset. These mood indicators so to speak can help nurses and navigators adjust their tone and their pitch to best suit the consumer calling in. As we marry emotional AI data with voice analytics software we can get a blueprint for how to adjust healthcare product capabilities, modify delivery and enhance customer satisfaction in real time.
Emotional AI has broad applications across mental health, remote monitoring (through voice and other biometrics e.g. blood pressure, heart beat) and telehealth. In mental health, for example, emotional AI can help to decode and predict varying degrees of patient depression. Smart cars are soon going to be alert to the driver’s state of mind and despondent mood or tiredness. People living with Autism cannot tell you what they are feeling. But Emotional AI can pick up facial expressions or elevated pulse rates and create an emotional profile of the person’s state of health and mind. In workplaces that still require shift work and long hours on the floor like manufacturing and retail, healthcare companies deploying emotional AI can help employers track employee frustration or dwindling motivation and provide proactive employee assistance programs (EAP) and connect them to counselling and guidance.
The way to think of this brave new world is probably not to think of it as intimidation but beneficial intermediation. Not machine controlling humans but humans becoming better at being human and healthcare professionals being equipped to drive focused initiatives, specific to each industry and outcome.
This is yet another call to those who can harness EI…..with a dash of what I call Covidacity aka embracing the positive and big bold goals despite the pandemic.
Photo: wildpixel, Getty Images