When customers visit your business’ website to resolve issues, they want to be greeted by a digital colleague that is able to comprehend the tone of their communications. The same goes for employees interacting with digital colleagues to solve back-office issues. Whether customers are panicking because of possible identity theft, or an employee is pleased because she’s logging paid time off, a digital colleague should greet users in a manner that indicates emotional intelligence.

If a digital colleague fails to comprehend the emotion behind a user’s statements, the user will most likely ask to be transferred to a human agent, which is a waste of time and resources. After all, what good is a technology that no one wants to use?

In this post, we’ll examine the many ways in which our digital colleague Amelia can interpret emotion and respond to correctly match a user’s tone. We’ll explore how we test and train Amelia to handle differences in language and attitude, and what we do to ensure that she’s always showing the proper empathy and respect.

Emotional Training

Amelia uses Natural Language Processing (NLP) to understand and speak in coherent and human-like sentences. This allows her to sort through a user’s sentence to find meaningful terms and phrases that she’ll utilize to prepare her response. If you were to say, “I bought a car yesterday,” Amelia would sort out that “car” is the noun and “bought” is the verb.

Amelia also uses neural network algorithms to detect intent. If a customer says, “I lost my credit card yesterday,” Amelia will remember her training for credit card replacements. She’ll know that in the case of a lost credit card, the customer’s intent is typically to deactivate the missing card, get a new card issued and resolve any disputed charges. She takes basic data (who, what and when) and determines that you lost your card last night.

While she’s processing intent, she’s also figuring out an appropriate emotional response. If you say, “I lost my credit card and I see fraudulent charges,” Amelia doesn’t respond by saying, “That’s great! Let’s get started.” She’ll respond by saying, “That’s terrible. Where did you last use it?” She can also use social talk to help make conversations seem more natural. Instead of saying, “That’s terrible,” she can be programmed to say, “That’s a bummer.”

To ensure appropriate responses, your team works with our Cognitive Implementation Engineers to prepare Amelia for things your customers or employees might say or do. What triggers might anger customers? Is Amelia dealing with acceptances and rejections? Will she be handling fraud-related issues?

Once you have established a comprehensive list of triggers, you can work with IPsoft to provide appropriate emotional responses. Should Amelia remain firm when telling a customer that he has been rejected for a policy? Should she take a more understanding and empathetic approach and offer alternative plans? Should she escalate angered customers to human agents? Our team goes through every possible scenario to help determine how Amelia should use emotional intelligence to deliver the best and most human service experience.

Why Emotional Intelligence Provides Business Value

When Amelia is working with customers or coworkers, she may sense frustration, anger, or even sadness. In these moments, she’s able to calibrate her tone and phraseology in order to be considerate of how her counterpart is feeling. With this emotional intelligence, she can build closer connections and relationships between a company and their customers and employees.

The language Amelia uses to interact with customers can vary from system to system and even from project to project. She can use slang, humor and even sarcasm, or she can speak in direct professional terms. If you’re designing your system to serve as a hub where people interact with your products and services, and dedicate large portions of time, you’ll want her to be more conversational and affable.

On the other hand, some business users want direct information delivered instantly, and without room for interpretation. You wouldn’t want Amelia to speak sarcastically and confuse a user who speaks English as a second language.

A similar stoicism is required for cases in which Amelia deals with security and fraud. Unlike chatbots, which respond only to keywords, Amelia is not rattled or distracted when customers take an aggressive tone. For example, a caller with a screaming baby in the background demands access to a banking application despite not having the proper credentials. The chaos of a crying child and an angry customer might fluster a human agent, but Amelia will stick to the process, and always conduct herself professionally.

Think about how a chatbot would respond in a similar scenario. A customer interacting with a chatbot is angry because he or she isn’t receiving the information they need. The customer sarcastically screams, “Hello?! Are you even listening to me?” A chatbot will respond to that question literally – so the customer stays angry and the chatbot is no closer to fulfilling the customer’s needs.

Amelia never takes a day off, she doesn’t take breaks and she can simultaneously handle thousands of conversations. We’ve trained her to offer best-in-class emotional intelligence.

Download Conversational AI for Customer Service now!