By Margaret Mayer, VP, Software Engineering
That’s why, when creating Eno, Capital One’s intelligent assistant, we decided to build our own natural language processing (NLP) technology. It was important to be able to build Eno on a platform that deeply understands financial services terms, allowing us to deliver the best possible experience to our users. Capital One is committed to delivering a human-centered experience each and every time we interact with a customer, so building our own NLP system for Eno was the only way to ensure that. Thanks to advances in convolutional neural networks (CNN) and long short-term memory networks (LSTM), our team was able to build Eno’s NLP from concept to launch in under three months.
Eno uses a natural language processor to understand the different ways that customers text about their money. Want to check your balance? You can text “what’s my balance” or “what amount do I owe” or even just send a money bag emoji. Eno understands over 2,200 terms and emojis people use to ask about their account balances. Since Eno aims to treat customer interactions as conversations, not commands, you don’t need to respond “yes” or “no” to Eno’s questions, either. You can say “yep”, “yah”, “nah”, “ok”, send a thumbs up or thumbs down emoji, or type something like “um, wait, actually no”. Make a spelling mistake and type “yed” instead of “yes”? That’s OK, Eno knows your thumbs are fallible.
Texting with Eno is simple, but, of course, the easiest-to-use technologies are the hardest to build. Creating the sophisticated NLP architecture to ensure Eno could understand nuances in human communication took an amazing amount of hard work, creativity, and collaboration. We built Eno using real customer conversations from chat logs. Our algorithm and platform leads worked closely with customer support teams to analyze hundreds of thousands of web chats between customer service representatives and real customers. Our machine learning engineers trained Eno on these conversations, building a rich and layered language model.
Creating the sophisticated NLP architecture to ensure Eno could understand nuances in human communication took an amazing amount of hard work, creativity, and collaboration.
From a technical perspective, our engineers used a three-step training process to create Eno. First, they used an unsupervised machine learning technique to pre-train Eno to understand the meaning and similarities of words using existing chat log data. Second, they used a supervised machine learning model to train Eno to understand tens of thousands of utterances customers had made in the past — such as “activate”, “turn on card”, or “make my card work” when they wanted to activate it — using only labeled data. Third, additional supervised learning is applied to new utterances so that Eno could understand terms customers had never used before.
Eno didn’t just learn to recognize dozens of synonyms, such as card, credit card, and account, or payment, bill, and balance, but also to recognize misspellings and abbreviations, such as pymt, acct #, blnc, etc. Of course, we had to ensure Eno understood hundreds of emojis because that’s simply the way people text today. Half of people who use Eno to pay their bills respond with a thumbs up emoji to confirm.
Eno isn’t all work and no play. We’ve found many customers just like talking to Eno. They often send Eno personal questions or ask about the meaning of life. A few have even sent Eno marriage proposals. If a customer asks Eno if it’s a boy or a girl, Eno responds: “I’m binary. I don’t mean I’m both, I’m actually just ones and zeros.” Eno makes jokes and responds to questions with real human-like messages. In fact, 14% of customer texts to Eno are completely unrelated to banking, like asking Eno’s take on the meaning of life.
Of course, since Eno uses machine learning, it’s always improving and becoming more intelligent over time. Each customer interaction adds to Eno’s language database and we’re continually learning new things about how customers interact with Eno. For example, we found some people didn’t respond right away when Eno asked them a question. Perhaps they got distracted by a crying baby or phone call, so we built conversational context into Eno where interactions can be continued where they left off during a several hour window. Ghosting doesn’t offend Eno!
People don’t talk like machines do. Human language is incredibly rich, complex, and varied. Computers will likely never replace the nuanced way actual humans speak to one other, but our deep learning algorithms have helped us make great strides in bridging the human-machine divide. With Eno, we’ve worked hard to create a chat bot that goes above and beyond performing simple banking tasks. Texting with Eno is like texting with a friend. Customers can communicate naturally and conversationally — and don’t have to adapt to a machine’s way of talking in binary code.
Having led the team that helped to create the technology behind Eno, I’m well aware of the training that goes into creating an intelligent assistant. It’s an incredibly complex software system built by talented engineers using the latest machine learning methods. But Eno’s NLP system is so natural sounding that I sometimes find myself forgetting it’s just a bunch of binary code. Most recently, while on vacation in Argentina, I checked into a hotel by the Iguazu Falls, and Eno sent me an alert to make sure the charge was legitimate. I quickly texted Eno that the charge was good to go, the transaction went through, and I was able to get back to checking out the falls, knowing that Eno had my back. Ultimately, that’s what my team and I want to do: create technology solutions to make people’s lives easier.