To meet increasingly elevated customer expectations, organizations are implementing detailed strategies for distributing customer experience (CX) practices across the organization. This includes defining and standardizing the “customer journey” across different channels in order to strengthen their brand, increase customer loyalty, reduce costs, make better use of customer feedback, and so forth. Organizations are also investing in leading technologies designed to enhance CX, regardless of the channel through which customers choose to engage with them. While most agree that providing a great customer experience is essential for any organization seeking to remain competitive today, actually implementing CX practices and technologies can be difficult. Consequently, Cutter Consortium is conducting a survey to gain insight into how organizations are adopting CX management practices and technologies and what they see as the possible impacts on their businesses.
Smartbots and Intelligent Assistants
Approximately a quarter of surveyed organizations are looking into using smartbots and intelligent assistants for their CX initiatives. Intelligent agents and smartbots enable customers to conduct common interactions in a conversational manner via speech and natural language speech or text-powered interfaces. This includes for standalone mobile apps and those designed to function within popular social messaging platforms.
Consumer voice-driven assistants — such as Apple Siri, Google Assistant, and Microsoft Cortana, and their hardware-based cousins, Amazon Alexa, Google Home, and Apple HomePod — are also increasingly becoming channels for organizations to engage with consumers. Consequently, more companies are developing applications for them.
A key trend surfacing today is that the enterprise software and platform providers, such as IBM, Oracle, Salesforce, and SAP, are incorporating bot-building capabilities into their respective offerings to allow customers to deploy and take advantage of smartbots and intelligent agents for customer engagement and CX. We are also seeing advanced chat offerings in the form of services based on AI platforms from providers such as IBM (Watson) and Microsoft (Azure/Cognitive Services/Azure ML). Industry-specific CX management solutions also make use of smartbots and virtual assistants.
That said, there are issues that organizations should take into account when considering the use of AI-powered digital assistants, smartbots, and intelligent agents. The chief issue is that their application is still constrained by the current limitations of NLP and natural language understanding (NLU) technology, as this AI developer who has been working on applying the technology in enterprise scenarios explains in depth:
Voice and speech interfaces and intelligent agent technology are certainly an evolving area. And currently, limited vocabulary applications offer the best error rate. So you want to focus on applications where you can limit the size of the vocabulary required, because that's where you get applications that perform with the least errors.
This is more so if you hook them up to a decision-making system, like a Q&A system (and usually a voice/speech UI and agents hook into complex enterprise systems). You always want to design the application to use the most limited vocabulary possible. It's the case that we simply don't have a wide-ranging, general language understanding problem solver. The [end-user] satisfaction comes from having a narrow domain and limited or constrained vocabulary — situations where you have patterns commonly used in questions in specific domains, like with banks: balances — did the check go through? Transfers and deposits — money x to money y, and so on.
Take Siri on iOS. It's getting more sophisticated. But it still operates with about 30 vocabulary words and their usage rules. There's no real script or depth to the conversation. And so the STT [speech-to-text] that you find in the document entry is not hooked up to any commands. It's just a straight STT translator.
It is uncertain how long before speech recognition becomes really accurate and capable enough for applying to situations requiring more complex conversations. A big project by DARPA [Defense Advanced Research Projects Agency] is focusing on common sense reasoning AI. And that's what we actually need to hook it [speech recognition/smartbots/agents] up to get that last 20% error rate down. So that's kind of where we're at — that we've got trouble with understanding enough meaning to discriminate among a wider vocabulary.
How fast will DARPA [and the organizations participating in the project] turn its research into tangible results? I don't know. I do know that more and more people are using the Semantic Web to help give meaning to a word. For example, you might have the word “tall” and a person might use it in a sentence: “Mary is tall.” But “tall” is relative and fuzzy because we don't really know what the set of “tall” things is. So you apply some common sense reasoning to place “tall” into some kind of context. So the Semantic Web has to do with the deeper meaning; it's the same technology that underlies more broader accuracy in NLU systems. And these capabilities are at the edge of where AI has been developed for commercial use so far.
[For more from the author on this topic, see “CX Management in the Enterprise, Part VII: More Leading CX Technologies.”]