There’s a reason that this blog post is not written in charts and graphs. Sure, a visual might be persuasive or advisory on its own, but it is more so when supported by words, preferably spoken words. Despite the illustrative capabilities in your BI dashboard, your dashboard sales rep does not communicate in charts and graphs, but instead emails and calls you. Charts and graphs can’t express value propositions, establish thought leadership, or give a compliment. Our most important communications—our most human communications—take place in natural conversation.
Conversation has been the default mode of communication for millennia. But such is the modern necessity of technology that we have been willing to learn complex computing languages, adapt to various user interfaces, master the interpretation of specialized visuals, and tether ourselves to screens and keyboards for hours at a time—all in order to capture the knowledge contained in the machine, and to figure out how to tell the machine what to do for us.
What if instead of fingers on the keyboard and eyes on the screen we could simply have a natural conversation? Not a command-response “conversation” of the kind that we are accustomed to having with voice platforms, such as Alexa, but a true multi-turn, two-way conversation during which Alexa is not merely responsive to simple questions but is spontaneously analytical, a subject matter expert in your business and field. What if Alexa can bring to your attention critical insights discovered within the data underlying your BI dashboards so that you can make better, faster business decisions?
To build a conversational AI system for Alexa or any voice platform, you need to be able to turn your data into words instantly. To do that, you need true Natural Language Generation technology, with built-in analytical and linguistic capabilities.
This is part two of a three-part series. Be sure to read parts one and three as well.