In the light of your experience what are the technological trends and challenges you’ve witnessed in the Virtual Assistance and chat-bots space?
After a year of huge overhype, bots are finally starting to find their place in the care ecosystem. Clients that moved fast, broke stuff, and learned lessons about what works and what doesn’t are the best positioned to drive value from the initiative. Fence sitters who didn’t invest last year are finally starting to show some interest.
On the platform side, bot providers are finally starting to put their functionality where their marketing is - meaning the promises they made to win business are starting to manifest in some more mature functionality. This is particularly true for bot building platforms which allow business users to configure and assemble bots without relying solely on developers.
New business models are starting to emerge including pay per performance models that shift the accountability from (mostly internal) IT groups to external chatbot partners. This creates the right incentive model and should drive better agility and focus on continuous improvement/optimization.
And finally, while there is some slight differentiation in the evolution of underlying technology - AI/NLU/Dialog Management - there isn’t one clear winner. Even if Watson continues to outpace the competition for mindshare.
What in your opinion is the right strategy for enterprises to leverage chat-bots? What should be the points of considerations, the dos and don’ts?
If brands haven’t already started talking about where a chat-bot might fit in their overall strategy, they ought to get moving. There are many things to work through - including who ultimately owns bot strategy. The answer today isn’t clear with some marketing teams driving, some customer care teams and some IT teams trying to satisfy all parties.
Assuming brands have a natural place for a bot to interact (many don’t!), like live chat, Messaging, in-app, or increasingly through voice enabled interaction, a great first step would be to define a base set of use cases that lend themselves to automation. Bots do well today for highly transactional customer interactions. Pick a small set of those across the end to end (acquisition to retention) customer relationship and validate those against your existing internal architecture. From there, it’s develop fast, test, measure and optimize for evermore. If your bot is performing well against one use case, look for the next best one and repeat the process.
• Try and boil the ocean in the beginning
• Work in a bubble - a bot is a collective endeavor, much like your website or your social media program
• Oversell the capability or the promise of a human-less future to the organization until you’ve had a chance to meaningfully test in production and understand the economics
• Be naive to the risks/rewards of an automation strategy
• Start small and move fast
• Benchmark the use cases you are hoping to improve before you launch your bot
• A/B test interactions with and without the bot in the mix
• Make sure your chat/messaging/service agents know what the bot is designed for and choreograph the handoffs from bot to people
• Find an external partner to help facilitate consensus. Make sure they are directly accountable for success
Could you elaborate on some interesting project/initiatives that you’re currently overseeing?
We are engaged in many different chat-bot and bot related initiatives across both support and marketing/acquisition use cases. One of our most interesting initiatives today is our work on our own IP called the Bot Trainer Platform.
Accuracy is the key driver of a good chatbot experience and one of the biggest failings of chat-bots today is the degree to which NLU can accurately understand user input. Many solutions use leading open source NLU engines like Stanford NLU which do an increasingly good job of understanding and parsing input across many languages. This works well for common phrases like “I need to pay my bill”, or “my TV isn’t working!”
The challenge for brands is that those NLU data sets aren’t proprietary and don’t solve for years developing memorable product names, tones of voice, or unique brand attributes. So if a customer says “Can I get InstantInk for my OfficeJetPro?” an off the shelf NLU engine might not know what to do. Compound that times the number of products and services a brand offers and it’s easy to see how an end customer might get frustrated.
While brands are starting to solve for these challenges by developing better internal knowledge bases and metadata around their products, too often the catalog of products and services and unique brand language makes its way into a developer’s hands for integration into the NLU data. It’s not a big stretch to say that developers aren’t necessarily the best curators or keepers of this data.
The Bot Trainer Platform was built for this dynamic; it turns the building of the proprietary NLU data set over to a brand’s front line agents. These are the same people who are taking calls about the very same issues and who typically go through regular and intensive training on a brand’s products and services. They are also trained in the language of the brand, so if a luxury retailer wants to refer to everyone as their “treasured friend” (I’m making that up), the agents are usually in the loop in real time.
Agents use the platform during their daily routine to review, qualify, or reclassify end-customer input and feed that new data back into the chat-bot. The result is an increasingly smarter bot who learns to model the very people it was built to exist alongside. That accuracy pays off with higher accuracy, better resolution rates, and better satisfied end-customers.
How would you see the evolution a few years from now with regard to disruptions and transformations within the arena?
We are still in the very early days of chat-bots but there are already some interesting things to watch. Voice, as a share of user input will grow very quickly. Brands that figure out how best to use that channel to serve and solve customers’ queries will benefit. Data Security will continue to grow in importance as public awareness increases through stories like Facebook’s current imbroglio. Eventually platforms that achieve scale and adoption will start to squeeze out their peers. Eventually we’ll see consolidation of platforms and mainlining of bots into even larger platforms to become core functionality.
Customers will be increasingly tolerant of bots in the mix and we’re already being trained through search queries and even Alexa requests to structure our language to be more consumable by bots.
No matter what the future holds for bots, we will always have humans in the mix. We need each other for high emotional, high touch, and high value interactions. The choreography between the two - humans and bots - is the space to watch, when and where they make each other smarter, more contextually relevant, and more...human, is ultimately where the true magic happens.