I have been spending a lot of time with some of our customers lately, learning how they are listening to and engaging their customers. One thing that really sticks out in every discussion I have is how important it is that the information that they get is accurate. Why? Because in all of our implementations, our customers are using their analysis to take action to improve products or to prevent customer churn or to drive a marketing program…..to do something that impacts the customer and the business.
It’s amazing to me how many companies miss this point during the buying process, only to find out later that they purchased a system that might create nice charts and graphs, but that doesn’t have the technology behind it to easily and accurately get to actionable information. In early markets, with limited coverage and no “Magic Quadrant” or “Wave”, its buyer beware. The core technology used to understand customer conversations is text analytics. But don’t feel too comfortable when your Customer Experience Management vendor tells you they have a sophisticated text analytics engine because “truth be told” they come in all shapes and sizes… from statistical word counting and targeted keyword lists, to sophisticated natural language processing or NLP. When evaluating text analytics solutions it’s very important to not only evaluate the output – dashboards, reports and exploration capabilities, but to also inspect the quality of the data being extracted. Solutions from the vendors that participate in our space, big and small offer very different technologies.
Chose the One with Natural Language Technology That is Actually Used in the Solution
Vendors that invest in NLP technology to model the structure of words in order to understand a sentence can yield accuracy similar to that of a human. Statistical approaches (keywords, classifiers, etc.) are considerably less accurate because they typically require the user or implementer to create rules to model what data is pulled from the system. In a lot of cases our customers don’t know what they are looking for and because they have to know, to create these rules, they inherently miss critical information. A couple of years ago it might have been okay to capture general themes and trends but today companies see the value in not only listening to get a general understanding of the customer, but listening to determine what to do to improve customer satisfaction, competitive position and sales. Correctly understanding all issues about every customer must be the goal for Customer Experience Managers because the impact in today’s socially networked world is too great. We see the affect of not hearing the individual daily, from an airline who can’t hear a guitar man http://www.youtube.com/watch?v=5YGc4zOqozo, to an auto manufacturer who can’t figure out if it’s a floor mat or electrical interference. If you use NLP technology and that technology does not require you to write rules (and therefore know everything you are looking for) then you have the best chance of finding new issues, uncovering opportunities you never saw before.
When evaluating vendors do a simple test and ask all of us how our applications stand up to the accuracy question and enable users to find issues they weren’t aware of before. Do they parse sentences to find entities (people, companies, brands, etc?) Do they relate those entities to sentiments, events and other issues, intent (to purchase, to leave…), conditions (actions a customer would take if some condition existed)? If so, they have sophisticated NLP. Another good test….ask if they need to create rules or word lists before they run your data through their system? If so, you need to be a mind reader or you’ll miss information and worst of all if you expose the text behind that pretty bar chart you might find that 20 to 30% of your facts are wrong…