Fooling the Customer


Share on LinkedIn

Time had an amusing article this week about The Telemarketing Robot who Denies She’s a Robot (complete with recordings!). A company offering health insurance had, for a time, a phone number answered by an a cheerful-sounding woman who would ask several questions about the caller’s insurance and respond to conversational questions.

Ask if she was a robot, and she would laugh and insist that she was a real person.

Except that she wasn’t, as would quickly become apparent by the awkward pauses before she would reply, her clearly limited set of responses to questions, and the way she would speak the exact same phrase (words, tone, timing) to different callers. Most likely, “Samantha” was a set of prerecorded messages triggered by someone listening to the caller and selecting the best response from a list. This would allow the use of really cheap overseas employees with poor English skills.

In the speech recognition industry about ten years ago there was a debate about how “human” an automated system should be. On the one hand, some designers believed that the best speech systems were the most natural and conversational–in the ideal world, you would call your bank and never know whether you were talking to a person or a machine.

The other view–which I hold along with people like Bruce Ballentine (author of It’s Better to be a Good Machine than a Bad Person)–is that you should always make it clear to the caller whether they’re interacting with a machine or a person. Leaving aside the fact that the technology is nowhere near advanced enough to allow for a true conversational experience, people just don’t like to be fooled.

People care very deeply whether they’re talking to a machine or not. It’s not that talking to a machine is bad (witness the willingness to use Apple’s Siri service). It’s because talking to a person carries social context and talking to a machine doesn’t, and you can interact with a machine in ways you wouldn’t interact with another human.

We observe this in people’s willingness to play with the machine and explore its capabilities. While sometimes people will observe social norms when talking to a computer (for example, saying “Please” and “Thank you”), they also feel free to break outside the box. The recordings of the Time reporters taunting “Samantha” are a great example (they can’t get her to repeat the exact phrase “I am not a robot”). Ballentine called this the problem of the “Monkey-butt user,” after someone in a usability test who randomly said “Monkey-butt” to the computer to see how it would react.

(As an aside: Bruce’s book is the only one I know which has “Monkey-Butt” in the index. Look it up yourself.)

It turns out that humans care so much about whether we’re talking to a machine or not that, if we suspect a machine it trying to fool us we will spontaneously begin a series of Turing tests to find out if it’s really a machine or a person. Current technology has a long way to go before it can get past someone who is determined to discover the truth.

So the lesson is simply this: Don’t try to fool your customers. It won’t work, and they won’t like it.

Republished with author's permission from original post.

Peter Leppik
Peter U. Leppik is president and CEO of Vocalabs. He founded Vocal Laboratories Inc. in 2001 to apply scientific principles of data collection and analysis to the problem of improving customer service. Leppik has led efforts to measure, compare and publish customer service quality through third party, independent research. At Vocalabs, Leppik has assembled a team of professionals with deep expertise in survey methodology, data communications and data visualization to provide clients with best-in-class tools for improving customer service through real-time customer feedback.


Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here