A New Tool for Understanding People’s Emotions – Beyond Verbal


Share on LinkedIn

“Hey, Bill, how are you doing?”

“Things are going well!”

“It doesn’t look like it. Looks like you’re tired and worn out. Something bother you?”

“Nah, I’m hanging in there. Life is good!”

Variations on the above conversation happen all the time. People say one thing but are feeling another. For whatever reasons, sometimes people don’t feel comfortable sharing their emotions. That’s okay – we respect that. But, when you’re trying to create a product or service that makes that person’s life easier, it often helps to understand the emotional underpinnings.

In the past I’ve blogged about PrEmo, a way of measuring emotions by utilizing the natural human capacity to notice emotions in others. A new tool has been (and is being) developed over at BeyondVerbal.com. They’ve been analyzing the intonations in people’s voices to tease out the emotions behind them. These intonations are universal and when categorized, provide a means for determining the emotional states of people around the world.

I did my own little demo at their website. So far, I’ve found it amazingly accurate. I’ve also found it hard to fool.

So give it a try – I’d love to hear your thoughts about the tool and its applications!

Republished with author's permission from original post.

Michael Plishka
Michael Plishka is the President and Founder of ZenStorming Solutions, LLC an innovation design consultancy. He believes in co-design methodologies, sharing design thinking essentials - empowering people and companies to make a difference with their products and services.


  1. Hate to point this out but inflections are NOT universal. They are heavily influence by both culture and language. Isn’t it a little obvious that a language that relies heavily on mouth clicks and one like English are…well, different?

    Scratching head.

  2. Robert, Thanks for stopping by. First, so that we’re dealing with the same definitions, while ‘inflection’ is sometimes used as a synonym for ‘intonation’, it’s more accurate to say that Beyond Verbal studies intonation.

    From Beyond Verbal’s site:

    “Our core research into this new dimension of human insight started in 1995 with the understanding that it's not “what we say” but "HOW we say it”. A team of leading scientists specializing in physics, operations research, decision making and neuropsychology conducted over 18 years of research that has included more than 60,000 test subjects in 26 different languages. This resulted in the discovery of the human "intonational code” extracted from vocal intonations. These discoveries form the basis of Beyond Verbal's technology, and are covered by four granted US patents. The technology has already received worldwide acclaim, winning both industry and academic recognition.”

    It would be interesting to see if intonations in click-based (as opposed to verbal speech) have the same ‘intonational code’. However, when listening to those who speak click based languages, there is still a sense of flow and it’s clear that emotion is being expressed in their speech (contrasted with Morse code that just sounds as dot – dash – dot – etc.) so perhaps the software would still work. I’ve sent a copy of your comment to Beyond Verbal. We’ll see what they come back with.

    Having said that, I agree that one should be cautious using the term “universal”, that is, until every language in the universe can be tested 🙂

    Thanks again for stopping by and your provocative comment!

  3. Below is the response I received from Dan Emodi at Beyond Verbal:

    I’m writing this short note to you from Kandy in Sri Lanka where I now spend some hard earned vacation with family. Besides the beauty of this place and the charm of its people, one curious thing you notice (as someone dealing with vocal intonations, of course) is how easy it is to pick the mood, feeling and type of personality of the person in front of you without understanding a single word in Sinhalese.

    The scientific reason is that vocal intonations (as well as body language) are originating in our Limbic System – a part of our brain that predates our Cortex and is in charge of our emotions. As a matter of fact intonation and body language are much the same. Intonation being created by our facial muscles and resonance box. Unlike the Cortex that holds our cognitive thinking, words, culture and conduct our Limbic System is culture agnostic and has nothing to do with cognition. That’s why babies that have no understanding of language or culture understand their caregivers (same goes the other way), that’s why the family dog “gets it”.

    For more information Michael you are more than welcome to checkout my Blog “I love Ave Maria though I do not understand a word of it”.


    Best regards,
    Dan Emodi

  4. Thanks for the attempt to explain. Unfortunately, the explanation propagates a common myth about brain specialization, when it talks about the limbic system. Yes, it’s described relatively properly, but what’s being assumed here is that the various parts of the brain operate independently. (http://psychmyths.com/)

    The explanation that the limbic system is primitive and therefore is not subject to cultural or cognitive components only works if you believe, falsely, that the limbic system is immune from cognition. In fact, cognition plays a huge part at the perceptual level in terms of gating, selecting and interpreting sense input, which in turn then affects emotional reactions. In fact, human ability to control “limbic response” is one of the major advantages we have over other species, and it’s a result of our superiority in cognition.

    Brain parts do not operate independently of each other. If, for example, people have lesions in the cognitive or affective “areas” of the brain, their behavior in the other arena will change. For example, people who have suffered from damage to brain areas in “charge” of emotion so they do not experience affect, cannot actually make decisions, the lack of emotion impairs cognitive function.

    So, while the explanation about the limbic system being culturally agnostic may be correct as far as it goes, the cognitive components that alter the input, and how the limbic system outputs IS culturally defined.

    This is SO obvious for even lay people. People are “afraid” of different things, based on their experiences, and cognitions. If it was as simple as it’s made out here, and the limbic system was “immune” from experience, culture, etc, we’d see universal fears (i.e. heights, whatever), and not the huge variation we actually do see.

    Also, it’s clear that people CAN learn to control these “limbic” behaviors, physiological responses, and again that would not be the case if the limbic system was as all pervasive.

    I can’t speak to the specific technology here, neither to I particularly care about it. It could “work” relatively well, though I have my doubts, even if it’ s not universal, and the “theory” is inaccurate. Or in fact, it could mislead companies down yet one more road believing that technology will actually make customer service “better”.

  5. Robert,

    There’s nothing I’ve read in the Beyond Verbal page, or in the response that shows that there’s an assumption that the limbic system is operating independently of other parts of the brain.

    In addition, just because humans have the ability to temper/enhance limbic reactions does not mean that those limbic responses and people’s bodies’ responses are somehow not discernible. Babies have been shown to recognize emotions conveyed via voice, long before they know language, per se. We are able to speak on the phone to people and understand each other and detect emotions.

    Evolutionarily speaking, it makes sense that something like vocal communication would carry with it, the indicators of emotion on a broader human scale. Technology has enabled researchers to be able to analyze 10’s of thousands of people from across cultures and categorize their responses in a meaningful way. (I have seen a similar phenomenon in the analysis of cell samples for cancer. When done on a small scale, patterns are virtually non-existent and accuracy is in the range of 60%. However, when thousands upon thousands are analyzed using neural nets, accuracy goes well into the 90th percentile range.)

    Teasing out this information, even if still imperfect, is, at least in my opinion, an exciting insight.

    Could it mislead companies down a technologically paved road that promises much but delivers little? Perhaps, but I don’t think so. Beyond Verbal has created a tool. Tools are limited by context and capability, they’re not right for all times and all places. In fact, this tool might be improved when combined with other tools, or it may be more effective and/or have more value when access to other emotional expressions might not be present (e.g. over the phone).

    But when all is said and done, this tool is about helping to get insights, i.e. improve understanding. That is only a first step in improving the customer experience.


Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here