How to win the battle between privacy and personalization

3
188

Share on LinkedIn

We’ve been hearing for years that if you get something for free, you are the product. But for a lot of people, that adage never really sunk in until the recent Cambridge Analytica scandal in which the company was accused of misusing and failing to secure Facebook data from more than 71 million people.

Before the scandal erupted, people were perhaps complacent about privacy. But, this Google Trends chart shows a stark reversal beginning in March 2018. Now, though interest in Cambridge Analytica has quickly dropped off, searches related to privacy continue to rise.

Privacy Personalization Cambridge Analytica Facebook Scandal SklarWilton

Privacy and personalization create a double-edged sword.

For many people, personalization is what you get when emails and newsletters address you by your first name. Our names have been public information since the day we were named, so we don’t normally feel a huge loss of privacy when someone we don’t know uses that information. And for the 2 BILLION people who use Facebook, the personal data we share on that website, from friends and family to favourite musicians and politicians, is shared under the assumption that it will be safe and secure within the website.

But for early adopters who have plunged head first into all that technology has to offer, the broader application of personalization is the magic that happens with a voice activated home assistant such as Amazon’s Alexa or Echo, Apple’s HomePod, or Google Home. When you literally tell a small electronic device such as Alexa to order more slow cooked beef pot roast, personalization of this device means that it recognizes YOUR voice. It knows that you usually buy pot roast from M&M Food Market. It uses your saved credit card numbers and places the order to be delivered to your home after 6pm that day. That instant gratification is the ultimate goal of personalization. And the consequence is the ultimate loss of privacy.

Many of us willingly give up our most personal and risky details to companies and brands, because we love them and believe that the relationship improves our lives.

We give those companies our kids’ names and our credit card numbers because it makes things easier and lets us spend our time doing the things we want to do in the way we want to do them.

On the other hand, personalization can sometimes be a less than wonderful thing. Social media games that ask for personal information such as pets’ names, favourite activities, authors, books, and more, probably are used to tell you which celebrity you’re most similar to. But, in some cases, these data are also used to profile your shopping personalities and determine which products and services you could be persuaded to buy. Which isn’t necessarily bad. But in some cases, these data could be used to facilitate serving deliberately slanted or misleading information. As we are discovering from the Cambridge Analytica fiasco.

We need to find a happy medium.

We know that privacy standards, even when very strict and enforced, are not always sufficient to safeguard consumer data and personal data. We know that we share too much information with websites we don’t completely trust. We know that laptops get forgotten, lost, and stolen allowing access to files and software that are highly confidential. We know that hackers around the world are actively trying to access private information, whether for fun, status, or malice. Privacy with technology is impossible.

The happy medium lies in giving consumers good options.

Companies that are willing to put in the work to earn consumer trust will enjoy long-lasting success. Consumers will reward companies that have a track record of good behaviour, and quick and friendly customer service. Consumers will even reward companies that make the occasional privacy or security mistake as long as the desired and necessary apologies are quick, genuine, and the resolutions are purposeful.

It might cost more to create winning customer service experiences, and build appropriate compromises between personalization and privacy, but the reward is loyal consumers. And nothing is more valuable than that.

Annie Pettit, Ph.D. FMRIA
Annie Pettit, PhD, FMRIA is a research methodologist who specializes in marketing and research design and strategy. She is an invited speaker at conferences around the world and has published refereed and industry articles. She won a Ginny Valentine Award, ESOMAR Excellence Award for the Best Paper, MRIA Award of Outstanding Merit, and ESOMAR Best Methodological Paper. Annie blogs at LoveStats, tweets at @LoveStats and is the author of "People Aren't Robots" and "7 Strategies and 10 Tactics to Become a Thought Leader" both available on Amazon.

3 COMMENTS

  1. thanks for your article, Annie. Although I do think that for the majority of us the reason to give away data is much more mundane than “love” you are hitting some truths. I guess the main reason is mere convenience and not really thinking about what we give. We can observe this day-by-day when looking at younger generations.

    Brands need to walk a fine line collecting the right data (the data we are more likely to give), use it in a very appropriate manner (to help us, not them). This, over time, gives them more credibility and trust, leading to us willingly giving more data.

  2. Hi Annie: interesting article. I am not sanguine that privacy and personalization will ever exist in harmony. The objectives of developers (monetization of data) and users (convenience, ego-satisfaction, control) often are in direct conflict.

    For companies that harvest and store consumer data, monetization of data inevitably means selling it, and the more personal the data is, the more value it has. All of this is done under the guise of personalization. So how do you square privacy and personalization? I don’t claim to have a complete answer, but I think as you suggest, being a trustworthy data steward is an essential component.

    I like the idea of a customer data bill of rights, and most companies receive an F. When Princess Cruises introduced its Ocean Medallion tracking service, the sales pitch was for convenience. As one travel website breathlessly reported, “Most importantly, you can order a drink from the bar, wherever you may be on the ship, and a server will show up soon after, beverage tray in hand, having found you through the app’s geo-location technology and an uploaded photo of your face.” (Read more: http://www.traveller.com.au/best-technological-innovations-on-cruise-ships-hightech-cruising-for-the-21st-century-h0w0s0#ixzz5EIpmmW9c)

    Was this ever a problem for cruise passengers in the first place? Companies pitch personalization as customer value and trumpet it as a beacon of their customer centricity. But this is a partial truth. What companies truly value is data and through personalization they get lots of it. In my view, that makes personalization the mortal enemy of consumer privacy.

    An article I wrote on the topic, A Future Without Secrets? Why We Need Ethical Data Governance outlines my recommendations for companies that want to foster customer trust: http://customerthink.com/a-future-without-secrets-why-we-need-ethical-data-governance/

  3. Some concessions to privacy regulation have become either a legislative mandate or a recognized guiding principle. In the United states, three new federal laws offer privacy protection to minors, Children’s Online Privacy Protection Act (COPPA); regulate information sharing between insurance companies and banks, Gramm-Leach-Bliely (GLB); and deal with healthcare information, Health Insurance Portability and Accountability Act (HIPPA).

ADD YOUR COMMENT

Please use comments to add value to the discussion. Maximum one link to an educational blog post or article. We will NOT PUBLISH brief comments like "good post," comments that mainly promote links, or comments with links to companies, products, or services.

Please enter your comment!
Please enter your name here