In early December, Cogito published some new research designed to capture consumers’ understanding of artificial intelligence (AI), their overall perception and utilization of it, and any apprehensions they had with their utilization of it related to data privacy and regulation.
While the study found that most consumers don’t think that AI is a threat to jobs and can help make the lives of employees easier, they expressed a lingering mistrust surrounding brands’ use of their data, privacy and the overall use of AI.
In fact, of the consumers surveyed, 72% said that they had concerns about data privacy and what AI-enabled tools are tracking.
That number represents a significant trust gap.
But, what should companies be doing in the face of that level of concern?
Well, additional findings from Cogito’s research offers up some clues.
Their research found that:
- One-third of the consumers surveyed believe that governmental regulation of AI-enabled tools and technologies would help them become more comfortable with the technology,
- 39% of consumers said that they would feel more comfortable using AI-enabled tools if brands had a clear customer code of practice, and
- 43% would think more positively about a brand and its use of AI if they were more explicit about how they use it, what data they collect and how it is used.
On the first data point, I think there is little doubt that regulation of the use of AI will happen at some point.
But that will take time.
By way of comparison, the EU’s GDPR legislation came about after more than four years of discussions and negotiations.
But, customers concerns about data, privacy and the use of artificial intelligence are real, present and should not be ignored.
Brands would do well to leverage the insight offered by the second and third data points: produce and live by a clear customer code of practice and be more explicit about how they use AI, what data they collect and how it is used.
The value of adopting a customer code is something I highlighted in Punk XL, where I told the story of Hubspot who following a period of explosive growth where they went from zero to nearly $700 million in sales per year, and over 3,000 employees and 114,000 customers in over 120 different countries in 15 years, they realized that their growth rate had gotten in the way of them delivering the experience their customers deserve.
To remedy that situation in 2018, they created what they call their Customer Code. It is made up of 10 tenets that act as both a set of guidelines for their conduct and a bunch of promises to their customers:
- Earn my attention, don’t steal it.
- Treat me like a person, not a persona.
- Solve for my success, not your systems.
- Use my data, but don’t abuse it.
- Ask for feedback, and act on it.
- Own your screw-ups.
- Help me help you, by helping myself.
- I don’t mind paying, but I do mind being played.
- Don’t block the exit.
- Do the right thing, even when it’s hard.
In 2018, they gave themselves a score of 7.1 out of 10 across all of the tenets, with numbers 2, 3 and 8 being the areas that they identified as those that require the most improvement.
They also said they were committed to living by these tenets and promised to repeat the exercise and publish the results every year.
Unfortunately, they don’t seem to have followed through on that promise.
That’s a shame because, amid rising demand for personalization and empathy from customers and concerns about data, privacy and security, as well as ethical and bias concerns with the use of AI technology, a customer code looks increasingly helpful and powerful.
Josh Feast, cofounder and CEO at Cogito, agrees and says “A customer code of practice effectively communicates an organization’s values, as well as the standards they hold themselves to and the expectations consumers can have as a result. This also poses a good opportunity for these organizations to better educate consumers around AI to minimize any fear or concerns they may have around its use. However, as concerns for data privacy and the use of AI continue to be top of mind for consumers, it’s pertinent that establishing a customer code of practice is a top priority for business leaders in the space.”
Some brands will say that they already have such a thing in place.
They’d be wrong. The number of examples of brands that are being explicit, in a customer-friendly way, about how they approach customer data, privacy and the use of artificial intelligence is low.
However, these concerns are not going away, and it’s becoming increasingly likely that transparency and accountability could become a competitive differentiator for many brands in the coming years.
This post was originally published on Forbes.com.
Wow! Fantastic insights, Adrian! Customers indeed know how to solve data, privacy and AI trust issues and brands should undoubtedly listen to them. AI and predictive analytics help us understand more about the various factors in our lives that influence our health, such as where we are born, what we eat, where we work, what our local air pollution levels are, and whether we have access to safe housing and a stable income, not just when we might get the flu or what medical conditions we’ve inherited. As described by the World Health Organization, these are some of the “social determinants of health” (SDOH). Here is an interesting article that talks about the impact of AI in transforming healthcare organizations: https://bit.ly/3CVQAO6