It all started with a half-case of Corona beer . . .
Amid the routine supermarket cacophony of bar code scanner beeps and crying babies, I almost miss a message that whirs onto the register’s colorful plasma display at my checkout station:
**Age verification bypassed by cashier**
The cashier barely looked up while scanning my items, but she had already made an age assessment. As the verification message scrolled up and off the screen, replaced by the odd character strings that comprise the retail shorthand for my purchases, I pondered how the cashier could have made such a hasty decision regarding my age, given the gravity of the consequences she faced had she been wrong.
What influenced her decision? I’m dressed in a black T-shirt, jeans, and running shoes. Was it a little gray hair? Slight balding at the temples? Crows feet around the eyes? Maybe I really do look fifty. In the name of Positive Customer Experience, couldn’t she have feigned a shred of age confusion, prompting her to ask me for an ID? But could my reaction of flattery be someone else’s annoyance over privacy invasion, or a twenty-something’s frustration that he doesn’t appear sufficiently old? Whatever the answer, could the enforcement of age verification be achieved without the ruthlessly cold message the Point-of-Sale system displayed?
Welcome to the murky world of personal profiling, and its collision with technology. Headline: A customer’s personal appearance is judged when managing a transaction. While web analytics enable companies selling products over the Internet to profile people at arms-length with clinical precision, face-to-face transactions require on-the-spot judgment and strong interpersonal skills. That’s a tall public relations order for anyone who must decide who to interrogate when selling alcohol or tobacco, allowing a senior-citizen discount, providing entrance into a bar, or pricing a kid’s haircut (are you eight—or nine?). Jim Barnes described related pitfalls in his March 16th blog “The Tripping Point.”
It’s not just age-related profiling that creates customer relationship problems. In 1994, Denny’s Restaurants paid a $54.4 million class action settlement to thousands of black customers who sued the chain for discrimination, alleging they were refused service or had to wait longer for service than white customers. And during a recent cab ride to the airport, I endured the driver’s diatribe about the bad tipping habits of a certain nationality (did he assume I had a different heritage?).
A recent article “Confessions of a Car Salesman,” by Chandler Phillips,
elucidates how institutionalized customer profiling infects the customer experience:
“Since I was still a “green pea” the other salesmen tried to push me to wait on undesirable ups — the undesirable customers who the salesmen thought wouldn’t or couldn’t qualify to buy a car. My manager had, at one point, described the different races and nationalities and what they were like as customers. It would be too inflammatory to repeat what he said here. But the gist of it was that the people of such-and-such nationality were “lie downs” (people who buy without negotiating), while the people of another race were “roaches” (they had bad credit), and people from that country were “mooches” (they tried to buy the car for invoice price).”
Data mining has freed many e-commerce companies from the burden of coaching employees on such sensitive issues. Sequestered in Spartan cubicles that could be anywhere in the world, analysts can look at a myriad of variables and target messages and tailor processes without offending customers. Amazon.com made business intelligence-generated recommendations famous with “Customers who bought this item also bought . . .”
The so-called statistical objectivity of analytics makes it possible to sell billions of dollars of products without customers taking umbrage. No value judgments, no offensive antecedents such as “People like you want . . . ” If a poor recommendation is made online, it’s an anomaly in the algorithm, and no one person is to blame. Most of all, no one makes a hurtful request based on how a person looks or thinks. In the cyber-world, psychographic judgments are deeply hidden in lines of code. In the bricks-and-mortar world, they’re not.
It’s not that technology and analytics hasn’t brought new ugliness to the customer experience. When caller-ID technology became widespread, many feared that it would create a two-class customer service system by routing inbound calls differently, depending on the neighborhood a person happened to be calling from. And technology-enabled profiling made headlines again when civil libertarians debated the potential abuses resulting from embedding RFID chips into clothing.
But the fact remains that for all the heavy lifting carried by today’s e-commerce workflow engines, billions of face-to-face customer-relationship interactions are conducted every day, and managing them is a less-than-perfect science. Add to the equation that any judgment based on an individual’s appearance might be considered offensive, you quickly realize this is not a job for amateurs.
As with many issues that reside in the union of social mores, technology, and the law, there are more questions than answers. Given the sensitivities over personal profiling, how should companies enforce regulations for sales of certain products to underage consumers? What constitutes appropriate management of policies? How should an appropriate customer interaction be defined? How should employees be trained? What is the role of Customer Relationship Management (CRM) systems in supporting employees when profiling must be part of the transaction? What constitutes unethical physical profiling? What risks are associated with profiling?
Companies such as Disney have addressed some challenges by regarding customer-facing employees as actors and patrons as guests, and providing interpersonal skills coaching along the way. Other companies need to think further about how profiling impacts the customer’s experience. Unchecked, customer profiling mutates into dangerous “—ism’s:” racism, sexism, and ageism, to name a few. Has the car dealership referenced earlier already crossed a perilous threshold? It’s worth thinking about, since once the fuzzy line has been crossed, the issues discussed around the boardroom table relate to litigation and damage control.
But companies should not ignore the opportunities! When you’re buying beer at age fifty, life seems brighter when a cashier asks you for an ID. You can bet I would tell at least twenty people, after first telling my wife! Talk about evangelizing the customer experience!