Efficiency & the era of Pig Data
When a new or a matured software enters the market, it’s typically used to increase the speed and efficiency of certain processes. So it should not come as a surprise that organizations are looking into facial recognition to make them more effective, secure and to improve the quality of their products. This is not the cool or ‘sexy’ part where the customer experience is enhanced to nearly magical levels, but it’s where processes are sped up and made cheaper. In these continued price wars times, that’s a real competitive advantage.
Target, Walmart, and Lowe’s: they have all run trials of facial recognition technology. One of the most common applications of facial recognition here is catching shoplifters. FaceFirst – one of the more popular technologies amongst retailers – can scan a shopper’s face up to 50 to 100 feet away and it has reduced theft in stores by as much as 30 percent. The TSA (Transportation Security Administration) – the agency of the U.S. Department of Homeland Security in charge of the security of travellers in the United States – began testing facial recognition technology for international travelers at Los Angeles International Airport (LAX) earlier last year and is planning to expand the use of biometrics technology. Even entertainers are starting to adopt the software for reasons of security. Taylor Swift, for instance, monitored a concert with a facial recognition system in California last year in order to record and recognize the faces of her too many stalkers. And the most obvious example of them all, of course: China’s very controversial social credit system which monitors and rewards, or punishes the behavior of its population, and ranks them based on their “social credit.”
One of the most advanced applications in the matter comes from China, too, where the e-commerce platform JD.com is using pig facial recognition to revolutionize farming. This type of projects can help reduce the cost of rearing pigs by as much as 50%, but it could also greatly enhance the quality of the meat, by monitoring how much which pig moves and eats. That’s a really nice case of Pig Data, if you ask me.
The new convenience: “What’s a button?”
Where the former examples had little direct advantages for the end user, the security applications described above could very well be exploited to enhance the convenience for customers too. Germany-based Lufthansa Group is for instance deploying automated kiosks with facial recognition technology at Los Angeles International Airport to identify passengers and allow them to board the plane within a few seconds. It’s not only a very efficient and secure process, but the speeding up of the boarding by 50% and the fact that they do not need to show their ID, is very convenient for the customer.
The same goes for mobile banking: several banking apps – like those of Belgian’s KBC Bank, Singapore’s OCBC Bank and Japan’s Seven Bank – use facial recognition as a fast and frictionless form of identification. It would not surprise me if the cashierless Amazon Go stores would offer the possibility to pay with your face too in a near future, instead of (or on top off) their current mobile app. In fact, 7-Eleven is opening a trial store in Tokyo which just requires your face for payment.
I love how this means that the face is becoming the new interface. It is the evolution from the one button interface to the zero button interface. Imagine a world where voice and face interfaces help you get through the day. I suspect that one day, my grandchildren could very well ask me “What’s a button, granddad?”.
Last but not least, facial recognition will be applied for context- and emotion-driven hyperpersonalization. Whereas customer information used to be pretty much limited to age, buying behavior and gender (which is pretty complicated if we can believe Facebook’s 70 different gender statuses), today pricing, communication and even product can be adapted to a person’s context and mood of the moment, especially if it’s enriched with past behavior data.
This is the most exciting and tricky part because convenience could very well turn in unwelcomed manipulation if we’re not careful. Different studies have shown that our face holds a surprising amount of information about our health, mood, sexual preference and even social status. Homosexuality, wealth, the rate of our heartbeat, certain diseases: they can all be read from our faces, and be cleverly (mis)used by companies. Retailers could offer discount scarves to people who seem to struggle with a bad cold, or people with clinical depression could receive information about non-profit or healthcare organizations to help them out. Advertisements in bus stops could show female people when a woman is waiting at a bus stop, because she would identify more with them. But it could also mean that companies could ask steeper fees for rich people (some of us might applaud that, but it hardly seems fair) or that insurance companies could ask more money, or even refuse people because their face would tell them that they have a higher percentage of developing certain diseases.
This raises a lot of ethical questions. But we’re definitely seeing that companies are tackling these issues by themselves. That might have partially to do with them trying to precede governmental meddling, I think, like in the case of the poorly successful GDPR story. I know, for instance, that Microsoft has a dedicated Principal Ethics Strategist within its AI Perceptions and Mixed Reality Group. And at the same time, I think that governments will start to demand full algorithmic transparency from companies. Today, a financial auditor might swing by to check up on you, and in a few years, I expect that an algorithmic auditor might do the same.
An eye for an I
The question remains: who will be the pioneers, the ones that will launch the breakthrough applications? The first thing that comes to mind is of course those sectors that still require our physical participation: retail, some entertainment sectors, hotels, the travel industry, farming (with animal faces), smart city projects or healthcare. Especially the latter hold some exciting promises: we’d move from facial recognition to facial investigation. People could be notified of the fact that they are too stressed, close to depression, have diabetes, have certain genetic diseases: which would help society in making the transition from sick care to actual health care.
But this focus on ‘physical’ sectors for facial recognition is actually a form of traditional thinking. Every computer, every iPad, every mobile phone, smart tv, a lot of smart homes, even some cars and smartwatches have a camera “eye”. Just imagine (lawsuit disclaimer: I’m not saying that this is happening): if your camera on your smartphone or PC is not covered, Zalando or Amazon could very well be monitoring your heartrate and emotions and use that to adapt their prices and what you see. For now, we are only utilizing our own camera’s for security and identification (opening phones, launching a banking app), but if there is convenience involved, I believe that the use cases will grow far beyond that. And let’s not forget about all of our pictures or videos that are available on Facebook, Twitter, Instagram, Snapchat, flickr, TikTok, Youtube and any other place where we’re compelled to share them. Just last week, the internet was buzzing with the Facebook 10-year challenge that might be a cleverly easy way to find out how people age. They seemed to have forgotten that Facebook has been exploiting that application for quite some time now. Even since 2012, though it discontinued the feature for a while after concerns from the Irish Data Protection Commissioner.
These are pivotal times for facial recognition software. Let’s all make sure together that the scales will tip in the direction of efficiency and convenience instead of the darker side of manipulation. As you know I’m an optimist and I definitely choose to believe that we’ll be able to use it for the better. The possibilities to improve our lives are endless.