The ongoing quest to enhance customer experience has birthed a new buzzword: multiexperience (MX). This emerging trend has companies seeking to provide their customers with seamless, native experiences on any channel or device, via voice, video, or text. Gartner has placed MX among the top technology trends of 2020, predicting that by 2023, more than 25% of large enterprises’ mobile apps, progressive web apps and conversational apps will be built or run through an MX development platform and the central role of Computer Vision in multiexperience is now starting to be understood.
There are four technological stages required for a company to create authentic MX applications.
- Sync me: storing a user’s information, which they can easily access and edit
- See me: understanding a user’s location, situation, and historical preferences, and offering them personalized information and a tailored interaction
- Know me: using predictive analytics to make timely and appropriate suggestions to the user
- Be me: acting on a user’s behalf – when given permission – to make the best decision for them
The “Sync me” stage has already been widely adopted. It is the next stage in Gartner’s model – “See me” – that companies should be focusing on in order to improve customer experience.
Role of Visual Assistance in Driving MX
The “See me” stage can be achieved incrementally. The simplest way to understand a user’s situation is via technology known as remote Visual Assistance, which connects a customer and an agent or technician via a live video stream, enabling the expert to provide real-time visual guidance in the form of on-screen Augmented Reality instructions.
Manual “See me” examples along the customer journey
Retailers using remote Visual Assistance can help potential customers by viewing their physical environment so they can recommend the right product. Imagine shopping for window treatments by showing your living room to a remote sales representative who can then show you samples that suit your décor and light, or shopping for a new entertainment center by showing your den to a rep who can suggest the models that would best fit your space.
Many organizations struggle to reduce the number of no-fault found returns that occur when customers encounter initial difficulties with new products. Remote Visual Assistance can save the sale before frustration sets in. For example, if a customer is confused while trying to configure a new remote control, or while trying to install a new smart thermostat, a remote expert can view the issue and provide visual resolution guidance.
Service-based industries can also benefit from remote Visual Assistance. For example, a customer contacts his insurance agent to make a claim for flood damage to his ground-floor apartment. An adjuster can inspect the damage remotely, and assess the cost without needing to travel.
Customer calls for help with bills or contracts can sometimes be complex to resolve as the issue is not always clear to the agent. Remote Visual Assistance can eliminate that uncertainty For example, if a customer wants to understand specific line items on his utility bill, he can simply point his smartphone at the document and indicate the problematic charges, saving the agent time in guiding the client through the paperwork.
The Role of Computer Vision in MX
The next stage of the “See Me” evolution is the implementation of Computer Vision AI technology to drive visual automation and ultimately self-service. Computer Vision technologies – such as object recognition, facial recognition, image to text, and image comparison – can enhance the customer experience by saving time and effort. These technologies can be implemented in assisted service mode – to route customer inquiries, and assist agents with visual decision support tools – or in self-service mode – where customers interact with bots that visually guide them to resolutions.
Automated “See me” examples along the customer journey
Furniture retailers have introduced self-service apps such as Wayfair View in Room 3D and Ikea Place that allow customers to visualize furniture in their own homes ahead of purchase, just by holding up their smartphones. For example, a customer would like to buy a new sofa that fits her living room. The virtual visual assistant views the space through her smartphone camera and works out the perfect sofa size. It can also ask her about desired colors, styles and fabrics, and search for store branches that carry the desired model.
Visual experiences are effective in helping customers navigate initial product setup, configuration, troubleshooting, and regular product maintenance. For example, many automakers provide apps that allows a car owner to point their smartphone at different parts of the vehicle, at which point AR overlays display relevant information, such as how to change the air filter, engine oil, or brake fluid.
Image recognition is a core technology for many insurance-based applications, enabling companies to automatically process images necessary for onboarding, underwriting, damage recognition and claims assessment. Insurance companies can also use facial recognition to automatically identify and authenticate customers, speeding up the validation process and reducing the risk of fraud.
Using Computer Vision AI to analyze a customer’s documents can be a significant time-saver. As well as Optical Character Recognition for simple identification, product registration, and warranty validation, advanced systems can now understand the meaning of complicated texts, enabling customers to easily update their personal details after an official name change, for example.
Reaching the ultimate goal of creating true customer MX is achieved through a step-by-step process. Many forward-thinking companies are currently focusing their efforts on the “See me” phase to better understand their customers’ immediate requirements in order to offer relevant information and enable satisfying interaction. Remote Visual Assistance is the first stage, with Computer Vision in multiexperience poised to take visual engagement to the next level with automation.
This post was originally published on the TechSee blog.