Apple recently launched its second-generation AirPods. Airppods, of course, are Apple’s wireless Bluetooth earbuds. They were first released on December 13, 2016, essentially to replace headphones after Apple expelled a separate headphone jack on its iPhones. In addition to playing audio, AirPods feature a built-in microphone that filters out background noise, which allows taking phone calls and talking to Apple’s digital assistant, Siri. Superficially there is little difference between these ones and their first-gen cousins. In summary, the charging indicator light has moved to the outside of the case on the new model, but that’s about it. However, the tech on the inside has changed significantly.
So what’s different? Firstly, the new AirPods case can charge wirelessly and a single charge will now last for a pretty impressive five hours (that’s almost a flight from London to New York) and recharge to give another three hours of listening time after just a quarter of an hour of additional charging (more than covering the distance from London to New York). But the biggest vaunted difference is the voice activated bit. As L’Oreal would say – “Here comes the Science!” – The new AirPods are always listening – just waiting for their owner to utter ‘Hey Siri!’ meaning that if they are connected to a phone or Apple Watch AirPod ousers have access to a virtual assistant literally without having to lift a finger; wherever they are, whatever they are doing. They might be in the gym asking Siri to read you their emails and type replies. They might be in the car and ask Siri for directions to the nearest petrol station or they might even be arriving at the train station and ask Siri to turn on their smart heating at home. Of course all of this is possible using your iPhone, but that involves taking it out of your pocket and interrupting whatever you were listening to!
So what does all this mean? Well first off speaking to virtual assistants as if they are real at any time in any place will become the norm and access to the Internet will increasingly come via voice.
This is significant as in order to optimise the omnichannel customer experience brands will need to know how their customers are interacting. For instance have purchases or search requests been made by Alexa, Siri or Cortana or has a customer come through and made a purchase/search ‘traditionally’ on the website, app or via social channels? Data Management Systems need to be updated in order to capture this information as it will be vital for future segmentation and promotional decisions, not to mention SEO. Moreover, if an organisation’s data does not incorporate the share of search or purchases coming through voice it will be impossible to optimise the front end delivery system that will pull customers towards the brand.
These second generation AirPods will inevitably move more customers towards voice and as penetration grows organisations need to be prepared for voice to become the major way consumers; particularly digital natives, interact with the Internet.