Voice and gaze-based technologies are giving marketers an eyeful and earful of fresh analytics.
Eyes and ears tell all
Marketers often rely on event attendees to provide feedback on an experience, whether it’s through a survey or by measuring activities like app downloads. These controlled methods of collecting data, however, don’t always produce unbiased, honest results. Eye-gaze technology and voice analytics are poised to provide an unbridled look into what attendees are focusing on and, perhaps, what they’re saying about your brand or product later on.
Smashbox Cosmetics and ModiFace in July released insights on the “impact of eye-tracking on mobile commerce.” The two collaborated on the iOS app MAKEUP, which features Smashbox products and ModiFace’s face-tracking and video makeup rendering technology “along with a unique ability to track the location on the screen that the user is looking at based on their video.” It allowed the brand to see what users were looking at by layering a heat map on areas of the screen that “receive more attention” and how long a user reads the specs of a product on the screen. Smashbox could then determine the most popular cosmetics category, the most popular shades and more.
On the voice recognition forefront, researchers at the Mitsubishi Electric Research Laboratory in Cambridge, MA, are manipulating artificial intelligence to learn to “pick out individual voices” from the crowd. The “trick” identifies features in a voice “that can be used to track a single person in conversation.” While the technology could improve upon speech recognition devices like Amazon Echo (so it can hear you better over a dinner party), it could mean opportunities for marketers down the road to listen in on feedback and conversations, perhaps from individual influencers or pick up on key words that could then be aggregated and analyzed post-event.
Beyond analytics, eye-gaze and voice technologies are improving event accessibility and supporting increasingly global event audiences. Over the past few years, a host of eye-tracking startups have been picked up by Silicon Valley heavy hitters Google, Facebook and Apple, and, just this past August, Microsoft joined them, adding a test program for Windows 10 with an eye-controlled interface that allows users to get around Windows 10 on their machines by looking and directing with their eyes. The test program works with Tobii Eye Tracker 4C device and “is primarily designed to help those suffering from neuro-muscular diseases like ALS and other disabilities to control the various interface elements in Windows 10 without a traditional mouse and keyboard,” The Verge reports. London’s National Theater introducing AR-powered smart glasses for the hearing impaired that displays closed captioning, live right before users’ eyes.
To boot, voice technologies are also poised to transform event experiences for global audiences, like Google Pixel Buds, ear phones that can translate up to 40 languages in your ear as people are speaking.
Sounds like success.