Close
Close

Published

Sky AdSmart proves worth with a little help from Affectiva Emotion AI

Fruits from the formidable combined addressable advertising initiative between rival UK operators Sky and Virgin Media – applying the Sky AdSmart technology together with Liberty Global’s ad tech system– have finally blossomed after five years of field trials. US-based readers stick with us please, because Comcast’s newly launched On Addressability initiative is essentially Sky AdSmart in disguise.

Results across the board are irresistible – a 48% reduction in channel switching; 35% increase in ad engagement; 20% increase on purchase metrics for brands new to TV; 13% higher emotional response to TV ads; and 49% increase in ad recall, to cite just a few achievements.

But while we have come to associate AdSmart as one of the most advanced addressable advertising systems on the market (and apparently most expensive, too) with a flexible approach allowing integration with third party APIs, Sky Media now mentions two “groundbreaking” new technologies – facial decoding and emotion analytics. Neither terms have been tied to Sky AdSmart before and unfortunately neither area of technology has been elaborated upon within the report. We have contacted Sky asking if someone from the AdSmart team would be willing to have their brain picked.

While Sky Media fails to explain how facial decoding works, it was more than happy to praise facial decoding for triggering a 22% higher emotional response in AdSmart audiences when viewing TV ads compared to linear audiences. It says creating emotion in turn creates memories, which is crucial for advertisers as this drives recognition and ultimately translates into positive return on investment.

Fortunately, Sky Media listed working with emotional measurement and differentology company Affectiva to measure the emotional reactions of AdSmart users. So, we headed over to the Affectiva website for a lesson in Emotion AI. The MIT spin-out company explains that its software can detect not only human emotions, but complex cognitive states, such as drowsiness and distraction – and, in the future, it will be able to understand human activities, interactions and objects people use.

To solve tasks such as face detection and tracking, speaker diarization, voice-activity detection, and emotion classification from face to voice, Affectiva uses two specific deep learning architectures – Convolutional Neural Networks (CNN) and Recurrent Neural Networks (RCN) – across a combination of customer layers and off the shelf network architectures.

However, there is a certain mobile flavor to Affectiva’s explorations, while Sky AdSmart is associated purely with addressability on the set top layer. Affectiva specifies how its deep learning models provide accurate, real-time estimates of emotions on mobile devices, while most deep learning models can only run as cloud-based APIs using specialized hardware – resulting in very computationally expensive scenarios with large GPUs required to quickly compute results. On-device performance therefore requires exploring trade-offs between model complexity and model accuracy.

“Building AI that understands all things human is no small feat. But with our proven approaches to deep learning, computer vision and speech science, and the massive amounts of real-world people data we continue to collect and annotate, we’re well on our way. What’s more, by taking a multi-modal approach to human perception – analyzing both facial and vocal expressions – we’re able to get a more comprehensive and accurate understanding of human states,” states Affectiva.

Its business model is built on how devices that are fundamentally designed to interface with humans are significantly hindered without a deep understanding of what’s happening with the users they interact with. “If we want these AI assistants to be truly useful, they need to be able to understand us on a deeper level,” it philosophizes.

As well as measuring emotional response, the emotion analytics methodology measured engagement – in the case of AdSmart defined by how attentive the viewer was to the advert while on screen. Results showed engagement levels up to 35% higher for AdSmart users with an average of 21% higher engagement than for linear audiences.

The linking of addressable ad systems between Sky AdSmart and Virgin Media essentially enabled programmatic deals to be offered across both TV platforms in a single pass – which Comcast has replicated through its On Addressability initiative. This involves Comcast, Cox and Charter attacking the addressable marketplace from the standpoint that it should begin with the content distributors and only that way can the advertising ecosystem benefit as a whole from addressability.

Originally built around Cisco technologies, Sky AdSmart has also been tested by Comcast with NBCUniversal’s Audience Studio. But while operators like Comcast, Sky, Charter and Cox are embracing addressable advertising, some broadcasters and networks have been unable to follow suit because they have been denied real-time access to viewership information, as well as the ability to insert ads dynamically in individual homes.

Close