A decade of data suppliers

A decade of data

I sat on a panel several weeks ago discussing a subject that is very close to my heart. The session was dealing with the way in which 3rd party data providers position themselves, productise their data assets, and build strategic relationships with agencies and brands, and how this has evolved and changed over the past decade or so.

Having led programmatic trading teams, handled global buy-side strategy within a leading 3rd party data provider, and now overseeing data strategy and data partnerships at a major media group, I feel as though I am in a strong position to comment on the changing nature of the relationship between agencies and their data suppliers, and the evolving methods of interface between the two.

Stage 1 – The Rise of Audience Data

It’s odd to think that the first data exchanges began to come to market over a decade ago, with the emergence of the then primitive programmatic media ecosystem. The promise of real-time inventory evaluation, coupled with a more sophisticated understanding of the individuals we were advertising to, led to an unprecedented drive towards addressability and application of data. 3rd party data exchanges like Bluekai, Lotame, Eyeota, and VisualDNA became common names banded around the programmatic trading floor, accessible in the major buying platforms to support targeting, personalisation and bid optimisation.

In the beginning, the relationships between agencies and data providers at this point was low-key; combining standard taxonomies searchable within the DSPs, with periodic reviews of the data supplier’s processes, segmentation methods and distribution points, and a healthy amount of networking and relationship building.

The goal of the agency – be able to deploy audience targeting, in programmatic, at large scale across core data categories

The goal of the data supplier – be scalable, and be available across as many categories and sub-categories

Stage 2 – The Performance Gap

 Once audience targeting was established and a regular feature of the programmatic media plan, questions began arising over the performance, and effectiveness of the data sets being used. Most of these questions focussed on the relative performance of a given KPI, and the relative efficiency of the audience line item given the incremental data costs. Audience segmentation became something that was “optimised out” or down-weighted as the campaign went on, and this was a time when only the strongest survived, and would be used on a recurring basis.

D

Data providers needed to be able to prove performance, to demonstrate that they could exceed the current benchmarks against On Target Percentage (OTP). Although the relationship between the agencies and the data providers was still hands-off, we began to see the emergence of an aftersales/service model with data providers incentivised to develop bespoke segmentation on behalf of a campaign to drive perceived value above and beyond “off the shelf” segmentation, and to maintain contact with traders through the campaign to maintain position on plan, and optimise alongside the campaign.

The goal of the agency – be able to prove performance of audience segmentation to their clients, and to find ways of creating new opportunities around non- “off-shelf” audience segmentation

The goal of the data supplier – be cost effective, exceed OTP benchmark and drive increased perception of value through customisation and support/service

Stage 3 – The Age of Quality Control

 If there were a clear and distinct V1.0 and V2.0 of the data exchange ecosystem, it could be strongly argued that May 25th 2018 marked the transition. The introduction of the General Data Protection Regulation (GDPR), alongside an ever-increasing scepticism in the market over quality control within the data marketplace, led to a critical step-change in the approach to data sourcing, privacy, and compliance taken by data providers, publishers, agencies, brands and all entities in the advertising and marketing supply chain and beyond.

Such significant and wide-sweeping legislation led to a new-found focus on the quality and compliance of data providers. Agency due diligence processes became more thorough, contracting processes became more selective and rigorous, and as a result, data providers were required to audit their supply, to establish proper check and balances, and to rebase quality standards, looking not only at user consent, but also at human verification, validation of segmentation techniques etc.

The goal of the agency – to prove that suppliers are sourcing data in accordance with the GDPR, and other local privacy regulation, and to ensure data is human-verified, validated by independent parties etc.

The goal of the data supplier – have the best approach to ensuring quality throughout the process, sourcing, developing and validating segmentation, and ensure privacy and compliance due diligence is pushed to the fore

Stage 4 – The Strategic Partner 

Increased scrutiny on the quality of data, and a data vendor’s approach to data collection, verification and compliance, coupled with a requirement to conduct enhanced due diligence and thorough contracting with any given data provider has inevitably resulted in agencies and brands taking on a more strategic approach to data partnerships, focussing on fewer suppliers, albeit with a far greater depth of relationship across those partners. Agencies and brands have adopted procurement practices that tilt the scales towards upstream acquisition of data, as opposed to activation-based data acquisition within the bidding platform, and have leveraged strategic data partnerships to access more raw data sets, which have subsequently been used to design and develop in-house utilities, applications and solutions.

The goal of the agency – to work with data exchanges in more nuanced ways, focussing on raw data, enabling the development of in-house data-driven advertising capability.

The goal of the data supplier – develop strategic relationships with data/tech specialists further upstream within the agency/group. Support raw data licenses. Identify and drive direct discussions with brands that are investing heavily in data and tech infrastructure.

Arrange a Conversation 

Browse

Article by channel:

Read more articles tagged: Analytics, Featured