Mozilla’s February announcement that Firefox 22 would default to blocking third-party cookies from being dropped matters to pretty much everyone in the ad community: advertisers, publishers, ad networks and exchanges, DSPs and DMPs, legislators and consumers — basically the entire LUMA slide. Ad placements designed to behaviorally target or retarget a consumer would be blocked (unless the consumer had previously visited the cookie’s site-of-origin), really shaking up the way we do business.
We’re already getting questions from clients about what losing third party cookies could mean for the industry, and my colleague Andrew Allender, Senior Campaign Analyst at Centro, responded perfectly. I couldn’t have said it better myself — so I won’t try. Here’s what Andrew has to say:
Is anyone blocking third party cookies now?
The only major browser I’m aware of that currently blocks third party cookies is Safari, which is not a major concern due to a market share of less than 10%. Most of the news related to this particular issue is in regard to Firefox. Last year, Microsoft announced Internet Explorer 10 would have “Do Not Track” enabled by default, but this is a little more serious since, rather than relying on ad servers and networks to respect the DNT flag, it blocks the third party tracking cookies from being stored in the first place.
Will Mozilla actually release Firefox with third party cookies disabled by default?
Microsoft stated a similar intent when Internet Explorer 10 was in development, but the setting never actually made it into the final release of the software. It’s worth noting that in its criticism of Microsoft’s initial decision, Mozilla said they wanted “to give users a voice and let them tell sites that they don’t want to be tracked.” With DNT as the default setting, “It’s not a conversation. For [it] to be effective, it must actually represent the user’s voice.”
I’ve got to believe that, at the very least, an Open Web company like Mozilla would think twice about making a sweeping decision for the entirety of its user base. Additionally, they’ve already delayed releasing this “feature” a couple of times because they’re seeing high numbers of false positives and false negatives. Their revised approach seems to be some sort of centralized database of white-listed and blacklisted domains. Time will tell where that leads.
If this is the default, will it matter?
To be quite honest, Firefox is slipping. As of the latest data I could find, they currently have less that 20% of the US market share and are trending downward. Google’s Chrome has been strongly trending upward, and, with their integration into the Google product universe and the rise of Android as an operating system, I don’t see that slowing. The overwhelming majority of Google’s revenues come from ads, so they’re not going to risk damage to that. Internet Explorer has been holding fairly steady due to both its familiarity to many PC users and its mission-critical status in many business situations (because IE was always so standards un-compliant AND the dominant browser, many web sites and web tools were coded specifically for it and don’t work in other browsers). This decision could also backfire if users find that the lack of relevant ads gives websites viewed through Firefox a more SPAM-y quality, sending them to other browsers that offer a better user experience.
If this does happen, what do we do?
In the short term, we may face some struggles. Inventory for demo-, behavior-, and re-targeting will probably shrink noticeably. Even though Firefox has less than 20% of users, dropping available targeted inventory by 1/5 would have an impact. We will have to work to make sure any cookie-based targeting on a campaign is less aggressive than it’s been in the past. Additionally, we’ll have to keep an even more watchful eye on campaigns that have already been planned and make sure we’re presenting optimization and/or reallocation options early and often.
In the long term, I not only think this presents yet another opportunity for us to educate advertisers and agencies on the value of branding via digital, but I think it could be good for the industry as a whole. Coming from the publisher side, one of our big concerns was the commoditization of ad inventory. It was very hard to justify the CPMs advertisers paid when buying from us directly when they could get on our site AND only hit their intended audience more cheaply via a network’s cookie-based targeting. This could stop or even reverse the trend of declining CPMs for all publishers. Similar to TV and radio, a site’s entire audience will now be factored into its value, something we’re already starting to see in metrics like GRP. The end result could be that 12 or 24 months from now it will take a higher budget to hit the same audience. It will be even more important to advertisers and agencies that they’re paying only for the best-performing publishers and that performance is being monitored not only regularly, but competently.
The Ad Spandex Take: Wrap-up from Crawford
I like Andrew’s take because it’s positive, and that’s where we need to be. Not having third party cookies to play with is certainly a challenge, but, if we’re not up for finding creative solutions, we’re in the wrong business. Let’s be proactive in coming up with solutions when we see hints like this of what might happen in the future. At the end of the day, the winners will be those of us who meet this head-on instead of running scared. Creative solutions to the problem will get us where we need to be.