Should Internet platforms disclose to advertisers when their ads sponsor illegal activity?
27 November 2018
Why do advertisers keep sponsoring illegal activity on big Internet platforms such as YouTube and Facebook? Platforms are running so many copyright-infringing copies, cryptocurrency scams, state-sponsored campaign finance violations, and even functioning as the IT department for genocide (d00d wtf?) that it's hard to understand why so many good brands are still there.
A big part of the problem is that even though platforms do invest a lot of time and money in removing illegal activity, the advertisers never know. If you're a CMO making decisions about where to spend your ad budget, your experience of a highly customized social platform is completely different from what most of your brand's customers see. As a CMO, you see content from people in your social and professional circles, and ads from high-bidding advertisers who want to sell high-margin items such as conferences and SaaS subscriptions. You don't see as much of the bad stuff. Advertisers pay the bills for illegal activity because they lack the information they would need to stop doing so.
It's time for Internet platforms to stop hiding this information.
When a platform blocks or restricts distribution of content, require disclosure to the advertisers affected. (link goes to my notes for an upcoming meeting about this. There's a GitHub link on the page, suggestions welcome.)