blog: Don Marti


figuring out the CCPA escalation path

19 September 2020

(Update 10 Sep 2021: add more material on GPC)

(update 8 Oct 2020: add material on GPC, copy edit)

The later you catch a software bug, the more expensive it is to fix. Catching a syntax error while you're typing code costs practically nothing, fixing a broken test is more expensive, and deploying an update to users can cost even more.

The California Consumer Privacy Act (CCPA) provides for a similar escalation path. It's helpful to look at all the ways to handle CCPA obligations in order from lowest overhead to highest. Just as software developers are learning to find and fix threading bugs with build-time borrow checking before a difficult bug can make it to a customer, companies are learning how to handle CCPA rights at the easiest, fastest, cheapest level.

Here are the levels, cheapest to most expensive.

browser Do Not Sell. The CCPA regulations (PDF) say,

If a business collects personal information from consumers online, the business shall treat user-enabled global privacy controls, such as a browser plug-in or privacy setting, device setting, or other mechanism, that communicate or signal the consumer’s choice to opt-out of the sale of their personal information as a valid request submitted pursuant to Civil Code section 1798.120 for that browser or device, or, if known, for the consumer.

A standard signal to implement this, called Global Privacy Control, has just been announced, and is being tested across a variety of browsers, extensions, and sites. It is technically similar to the old Do Not Track, with the big difference that it's intended to be now legally required (updated for 2021).

authorized agent Do Not Sell This is the first escalation for a mishandled browser Do Not Sell, and the lowest level for a company that the consumer does not have a direct HTTP connection to. Agent opt outs can be bundled and made easy to handle, and the agent has an incentive to cooperate with the company, to cut their own costs and increase user satisfaction. Bulk handling of agent opt outs is an easy win for DSAR vendors, to lower average cost per CCPA transaction.

Do Not Sell My Personal Information emails and clicks. These are similar to a GDPR Article 21 objection, but that can't be handled with the same processes used for GDPR. Lightweight because no user identity verification is required (although the company can do an anti-fraud check) but still heavier than handling an agent opt out. If you get an opt out, cheaper to act on it than to push back and make the consumer escalate to a Right to Know or Right to Delete.

Right to Know and Right to Delete If a Do Not Sell gets subjected to illegal verification steps or other dark patterns, then the consumer can escalate to a Right to Know, followed by a Right to Delete. According to vendors of Data Subject Access Rights software, manual handling of a Right to Know can cost $1,400-$10,000. Software and processes are going to bring this down, but realistically nowhere near the cost of dealing with the browser signal or the agent opt opt correctly in the first place.

Companies that try to apply the same user experience to a Right to Know as to a more common and less expensive opt out are likely to have to have to deal with a higher volume of Right to Know requests.

Somewhere along this escalation path, users can make automated or manual reports to the office of the Attorney General, to help them pick targets for enforcement. They certainly don't have the staff time to go after most CCPA violations, but reports from consumers and consumer organizations will help them pick some high-priority targets.

Some experts are recommending relying on dark patterns to limit the number of CCPA requests I still don't like the term "requests" here since the company has to comply with them. And legally binding communications coming from the company to the customer is never called a request. But "request" is in the regulations. that companies have to deal with. But we're going to discover that the dark patterns approach is flawed. Yes, a lot of consumers are going to give up and go away when they hit a dark pattern, such as an extra verification not allowed by the law, but a fraction of the consumers are going to escalate. A company that chooses dark patterns instead of straightforward compliance is making a high-stakes bet on what fraction of consumers will escalate.

We don’t stop until hate for profit stops

Endorsement: Yes on Prop. 24. It’s not perfect, but it would improve online privacy

Killing the Ad Business. From the inside.

Opt-In Value Exchange Ads: Examples and Best Practices

How Prop 24 Regulates Big Tech and Data Brokers that Sell Your Personal Data