The final of three stakeholder meetings to allow the public to provide oral comment to the draft Colorado Privacy Act (CPA) rules released on October 10, 2022 was held by the Colorado Attorney General’s office Thursday, November 17, 2022. Onemata also attended the first and second stakeholder meetings, and we encourage you to read those summaries as well.
Assistant Attorneys General Jill Szewczyk and Stevie DeGroff continued to uphold the Colorado Office of the Attorney General’s (OAG) commitment to robust stakeholder engagement soliciting ideas and feedback from the public to inform the CPA regulations. The agenda focused on the following provisions of the CPA draft rules:
Profiling: Part 9 of the draft rules clarifies the requirements on controllers that process personal data for the purposes of profiling. The CPA includes several requirements for profiling activities; specifically, controllers must tell consumers how their personal data is used for profiling. conduct and document Data Protection Assessments (DPA) prior to processing personal data for profiling, and provide consumers the right to opt out of the processing of personal data for the purpose of in certain situations.
Consent/Dark Patterns: Part 7 of the draft rules clarifies the CPA’s requirements related to requesting and obtaining consent, including the prohibition against obtaining consumer agreement through dark patterns.
Definitions: Part 2 of the draft rules provides defined terms, including terms such as “Biometric Data” and “Bona Fide Loyalty Program” which are not defined the CPA statute.
Risks and Harms of Profiling
The CPA requires controllers to conduct a DPA prior to using personal data for profiling in certain situations. Commenters provided the OAG numerous suggestions for how the rules could be clarified to allow controllers to best comply with the CPA. First, the draft rules require controllers to conduct a DPA if the profiling presents certain “reasonably foreseeable” risks. It was suggested that the OAG replace the "reasonably foreseeable” language with “likely” in order to align with the European Union’s General Data Protection Regulation.
A DPA is also required when profiling would present a “reasonably foreseeable risk” of “substantial injury” to a consumer. The draft rules provide a vague example of “substantial injury” as “a small harm to a large number of consumers”. In the stakeholder meeting it was noted that the draft rules do not provide reasonable guidance for interpreting “harm”, and that the language utilized seem to reflect tort law, which likely wasn’t the OAG’s intent.
It was also noted that even though the CPA statute does not connect “profiling” to ad targeting, that didn’t stop the OAG from sneaking in language to draft rules to require controllers to describe how profiling is used to serve ads in the privacy notice. The AdTech industry has been greatly affected by the waves of increased privacy regulation and scrutiny on the industry. So it should come as no surprise that the OAG would want to propose such a requirement, or that the AdTech industry would notice and attempt a push-back.
Too Many Consents or Too Few Consents: Which is Better?
Consent is an integral part of the foundation of privacy regulation. As such, it’s critical that regulators are successful at walking the razor-sharp edge of ease of controller implementation, and thus consumer understanding, and consumer consent fatigue.
Colorado is one of the first states to explore the concept of granular consent. This is important because the more granular the consumer consent is, the more informed the consumer and thus the consent will be. However, too much granularity will result in a controller needing to present a consumer with numerous consent requests, which may overwhelm, confuse, or annoy the consumer. On the other side of that coin, the more consents the consumer gives, the more granular the opt-out rights become as well, potentially making it more burdensome for both the consumer and industry.
Also as noted in the stakeholder meeting, today industry is faced with conflicting consents, and implementing granular consents will only exacerbate that issue unless mitigations are put in place. For example, a consumer gives their consent to allow their personal data to be used for advertising and marketing purposes. A certain amount of time passes and that same consumer opts-out, either through the controller directly or a universal opt-out mechanism. The controller isn’t clear whether the consumer changed their mind, which they have the right to do, or if they made a mistake. As such, the idea of using pop-ups or something similar to essentially ask the consumer to confirm or verify their action, whether that be opting-in or opting-out, if that opt-in or opt-out conflicts with a prior action made by that consumer. Generally speaking people are wary of using pop-ups as they detract from the user experience. So if consumers will be asked to confirm or verify their actions, it will need to be done in the most user-friendly way possible.
Intent Behind Dark Patterns
Another important discussion topic pertained to dark patterns and a controller’s intent. We can all agree that dark patterns are manipulative and not user friendly. However, the intent behind a dark pattern can be important for an Attorney General’s office to determine if and how to take an enforcement action. For example, according to the current CPA and draft rules, intentionally making it difficult to deactivate an account by making a consumer go through 10 levels of options and having a link allowing a controller to exercise a data right be accidentally broken, are both dark patterns. Clearly the first is intentional and the second is not. Incorporating “intent” to the dark pattern rules and enforcement thereof could be critical to appropriate regulation and enforcement.
There were three definitions that commenters consistently opined on: “publicly available information”, “biometric data” and “biometric identifiers”.
The definition of “publicly available information” has clear and obvious implications on the media and press. The draft rules currently exclude inferences from publicly available information and non-publicly available information combined with publicly available information from the definition of “publicly available information”. Numerous representatives from the media and press stated that this narrow definition of “publicly available information” not only conflicts with other Colorado laws, but it will have a significant negative effect on their ability to investigate, author and publish, and thus on their first amendment rights.
Many commenters also stated that they felt it was unnecessary to have two definitions, one for “biometric data” and one for “biometric identifiers”. It was suggested to remove one and have the remaining definition align with the Virginia, Utah and Connecticut state privacy laws for optimum harmonization and interoperability
All in all these three stakeholder meetings with the Colorado OAG were valuable and productive, and we look forward to continuing our engagement with their office to facilitate robust, effective, yet industry friendly privacy regulation.