Group Letter Opposing Weak Industry-Backed Privacy Bill In Virginia
U.S. PIRG joined several leading privacy and consumer protection groups in a series of letters, this one sent urgently by email, to the Virginia House of Delegates urging them not to enact a weak privacy proposal that appeared to have support from Big Tech.
TO: Members of the Virginia House of Delegates
Re: Pause SB 1392, Consumer Data Protection Act – Make it real privacy protection!
We urge you not to approve the Consumer Data Protection Act (CDPA). While the bill would provide consumers with some important rights, there are problems with the legislation that significantly limit the protections it would provide for Virginians from unwanted tracking and profiling, discriminatory practices, and sale of their personal information. We believe there is a way to more fairly balance the needs of consumers and businesses. Since this doesn’t go into effect until 2023, it seems premature to pass it with such severe issues for consumers unaddressed. We appreciate the inclusion of the Governor’s request for a work group that could recommend changes next Session but know it is always best to negotiate changes prior to initial passage. We were not aware of this bill or HB2307 prior to Crossover and were not allowed to weigh in prior to its introduction. It seems that many business entities were consulted but consumers were not.
Average people have little insight into the ways that companies collect and use their information, and even fewer understand mechanisms for control over their personal data. We have worked with legislators across the country to encourage crafting legislation that gives consumers meaningful tools to better protect their privacy. These are just some of many changes that are needed to this bill in order to provide real privacy protection for your constituents:
Strengthen data minimization. It should be easy for consumers to understand what information is collected about them, and who has that information about them. Companies should have to ask consumers before they collect information at all – an opt-in framework rather than an opt-out framework. Making “opt-out” the default disempowers consumers and poses equity concerns; consumers with less time and resources to figure out how their data is being used and how to opt-out will inevitably be subject to more privacy violations.
Even within a consent framework, however, privacy laws should limit the data that companies can collect and share. Consumers should be able to use an online service or app safely without having to take any action, such as opting in or opting out, by including a strong data minimization requirement that limits data collection and sharing to what is reasonably necessary to provide the service they requested. A default prohibition on data sharing is preferable to an opt-out based regime which relies on users to hunt down and navigate divergent opt-out processes for hundreds of different companies.
Strengthen consumer’s control of their personal information. The definition of personal data does not include information gleaned from sources such as social media if consumers have failed to adequately restrict who can see that information. This is not a reasonable exception in light of the fact that many consumers do not understand social media privacy settings or anticipate that their information could be harvested for commercial purposes. Furthermore, the definition of personal data does not include information that is linked or can be linked to a particular household or device, a major gap considering today’s complex data ecosystem in which information from the use of smart devices is used to target users and households, without knowing their exact identities.
Consumers have no right to stop their personal data from being sold to companies’ affiliates – businesses they do not know and whose products and services may be very different from that of their parent companies. In addition, some of the rights that consumers do have are unduly limited; for instance, they can avoid seeing targeted ads based on tracking their activities, but not being tracked. Furthermore, since targeted advertising does not include a business advertising to consumers based on their activities on its own website or app, it does not cover the business models of Facebook and Google, who profit from targeting ads to consumers based on that data on behalf of other companies.
Consumers can only opt-out of being profiled if it would result in decisions that “produce legal or similarly significant effects” on them – a determination that would be made by the controller, not the consumer. In addition, consumers cannot designate someone else to exercise their rights; for instance, an elderly woman could not ask her grown child to act on her behalf to request that her data be deleted. There are also problems with references to “known child,” which would appear to limit parents’ and guardians’ abilities to act on behalf of their children if the controller did not know that consumer was a child.
Remove the verification requirement for opting out. Consumers should not have to pass a high bar to stop companies from using their information. The CDPA gives consumers the right to opt out of certain uses of the consumer’s information but requires that they verify their identities to do so. Companies decide what is reasonable so this could become a barrier. We ask that you remove this unnecessary hurdle to prevent people from exercising their privacy rights.
No pay for privacy. No one should be charged or penalized for asking companies to respect their privacy rights. And no one should be asked to pay more in order to protect their privacy. Yet this bill allows companies to charge consumers more or provide them with a lower quality of goods or services if they have exercised their rights – for instance, to avoid targeted advertising. This provision must be removed to avoid unfairly separating Virginians into two classes; privacy “haves” and “have nots.”
Strengthen enforcement. The “right to cure” provision in the administrative enforcement section of the CDPA must be removed. It offers companies a “get-out-of-jail-free” card, significantly undermining the attorney general’s ability to take enforcement action when it deems it necessary and incentivizing companies to be lax about providing necessary privacy protections to their customers.
Finally, the bill prevents consumers from taking legal action to enforce their rights. Private rights of action provide a valuable enforcement tool for everyday people and make clear that companies will face real consequences for privacy harms. People rightly can sue over product defects, car accidents, breach of contract, or injuries to reputation— they do not have to wait for the state attorney general to bring actions on their behalf in any of these instances. Privacy harms should be no exception.
We urge you to enact meaningful privacy legislation that put the needs of people first, even if that means delaying final enactment. It is better to get this right than to enact a law that does not provide privacy protection that your constituents want and deserve.
Irene E. Leech, Virginia Citizens Consumer Council
Susan Grant, Consumer Federation of America
Lee Tien, Electronic Frontier Foundation
Emory Roane, Privacy Rights Clearinghouse
Ed Mierzwinski, U.S. PIRG