Consumer Protection
Subscribe to Consumer Protection's Posts

New California Privacy Ballot Initiative Would Expand the CCPA

A proposed ballot initiative in California known as the California Privacy Rights Act, which is likely to pass if placed on the 2020 ballot, would both clarify and expand the existing California Consumer Privacy Act. Companies doing business in the state should closely monitor these developments and prepare for compliance, as we outline in this article.

A California ballot initiative known as the California Privacy Rights Act (CPRA) would clarify and expand the California Consumer Privacy Act (CCPA), granting significant new rights to consumers and imposing additional liability risks on companies doing business in the state. The CPRA is an update to the California Privacy Rights and Enforcement Act (CPREA) ballot initiative, which was proposed in late 2019 by the Californians for Consumer Privacy, which also sought to broadly amend and prevent changes to the CCPA that would undermine its consumer protections.

The proposed ballot initiative, submitted by the architects of the CCPA, garnered 900,000 signatures, far more than the roughly 625,000 necessary for certification on the 2020 ballot. Early polling reportedly shows strong support for the measure, so assuming the signatures are approved and the CPRA is placed on the ballot, it is considered likely to pass and to take effect on January 1, 2023.

The CPRA proposes a myriad of changes, and this article will not address them all. What follows is a discussion of the most significant changes for businesses and consumers in California, followed by enforcement and implementation considerations.

New Clarifications, Rights and Responsibilities

In a number of areas, the CPRA would modify the current CCPA in ways that are likely to be welcomed by companies grappling with the often ambiguous and unclear obligations under the current law:

  • “Personal information” would no longer include information that is manifestly made public by the individual or the media.
  • Businesses that receive deletion requests would be expressly permitted to maintain records of these requests for compliance purposes.
  • Consumers could no longer require a business to generate a list of “the categories of personal information it has collected about that consumer” in response to access requests.
  • “Service providers” and “contractors” (a new term that appears to replace the “third party” contract provisions) would not need to respond directly to consumer requests to access or delete information.

However, these changes are largely overshadowed by the initiative’s imposition of significant new rights for consumers and responsibilities for businesses subject to the CCPA. These include the following requirements:

  • Businesses would need to contend with a new opt-out right to “Limit the Use of My Sensitive Personal Information,” which would require enhanced scrutiny of business practices involving certain “sensitive” categories of information. These sensitive categories of information are reminiscent of (but broader than) the categories of information typically regulated by US data breach notification statutes or are considered “special categories” under the EU General Data Protection Regulation. For purposes of the CPRA, “sensitive” categories will include certain government identifiers (Social Security number, driver’s license, state identification card or passport number); a [...]

    Continue Reading



Public Backlash Calls Use of Facial Recognition Systems into Question

In recent weeks and months, legal and technical issues related to use of facial recognition systems in the United States have received national attention, including concerns that the technology lacks accuracy in identifying non-white individuals and that its widespread use by police departments may play a role in racially discriminatory policing. Privacy considerations will play a key role in the ongoing debate over the future of facial recognition technology.

Facial recognition systems (FRS) are automated or semi-automated technologies that analyze an individual’s features by extracting facial patterns from video or still images. FRS use attributes or features of an individual’s face to create data that can be used for the unique personal identification of a specific individual. FRS use has grown exponentially in recent years. In addition to widespread adoption by law enforcement agencies, FRS are also frequently used in retail, banking and security sectors, such as airport screening. Particularly in recent weeks and months, legal and technical issues associated with FRS have come to the forefront, including concerns that the technology lacks accuracy in identifying non-white individuals and that its widespread use by police departments may play a role in racially discriminatory policing.

In response to the global Coronavirus (COVID-19) pandemic, public health agencies and private sector companies have considered ways that FRS might be used in conjunction with proximity and geolocation tracking data to control the disease’s spread. Some foreign governments have implemented extensive biometric and behavioral monitoring to track and contain the spread of the virus, and have used FRS to identify persons who have been in contact with COVID-19-positive individuals and to enforce quarantine or stay-at-home orders. By contrast, use of FRS in the United States already faced opposition because of pre-COVID-19 data privacy concerns, and has encountered increased backlash after the civil rights protests of the past month due to concerns over the technology’s accuracy and accompanying questions regarding its use by law enforcement agencies.

Accuracy Concerns

There are currently no industry standards for the development of FRS, and as a result, FRS algorithms differ significantly in accuracy. A December 2019 National Institute of Standards and Technology (NIST) study, the third in a series conducted through its Face Recognition Vendor Test program, evaluated the effects of factors such as race and sex on facial recognition software. The study analyzed 189 facial recognition algorithms from 99 developers, using collections of photographs with approximately 18 million images of eight million people pulled from databases provided by the US Department of State, the Department of Homeland Security and the Federal Bureau of Investigation. The study found disproportionately higher false positive rates for African American, Asian and Native American faces for one-to-one matching, and higher rates of false positives for African American females for one-to-many matching. The effect of the high rate of false positives for African American females put this group at the greatest risk of misidentification. While law enforcement is encouraged to adopt a high threshold recognition percentage—often 99%—for the use of FRS, in reality police departments exercise [...]

Continue Reading




COVID-19 Causing a Surge in E-Commerce—Is Your Website Accessible?

Stay-at-home orders and business closures during the Coronavirus (COVID-19) pandemic have led to a sharp increase in online shopping. While e-commerce has helped businesses stay afloat during this challenging economic time, there has also been a spike in litigation alleging that certain websites are not accessible to individuals with disabilities. In an article for Bloomberg LawJeremy WhiteMatthew Cin and Brian Long review the legal landscape governing accessibility of websites – including specific rules that apply to the healthcare industry, and explore best practices for companies to mitigate their risk of facing a website accessibility lawsuit.

Click here to read the full article.




Washington State Takes the Lead in CCPA Copycat Legislation Race, Trends Emerge

Since the California Consumer Privacy Act (CCPA) took effect on January 1, 2020, “copycat” legislation has been introduced at a dizzying pace by state legislatures across the country. Taking their cues from CCPA, at last count 16 states have borrowed language from California’s watershed law regarding consumer notices, data subject rights requests, and definitions of “personal information, “sale” of data and other key items. The likely intent is to provide equal (or, in some cases, greater) protections to the residents of their states.

As a practical matter, however, none of the proposed laws is identical to CCPA (nor to each other); some look to the EU General Data Protection Regulation (GDPR), and each takes a complex approach that requires careful reading. The proposed Washington Privacy Act (SB 6281) has been touted as the most comprehensive data protection law in the United States and combines elements of CCPA and GDPR, adding specific protections for biometric information. Late last week, the Washington House added significant enforcement “teeth” by passing an amendment that would provide a private right of action under the Washington Consumer Protection Act for any violation of the Privacy Act.

Despite the lack of uniformity among the recently proposed bills across the country, three key trends are emerging:

Trend #1 – Increased Push for a Private Right of Action

In Washington, pending legislation would extend the private right of action beyond alleged harm arising from data breaches to any violation of the proposed Washington Privacy Act. While prior versions of the legislation vested exclusive enforcement authority in the Washington Attorney General—with penalties up to $7,500 per violation—late last week, the Innovation, Technology and Economic Development Committee in the Washington House approved an amendment to SB 6281 under which any violation of the Privacy Act would be deemed a per se violation of Washington’s Consumer Protection Act. While it is unclear exactly how damages will ultimately be calculated, a broad private right of action is a significant enforcement mechanism for Washington consumers. Supporters of the amendment argued that without a private right of action, companies would have little incentive to comply with the law because the Attorney General’s office lacks the resources to undertake many enforcement actions.

Recent bills propose legislation that closely tracks the CCPA’s private right of action for individuals who allege that they were harmed by data breaches caused by a business’ failure to implement “reasonable security” measures. Both the Illinois Data Transparency and Privacy Act (SB 2330) and New Hampshire’s proposed privacy law, HB 1680, provide consumers with private right of action where personal information is (i) unencrypted and unredacted; and (ii) subject to exfiltration, theft or disclosure due to failure to implement reasonable data security procedures. Consumers may seek damages the greater of $100 – $750 per consumer, per incident or actual damages.

If Washington or other states enact data privacy laws with such provisions, the potential liability for organizations affected by data breaches or failing to comply with sweeping new privacy obligations could rapidly become [...]

Continue Reading




Privacy and Data Security: 2020 Considerations for the Insurance Industry

With the California Consumer Privacy Act of 2018 (CCPA) having taken effect on January 1, 2020, the privacy and data security landscape for insurance carriers, producers and insurtech (collectively, “insurers”) continues to grow more complex. A number of states have also recently passed laws regulating data security in the insurance industry, with the first transition period under a number of these laws set to end in 2020. Given the significant amount of sensitive personal information that insurers collect, process and retain, this trend of increased privacy and data security regulation within the insurance industry is likely to continue. To stay ahead of these new privacy and data security requirements, insurers need to take steps now to navigate the increasingly complex regulatory landscape.

How Does the CCPA Impact Insurers?

On January 1, 2020, California became the first state in the United States to enact comprehensive privacy legislation that governs the collection, use and sale of personal information of California residents (i.e., consumers) and households. Personal information is broadly defined as any information that identifies, relates to, describes is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular individual or household. The CCPA applies to “businesses,” which are for-profit entities that determine the purposes and means of processing consumers’ personal information that do business in California and meet certain applicability thresholds.

Insurers operating in California that meet the CCPA applicability thresholds will be deemed “businesses” subject to a number of obligations under the CCPA, including disclosure obligations and requirements related to consumer privacy rights. While these obligations can be quite onerous, the vast majority of personal information that many personal line insurers collect, process and retain will likely fall under an exemption in the CCPA. The CCPA includes exemptions for:

(more…)




Available Now – 2019 Digital Health Year in Review

Throughout the past year, the healthcare and life science industries experienced a proliferation of digital health innovation that challenged traditional notions of healthcare delivery and payment, as well as product research, development and commercialization, for long-standing and new stakeholders alike. Lawmakers and regulators made meaningful progress towards modernizing the existing legal framework to both protect patients and consumers and encourage continued innovation, but these efforts still lag behind the pace of digital health innovation. As a result, some obstacles, misalignment and ambiguity remain, and 2020 will likely be another year of significant legal and regulatory change.

Click here to read our review of key developments that shaped digital health in 2019 and set the groundwork for trends in 2020.

 




Comprehensive Federal Privacy Law Still Pending

The California Consumer Privacy Act (CCPA) has forced companies across the United States (and even globally) to seriously consider how they handle the personal information they collect from consumers. By its terms, however, the CCPA only protects the privacy interests of California residents; other “copy-cat” privacy laws proposed or enacted in other states similarly would only protect the rights of residents of each state. Given the burden on businesses imposed by the rapid proliferation of privacy and data protection laws, including data breach notification obligations, requirements for data transfer mechanisms imposed by international data protection laws (such as the EU General Data Protection Regulation (GDPR)), and the imposition of a variety of data subject rights, a comprehensive US federal privacy bill appears increasingly overdue.

In the past year, US legislators have proposed a wide variety of data privacy laws—none of which seems to have gained significant traction. In November 2019, two new proposals were released in the Senate: the Consumer Online Privacy Rights Act (COPRA), sponsored by Senate Democrats, and the United States Consumer Data Privacy Act of 2019 (CDPA), proposed by Senate Republicans. Both proposals require covered entities to:

(more…)




CCPA Has Just Gone Into Effect, But Businesses May Need to Prepare for a New California Privacy Law

The California Consumer Privacy Act (CCPA) is not yet one month old, but movement has already started on a new California privacy law. In November 2019, the advocacy group Californians for Consumer Privacy, led by Alastair Mactaggart, the architect of CCPA, submitted a proposed California ballot initiative to the Office of the California Attorney General that would build upon the consumer privacy protections and requirements established by CCPA. In December 2019, as required under state law, California Attorney General Xavier Becerra released a title for and summary of the proposed ballot initiative, which will be known as the California Privacy Rights Act (CPRA).

Key Provisions of the CPRA

CPRA seeks to give California consumers additional control over and protection of their personal information in five core ways.

(more…)




CCPA and ‘Reasonable Security’: A Game Changer

On January 1, 2020, the California Consumer Privacy Act of 2018 (CCPA) went into effect. The CCPA applies to a wide range of companies and broadly governs the collection, use and sale of personal information of California residents (i.e., consumers and certain other individuals) and households.

The CCPA provides that consumers may seek statutory damages of between $100 and $750, or actual damages if greater, against a company in the event of a data breach of nonredacted and nonencrypted personal information that results from the company’s failure to implement reasonable security. The amount of the statutory damages depends on factors such as the nature and seriousness of the company’s misconduct, the number of violations, the persistence of the company’s misconduct, the length of time over which the misconduct occurred, and the company’s assets, liabilities and net worth. To defend against these consumer actions, a company must show that it has implemented and maintains reasonable security procedures and practices appropriate to the nature of the personal information it is processing.

This CCPA private right of action promises to shake up the data breach class action landscape in which such actions have generally been settled for small amounts or dismissed due to lack of injury. With the CCPA, companies now face potentially staggering damages in relation to a breach. To provide some context, a data breach affecting the personal information of 1,000 California consumers may result in statutory damages ranging from $100,000 to $750,000, and a data breach affecting the personal information of one million California consumers may result in statutory damages ranging from $100 million to $750 million. These potential statutory damages dwarf almost every previous large data breach settlement in the United States.

To mitigate the risk of this increased exposure, companies need to take key steps to ensure they have implemented reasonable security procedures and practices.

What Is Reasonable Security?

(more…)




Though CCPA is Now Live, Questions About Its Constitutionality Linger

As businesses have scrambled to obtain compliance with the California Consumer Privacy Act (CCPA) in recent months, questions surrounding its constitutionality have arisen. As a broad, sometimes unclear state law that imposes significant obligations on businesses around the country, CCPA may be ripe for legal challenge. The strongest bases for such challenges appear to be: (1) that CCPA violates the “Dormant Commerce Clause”; and (2) that CCPA is impermissibly vague.

Dormant Commerce Clause

The burden that CCPA imposes on out-of-state economic activity may place it in violation of the Dormant Commerce Clause, a legal doctrine created out of the Commerce Clause of the US Constitution. The Commerce Clause allows the US Congress to regulate interstate commerce; from this grant of power, courts have inferred a limitation on the authority of states to regulate interstate commerce, a doctrine coined the Dormant Commerce Clause. On this basis, courts will strike down state laws that explicitly discriminate against out-of-state actors or that regulate activity that occurs entirely outside of the state. In addition, the Dormant Commerce Clause prohibits laws that do not explicitly discriminate against out-of-state economic interests if the effect of a law is to unduly burden interstate commerce. If a state law does unduly burden out-of-state interests, a court will typically balance the burdens imposed on interstate commerce against the benefits the law creates for the state to determine whether or not the law should be upheld.

(more…)




STAY CONNECTED

TOPICS

ARCHIVES