The California Consumer Privacy Act (CCPA) has forced companies across the United States (and even globally) to seriously consider how they handle the personal information they collect from consumers. By its terms, however, the CCPA only protects the privacy interests of California residents; other “copy-cat” privacy laws proposed or enacted in other states similarly would only protect the rights of residents of each state. Given the burden on businesses imposed by the rapid proliferation of privacy and data protection laws, including data breach notification obligations, requirements for data transfer mechanisms imposed by international data protection laws (such as the EU General Data Protection Regulation (GDPR)), and the imposition of a variety of data subject rights, a comprehensive US federal privacy bill appears increasingly overdue.

In the past year, US legislators have proposed a wide variety of data privacy laws—none of which seems to have gained significant traction. In November 2019, two new proposals were released in the Senate: the Consumer Online Privacy Rights Act (COPRA), sponsored by Senate Democrats, and the United States Consumer Data Privacy Act of 2019 (CDPA), proposed by Senate Republicans. Both proposals require covered entities to:


Continue Reading

The California Consumer Privacy Act (CCPA) is not yet one month old, but movement has already started on a new California privacy law. In November 2019, the advocacy group Californians for Consumer Privacy, led by Alastair Mactaggart, the architect of CCPA, submitted a proposed California ballot initiative to the Office of the California Attorney General that would build upon the consumer privacy protections and requirements established by CCPA. In December 2019, as required under state law, California Attorney General Xavier Becerra released a title for and summary of the proposed ballot initiative, which will be known as the California Privacy Rights Act (CPRA).

Key Provisions of the CPRA

CPRA seeks to give California consumers additional control over and protection of their personal information in five core ways.


Continue Reading

On January 1, 2020, the California Consumer Privacy Act of 2018 (CCPA) went into effect. The CCPA applies to a wide range of companies and broadly governs the collection, use and sale of personal information of California residents (i.e., consumers and certain other individuals) and households.

The CCPA provides that consumers may seek statutory damages of between $100 and $750, or actual damages if greater, against a company in the event of a data breach of nonredacted and nonencrypted personal information that results from the company’s failure to implement reasonable security. The amount of the statutory damages depends on factors such as the nature and seriousness of the company’s misconduct, the number of violations, the persistence of the company’s misconduct, the length of time over which the misconduct occurred, and the company’s assets, liabilities and net worth. To defend against these consumer actions, a company must show that it has implemented and maintains reasonable security procedures and practices appropriate to the nature of the personal information it is processing.

This CCPA private right of action promises to shake up the data breach class action landscape in which such actions have generally been settled for small amounts or dismissed due to lack of injury. With the CCPA, companies now face potentially staggering damages in relation to a breach. To provide some context, a data breach affecting the personal information of 1,000 California consumers may result in statutory damages ranging from $100,000 to $750,000, and a data breach affecting the personal information of one million California consumers may result in statutory damages ranging from $100 million to $750 million. These potential statutory damages dwarf almost every previous large data breach settlement in the United States.

To mitigate the risk of this increased exposure, companies need to take key steps to ensure they have implemented reasonable security procedures and practices.

What Is Reasonable Security?


Continue Reading

As businesses have scrambled to obtain compliance with the California Consumer Privacy Act (CCPA) in recent months, questions surrounding its constitutionality have arisen. As a broad, sometimes unclear state law that imposes significant obligations on businesses around the country, CCPA may be ripe for legal challenge. The strongest bases for such challenges appear to be: (1) that CCPA violates the “Dormant Commerce Clause”; and (2) that CCPA is impermissibly vague.

Dormant Commerce Clause

The burden that CCPA imposes on out-of-state economic activity may place it in violation of the Dormant Commerce Clause, a legal doctrine created out of the Commerce Clause of the US Constitution. The Commerce Clause allows the US Congress to regulate interstate commerce; from this grant of power, courts have inferred a limitation on the authority of states to regulate interstate commerce, a doctrine coined the Dormant Commerce Clause. On this basis, courts will strike down state laws that explicitly discriminate against out-of-state actors or that regulate activity that occurs entirely outside of the state. In addition, the Dormant Commerce Clause prohibits laws that do not explicitly discriminate against out-of-state economic interests if the effect of a law is to unduly burden interstate commerce. If a state law does unduly burden out-of-state interests, a court will typically balance the burdens imposed on interstate commerce against the benefits the law creates for the state to determine whether or not the law should be upheld.


Continue Reading

Minimal Changes Expected to the Final Regulations

On October 10, 2019, the Attorney General issued his Proposed Text of Regulations, along with a Notice of Proposed Rulemaking Action and Initial Statement of ReasonsAccording to the Attorney General, the regulations will “benefit the welfare of California residents because they will facilitate the implementation of many components of the CCPA” and “provid[e] clear direction to businesses on how to inform consumers of their rights and how to handle their requests.” See Notice of Proposed Rulemaking, page 10.

The deadline to submit public comments on the proposed regulations was December 6, 2019. The Office of the Attorney General (OAG) reported receiving about 1,700 pages of written comments from almost 200 parties. Despite this, the Attorney General stated in a news briefing that he does not expect the final regulations to include significant changes.

The proposed regulations should give everyone a sense of how the Attorney General will interpret the CCPA. The Attorney General is required to issue final regulations and a final Statement of Reasons at some point before July 1, 2020, which is the first day that the Attorney General can enforce the law.

Investing in Enforcement

California has invested in enforcement resources. The Attorney General stated that the CCPA will cost the state about $4.7 million for FY 2019-2020, and $4.5 million for FYI 2020-2021, which reflects the cost of hiring an additional 23 full-time positions and expert consultants to enforce and defend the CCPA. See Notice of Proposed Rulemaking, page 10. Despite this additional funding, the OAG is still an agency with limited resources. Many expect that the OAG will only be able to pursue a limited number of CCPA enforcement actions, particularly if it takes large on and well-funded companies.


Continue Reading

On January 7, 2020, the Director of the US Office of Management and Budget (OMB) issued a Draft Memorandum (the Memorandum) to all federal “implementing agencies” regarding the development of regulatory and non-regulatory approaches to reducing barriers to the development and adoption of artificial intelligence (AI) technologies. Implementing agencies are agencies that conduct foundational research, develop and deploy AI technologies, provide educational grants, and regulate and provide guidance for applications of AI technologies, as determined by the co-chairs of the National Science and Technology Council (NSTC) Select Committee. To our knowledge, the NTSC has not yet determined which agencies are “implementing agencies” for purposes of the Memorandum.

Submission of Agency Plan to OMB

The “implementing agencies” have 180 days to submit to OMB their plans for addressing the Memorandum.

An agency’s plan must: (1) identify any statutory authorities specifically governing the agency’s regulation of AI applications as well as collections of AI-related information from regulated entities; and (2) report on the outcomes of stakeholder engagements that identify existing regulatory barriers to AI applications and high-priority AI applications that are within the agency’s regulatory authorities. OMB also requests but does not require agencies to list and describe any planned or considered regulatory actions on AI.

Principles for the Stewardship of AI Applications

The Memorandum outlines the following as principles and considerations that agencies should address in determining regulatory or non-regulatory approaches to AI:

  1. Public trust in AI. Regulatory and non-regulatory approaches to AI need to be reliable, robust and trustworthy.
  2. Public participation. The public should have the opportunity to take part in the rule-making process.
  3. Scientific integrity and information quality. The government should use scientific and technical information and processes when developing a stance on AI.
  4. Risk assessment and management.A risk assessment should be conducted before determining regulatory and non-regulatory approaches.
  5. Benefits and costs. Agencies need to consider the societal costs and benefits related to developing and using AI applications.
  6. Flexibility. Agency approaches to AI should be flexible and performance-based.
  7. Fairness and nondiscrimination. Fairness and nondiscrimination in outcomes needs to be considered in both regulatory and non-regulatory approaches.
  8. Disclosure and transparency. Agencies should be transparent. Transparency can serve to improve public trust in AI.
  9. Safety and security. Agencies should guarantee confidentiality, integrity and availability of data use by AI by ensuring that the proper controls are in place.
  10. Interagency coordination. Agencies need to work together to ensure consistency and predictability of AI-related policies.


Continue Reading

On January 6, 2020, the California State Senate’s Health Committee unanimously approved California AB 713, a bill that would amend the California Consumer Privacy Act (CCPA) to except from CCPA requirements additional categories of health information, including data de-identified in accordance with the Health Insurance Portability and Accountability Act of 1996 (HIPAA), medical research data, personal information used for public health and safety activities, and patient information that is maintained by HIPAA business associates in the same manner as HIPAA protected health information (PHI). If enacted, the bill would simplify CCPA compliance strategies for many HIPAA-regulated entities, life sciences companies, research institutions and health data aggregators.

Exemption for HIPAA Business Associates

Presently, the CCPA does not regulate PHI that is collected by either a HIPAA covered entity or business associate.

The CCPA also exempts covered entities to the extent that they maintain patient information in the same manner as PHI subject to HIPAA. The CCPA does not, however, currently include a similar entity-based exemption for business associates.

AB 713 would add an exemption for business associates to the extent that they maintain, use and disclose patient information consistent with HIPAA requirements applicable to PHI. For example, if a business associate maintains consumer-generated health information that is not PHI, but processes the information in accordance with HIPAA requirements for PHI, then the information would not be regulated by the CCPA. While the practical import of the new exemption may be limited because business associates may not want to apply HIPAA requirements to consumer-generated health information, AB 713 offers business associates another potential exception to CCPA requirements for patient information about California consumers.

Exception for De-Identified Health Information

AB 713 would except from CCPA requirements de-identified health information when each of the following three conditions are met:

  • The information is de-identified in accordance with a HIPAA de-identification method (i.e., the safe harbor or expert determination method) at 45 CFR § 164.514(b).
  • The information is derived from PHI or “individually identifiable health information” under HIPAA, “medical information” as defined by the California Confidentiality of Medical Information Act (CMIA), or “identifiable private information” subject to the Common Rule.
  • The business (or its business associate) does not actually, or attempt to, re-identify the information.


Continue Reading

The California Consumer Privacy Act (CCPA) requires businesses who engage in sales of personal information, to offer consumers the right to opt out of such sales through a “Do Not Sell My Personal Information” link or button on their websites. These “Do Not Sell” obligations present a particularly thorny question for businesses that participate in a digital ad exchange or otherwise use advertising tracking technologies on their websites. Because data elements such as IP address, cookie ID, device identifier and browsing history are considered “personal information” for purposes of the CCPA, the question is: does sharing that information with third-party ad tech providers constitute a “sale” of data?

The answer, so far, is a resounding “maybe.” In what follows, we expand on the issue and survey different approaches to this hotly contested question.

Why the Debate?

The CCPA defines a “sale” as “selling, renting, releasing, disclosing, disseminating, making available, transferring, or otherwise communicating orally, in writing, or by electronic or other means, a consumer’s personal information by the business to another business or a third party for monetary or other valuable consideration.” The Network Advertising Initiative (NAI) broke this definition down into three main elements that, when satisfied, might make the case that digital advertising involves a “sale.”

    • The digital advertising must involve “personal information.” We know that it does because serving digital ads requires, at the very least, access to IP address and browsing history.
    • The digital advertising must involve the movement of personal information from a business to another business or third party. This is often true for digital advertising relationships, as ad tech intermediaries and other participants in the ad exchange often use the personal information they have received from businesses for their own purposes, thus taking many ad tech entities outside of CCPA’s “service provider” safe harbor.
    • The digital advertising must involve the exchange of monetary or other valuable consideration for the personal information. This is a fact-specific inquiry that will vary across contractual arrangements. For that reason, the NAI analysis states it would be difficult to broadly categorize all digital advertising activities as “sales.” However, the NAI cautions that if the recipients of personal information can retain the information “for profiling or segmenting purposes” (e.g., the ability to monetize the data independently), that could be evidence of a “sale” of data.


Continue Reading

A recent McDermott roundtable on European health private equity generated key insights into the future of medtech, digital health, and data analytics, and identified opportunities for companies and investors.

Digital health solutions are widely considered to be the next big growth market. Healthcare lags significantly behind other industries when it comes to digitization, but the potential opportunities are driving developers, healthcare providers, and investors to find solutions.

PATIENT CARE
A key point to bear in mind about healthcare technology is that success and adoption may often be measured by the quality of the users’ experience, the resulting clinical outcomes, short and long term cost savings, and the resulting margin for both investors and the health care system at large. These multi-faceted goals are best illustrated by the demands for i) greater efficiency, and ii) better patient outcomes.

Efficiency is typified by, for example, streamlined bookings and appointment reminders, algorithms that triage patients to ensure they are seen by the right person at the right time, and in-home patient monitoring after patients are discharged. Patient take-up is also an excellent gauge of efficiency, for example, a high tech product that measures and reports blood sugar is of no value if the interface is too complicated for an older population.

Better outcomes result from clinicians gathering and using data to determine the right treatment in the fastest possible time, and are demonstrated, for example, by permanent lifestyle changes, improvements in self-care or care outside hospital,accurate drug dosage and use of medicines, and, in direct contrast with other sectors, reduced, rather than increased, service usage.

PRIVACY AND REGULATORY HURDLES
One of the most obvious challenges inherent in digital health is data privacy and security. Stemming from that are issues relating to control of the data, the right to use it, and ownership of the analysis. The most successful companies are those that, from the very beginning, understand the regulatory landscape in which they are operating; are transparent in terms of where their data comes from; make clear the type of data at issue, be that identifiable, pseudonymized, anonymized, or something in between; and identify who will control what data in what form. The ability to marry up these factors is a key part of any new entrant’s value proposition.


Continue Reading

As discussed in the first post in this two-part series, new players from outside the traditional healthcare paradigm are joining forces with hospitals, health systems and other providers to drive unprecedented innovation. These unexpected partnerships are bringing new solutions to market and changing how business is done and care is delivered.

Many of these collaborations revolve around data and data sharing arrangements. Traditional health industry stakeholders such as hospitals and health systems (HHSs) are partnering with technology companies—both established and start-up—to develop and market digital health solutions that engage patients beyond the brick-and-mortar clinical setting. Digital health tools are making it easier for patients to receive care in a mobile setting and access their health data across various platforms and sources. These innovative partnerships thus hold out the possibility of delivering better, faster, more targeted care.

Addressing Community Concerns

At the same time, digital health collaborations can encounter challenges regarding data privacy and security, permissions and ownership. Historically, health data was housed in one place—within the health institution. But with the rise of digital health tools, health data has become ubiquitous, raising fears about how it may be used, aggregated and shared.


Continue Reading