While digital health innovation empowers us to better manage our health and live more productive lives, it also poses myriad regulatory, strategic and operational challenges. Edited and authored by McDermott’s team of distinguished digital health lawyers, The Law of Digital Health offers an overview of the highly dynamic and integrated components of the digital health ecosystem, with the goal of helping businesses thrive in this ever-evolving landscape. Over five chapters, we explore a broad spectrum of digital health innovation opportunities and the corresponding value proposition; review current and evolving legal and regulatory frameworks, theories, interpretations, and policy and enforcement initiatives in both the public and private sectors; and provide practical planning and implementation strategies for achieving the appropriate balance between the benefits of digital health innovation opportunities and the need to manage associated legal and regulatory risks.
Amy C. Pimentel focuses her practice on privacy and data security and general health law. Her clients operate in a variety of industries, including health care, consumer products, retail, food and beverage, technology, banking and other financial services. Read Amy Pimentel's full bio.
The General Data Protection Regulation (GDPR) establishes protections for the privacy and security of personal data (Personal Data) about individuals in the European Union (EU) single market countries, and potentially affects the clinical and other scientific research activities of academic medical centers and other research organizations in the United States.
This On the Subject includes frequently asked questions that discuss the extent to which United States research organizations must comply with GDPR when conducting research. Future coverage will address the impact of GDPR on other aspects of the United States health care sector.
The validity of Model Clauses for EU personal data transfer to the United States is now in real doubt as a result of a new Irish High Court judgment stating that there are “well founded grounds” to find the Model Clauses invalid. The issue of Model Clauses as a legitimate data transfer mechanism will now be adjudicated by the European Court of Justice (ECJ), the same court that previously overturned the Safe Harbor arrangement. EU and US companies will need to consider various strategies in anticipation of this decision.
The US Department of Transportation’s National Highway Traffic Safety Administration recently released A Vision for Safety 2.0, an update to its prior guidance on automated driving systems. The new guidance adopts a voluntary, flexible approach to regulation of automated driving systems and clarifies that it alone, and not the states, is responsible for regulating the safety design and performance aspects of such systems.
New technologies and the expansion of the Internet of Things have allowed children of this generation to experience seamless interactive technologies through microphones, GPS devices, speech recognition, sensors, cameras and other technological capabilities. These advancements create new markets for entertainment and education alike and, in the process, collect endless amounts of data from children–from their names and locations to their likes/dislikes and innermost thoughts.
The collection of data through this Internet of Toys is on the tongues of regulators and law enforcement, who are warning parents to be wary when purchasing internet-connected toys and other devices for children. These warnings also extend to connected toy makers, urging companies to comply with children’s privacy rules and signaling that focused enforcement is forthcoming.
Federal Trade Commission Makes Clear That Connected Toy Makers Must Comply with COPPA
On June 21 2017, the Federal Trade Commission (FTC) updated its guidance for companies required to comply with the Children’s Online Privacy and Protection Act (COPPA) to ensure those companies implement key protections with respect to Internet-connected toys and associated services. While the FTC’s Six Step Compliance Plan for COPPA compliance is not entirely new, there are a few key updates that reflect developments in the Internet of Toys marketplace. Continue Reading Regulating the Internet of Toys
In an age where providers are increasingly taking the management of their patient’s health online and out of the doctor’s office, the creation of scalable and nimble patient engagement tools can serve to improve patient experience, health care outcomes and health care costs. While the level of enthusiasm for these tools is at an all-time high, there is a growing concern about the unexpected deterrent to the adoption of these tools from an unlikely source: the Telephone Consumer Protection Act of 1991 (TCPA).
Many professionals in the health industry have come to share two misconceptions about the TCPA: first, that the TCPA only applies to marketing phone calls or text message “spam,” and second, that the TCPA does not apply to communications from HIPAA covered entities to their patients/health plan members. These misconceptions can be costly mistakes for covered entities that have designed their patient engagement outreach programs without include a TCPA compliance strategy.
As discussed in a previous post, the TCPA was originally intended to curb abusive telemarketing calls. When applying the law to smarter and increasingly innovative technologies (especially those that we see in the patient engagement world), the TCPA poses significant compliance challenges for the users of these tools that arguably threaten to curb meaningful progress on important public health and policy goals.
Despite its initial scope of addressing robocalls, the TCPA also applies to many automated communications between health care providers and their patients, and between plans and their members. There is a diverse array of technical consent requirements that apply depending on what type of phone call you make. For instance, most auto-dialed marketing calls to cell phones require prior express written consent, meaning that the caller must first obtain written consent before making the call. To make compliance more compliance, callers remain responsible for proving consent and the accuracy of the numbers dialed.
Indeed, the TCPA presents a serious challenge for patient engagement tools, especially when violations of the TCPA can yield statutory damages of up to $1,500 per call or text message. While Federal Communications Commission orders over the past several years have added some clarity and a “safe harbor” for HIPAA-covered entities to help entities achieve compliance, there is still no “free pass” from the TCPA’s requirements. Therefore, covered entities and the business associates who work for them should not assume that compliance with HIPAA offers any security of defense against a successful claim under the TCPA.
On 19 October 2016, the European Court of Justice (ECJ) held (Case C-582/14 – Breyer v Federal Republic of Germany) that dynamic IP addresses may constitute personal data. The ECJ also held that a website operator may collect and process IP addresses for the purpose of protecting itself against cyberattacks, because in the view of the Court, preventing cyberattacks may be a legitimate interest of a website operator in its effort to continue the operability of its website.
The ECJ’s ruling was based on two questions referred to it by the German Federal Court of Justice (BGH). In the underlying German proceedings, a member of the German Pirate Party challenged the German Federal Government’s logging and subsequent use of his dynamic Internet Protocol (IP) address when visiting their websites. While the government is a public authority, the case was argued on the basis of German provisions that address both public and private website operators, and is therefore directly relevant for commercial companies.
The European Commission recently determined that the Privacy Shield Framework is adequate to legitimize data transfers under EU law, providing a replacement for the Safe Harbor program. The Privacy Shield is designed to provide organizations on both sides of the Atlantic with a mechanism to comply with EU data protection requirements when transferring personal data from the European Union to the United States. Organizations that apply for Privacy Shield self-certification by September 30, 2016, will be granted a nine-month grace period to conform their contracts with third-party processors to the Privacy Shield’s new onward transfer requirements.
Read the full article here.
On January 6, the Federal Trade Commission (FTC) released a report that it hopes will educate organizations on the important laws and research that are relevant to big data analytics. The report, Big Data: A Tool for Inclusion or Exclusion? Understanding the Issues, looks specifically at how big data is used after it is collected and analyzed and provides suggestions aimed at maximizing the benefits and minimizing the risks of using big data.
Risk and Rewards
The report argues that big data analytics can provide numerous opportunities for improvements in society. In addition to more effectively matching products and services to consumers, big data can create opportunities for low income and underserved communities. The report highlights a number of innovative uses of big data that provide benefits to underserved populations, such as increased educational attainment, access to credit through nontraditional methods, specialized health care for underserved communities, and better access to employment.
At the same time, the report shows that potential inaccuracies and biases might lead to detrimental effects for low-income and underserved populations. For example, organizations could use big data to inadvertently exclude low-income and underserved communities from credit and employment opportunities, which may reinforce existing disparities or weaken the effectiveness of consumer choice.
Considerations for Using Big Data
The report outlines some of the consumer protection laws (in particular, the Fair Credit Reporting Act and FTC Act) and equal opportunity laws that apply to the use of big data, especially with regard to possible issues of discrimination or exclusion. It also recommends that an organization consider the following questions to help ensure that its use of big data analytics does not lead to unlawful exclusion or discrimination:
How representative is your data set?
If the data set is missing information from particular populations, take appropriate steps to address this problem.
Does your data model account for biases?
Review data sets and algorithms to ensure that hidden biases do not have an unintended impact on certain populations.
How accurate are your predictions based on big data?
Balance the risks of using correlative results, especially where the business’ policies could negatively affect certain populations.
Does your reliance on big data cause ethical or fairness concerns?
Consider whether fairness and ethical considerations advise against using big data in certain circumstances and whether the business can use big data in ways that advance opportunities for previously underrepresented populations.
Monitoring and Enforcement Ahead
The FTC stated that its collective challenge is to make sure that big data analytics continue to provide benefits and opportunities to consumers while adhering to core consumer protection values and principles. It has committed to continue monitoring areas where big data practices could violate existing laws and to bring enforcement actions where appropriate. With that in mind, organizations that already use big data and those that are have been persuaded by reported benefits of big data should heed the FTC’s advice. The FTC is highlighting its interest in the consumer protection and equal opportunity ramifications of big data use. This report serves as a warning—a statement of intent—that the FTC will be evaluating data practices in light of these concerns. It is clear that organizations must identify and mitigate the risks in using big data, not only those dealing with privacy and data protection but also those presenting consumer protection and equal opportunity issues. Thinking critically about and taking corrective action in line with the considerations listed above, and creating a record that such steps have been taken, may help organizations using big data to avoid FTC regulatory scrutiny.
As we reported on October 19th, the Article 29 Working Party on the Protection of Individuals with Regard to the Processing of Personal Data challenged the EU member states to “open discussions with the US” to find a viable alternative to the Safe Harbor program. Today, the European Commission (EC) issued a public statement confirming its commitment to working with the United States on a “renewed and sound framework for transatlantic transfers of personal data.” The apparent trigger for today’s announcement are “concerns” from businesses about “the possibilities for continued data transfers” while the Safe Harbor Sequel is under negotiation.
In its statement, the EC confirms that during the pendency of the U.S.-EU negotiations, Standard Contractual Clauses and Binding Corporate Rules (BCRs) are viable bases for legitimizing data transfers that formerly were validated by the Safe Harbor Program.
The EC was careful to note that today’s guidance “does not lay down any binding rules” and “is without prejudice to the powers and duty of the DPAs (Data Protection Authorities) to examine the lawfulness of such transfers in full independence.” In other words, a DPA still may decide that Standard Contractual Clauses and BCRs are not viable under its country’s laws.