Photo of Amy C. Pimentel

Amy C. Pimentel focuses her practice on privacy and data security and general health law. Her clients operate in a variety of industries, including health care, consumer products, retail, food and beverage, technology, banking and other financial services. Read Amy Pimentel's full bio.

The validity of Model Clauses for EU personal data transfer to the United States is now in real doubt as a result of a new Irish High Court judgment stating that there are “well founded grounds” to find the Model Clauses invalid. The issue of Model Clauses as a legitimate data transfer mechanism will now be adjudicated by the European Court of Justice (ECJ), the same court that previously overturned the Safe Harbor arrangement. EU and US companies will need to consider various strategies in anticipation of this decision.

Continue Reading

The US Department of Transportation’s National Highway Traffic Safety Administration recently released A Vision for Safety 2.0, an update to its prior guidance on automated driving systems. The new guidance adopts a voluntary, flexible approach to regulation of automated driving systems and clarifies that it alone, and not the states, is responsible for regulating the safety design and performance aspects of such systems.

Continue Reading

New technologies and the expansion of the Internet of Things have allowed children of this generation to experience seamless interactive technologies through microphones, GPS devices, speech recognition, sensors, cameras and other technological capabilities. These advancements create new markets for entertainment and education alike and, in the process, collect endless amounts of data from children–from their names and locations to their likes/dislikes and innermost thoughts.

The collection of data through this Internet of Toys is on the tongues of regulators and law enforcement, who are warning parents to be wary when purchasing internet-connected toys and other devices for children. These warnings also extend to connected toy makers, urging companies to comply with children’s privacy rules and signaling that focused enforcement is forthcoming.

Federal Trade Commission Makes Clear That Connected Toy Makers Must Comply with COPPA

On June 21 2017, the Federal Trade Commission (FTC) updated its guidance for companies required to comply with the Children’s Online Privacy and Protection Act (COPPA) to ensure those companies implement key protections with respect to Internet-connected toys and associated services. While the FTC’s Six Step Compliance Plan for COPPA compliance is not entirely new, there are a few key updates that reflect developments in the Internet of Toys marketplace. Continue Reading Regulating the Internet of Toys

In an age where providers are increasingly taking the management of their patient’s health online and out of the doctor’s office, the creation of scalable and nimble patient engagement tools can serve to improve patient experience, health care outcomes and health care costs. While the level of enthusiasm for these tools is at an all-time high, there is a growing concern about the unexpected deterrent to the adoption of these tools from an unlikely source: the Telephone Consumer Protection Act of 1991 (TCPA).

Many professionals in the health industry have come to share two misconceptions about the TCPA: first, that the TCPA only applies to marketing phone calls or text message “spam,” and second, that the TCPA does not apply to communications from HIPAA covered entities to their patients/health plan members. These misconceptions can be costly mistakes for covered entities that have designed their patient engagement outreach programs without include a TCPA compliance strategy.

Compliance Challenges

As discussed in a previous post, the TCPA was originally intended to curb abusive telemarketing calls. When applying the law to smarter and increasingly innovative technologies (especially those that we see in the patient engagement world), the TCPA poses significant compliance challenges for the users of these tools that arguably threaten to curb meaningful progress on important public health and policy goals.

Despite its initial scope of addressing robocalls, the TCPA also applies to many automated communications between health care providers and their patients, and between plans and their members. There is a diverse array of technical consent requirements that apply depending on what type of phone call you make. For instance, most auto-dialed marketing calls to cell phones require prior express written consent, meaning that the caller must first obtain written consent before making the call. To make compliance more compliance, callers remain responsible for proving consent and the accuracy of the numbers dialed.

Indeed, the TCPA presents a serious challenge for patient engagement tools, especially when violations of the TCPA can yield statutory damages of up to $1,500 per call or text message. While Federal Communications Commission orders over the past several years have added some clarity and a “safe harbor” for HIPAA-covered entities to help entities achieve compliance, there is still no “free pass” from the TCPA’s requirements. Therefore, covered entities and the business associates who work for them should not assume that compliance with HIPAA offers any security of defense against a successful claim under the TCPA.

Continue Reading The TCPA: An Unexpected Deterrent to Patient Engagement Tools

On 19 October 2016, the European Court of Justice (ECJ) held (Case C-582/14 – Breyer v Federal Republic of Germany) that dynamic IP addresses may constitute personal data. The ECJ also held that a website operator may collect and process IP addresses for the purpose of protecting itself against cyberattacks, because in the view of the Court, preventing cyberattacks may be a legitimate interest of a website operator in its effort to continue the operability of its website.

The ECJ’s ruling was based on two questions referred to it by the German Federal Court of Justice (BGH). In the underlying German proceedings, a member of the German Pirate Party challenged the German Federal Government’s logging and subsequent use of his dynamic Internet Protocol (IP) address when visiting their websites. While the government is a public authority, the case was argued on the basis of German provisions that address both public and private website operators, and is therefore directly relevant for commercial companies.

Continue Reading ECJ Confirms Dynamic IP Address May Constitute Personal Data But Can Be Logged to Combat Cyberattacks

On January 6, the Federal Trade Commission (FTC) released a report that it hopes will educate organizations on the important laws and research that are relevant to big data analytics. The report, Big Data: A Tool for Inclusion or Exclusion? Understanding the Issues, looks specifically at how big data is used after it is collected and analyzed and provides suggestions aimed at maximizing the benefits and minimizing the risks of using big data.

Risk and Rewards

The report argues that big data analytics can provide numerous opportunities for improvements in society. In addition to more effectively matching products and services to consumers, big data can create opportunities for low income and underserved communities. The report highlights a number of innovative uses of big data that provide benefits to underserved populations, such as increased educational attainment, access to credit through nontraditional methods, specialized health care for underserved communities, and better access to employment.

At the same time, the report shows that potential inaccuracies and biases might lead to detrimental effects for low-income and underserved populations. For example, organizations  could use big data to inadvertently exclude low-income and underserved communities from credit and employment opportunities, which may reinforce existing disparities or weaken the effectiveness of consumer choice.

Considerations for Using Big Data

The report outlines some of the consumer protection laws (in particular, the Fair Credit Reporting Act and FTC Act)  and equal opportunity laws that apply to the use of big data, especially with regard to possible issues of discrimination or exclusion. It also recommends that an organization consider the following questions to help ensure that its use of big data analytics does not lead to unlawful exclusion or discrimination:

How representative is your data set? 

If the data set is missing information from particular populations, take appropriate steps to address this problem.

Does your data model account for biases? 

Review data sets and algorithms to ensure that hidden biases do not have an unintended impact on certain populations.

How accurate are your predictions based on big data? 

Balance the risks of using correlative results, especially where the business’ policies could negatively affect certain populations.

Does your reliance on big data cause ethical or fairness concerns?

Consider whether fairness and ethical considerations advise against using big data in certain circumstances and whether the business can use big data in ways that advance opportunities for previously underrepresented populations.

Monitoring and Enforcement Ahead

The FTC stated that its collective challenge is to make sure that big data analytics continue to provide benefits and opportunities to consumers while adhering to core consumer protection values and principles. It has committed to continue monitoring areas where big data practices could violate existing laws and to bring enforcement actions where appropriate.  With that in mind, organizations that already use big data and those that are have been persuaded by reported benefits of big data should heed the FTC’s advice. The FTC is highlighting its interest in the consumer protection and equal opportunity ramifications of big data use. This report serves as a warning—a statement of intent—that the FTC will be evaluating data practices in light of these concerns.  It is clear that organizations must identify and mitigate the risks in using big data, not only those dealing with privacy and data protection but also those presenting consumer protection and equal opportunity issues. Thinking critically about and taking corrective action in line with the considerations listed above, and creating a record that such steps have been taken, may help organizations using big data to avoid FTC regulatory scrutiny.

As we reported on October 19th, the Article 29 Working Party on the Protection of Individuals with Regard to the Processing of Personal Data challenged the EU member states to “open discussions with the US” to find a viable alternative to the Safe Harbor program. Today, the European Commission (EC) issued a public statement confirming its commitment to working with the United States on a “renewed and sound framework for transatlantic transfers of personal data.” The apparent trigger for today’s announcement are “concerns” from businesses about “the possibilities for continued data transfers” while the Safe Harbor Sequel is under negotiation.

In its statement, the EC confirms that during the pendency of the U.S.-EU negotiations, Standard Contractual Clauses and Binding Corporate Rules (BCRs) are viable bases for legitimizing data transfers that formerly were validated by the Safe Harbor Program.

The EC was careful to note that today’s guidance “does not lay down any binding rules” and “is without prejudice to the powers and duty of the DPAs (Data Protection Authorities) to examine the lawfulness of such transfers in full independence.”  In other words, a DPA still may decide that Standard Contractual Clauses and BCRs are not viable under its country’s laws.

The Judicial Redress Act of 2015 (H.R. 1428) (Judicial Redress Act) is on its way to the U.S. Senate. On October 20th, the U.S. House of Representatives voted in favor of passage.

The Judicial Redress Act extends certain privacy rights under the Privacy Act of 1974 (Privacy Act) to citizens of the EU and other specified countries.

The preamble to the Judicial Redress Act states that:

The Judicial Redress Act provides citizens of covered foreign countries with the ability to bring suit in Federal district court for certain Privacy Act violations by the Federal Government related to the sharing of law enforcement information between the United States and a covered foreign government. Any such lawsuit is subject to the same terms and conditions that apply to U.S. citizens and lawful permanent residents who seek redress against the Federal Government under the Privacy Act. Under current law, only U.S. citizens and lawful permanent residents may bring claims against the Federal Government pursuant to the Privacy Act despite the fact that many countries provide U.S. citizens with the ability to seek redress in their courts when their privacy rights are violated. Enactment of this legislation is necessary in order to promote and maintain law enforcement cooperation and information sharing between foreign governments and the United States and to complete negotiations of the Data Protection and Privacy Agreement with the European Union.”

The House’s passage of the Judicial Redress Act is expected to help mitigate one of the key criticisms of U.S. privacy protection from EU regulators. As discussed in our blog posts from earlier this month, in the Court of Justice of the European Union (CJEU) decision invalidating the U.S.-EU Safe Harbor Program, the CJEU noted that EU residents lack an “administrative or judicial means of redress enabling, in particular, the data relating to them to be accessed and, as the case may be, rectified or erased.”  Once passed by the Senate (as is generally expected), the Judicial Redress Act will provide that means of redress.

Check back for updates on the Senate’s consideration of the Judicial Redress Act and the ongoing EU-US negotiations about a Safe Harbor Sequel.

As we wrote on October 6, 2015, the Court of Justice of the European Union (CJEU) announced its invalidation of the U.S.-EU Safe Harbor program as a legally valid pathway for transferring personal data of European Union (EU) residents from the EU to the United States. An avalanche of reports, analyses and predictions followed the CJEU announcement because so many U.S. businesses operating in the EU relied on the validity of the Safe Harbor program.

As we expected, the CJEU decision was not the final chapter. On October 16, the Article 29 Working Party on the Protection of Individuals with Regard to the Processing of Personal Data (the Working Party, an independent advisory board to data protection authorities in EU members states) called on the EU member states to “open discussions with the US” to find a viable alternative to the Safe Harbor program.

Echoing the CJEU’s concern about “massive and indiscriminate surveillance” by the U.S. government, the Working Party challenged the United States and EU to produce by 31 January 2016, a new data transfer framework with “stronger guarantees” of EU residents’ “fundamental rights” to data privacy, as well as “redress mechanisms” for violations.

In the meantime, the Working Party affirmed that data transfers formerly validated by the Safe Harbor program are not legal. It also noted its intent to evaluate the validity of the two other key data EU-U.S. transfer pathways: Binding Corporate Rules (BCRs) and Standard Contractual Clauses.

What This Means for U.S. Businesses

While waiting for news of Safe Harbor: The Sequel, our Privacy and Data Protection Group continues to advise a business that relied on the Safe Harbor program to:

  1. Classify the data transferred from the EU to the United States (employee, consumer, business contacts, etc.).
  2. Determine which of the data transfers from the EU to the United States were formerly validated by Safe Harbor.
  3. Identify vendors that transfer EU personal data for the business and determine how those vendors validate their transfers (e.g., Did a vendor represent that it could make legitimate transfers via Safe Harbor, and, if so, what happens now?).
  4. Decide how best to address EU to U.S. personal data transfers under one of the other data transfer pathways based on data classification (e.g., Binding Corporate Rules for intra-company transfers; Standard Contractual Clauses for transfers to third parties that do not otherwise meet EU requirements; or consent of each EU data subject—an impractical option for high-volume transfers).

Stay tuned for more on Safe Harbor: The Sequel and guidance for businesses.