Lack of a sufficient risk analysis continues to be one of the most commonly alleged violations in Office for Civil Rights (OCR) HIPAA enforcement actions, appearing in half of all OCR settlements announced in the last 12 months and in almost all of the $1 million-plus settlements during that time period. Significant confusion remains across the health care industry as to what actually constitutes a compliant risk analysis for purposes of the HIPAA Security Rule. On April 30, 2018 OCR issued guidance discussing the differences between a HIPAA Security Rule risk analysis and a HIPAA compliance “gap analysis.” Drawing from our experience reviewing clients’ historical risk analysis documents, helping clients to navigate OCR investigations and negotiating several recent HIPAA settlements with OCR, we elaborate on what constitutes a compliant HIPAA Security Rule risk analysis, discuss common risk analysis misunderstandings and pitfalls, and encourage covered entities and business associates to consider whether to conduct these reviews under attorney-client privilege.
Amy C. Pimentel focuses her practice on privacy and data security and general health law. Her clients operate in a variety of industries, including health care, consumer products, retail, food and beverage, technology, banking and other financial services. Read Amy Pimentel's full bio.
Designed to provide business leaders and their key advisors with the knowledge and insight they need to grow and sustain successful digital health initiatives, we are pleased to present The Law of Digital Health, a new book edited and authored by McDermott’s team of distinguished digital health lawyers, and published by AHLA.
Visit www.mwe.com/lawofdigitalhealth to order this comprehensive legal and regulatory analysis, coupled with practical planning and implementation strategies. You can also download the Executive Summary and hear more about how Digital Health is quickly and dynamically changing the health care landscape.
The General Data Protection Regulation (GDPR) establishes protections for the privacy and security of personal data (Personal Data) about individuals in the European Union (EU) single market countries, and potentially affects the clinical and other scientific research activities of academic medical centers and other research organizations in the United States.
This On the Subject includes frequently asked questions that discuss the extent to which United States research organizations must comply with GDPR when conducting research. Future coverage will address the impact of GDPR on other aspects of the United States health care sector.
The validity of Model Clauses for EU personal data transfer to the United States is now in real doubt as a result of a new Irish High Court judgment stating that there are “well founded grounds” to find the Model Clauses invalid. The issue of Model Clauses as a legitimate data transfer mechanism will now be adjudicated by the European Court of Justice (ECJ), the same court that previously overturned the Safe Harbor arrangement. EU and US companies will need to consider various strategies in anticipation of this decision.
The US Department of Transportation’s National Highway Traffic Safety Administration recently released A Vision for Safety 2.0, an update to its prior guidance on automated driving systems. The new guidance adopts a voluntary, flexible approach to regulation of automated driving systems and clarifies that it alone, and not the states, is responsible for regulating the safety design and performance aspects of such systems.
New technologies and the expansion of the Internet of Things have allowed children of this generation to experience seamless interactive technologies through microphones, GPS devices, speech recognition, sensors, cameras and other technological capabilities. These advancements create new markets for entertainment and education alike and, in the process, collect endless amounts of data from children–from their names and locations to their likes/dislikes and innermost thoughts.
The collection of data through this Internet of Toys is on the tongues of regulators and law enforcement, who are warning parents to be wary when purchasing internet-connected toys and other devices for children. These warnings also extend to connected toy makers, urging companies to comply with children’s privacy rules and signaling that focused enforcement is forthcoming.
Federal Trade Commission Makes Clear That Connected Toy Makers Must Comply with COPPA
On June 21 2017, the Federal Trade Commission (FTC) updated its guidance for companies required to comply with the Children’s Online Privacy and Protection Act (COPPA) to ensure those companies implement key protections with respect to Internet-connected toys and associated services. While the FTC’s Six Step Compliance Plan for COPPA compliance is not entirely new, there are a few key updates that reflect developments in the Internet of Toys marketplace. Continue Reading Regulating the Internet of Toys
In an age where providers are increasingly taking the management of their patient’s health online and out of the doctor’s office, the creation of scalable and nimble patient engagement tools can serve to improve patient experience, health care outcomes and health care costs. While the level of enthusiasm for these tools is at an all-time high, there is a growing concern about the unexpected deterrent to the adoption of these tools from an unlikely source: the Telephone Consumer Protection Act of 1991 (TCPA).
Many professionals in the health industry have come to share two misconceptions about the TCPA: first, that the TCPA only applies to marketing phone calls or text message “spam,” and second, that the TCPA does not apply to communications from HIPAA covered entities to their patients/health plan members. These misconceptions can be costly mistakes for covered entities that have designed their patient engagement outreach programs without include a TCPA compliance strategy.
As discussed in a previous post, the TCPA was originally intended to curb abusive telemarketing calls. When applying the law to smarter and increasingly innovative technologies (especially those that we see in the patient engagement world), the TCPA poses significant compliance challenges for the users of these tools that arguably threaten to curb meaningful progress on important public health and policy goals.
Despite its initial scope of addressing robocalls, the TCPA also applies to many automated communications between health care providers and their patients, and between plans and their members. There is a diverse array of technical consent requirements that apply depending on what type of phone call you make. For instance, most auto-dialed marketing calls to cell phones require prior express written consent, meaning that the caller must first obtain written consent before making the call. To make compliance more compliance, callers remain responsible for proving consent and the accuracy of the numbers dialed.
Indeed, the TCPA presents a serious challenge for patient engagement tools, especially when violations of the TCPA can yield statutory damages of up to $1,500 per call or text message. While Federal Communications Commission orders over the past several years have added some clarity and a “safe harbor” for HIPAA-covered entities to help entities achieve compliance, there is still no “free pass” from the TCPA’s requirements. Therefore, covered entities and the business associates who work for them should not assume that compliance with HIPAA offers any security of defense against a successful claim under the TCPA.
On 19 October 2016, the European Court of Justice (ECJ) held (Case C-582/14 – Breyer v Federal Republic of Germany) that dynamic IP addresses may constitute personal data. The ECJ also held that a website operator may collect and process IP addresses for the purpose of protecting itself against cyberattacks, because in the view of the Court, preventing cyberattacks may be a legitimate interest of a website operator in its effort to continue the operability of its website.
The ECJ’s ruling was based on two questions referred to it by the German Federal Court of Justice (BGH). In the underlying German proceedings, a member of the German Pirate Party challenged the German Federal Government’s logging and subsequent use of his dynamic Internet Protocol (IP) address when visiting their websites. While the government is a public authority, the case was argued on the basis of German provisions that address both public and private website operators, and is therefore directly relevant for commercial companies.
On January 6, the Federal Trade Commission (FTC) released a report that it hopes will educate organizations on the important laws and research that are relevant to big data analytics. The report, Big Data: A Tool for Inclusion or Exclusion? Understanding the Issues, looks specifically at how big data is used after it is collected and analyzed and provides suggestions aimed at maximizing the benefits and minimizing the risks of using big data.
Risk and Rewards
The report argues that big data analytics can provide numerous opportunities for improvements in society. In addition to more effectively matching products and services to consumers, big data can create opportunities for low income and underserved communities. The report highlights a number of innovative uses of big data that provide benefits to underserved populations, such as increased educational attainment, access to credit through nontraditional methods, specialized health care for underserved communities, and better access to employment.
At the same time, the report shows that potential inaccuracies and biases might lead to detrimental effects for low-income and underserved populations. For example, organizations could use big data to inadvertently exclude low-income and underserved communities from credit and employment opportunities, which may reinforce existing disparities or weaken the effectiveness of consumer choice.
Considerations for Using Big Data
The report outlines some of the consumer protection laws (in particular, the Fair Credit Reporting Act and FTC Act) and equal opportunity laws that apply to the use of big data, especially with regard to possible issues of discrimination or exclusion. It also recommends that an organization consider the following questions to help ensure that its use of big data analytics does not lead to unlawful exclusion or discrimination:
How representative is your data set?
If the data set is missing information from particular populations, take appropriate steps to address this problem.
Does your data model account for biases?
Review data sets and algorithms to ensure that hidden biases do not have an unintended impact on certain populations.
How accurate are your predictions based on big data?
Balance the risks of using correlative results, especially where the business’ policies could negatively affect certain populations.
Does your reliance on big data cause ethical or fairness concerns?
Consider whether fairness and ethical considerations advise against using big data in certain circumstances and whether the business can use big data in ways that advance opportunities for previously underrepresented populations.
Monitoring and Enforcement Ahead
The FTC stated that its collective challenge is to make sure that big data analytics continue to provide benefits and opportunities to consumers while adhering to core consumer protection values and principles. It has committed to continue monitoring areas where big data practices could violate existing laws and to bring enforcement actions where appropriate. With that in mind, organizations that already use big data and those that are have been persuaded by reported benefits of big data should heed the FTC’s advice. The FTC is highlighting its interest in the consumer protection and equal opportunity ramifications of big data use. This report serves as a warning—a statement of intent—that the FTC will be evaluating data practices in light of these concerns. It is clear that organizations must identify and mitigate the risks in using big data, not only those dealing with privacy and data protection but also those presenting consumer protection and equal opportunity issues. Thinking critically about and taking corrective action in line with the considerations listed above, and creating a record that such steps have been taken, may help organizations using big data to avoid FTC regulatory scrutiny.