Photo of Scott Weinstein

Scott A. Weinstein provides legal counsel on health care regulatory compliance, contracting and transactional due diligence, with a focus on health information privacy and security, Medicare and Medicaid's health information technology and quality reporting requirements, and clinical research regulations. Scott additionally provides legal counsel on federal and state privacy and data protection laws, with a focus on privacy audits and the development of internal and externally facing privacy policies for websites and mobile applications. Read Scott Weinstein's full bio.

New cybersecurity regulations issued by the NYDFS define the nonpublic information they regulate in exceptionally broad terms. This expanded definition of Nonpublic Information will create major challenges for regulated companies and their third-party service providers that will likely ripple through other ancillary industries.

Continue Reading.

On December 7, 2016, the US Congress approved the 21st Century Cures Act (Cures legislation), which is intended to accelerate the “discovery, development and delivery” of medical therapies by encouraging public and private biomedical research investment, facilitating innovation review and approval processes, and continuing to invest and modernize the delivery of health care. The massive bill, however, also served as a vehicle for a variety of other health-related measures, including provisions relating to health information technology (HIT) and related digital health initiatives.  President Barack Obama has expressed support for the Cures legislation and is expected to sign the bill this month.

The HIT provisions of the Cures legislation in general seek to:

  • Reduce administrative and regulatory burdens associated with providers’ use of electronic health records (EHRs)
  • Advance interoperability
  • Promote standards for HIT
  • Curb information blocking
  • Improve patient care and access to health information in EHRs

As public and private payers increasingly move from fee-for-service payments to value-based payment models, with a focus on maximizing health outcomes, population health improvement, and patient engagement, HIT—including EHRs and digital health tools—will be increasingly relied upon to collect clinical data, measure quality and cost effectiveness; assure continuity of care between patients and providers in different locations; and develop evidence-based clinical care guidelines.

Read the full article.

As we reported in May 2014, the Federal Trade Commission (FTC) convened stakeholders to explore whether health-related information collected from and about consumers — known as consumer-generated health information (CHI) — through use of the internet and increasingly-popular lifestyle and fitness mobile apps is more sensitive and in need of more privacy-sensitive treatment than other consumer-generated data.

One of the key questions raised during the FTC’s CHI seminar is: “what is consumer health information”?  Information gathered during traditional medical encounters is clearly health-related.  Information gathered from mobile apps designed as sophisticated diagnostic tools also is clearly health-related — and may even be “Protected Health Information,” as defined and regulated by Health Information Portability and Accountability Act (HIPAA), depending on the interplay of the app and the health care provider or payor community.  But, other information, such as diet and exercise, may be viewed by some as wellness or consumer preference data (for example, the types of foods purchased).  Other information (e.g., shopping habits) may not look like health information but, when aggregated with other information generated by and collected from consumers, may become health-related information.  Information, therefore, may be “health information,” and may be more sensitive as such, depending on (i) the individual from whom it is collected, (ii) the context in which it is initially collected; (iii) the other information which it is combined; (iv) the purpose for which the information was initially collected; and (v) the downstream uses of the information.

Notably, the FTC is not the only regulatory body struggling with how to define CHI.  On February 5, 2015, the European Union’s Article 29 Working Party (an EU representative body tasked with advising EU Member States on data protection) published a letter in response to a request from the European Commission to clarify the definitional scope of “data concerning health in relation to lifestyle and wellbeing apps.”

The EU’s efforts to define CHI underscore the importance of understanding CHI.  The EU and the U.S. data privacy and security regimes differ fundamentally in that the EU regime broadly protects personally identifiable information.  The US does not currently provide universal protections for personally identifiable information.  The U.S. approach varies by jurisdiction and type of information and does not uniformly regulate the mobile app industry or the CHI captured by such apps.  These different regulatory regimes make the EU’s struggle to define the precise scope and definition of “lifestyle and wellbeing” data (CHI) and develop best practices going forward all the more striking because, even absent such a definition, the EU privacy regime would offer protections.

The Article 29 Working Party letter acknowledges the European Commission’s work to date, including the European Commission’s “Green Paper on Mobile Health,” which emphasized the need for strong privacy and security protections, transparency – particularly with respect to how CHI interoperates with big data  – and the need for specific legislation on CHI-related  apps or regulatory guidance that will promote “the safety and performance of lifestyle and wellbeing apps.”  But, in its annex to the Article 29 Working Party letter, the Working Party notes: “due to the wide range of personal data that may fall into the category of health data, this category represents one of the most complex areas of sensitive data and …display[s] a great deal of diversity and legal uncertainty.”    Thus, even within the more protective EU data privacy regime, regulators acknowledge the likely need for specific privacy and security protections in light of the consumer-driven nature of CHI, the myriad mechanisms in which such data is collected and aggregated in the digital landscape, and the difficulty in tracing, tracking and predicting how such data will be aggregated, disaggregated and otherwise used.

As a starting point, the annex to the Article 29 Working Party letter presents a framework for determining when personal data are health data, which is:

  1. “The data are inherently/clearly medical data.
  2. The data are raw sensor data that can be used in itself or in combination with other data to draw a conclusion about the actual health status or health risk of a person.
  3. Conclusions are drawn about a person’s health status or health risk (irrespective of whether these conclusions are accurate or inaccurate, legitimate or illegitimate, or otherwise adequate or inadequate).”

The Annex also notes the importance of obtaining “the unambiguous consent of the data subject,” given that many CHI-related mobile apps collect and process location data and data collected through sensors, which, when combined with other data, could identify a  person’s health status.

Back in the United States, the FTC continues to signal its interest in mobile applications that collect and analyze CHI.  On February 23, 2015, the FTC released a pair of consent orders about two different mobile applications, alleging that the apps did not perform as advertised.  Although these consent orders do not expressly address the data privacy implications of the apps, they signal that the FTC is monitoring the representations that apps collecting and using CHI are making to consumers.

As mobile apps become more sophisticated and assist patients and providers with the active detection and management of health conditions, we expect that the need for clarity and consensus about reasonable data privacy and protection practices with respect to CHI will intensify because this need for clarity and consensus is something about which both U.S. and EU regulators can agree.

In the wake of recent breaches of personally identifiable information (PII) suffered by health insurance companies located in their states, the New Jersey Legislature passed, and the Connecticut General Assembly will consider legislation that requires health insurance companies offering health benefits within these states to encrypt certain types of PII, including social security numbers, addresses and health information.  New Jersey joins a growing number of states (including California (e.g., 1798.81.5), Massachusetts (e.g., 17.03) and Nevada (e.g., 603A.215)) that require organizations that store and transmit PII to implement data security safeguards.   Massachusetts’ data security law, for example, requires any person or entity that owns or licenses certain PII about a resident of the Commonwealth to, if “technically feasible” (i.e., a reasonable technological means is available), encrypt information stored on laptops and other portable devices and encrypt transmitted records and files that will travel over public networks.  Unlike Massachusetts’ law New Jersey’s new encryption law only applies to health insurance carriers that are authorized to issue health benefits in New Jersey (N.J. Stat. Ann. §  56:8-196) but requires health insurance carriers to encrypt records with the PII protected by the statute when stored on any end-user systems and devices, and when transmitted electronically over public networks (e.g., N.J. Stat. Ann. § 56.8-197).

At the federal level, the Health Insurance Portability and Accountability Act (HIPAA) already requires health plans, as well as other “covered entities” (i.e., health providers)  and their “business associates” (i.e., service providers who need access to a covered entity’s health information to perform their services), to encrypt stored health information or health information transmitted electronically if “reasonable and appropriate” for them to do so (45 C.F.R. §§ 164.306; 164.312).  According to the U.S. Department of Health and Human Services, health plans and other covered entities and their business associates should consider a variety factors to determine whether a security safeguard is reasonable and appropriate, including: (1) the covered entity or business associate’s risk analysis; (2) the security measures the covered entity or business associate already has in place; and (3) the costs of implementation (68 Fed. Reg. 8336).  If the covered entity or business associate determines that encryption of stored health information or transmitted information is not reasonable and appropriate, however, the covered entity or business associate may instead elect to document its determination and implement an equivalent safeguard.

The New Jersey law and the Connecticut proposal appear to reflect a legislative determination that encryption of stored or transmitted health information is always reasonable and appropriate for health plans to implement, regardless of the other safeguards that the health plan may already have in place.  As hackers become more sophisticated and breaches more prevalent in the health care industry, other states may follow New Jersey and Connecticut by expressly requiring health plans and other holders of health care information to implement encryption and other security safeguards, such as multifactor authentication or minimum password complexity requirements.  In fact, Connecticut’s Senate Democrats have indicated that their proposal will address user authentication protocol requirements, in addition to encryption.  We will continue to monitor Connecticut’s legislative proposal and track additional developments in state data security laws during this year’s legislative sessions.

On May 1, 2014, the White House released two reports addressing the public policy implications of the proliferation of big data. Rather than trying to slow the accumulation of data or place barriers on its use in analytic endeavors, the reports assert that big data is the “new normal” and encourages the development of policy initiatives and legal frameworks that foster innovation, promote the exchange of information and support public policy goals, while at the same time limiting harm to individuals and society. This Special Report provides an overview of the two reports, puts into context their conclusions and recommendations, and extracts key takeaways for businesses grappling with understanding what these reports—and this “new normal”—mean for them.

Read the full article.