personally identifiable information
Subscribe to personally identifiable information's Posts

Recent Decisions Narrow Scope of Liability under Video Privacy Protection Act

Two significant decisions under the Video Privacy Protection Act (VPPA) in recent weeks have provided new defenses to companies alleged to have run afoul of the statute.  Bringing the long-running litigation against Hulu to a close–at least pending appeal–the court in In re: Hulu Privacy Litigation granted summary judgment in favor of Hulu, holding that the plaintiffs could not prove that Hulu knowingly violated the VPPA.  A week later in a more recently filed case, Austin-Spearman v. AMC Network Entertainment, LLC, the court dismissed the complaint on the basis that the plaintiff was not a “consumer” protected by the VPPA.  Both rulings provide comfort to online content providers, while also raising new questions as to the scope of liability under the VPPA.

In re: Hulu Privacy Litigation

In a decision with wide-ranging implications, the Hulu court granted summary judgment against the plaintiffs, holding that they had not shown that Hulu knowingly shared their video selections in violation of the VPPA.  The plaintiffs’ allegations were based on Hulu’s integration of a Facebook “Like” button into its website.  Specifically, the plaintiffs alleged that when the “Like” button loaded on a user’s browser, Hulu would automatically send Facebook a “c_user” cookie containing the user’s Facebook user ID.  At the same time, Hulu would also send Facebook a “watch page” that identified the video requested by the user.  The plaintiffs argued that Hulu’s transmission of the c_user cookie and the watch page allowed Facebook to identify both the Hulu user and that user’s video selection and therefore violated the VPPA.

The plaintiffs’ case foundered, however, on their inability to demonstrate that Hulu knew that Facebook’s linking of those two pieces of information was a possibility.  According to the court, “there is no evidence that Hulu knew that Facebook might combine a Facebook user’s identity (contained in the c_user cookie) with the watch page address to yield ‘personally identifiable information’ under the VPPA.”  Without showing that Hulu had knowingly disclosed a connection between the user’s identity and the user’s video selection, there could be no VPPA liability.

The court’s decision, if upheld on appeal, is likely to provide a significant defense to online content providers sued under the VPPA.  Under the decision, plaintiffs must now be able to show not only that the defendant company knew that the identity and video selections of the user were disclosed to a third party, but also that the company knew that that information was disclosed in manner that would allow the third party to combine those pieces of information to determine which user had watched which content.  While Hulu prevailed only at the summary judgment stage after four years of litigation, other companies could likely make use of this same rationale at the pleadings stage, insisting that plaintiffs set out a plausible case in their complaint that the defendant had the requisite level of knowledge.

Austin Spearman v. AMC Network Entertainment

The AMC decision turned on the VPPA’s definition of the term “consumer” and illustrated how that seemingly [...]

Continue Reading




Consumer Health Information Update from Both Sides of the Atlantic

As we reported in May 2014, the Federal Trade Commission (FTC) convened stakeholders to explore whether health-related information collected from and about consumers — known as consumer-generated health information (CHI) — through use of the internet and increasingly-popular lifestyle and fitness mobile apps is more sensitive and in need of more privacy-sensitive treatment than other consumer-generated data.

One of the key questions raised during the FTC’s CHI seminar is: “what is consumer health information”?  Information gathered during traditional medical encounters is clearly health-related.  Information gathered from mobile apps designed as sophisticated diagnostic tools also is clearly health-related — and may even be “Protected Health Information,” as defined and regulated by Health Information Portability and Accountability Act (HIPAA), depending on the interplay of the app and the health care provider or payor community.  But, other information, such as diet and exercise, may be viewed by some as wellness or consumer preference data (for example, the types of foods purchased).  Other information (e.g., shopping habits) may not look like health information but, when aggregated with other information generated by and collected from consumers, may become health-related information.  Information, therefore, may be “health information,” and may be more sensitive as such, depending on (i) the individual from whom it is collected, (ii) the context in which it is initially collected; (iii) the other information which it is combined; (iv) the purpose for which the information was initially collected; and (v) the downstream uses of the information.

Notably, the FTC is not the only regulatory body struggling with how to define CHI.  On February 5, 2015, the European Union’s Article 29 Working Party (an EU representative body tasked with advising EU Member States on data protection) published a letter in response to a request from the European Commission to clarify the definitional scope of “data concerning health in relation to lifestyle and wellbeing apps.”

The EU’s efforts to define CHI underscore the importance of understanding CHI.  The EU and the U.S. data privacy and security regimes differ fundamentally in that the EU regime broadly protects personally identifiable information.  The US does not currently provide universal protections for personally identifiable information.  The U.S. approach varies by jurisdiction and type of information and does not uniformly regulate the mobile app industry or the CHI captured by such apps.  These different regulatory regimes make the EU’s struggle to define the precise scope and definition of “lifestyle and wellbeing” data (CHI) and develop best practices going forward all the more striking because, even absent such a definition, the EU privacy regime would offer protections.

The Article 29 Working Party letter acknowledges the European Commission’s work to date, including the European Commission’s “Green Paper on Mobile Health,” which emphasized the need for strong privacy and security protections, transparency – particularly with respect to how CHI interoperates with big data  – and the need for specific legislation on CHI-related  apps or regulatory guidance that will promote “the safety and performance of lifestyle and wellbeing apps.”  But, [...]

Continue Reading




States Respond to Recent Breaches with Encryption Legislation

In the wake of recent breaches of personally identifiable information (PII) suffered by health insurance companies located in their states, the New Jersey Legislature passed, and the Connecticut General Assembly will consider legislation that requires health insurance companies offering health benefits within these states to encrypt certain types of PII, including social security numbers, addresses and health information.  New Jersey joins a growing number of states (including California (e.g., 1798.81.5), Massachusetts (e.g., 17.03) and Nevada (e.g., 603A.215)) that require organizations that store and transmit PII to implement data security safeguards.   Massachusetts’ data security law, for example, requires any person or entity that owns or licenses certain PII about a resident of the Commonwealth to, if “technically feasible” (i.e., a reasonable technological means is available), encrypt information stored on laptops and other portable devices and encrypt transmitted records and files that will travel over public networks.  Unlike Massachusetts’ law New Jersey’s new encryption law only applies to health insurance carriers that are authorized to issue health benefits in New Jersey (N.J. Stat. Ann. §  56:8-196) but requires health insurance carriers to encrypt records with the PII protected by the statute when stored on any end-user systems and devices, and when transmitted electronically over public networks (e.g., N.J. Stat. Ann. § 56.8-197).

At the federal level, the Health Insurance Portability and Accountability Act (HIPAA) already requires health plans, as well as other “covered entities” (i.e., health providers)  and their “business associates” (i.e., service providers who need access to a covered entity’s health information to perform their services), to encrypt stored health information or health information transmitted electronically if “reasonable and appropriate” for them to do so (45 C.F.R. §§ 164.306; 164.312).  According to the U.S. Department of Health and Human Services, health plans and other covered entities and their business associates should consider a variety factors to determine whether a security safeguard is reasonable and appropriate, including: (1) the covered entity or business associate’s risk analysis; (2) the security measures the covered entity or business associate already has in place; and (3) the costs of implementation (68 Fed. Reg. 8336).  If the covered entity or business associate determines that encryption of stored health information or transmitted information is not reasonable and appropriate, however, the covered entity or business associate may instead elect to document its determination and implement an equivalent safeguard.

The New Jersey law and the Connecticut proposal appear to reflect a legislative determination that encryption of stored or transmitted health information is always reasonable and appropriate for health plans to implement, regardless of the other safeguards that the health plan may already have in place.  As hackers become more sophisticated and breaches more prevalent in the health care industry, other states may follow New Jersey and Connecticut by expressly requiring health plans and other holders of health care information to implement encryption and other security safeguards, such as multifactor authentication or minimum password complexity requirements.  In fact, Connecticut’s Senate [...]

Continue Reading




The California AG’s New Guide on CalOPPA – A Summary for Privacy Pros

Last week, the California Attorney General’s Office (AGO) released a series of recommendations entitled Making Your Privacy Practices Public (Guide) designed to help companies meet the requirements of California’s Online Privacy Protection Act (CalOPPA) and “provide privacy policy statements that are meaningful to consumers.”

As we have previously discussed, CalOPPA requires website operators to disclose (1) how they respond to Do Not Track (DNT) signals from browsers and other mechanism that express the DNT preference, and (2) whether third parties use or may use the site to track (i.e., collect personally identifiable information about) individual California residents “over time and across third party websites.”   Since the disclosure requirements became law, however, there has been considerable confusion among companies on how exactly to comply, and some maintain that despite W3C efforts, there continues to be no industry-wide accepted definition of what it means to “respond” to DNT signals.  As a result, the AGO engaged in an outreach process, bringing stakeholders together to provide comments on draft recommendations over a period of several months, finally culminating in the AGO publishing the final Guide earlier this week.

The Guide is just that – a guide – rather than a set of binding requirements.  However, the recommendations in the Guide do seem to present a road map for how companies might steer clear of an AGO enforcement action in this area.  As a result, privacy professionals may want to consider matching up the following key recommendations from the Guide with existing privacy policies, to confirm that they align or to consider whether it is necessary and appropriate to make adjustments:

  • Scope of the Policy:  Explain the scope of the policy, such as whether it covers online or offline content, as well as other entities such as subsidiaries.
  • Availability:  Make the policy “conspicuous” which means:
    • for websites, put a link on every page that collects personally identifiable information (PII).
    • for mobile apps that collect PII, put link at point of download, and from within the app – for example: put a link accessible from the “about” or “information” or “settings” page.
  • Do Not Track:
    • Prominently label the section of your policy regarding online tracking, for example: “California Do Not Track Disclosures”.
    • Describe how you respond to a browser’s Do Not Track signal or similar mechanisms within your privacy policy instead of merely providing a link to another website; when evaluating how to “describe” your response, consider:
      • Do you treat users whose browsers express the DNT signal differently from those without one?
      • Do you collect PII about browsing activities over time and third party sites if you receive the DNT signal?  If so, describe uses of the PII.
    • If you choose to link to an online program rather than describe your own response, provide the link with a general description of what the program does.
  • Third Party Tracking:
    • Disclose whether third parties are or may be collecting PII.
    • When drafting the disclosure [...]

      Continue Reading



STAY CONNECTED

TOPICS

ARCHIVES

2021 Chambers USA top ranked firm
LEgal 500 EMEA top tier firm 2021