The General Data Protection Regulation (GDPR) was the biggest story of 2018 in the field of global privacy and data protection. The GDPR became enforceable in European Union Member States on May 25, 2018, significantly expanding the territorial reach of EU data protection law and introducing numerous changes that affected the way organizations globally process the personal data of their EU customers, employees and suppliers. These important changes required action by companies and institutions around the world. In almost six months after the GDPR’s effective date, organizations are still working on compliance—and will be for years to come.

Critical provisions

The GDPR applies to organizations inside and outside the EU. Organizations “established” inside the EU, essentially meaning a business or unit located in the EU, must comply with the GDPR if they process personal data in the context of that establishment. The GDPR also applies to organizations outside the EU that offer goods or services to, or monitor the behavior of, individuals located in the EU.

The GDPR uses other terms not familiar to US businesses but which need to be understood. Both “data controllers” and “data processors” have obligations under the GDPR, and data subjects can bring actions directly against either or both of those parties. A data controller is an organization that has control over and determines how and why to process data. A data controller is often, but not always, the organization that has the direct relationship with the data subject (the individual about whom the data pertains). A data processor is an organization that processes personal data on behalf of a data controller, typically a vendor or service provider. The GDPR defines “processing” to mean any operation or set of operations performed on personal data or on sets of personal data, whether or not by automated means (e.g., collection, recording, storage, alteration, use, disclosure and structuring).

The GDPR also broadly defines “personal data” as any information directly or indirectly relating to an identified or identifiable natural person, such as a name, identification number, location data, an online identifier, or one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person. Organizations in the US are used to a narrower definition of personal data, which typically includes information that, if breached, would put an individual at risk of identity theft or fraud and require notice (e.g., Social Security numbers, driver’s license numbers, and financial account, credit and debit card numbers). Continue Reading GDPR 6 Months After Implementation: Where are We Now?

The validity of Model Clauses for EU personal data transfer to the United States is now in real doubt as a result of a new Irish High Court judgment stating that there are “well founded grounds” to find the Model Clauses invalid. The issue of Model Clauses as a legitimate data transfer mechanism will now be adjudicated by the European Court of Justice (ECJ), the same court that previously overturned the Safe Harbor arrangement. EU and US companies will need to consider various strategies in anticipation of this decision.

Continue Reading

With the United Kingdom having voted to leave the European Union (Brexit) on 23 June 2016, the free flow of personal data between the United Kingdom and EU and European Economic Area (EEA) countries is at risk. Even though Brexit will likely have the biggest impact on the financial sector, businesses in the United Kingdom that rely on the free flow of personal data to and from EU nations will also be affected. In particular, should the United Kingdom also leave the EEA and thus become a “third country” for the purposes of data protection laws, transfers to data processors in the United Kingdom would have to be based on an adequacy decision of the European Commission, standard contractual clauses (model contracts) or binding corporate rules.

Read the full article here.

On August 17, 2015, the Federal Trade Commission (FTC) announced settlements with 13 companies on charges that they misled consumers by claiming that they were certified members of the U.S.-EU or U.S.-Swiss Safe Harbor programs when in fact their certifications had lapsed or never existed in the first place. The FTC’s announcement comes on the heels of two previous settlements reached in late May 2015 with companies that had lapsed certifications despite representations to the contrary made to online consumers. This recent activity by the FTC serves as yet another reminder to businesses to monitor their Safe Harbor program certification renewal dates and to exercise care when making representations in privacy policies related to Safe Harbor program certification.

The Safe Harbor programs provide a method for U.S. companies to transfer personal data outside of the European Union (EU) or European Economic Area (EEA) consistent with the requirements of the European Union Directive on Data Protection or the Swiss Federal Act on Data Protection. To participate in a Safe Harbor program, a company must self-certify to the U.S. Department of Commerce that it complies with seven privacy principles and related requirements. Once certified, a company is required to renew its certification with the Department of Commerce each year to maintain its status as a current member of the Safe Harbor program.

The companies at the center of the recent enforcement actions represent a variety of industries, including app development, pharmaceutical and biotechnology research, medical waste processing and wholesale food manufacturing. This broad industry representation suggests to us that the FTC is committed to ongoing enforcement. Accordingly, we want to remind readers of these tips:

  • Check your company’s certification status to ensure that it is marked “current” on the Department of Commerce website: https://safeharbor.export.gov/list.aspx;
  • Review any privacy policies and online statements referencing the Safe Harbor programs to ensure that they properly reflect the certification status and the company’s actual privacy and data security practices;
  • Institute a systemic reminder six months prior to the recertification date that triggers compliance review activity with a due date for completion prior to the recertification deadline, together with a requirement that the actual online recertification be completed prior to the annual deadline;
  • Remove all references to the Safe Harbor programs from publicly available privacy policies and statements if the company’s certification status is unclear; and
  • Review substantive compliance with the Safe Harbor programs and institute corrective action and controls to ensure that compliance is maintained.

 

The mission of the French data protection authority—the Commission Nationale Informatique et Libertés (CNIL)—is “to protect personal data, support innovation, [and] preserve individual liberties.”

In addition to its general inspections, every year the CNIL establishes a different targeted-inspection program. This program identifies the specific areas that CNIL’s controls will concentrate on for the following year. The 2014 inspection program was focused on everyday life devices, such as online payment, online tax payment and dating websites, among other things.

On May 25, 2015, the CNIL announced its 2015 inspection program and identified a focus on six issues in particular: contactless payment, Driving Licenses National File (Le Fichier National des Permis de Conduire), the “well-being and health” connected devices, monitoring tools used for attendance in public places, the treatment of personal data during evaluation of psychosocial risks and the Binding Corporate Rules.

The last two issues caught our attention:

  • Treatment of personal data during evaluation of psychosocial risks: Since 2008, many companies have been investigating psychosocial risks within the workplace in order to provide a more stress-free environment. This practice, however, raises issues concerning the employee’s right not to share private information with the employer. The CNIL will try to identify which prior investigations may have jeopardized (or may still be jeopardizing) the employee’s rights to privacy.
  • Binding Corporate Rules: Companies seeking to export data outside of the European Union (EU) may adopt a voluntary set of data-protection rules within their corporate group called Binding Corporate Rules (BCR). These BCRs are intended to provide a level of privacy and data protection within the entire corporate group equivalent to the one found under EU law. So far, 68 companies have adopted BCRs. Through its 2015 inspection program, the CNIL wants to give the BCRs a closer look, making sure that the means and devices used are in compliance with French law.

In addition to focusing its 2015 inspection program on BCR compliance, the CNIL also announced, earlier this year, the simplification of intra-group data transfers. Prior to simplification, companies whose BCRs had been approved by the CNIL were also required to obtain the CNIL’s approval for each new type of transfer. The CNIL has since declared that a new, personalized “single decision” will be given to companies with approved BCRs. In return, the companies must keep an internal record of all transfers detailing certain information (the general purpose of each transfer based on the BCR; the category of data subjects concerned by the transfer; the categories of personal data transferred; and information on each data recipient) in accordance with the terms of the single decision issued.

With respect to its targeted inspection program, the question still remains: How many inspections will the CNIL conduct in 2015? In 2014, the CNIL performed a total number of 421 inspections. The CNIL declares that, in 2015, the objective is to achieve 550 inspections. However, only 28 percent of the CNIL’s inspections typically result from the annual inspection program. Forty percent are initiated by the CNIL itself and the remaining quarter are initiated on request. Using these percentages, if the CNIL reaches its objective of 550 controls, we expect that it will likely have performed somewhere around 154 inspections as result of the 2015 inspection program.

During 2014, the Office for Civil Rights (OCR) of the U.S. Department of Health & Human Services initiated six enforcement actions in response to security breaches reported by entities covered by the Health Insurance Portability and Accountability Act (HIPAA) (covered entities), five of which involved electronic protected health information (EPHI).  The resolution agreements and corrective action plans resolving the enforcement actions highlight key areas of concern for OCR and provide the following important reminders to covered entities and business associates regarding effective data protection programs.

  1. Security risk assessment is key.

OCR noted in the resolution agreements related to three of the five security incidents, involving QCA Health Plan, Inc., New York and Presbyterian Hospital (NYP) and Columbia University (Columbia), and Anchorage Community Mental Health Services (Anchorage), that each entity failed to conduct an accurate and thorough assessment of the risks and vulnerabilities to the entity’s EPHI and to implement security measures sufficient to reduce the risks and vulnerabilities to a reasonable and appropriate level.  In each case, the final corrective action plan required submission of a recent risk assessment and corresponding risk management plan to OCR within a relatively short period after the effective date of the resolution agreement.

      2.  A risk assessment is not enough – entities must follow through with remediation of identified threats and vulnerabilities.

In the resolution agreement related to Concentra Health Services (CHS), OCR noted that although CHS had conducted multiple risk assessments that recognized a lack of encryption on its devices containing EPHI, CHS failed to thoroughly implement remediation of the issue for over 3-1/2 years.

      3.  System changes and data relocation can lead to unintended consequences. 

In two of the cases, the underlying cause of the security breach was a technological change that led to the public availability of EPHI.  A press release on the Skagit County incident notes that Skagit County inadvertently moved EPHI related to 1,581 individuals to a publicly accessible server and initially reported a security breach with respect to only seven individuals, evidentially failing at first to identify the larger security breach.  According to a press release related to the NYP/Columbia security breach, the breach was caused when a Columbia physician attempted to deactivate a personally-owned computer server on the network, which, due to lack of technological safeguards, led to the public availability of certain of NYP’s EPHI on internet search engines.

      4.  Patch management and software upgrades are basic, but essential, defenses against system intrusion.

OCR noted in its December 2014 bulletin on the Anchorage security breach (2014 Bulletin) that the breach was a direct result of Anchorage’s failure to identify and address basic security risks. For example, OCR noted that Anchorage did not regularly update IT resources with available patches and ran outdated, unsupported software.

      5.  HIPAA policies and procedures that merely sit on the shelf are not sufficient.

OCR noted the failure of two covered entities to follow policies and procedures that each entity had adopted.  In the NYP resolution agreement, OCR noted that, with respect to a data sharing arrangement with Columbia, NYP had “failed to comply with its own policies on information access management.”  Similarly, OCR noted in the 2014 Bulletin that its investigation of Anchorage revealed that Anchorage “had adopted sample Security Rule policies and procedures in 2005, but [that] these were not followed.”

Impending sweep day to verify compliance with guidelines on cookies

During the week of September 15–19, 2014, France’s privacy regulator, the Commission Nationale de l’Informatique et des Libertés (CNIL), is organizing a “cookies sweep day” to examine compliance with its guidelines on cookies and other online trackers.

Starting in October 2014, the CNIL will also be conducting onsite and remote inspections to verify compliance with its guidelines on cookies.

Depending on the findings of the sweep and inspections, the CNIL may issue warnings or financial sanctions to non-compliant websites and applications.

Investigations gaining momentum

France is not the only country stepping up its data privacy efforts.  Parallel sweeps to the one conducted by the CNIL in September 2014 will be undertaken simultaneously by data protection authorities across the European Union.  The purpose of the coordinated action is to compare practices on the information given by websites to internet users and the methods to obtain their consent for cookies.

Nor is this the first time such a sweep has been organized in France.  In May 2013, the CNIL joined 19 counterparts worldwide in an audit of the 2,180 most visited websites and applications.  In that operation, known as “Internet Sweep Day”, the CNIL examined the compliance of 250 frequently visited websites and found that 99 percent of websites visited by French internet users collect personal information.  Of those that provided information on their data privacy policy, a considerable number did not render it easily accessible, clearly articulated or even written in French.

Compliance made simpler through CNIL guidelines

EU Directive 2002/58 on Privacy and Electronic Communications imposes an obligation to obtain prior consent before placing or accessing cookies and similar technologies on web users’ devices, an obligation incorporated into French law by Article 32-II of the French Data Protection Act.

Not all cookies require prior consent by internet users.  Exempt are cookies used “for the sole purpose of carrying out the transmission of a communication over an electronic communications network” and those that are “strictly necessary for the provision of an information service explicitly requested by the subscriber or user.”

For those cookies that require prior consent, the CNIL will verify how consent is obtained.  Under the CNIL guidelines, consent may be obtained either through an actual click or by the user’s further navigation within the site notwithstanding a continuing banner informing him or her of the website’s use of cookies.

Website owners can rely on tools made available by the CNIL to ensure their compliance with the cookie requirements.  In particular, a set of guidelines released by the CNIL in December 2013 explains how to obtain consent for the use of cookies and other online trackers in compliance with EU and French data protection requirements.

Under the CNIL guidelines, owners of websites may not force internet users to accept cookies.  Instead, the users must be able to block advertising cookies and still use the relevant service.  Internet users can withdraw their consent at any time, and cookies have a lifespan limited to 13 months after which consent must be sought again.

This blog post was authored with assistance from May El Khoury, a third-year student at Boston College Law School.

On 12 June 2014, in a letter from the Article 29 Data Protection Working Party to the President of the European Parliament, the Working Party has defended, and urged the EU institutions to discuss, Binding Corporate Rules for Processors (BCR-P) in respect of the forthcoming EU General Data Protection Regulation.

In its letter, the Working Party clarifies its views on BCR-P, outlines the safeguards that BCR-P offer and addresses concerns that have led some to call for the dropping of BCR-P. The letter suggests that these issues should be covered during future trialogues between the EU Council, the European Commission (whom both received copies of the letter) and the European Parliament.

Background

Binding Corporate Rules (BCR) represent one of the ways that a data controller can overcome the general prohibition contained in the EU Data Protection Directive (95/46/EC) on cross-border transfers of personal data to countries outside the EEA that do not offer adequate levels of data protection. Broadly, BCR are legally enforceable corporate rules applied by company groups which, on the approval of the relevant national data protection authority, are deemed to ensure sufficient protection for international transfers between group companies.

In December 2011, the European Commission announced that BCR would be updated in the new EU General Data Protection Regulation. Whilst BCR only apply to data controllers, the Working Party is a proponent for BCR-P (which apply similarly to data processors rather than data controllers) and, in June 2012, established a BCR-P framework. In brief, BCR-P permit data processors, on the instruction of data controllers, to forward personal data to their group companies, otherwise known as “sub-processing”. The Working Party has officially permitted companies to apply for BCR-P since January 2013. To date, three international organisations have BCR-P approved by their national data protection authorities, with a further 10 currently under review.

In Defence of BCR-P

In its letter, the Working Party encloses an explanatory document setting out the main guarantees offered to data controllers, data subjects and data protection authorities generally, relating to:

  • Use of external sub-processors;
  • Conflict between an applicable legislation and BCR-P and/or Service agreements / Access by law enforcement authorities;
  • Controllers’ rights;
  • Data subjects’ rights;
  • Processors’ obligations towards data protection authorities; and
  • Implementation of accountability measures.

The Working Party also stresses the high level of protection that BCR-P offer to international transfers of personal data, which, according to the Working Party represent the “optimal solution” to encourage data protection principles abroad. In the alternative, the Working Party suggests that model clauses or Safe Harbour do not offer a comparable level of protection.

In response to calls for the European Parliament to drop BCR-P from future legislation due to a lack of guarantees to frame sub-processing activities, the Working Party clarifies that BCR-P offer greater levels of protection that those currently provided by the European Parliament. Furthermore, the Working Party concludes that to drop BCR-P would create legal uncertainty and represent a loss generally to those organisations with approved BCR-P or those currently applying for such approval.

On May 30, 2014, the European Union’s Article 29 Data Protection Working Party adopted “Statement on the role of a risk-based approach in data protection legal frameworks” (WP281).  The Working Party, made up of EU member state national data protection authorities, confirmed its support for a risk-based approach in the EU data protection legal framework, particularly in relation to the proposed reform of the current data protection legislation.  However, with a view to “set the record straight,” the Working Party also addresses its concerns as to the interpretation of such an approach and sets out its “key messages” on the issue.

Approaching Risk

In support of the risk-based approach, which broadly calls for increased obligations proportionate to the risks involved in data processing, the Working Party sets out examples of its application in the current Data Protection Directive (95/46/EC) and the proposed General Data Protection Regulation.  The Working Party confirms that the risk-based approach must result in the same level of protection for data subjects, no matter the size of the particular organisation or the amount of data processed.  However, the Working Party clarifies that the risk-based approach should not be interpreted as an alternative to established data protection rights, but instead a “scalable and proportionate approach to compliance.”  Consequently, the Working Party accepts that low-risk data processing may involve less stringent obligations on data controllers than comparatively high-risk data processing.

Key Messages

To conclude its views on the risk-based approach, the Working Party establishes 13 key messages – in summary:

  1. Protection of personal data is a fundamental right and any processing should respect that right;
  2. Whatever the level of risk involved, data subjects’ legal rights should be respected;
  3. While the levels of accountability obligations can vary according to the risk of the processing, data controllers should always be able to demonstrate compliance with their data protections obligations;
  4. While fundamental data protection principles relating to data controllers should remain the same whatever the risks posed to data subjects, such principles are still inherently scalable;
  5. Accountability obligations should be varied according to the type and risk of processing involved;
  6. All data controllers should document their processing, although the form of documentation can vary according to the level of risk posed by the processing;
  7. Objective criteria should be used when determining risks which could potentially negatively impact a data subject’s rights, freedoms and interests;
  8. A data subject’s rights and freedoms primarily concerns the right to privacy, but also encompasses other fundamental rights, such as freedom of speech, thought and movement, prohibition on discrimination, and the right to liberty, conscience and religion;
  9. Where specific risks are identified, additional measures should be taken – data protection authorities should be consulted regarding highly risky processing;
  10. WHile pseudonymising techniques are important safeguards that can be taken into account when assessing compliance, such techniques alone do not justify a reduced regime on accountability obligations;
  11. The risk-based approach should be assessed on a very wide scale and take into account every potential/actual adverse effect;
  12. The legitimate interest pursued by data controllers or third parties is not relevant when assessing the risks for data subjects; and
  13. Under the proposed General Data Protection Regulation, data protection authorities will have an active role in respect of the risk-based approach, including inter alia developing guidelines on impact assessments and targeting enforcement activity on areas of greater risk.

German data protection authorities published new guidelines in December 2013 about the collection and processing of personal data for advertising purposes.  The 2013 advertising guidelines (available here in German) supplement another set of advertising guidelines published in October 2012 (available here in German). Together, the 2012 and 2013 guidelines help to clarify how German data protection law relates to advertising activities.

The 2013 guidelines cover the following three main topics:

  1. The “list-privilege” exception. The Bundesdatenschutzgesetz (Germany’s Data Protection Act) provides an exception to the rule that consent is required before personal data is processed for advertising. Certain personal data, such as name, address and year of birth, may be used for an organisation’s own marketing purposes without prior consent as long as the data is aggregated. The 2013 guidelines provide useful information on this exception by, for example, clearly stating that e-mail addresses and telephone numbers do not qualify for this exemption, and by providing commentary on how long the information qualifies for this exception after the last time it is used to contact the person to whom the data relates (generally two years, but the 2013 guidelines state that the time period can vary based on the facts).
  2. Consent. The 2013 guidelines confirm that leaving business cards at a trade show for the express purpose of receiving information from business contacts constitutes consent to contact  the person named on the business card for advertising purposes. For the digital world, the 2013 guidelines advise a double opt-in for consent provided electronically (e.g., via e-mail or SMS).  Under the 2013 guidelines, double opt-in consent means that the person providing personal data about himself or herself must: (i) affirmatively consent (e.g., by clicking a button or checking an unchecked box on a website) at the time of data collection; and (ii) confirm his/her consent after receiving a written request for confirmation of consent (e.g., a detailed e-mail requiring confirmation by clicking a link) that includes enough information to enable the person to provide informed consent.
  3. Right to object to use of personal data for advertising purposes. The 2013 guidelines state that when a person indicates (whether in writing or otherwise) that he/she no longer wishes to be contacted for advertising purposes, the organisation holding the data should take action in accordance with such a request without “undue delay.” (The 2013 guidelines don’t specify what time period is acceptable for undue delay.)  The 2013 guidelines do, however, recognise that stopping all communications immediately may not be feasible, and advises organisations to inform individuals of any time lag to avoid complaints. The 2013 guidelines also remind organisations that a statement publicising an individual’s right to object should be included on every marketing communication, not just in (often lengthy) website terms and conditions.