Workplace Privacy
Subscribe to Workplace Privacy's Posts

Data Protection During and After the Pandemic: Consolidate, Update and Innovate

Having adapted products, processes, services, facilities and IT systems in response to Coronavirus (COVID-19), businesses should now refocus on their legal and business fundamentals as they move towards returning to the office. Compliance policies should be updated, Brexit contingency plans reinvigorated, and upcoming legal and regulatory changes anticipated.

While taking these steps, businesses should bear in mind a number of key data protection and IT/cybersecurity fundamentals, and take the opportunities afforded by the return to work period to kick-start new initiatives.

Click here to read the full article, and many more in our latest International News: Focus on Global Privacy and Cybersecurity.

 




Brazil’s LGPD Takes Effect—With Early Enforcement

Brazil represents over half of all IT spend in Latin America, has the largest regional market for software outsourcing, employs a sizable IT workforce, manufactures consumer goods (including commercial airplanes and cars) and has an active consumer market of social media operated by global data aggregators. At a time when data privacy is becoming increasingly important to consumers, it seems only fitting that Brazil would adopt comprehensive privacy legislation to protect data privacy rights.

The General Data Protection Law, the first law of its kind in Brazil, is now in effect, and we are already seeing enforcement. Streamlining the legal framework on data protection, the law sets forth a number of requirements addressing legal bases for processing, individual rights, governance and accountability and data transfers.

Access the article.




The Toughest Problem Set: Navigating Regulatory and Operational Challenges on University Campuses

When the academic year ended in the spring of 2020, many US university students assumed that a return to campus would be straightforward this fall. However, it is now clear—at least in the near term—that a return to the old “normal” will not be possible. Some universities have concluded that their best course of action is to offer only distanced learning for the time being. Other universities, however, are welcoming students back onto campus, and into residence and dining halls, classrooms, labs and libraries. Each of those universities is developing its own approach to retain the benefits of on-campus student life while reducing risk to the greatest extent possible; nevertheless, some have had to adjust their plans to pivot to remote learning when faced with clusters of positive cases on campus. One thing is clear: The fall semester will be a real-time, national learning laboratory.

Because widespread, rapid testing remains unavailable in many locations, universities have had to find innovative ways to implement testing, tracing and isolation protocols to reduce the risk of transmission among students, faculty and staff. There is no one perfect protocol—all universities are in unchartered waters. But there are a few key components university administrators may want to consider and address:

  • Apps: Symptom checkers, contact tracing and other apps can be useful in identifying and focusing attention on the onset of symptoms, fostering accountability and identifying high-risk exposure. In considering whether to incorporate apps and related technologies into their back to campus plans, universities must anticipate and address considerations related to privacy, security and reporting of results, and will need to consider how such apps are hosted (for example, through Apple’s App Store) and whether any third parties will have access to the personal data collected.
  • Contact Tracing: In addition to the issues noted above, contact tracing efforts also present other challenges, including managing reliability, over/under inclusiveness and liability (for both false positives and false negatives). In addition, the effectiveness of contact tracing is closely tied to its speed and comprehensiveness; to implement a successful contact tracing program, universities will need to balance effectiveness with privacy and autonomy.
  • CLIA: The Clinical Laboratory Improvement Act (CLIA) will require that many of the tests be performed in CLIA-certified (and state-licensed, where required) space. Universities will need to consider how best to handle building out additional compliant space, creating additional “point of care” testing or specimen collection sites if needed to test students, faculty and staff where they are and validating the test(s) being offered. Tests that are not yet validated likely cannot be used to return patient-specific results that inform student and staff care or be used to prompt “official” testing.
  • FDA/Emergency Use Authorizations (EUA): In general, the Food and Drug Administration (FDA) expects developers of molecular, antigen and (in the case of test kit manufacturers) antibody tests to obtain an EUA. However, under FDA enforcement policies during the pandemic, many of these same tests—if validated and offered with appropriate agency-mandated disclaimers—can be offered before [...]

    Continue Reading



Public Backlash Calls Use of Facial Recognition Systems into Question

In recent weeks and months, legal and technical issues related to use of facial recognition systems in the United States have received national attention, including concerns that the technology lacks accuracy in identifying non-white individuals and that its widespread use by police departments may play a role in racially discriminatory policing. Privacy considerations will play a key role in the ongoing debate over the future of facial recognition technology.

Facial recognition systems (FRS) are automated or semi-automated technologies that analyze an individual’s features by extracting facial patterns from video or still images. FRS use attributes or features of an individual’s face to create data that can be used for the unique personal identification of a specific individual. FRS use has grown exponentially in recent years. In addition to widespread adoption by law enforcement agencies, FRS are also frequently used in retail, banking and security sectors, such as airport screening. Particularly in recent weeks and months, legal and technical issues associated with FRS have come to the forefront, including concerns that the technology lacks accuracy in identifying non-white individuals and that its widespread use by police departments may play a role in racially discriminatory policing.

In response to the global Coronavirus (COVID-19) pandemic, public health agencies and private sector companies have considered ways that FRS might be used in conjunction with proximity and geolocation tracking data to control the disease’s spread. Some foreign governments have implemented extensive biometric and behavioral monitoring to track and contain the spread of the virus, and have used FRS to identify persons who have been in contact with COVID-19-positive individuals and to enforce quarantine or stay-at-home orders. By contrast, use of FRS in the United States already faced opposition because of pre-COVID-19 data privacy concerns, and has encountered increased backlash after the civil rights protests of the past month due to concerns over the technology’s accuracy and accompanying questions regarding its use by law enforcement agencies.

Accuracy Concerns

There are currently no industry standards for the development of FRS, and as a result, FRS algorithms differ significantly in accuracy. A December 2019 National Institute of Standards and Technology (NIST) study, the third in a series conducted through its Face Recognition Vendor Test program, evaluated the effects of factors such as race and sex on facial recognition software. The study analyzed 189 facial recognition algorithms from 99 developers, using collections of photographs with approximately 18 million images of eight million people pulled from databases provided by the US Department of State, the Department of Homeland Security and the Federal Bureau of Investigation. The study found disproportionately higher false positive rates for African American, Asian and Native American faces for one-to-one matching, and higher rates of false positives for African American females for one-to-many matching. The effect of the high rate of false positives for African American females put this group at the greatest risk of misidentification. While law enforcement is encouraged to adopt a high threshold recognition percentage—often 99%—for the use of FRS, in reality police departments exercise [...]

Continue Reading




Does GDPR Regulate My Research Studies in the United States?

The General Data Protection Regulation (GDPR) establishes protections for the privacy and security of personal data (Personal Data) about individuals in the European Union (EU) single market countries, and potentially affects the clinical and other scientific research activities of academic medical centers and other research organizations in the United States.

This On the Subject includes frequently asked questions that discuss the extent to which United States research organizations must comply with GDPR when conducting research. Future coverage will address the impact of GDPR on other aspects of the United States health care sector.

Continue reading.




To Scan or Not to Scan: Surge in Lawsuits under Illinois Biometrics Law

Although the Illinois Biometric Information Privacy Act has been on the books for almost 10 years, a recent surge in lawsuits has likely been brought on by developments in biometric scanning technology and its increased use in the workplace. At least 32 class action lawsuits have been filed in recent months by Illinois residents in state court challenging the collection, use and storage of biometric data by companies in the state. This could potentially cause a reevaluation of company strategies and development of new defenses in the use of advancing biometric technology.

Read “To Scan or Not to Scan: Surge in Lawsuits under Illinois Biometrics Law.”




Where Are We Now? The NIST Cybersecurity Framework One Year Later

The National Institute of Standards and Technology (NIST) released its Cybersecurity Framework (Framework) almost 15 months ago and charged critical infrastructure companies within the United States to improve their cybersecurity posture. Without question, the Framework has sparked a national conversation about cybersecurity and the controls necessary to improve it.  With regulators embracing the Framework, industry will want to take note that a “voluntary” standard may evolve into a de facto mandatory standard.”

Read the full On the Subject on the NIST Cybersecurity Framework on the McDermott website.




Privacy and Data Protection: 2014 Year in Review

In 2014, regulators around the globe issued guidelines, legislation and penalties in an effort to enhance security and control within the ever-shifting field of privacy and data protection. The Federal Trade Commission confirmed its expanded reach in the United States, and Canada’s far-reaching anti-spam legislation takes full effect imminently. As European authorities grappled with the draft data protection regulation and the “right to be forgotten,” the African Union adopted the Convention on Cybersecurity and Personal Data, and China improved the security of individuals’ information in several key areas. Meanwhile, Latin America’s patchwork of data privacy laws continues to evolve as foreign business increases.

This report furnishes in-house counsel and others responsible for privacy and data protection with an overview of key action points based on these and other 2014 developments, along with advance notice of potential trends in 2015. McDermott will continue to report on future updates, so check back with us regularly.

Read the full report here.




Are You Monitoring Your French Employees? Make Sure You Have Registered That Activity with the CNIL!

French employers must declare monitoring to the French Data Protection Authority (CNIL) in advance if they want to use evidence obtained from that monitoring in court.   The use of the employee’s company mailbox for personal purposes is tolerated under French law, when reasonable. Where it is considered abusive, however, it could constitute a breach of conduct against which the employer may impose sanctions.

Employers generally use monitoring software to discourage and establish evidence of abuse. Such software may be lawful provided the employer follows the rules stipulated by the French Labor Code and the French Data Protection Act to ensure the protection of personal data. In particular, the employer must submit information to and engage in consultation with the works council, provide information to employees impacted by the software, as well as make a formal declaration of the proposed monitoring activities to CNIL – except where a Data Protection Correspondent (Correspondant Informatique et Libertés) is appointed.

These requirements must be met before the implementation of the monitoring software. If these steps are not fulfilled, the software and monitoring activity remains illicit and the employer cannot rely on evidence obtained through that software to establish the employee’s misconduct.

The requirement to comply with the French data privacy law was reinforced by the French Social Supreme Court in a case where an employer’s software monitoring company mailbox flows had detected that an employee had dispatched or received 1,228 personal messages. But the employer’s declaration to the CNIL about the software had been filed after the beginning of the employee’s dismissal process.

The Social Supreme Court ruled that the employer could not use the data collected and, more generally, that any data collected by an automated personal data processing tool prior to its CNIL filing, constitutes an illicit means of evidence.

This decision marks the first time that the French Social Supreme Court has officially ruled that prior declaration to the CNIL is a necessary condition affecting the validity of evidence in this context.  This is a similar conclusion and rationale to the 2013 decision where the sale of client files was rendered null and void by the French Supreme Commercial Court for failure to comply with the CNIL registration obligations and demonstrates once again how data protection is becoming a key matter in all legal areas, including employment law.




STAY CONNECTED

TOPICS

ARCHIVES