law enforcement
Subscribe to law enforcement's Posts

Public Backlash Calls Use of Facial Recognition Systems into Question

In recent weeks and months, legal and technical issues related to use of facial recognition systems in the United States have received national attention, including concerns that the technology lacks accuracy in identifying non-white individuals and that its widespread use by police departments may play a role in racially discriminatory policing. Privacy considerations will play a key role in the ongoing debate over the future of facial recognition technology.

Facial recognition systems (FRS) are automated or semi-automated technologies that analyze an individual’s features by extracting facial patterns from video or still images. FRS use attributes or features of an individual’s face to create data that can be used for the unique personal identification of a specific individual. FRS use has grown exponentially in recent years. In addition to widespread adoption by law enforcement agencies, FRS are also frequently used in retail, banking and security sectors, such as airport screening. Particularly in recent weeks and months, legal and technical issues associated with FRS have come to the forefront, including concerns that the technology lacks accuracy in identifying non-white individuals and that its widespread use by police departments may play a role in racially discriminatory policing.

In response to the global Coronavirus (COVID-19) pandemic, public health agencies and private sector companies have considered ways that FRS might be used in conjunction with proximity and geolocation tracking data to control the disease’s spread. Some foreign governments have implemented extensive biometric and behavioral monitoring to track and contain the spread of the virus, and have used FRS to identify persons who have been in contact with COVID-19-positive individuals and to enforce quarantine or stay-at-home orders. By contrast, use of FRS in the United States already faced opposition because of pre-COVID-19 data privacy concerns, and has encountered increased backlash after the civil rights protests of the past month due to concerns over the technology’s accuracy and accompanying questions regarding its use by law enforcement agencies.

Accuracy Concerns

There are currently no industry standards for the development of FRS, and as a result, FRS algorithms differ significantly in accuracy. A December 2019 National Institute of Standards and Technology (NIST) study, the third in a series conducted through its Face Recognition Vendor Test program, evaluated the effects of factors such as race and sex on facial recognition software. The study analyzed 189 facial recognition algorithms from 99 developers, using collections of photographs with approximately 18 million images of eight million people pulled from databases provided by the US Department of State, the Department of Homeland Security and the Federal Bureau of Investigation. The study found disproportionately higher false positive rates for African American, Asian and Native American faces for one-to-one matching, and higher rates of false positives for African American females for one-to-many matching. The effect of the high rate of false positives for African American females put this group at the greatest risk of misidentification. While law enforcement is encouraged to adopt a high threshold recognition percentage—often 99%—for the use of FRS, in reality police departments exercise [...]

Continue Reading




read more

Regulating the Internet of Toys

New technologies and the expansion of the Internet of Things have allowed children of this generation to experience seamless interactive technologies through microphones, GPS devices, speech recognition, sensors, cameras and other technological capabilities. These advancements create new markets for entertainment and education alike and, in the process, collect endless amounts of data from children–from their names and locations to their likes/dislikes and innermost thoughts.

The collection of data through this Internet of Toys is on the tongues of regulators and law enforcement, who are warning parents to be wary when purchasing internet-connected toys and other devices for children. These warnings also extend to connected toy makers, urging companies to comply with children’s privacy rules and signaling that focused enforcement is forthcoming.

Federal Trade Commission Makes Clear That Connected Toy Makers Must Comply with COPPA

On June 21 2017, the Federal Trade Commission (FTC) updated its guidance for companies required to comply with the Children’s Online Privacy and Protection Act (COPPA) to ensure those companies implement key protections with respect to Internet-connected toys and associated services. While the FTC’s Six Step Compliance Plan for COPPA compliance is not entirely new, there are a few key updates that reflect developments in the Internet of Toys marketplace. (more…)




read more

STAY CONNECTED

TOPICS

ARCHIVES

2021 Chambers USA top ranked firm
LEgal 500 EMEA top tier firm 2021
U.S. News Law Firm of the Year 2022 Health Care Law