Fair Credit Reporting Act
Subscribe to Fair Credit Reporting Act's Posts

FTC Report Alerts Organizations about the Risks and Rewards of Big Data Analytics

On January 6, the Federal Trade Commission (FTC) released a report that it hopes will educate organizations on the important laws and research that are relevant to big data analytics. The report, Big Data: A Tool for Inclusion or Exclusion? Understanding the Issues, looks specifically at how big data is used after it is collected and analyzed and provides suggestions aimed at maximizing the benefits and minimizing the risks of using big data.

Risk and Rewards

The report argues that big data analytics can provide numerous opportunities for improvements in society. In addition to more effectively matching products and services to consumers, big data can create opportunities for low income and underserved communities. The report highlights a number of innovative uses of big data that provide benefits to underserved populations, such as increased educational attainment, access to credit through nontraditional methods, specialized health care for underserved communities, and better access to employment.

At the same time, the report shows that potential inaccuracies and biases might lead to detrimental effects for low-income and underserved populations. For example, organizations  could use big data to inadvertently exclude low-income and underserved communities from credit and employment opportunities, which may reinforce existing disparities or weaken the effectiveness of consumer choice.

Considerations for Using Big Data

The report outlines some of the consumer protection laws (in particular, the Fair Credit Reporting Act and FTC Act)  and equal opportunity laws that apply to the use of big data, especially with regard to possible issues of discrimination or exclusion. It also recommends that an organization consider the following questions to help ensure that its use of big data analytics does not lead to unlawful exclusion or discrimination:

How representative is your data set? 

If the data set is missing information from particular populations, take appropriate steps to address this problem.

Does your data model account for biases? 

Review data sets and algorithms to ensure that hidden biases do not have an unintended impact on certain populations.

How accurate are your predictions based on big data? 

Balance the risks of using correlative results, especially where the business’ policies could negatively affect certain populations.

Does your reliance on big data cause ethical or fairness concerns?

Consider whether fairness and ethical considerations advise against using big data in certain circumstances and whether the business can use big data in ways that advance opportunities for previously underrepresented populations.

Monitoring and Enforcement Ahead

The FTC stated that its collective challenge is to make sure that big data analytics continue to provide benefits and opportunities to consumers while adhering to core consumer protection values and principles. It has committed to continue monitoring areas where big data practices could violate existing laws and to bring enforcement actions where appropriate.  With that in mind, organizations that already use big data and those that are have been persuaded by reported benefits of big data should heed [...]

Continue Reading




Data Broker’s Appeal to U.S. Supreme Court Could Reshape Future of Data Privacy Litigation

In a case that could shape the future of data privacy litigation, the Supreme Court recently agreed to review the decision by the U. S. Court of Appeals for the Ninth Circuit under the Fair Credit Reporting Act (FCRA) in Robins v. Spokeo, Inc.  At issue is the extent to which Congress may create statutory rights that, when violated, are actionable in court, even if the plaintiff has not otherwise suffered a legally-redressable injury.

Spokeo is a data broker that provides online “people search capabilities” and “business information search” (i.e., business contacts, emails, titles, etc.).   Thomas Robins (Robins) sued Spokeo in federal district court for publishing data about Robins that incorrectly represented him as married and having a graduate degree and more professional experience and money than he actually had.  Robins alleged that Spokeo’s inaccurate data caused him actual harm by (among other alleged harms) damaging his employment prospects.

After some initial indecision, the district court dismissed the case in 2011 on the grounds that Robins had not sufficiently alleged any actual or imminent harm traceable to Spokeo’s data.  Without evidence of actual or imminent harm, Robins did not have standing to bring suit under Article III of the U.S. Constitution.  Robins appealed.

On February 4, 2014, the Court of Appeals for the Ninth Circuit announced its decision to reverse the district court, holding that the FCRA allowed Robins to sue for a statutory violation: “When, as here, the statutory cause of action does not require proof of actual damages, a plaintiff can suffer a violation of the statutory right without suffering actual damages.” The Court of Appeals acknowledged limits on Congress’ ability to create redressable statutory causes of action but held that Congress did not exceed those limits in this case.  The court held that “the interests protected” by the FCRA were “sufficiently concrete and particularized” such that Congress could create a statutory cause of action, even for individuals who could not show actual damages.

Why Spokeo Matters

If the Supreme Court reverses the Ninth Circuit’s decision, the decision could dramatically redraw the landscape of data privacy protection litigation in favor of businesses by requiring plaintiffs to allege and eventually prove actual damages.  Such a ruling could severely limit lawsuits brought under several privacy-related statutes, in which plaintiffs typically seek statutory damages on behalf of a class without needing to show actual damages suffered by the class members.  Litigation under the FCRA, the Telephone Consumer Protection Act and the Video Privacy Protection Act (among others statutes) all could be affected.




FTC Releases Extensive Report on the “Internet of Things”

On January 27, 2015, U.S. Federal Trade Commission (FTC) staff released an extensive report on the “Internet of Things” (IoT). The report, based in part on input the FTC received at its November 2013 workshop on the subject, discusses the benefits and risks of IoT products to consumers and offers best practices for IoT manufacturers to integrate the principles of security, data minimization, notice and choice into the development of IoT devices. While the FTC staff’s report does not call for IoT specific legislation at this time, given the rapidly evolving nature of the technology, it reiterates the FTC’s earlier recommendation to Congress to enact strong federal data security and breach notification legislation.

The report also describes the tools the FTC will use to ensure that IoT manufacturers consider privacy and security issues as they develop new devices. These tools include:

  • Enforcement actions under such laws as the FTC Act, the Fair Credit Reporting Act (FCRA) and the Children’s Online Privacy Protection Act (COPPA), as applicable;
  • Developing consumer and business education materials in the IoT area;
  • Participation in multi-stakeholder groups considering guidelines related to IoT; and
  • Advocacy to other agencies, state legislatures and courts to promote protections in this area.

In furtherance of its initiative to provide educational materials on IoT for businesses, the FTC also announced the publication of “Careful Connections: Building Security in the Internet of Things”.  This site provides a wealth of advice and resources for businesses on how they can go about meeting the concept of “security by design” and consider issues of security at every stage of the product development lifecycle for internet-connected devices and things.   

This week’s report is one more sign pointing toward our prediction regarding the FTC’s increased activity in the IoT space in 2015. 




New Data Disposal Law in Delaware Requires Action by Impacted Businesses

While the federal government continues its inaction on data security bills pending in Congress, some U.S. states have been busy at work on this issue over the summer.  A new Delaware law H.B. 295, signed into law on July 1, 2014 and effective January 1, 2015, provides for a private right of action in which a court may order up to triple damages in the event a business improperly destroys personal identifying information at the end of its life cycle.  In addition to this private right of action, the Delaware Attorney General may file suit or bring an administrative enforcement proceeding against the offending business if it is in the public interest.

Under the law, personal identifying information is defined as:

A consumer’s first name or first initial and last name in combination with any one of the following data elements that relate to the consumer, when either the name or the data elements are not encrypted:

  • his or her signature,
  • full date of birth,
  • social security number,
  • passport number, driver’s license or state identification card number,
  • insurance policy number,
  • financial services account number, bank account number,
  • credit card number, debit card number,
  • any other financial information or
  • confidential health care information including all information relating to a patient’s health care history, diagnosis condition, treatment or evaluation obtained from a health care provider who has treated the patient, which explicitly or by implication identifies a particular patient.

Interestingly, this new law exempts from its coverage:  banks and financial institutions that are merely subject to the Gramm-Leach-Bliley Act, but the law only exempts health insurers and health care facilities if they are subject to and in compliance with the Health Insurance Portability and Accountability Act (HIPAA), as well as credit reporting agencies if they are subject to and in compliance with the Fair Credit Reporting Act (FCRA).

Given how broadly the HIPAA and FCRA exemptions are drafted, we expect plaintiffs’ attorneys to argue for the private right of action and triple damages in every case where a HIPAA- or FCRA-covered entity fails to properly dispose of personal identifying information, arguing that such failure evidences noncompliance with HIPAA or FCRA, thus canceling the exemption.   Note, however, that some courts have refused to allow state law claims of improper data disposal to proceed where they were preempted by federal law.  See, e.g., Willey v. JP Morgan Chase, Case No. 09-1397, 2009 U.S. Dist. LEXIS 57826 (S.D.N.Y. July 7, 2009) (dismissing individual and class claims alleging improper data disposal based on state law, finding they were pre-empted by the FCRA).

The takeaway?  Companies that collect, receive, store or transmit personal identifying information of residents of the state of Delaware (or any of the 30+ states in the U.S. that now have data disposal laws on the books) should examine their data disposal policies and practices to ensure compliance with these legal requirements.  In the event a business is alleged to have violated one of [...]

Continue Reading




STAY CONNECTED

TOPICS

ARCHIVES

2021 Chambers USA top ranked firm
LEgal 500 EMEA top tier firm 2021