FTC
Subscribe to FTC's Posts

FTC Report Alerts Organizations about the Risks and Rewards of Big Data Analytics

On January 6, the Federal Trade Commission (FTC) released a report that it hopes will educate organizations on the important laws and research that are relevant to big data analytics. The report, Big Data: A Tool for Inclusion or Exclusion? Understanding the Issues, looks specifically at how big data is used after it is collected and analyzed and provides suggestions aimed at maximizing the benefits and minimizing the risks of using big data.

Risk and Rewards

The report argues that big data analytics can provide numerous opportunities for improvements in society. In addition to more effectively matching products and services to consumers, big data can create opportunities for low income and underserved communities. The report highlights a number of innovative uses of big data that provide benefits to underserved populations, such as increased educational attainment, access to credit through nontraditional methods, specialized health care for underserved communities, and better access to employment.

At the same time, the report shows that potential inaccuracies and biases might lead to detrimental effects for low-income and underserved populations. For example, organizations  could use big data to inadvertently exclude low-income and underserved communities from credit and employment opportunities, which may reinforce existing disparities or weaken the effectiveness of consumer choice.

Considerations for Using Big Data

The report outlines some of the consumer protection laws (in particular, the Fair Credit Reporting Act and FTC Act)  and equal opportunity laws that apply to the use of big data, especially with regard to possible issues of discrimination or exclusion. It also recommends that an organization consider the following questions to help ensure that its use of big data analytics does not lead to unlawful exclusion or discrimination:

How representative is your data set? 

If the data set is missing information from particular populations, take appropriate steps to address this problem.

Does your data model account for biases? 

Review data sets and algorithms to ensure that hidden biases do not have an unintended impact on certain populations.

How accurate are your predictions based on big data? 

Balance the risks of using correlative results, especially where the business’ policies could negatively affect certain populations.

Does your reliance on big data cause ethical or fairness concerns?

Consider whether fairness and ethical considerations advise against using big data in certain circumstances and whether the business can use big data in ways that advance opportunities for previously underrepresented populations.

Monitoring and Enforcement Ahead

The FTC stated that its collective challenge is to make sure that big data analytics continue to provide benefits and opportunities to consumers while adhering to core consumer protection values and principles. It has committed to continue monitoring areas where big data practices could violate existing laws and to bring enforcement actions where appropriate.  With that in mind, organizations that already use big data and those that are have been persuaded by reported benefits of big data should heed [...]

Continue Reading




read more

FTC Sees Disconnect on Proposed Connected Cars Legislation

The Energy & Commerce Committee of the U.S. House of Representatives held a hearing on October 21st titled “Examining Ways to Improve Vehicle and Roadway Safety” to consider (among other matters) Vehicle Data Privacy legislation for internet-connected cars.

The proposed legislation includes requirements that auto manufacturers:

  • “Develop and implement” a privacy policy incorporating key elements on the collection, use and sharing of data collected through technology in vehicles. By providing the policy to the National Highway Traffic Safety Administration, a manufacturer earns certain protection against enforcement action under Section 5 of the Federal Trade Commission Act.
  • Retain data no longer than is determined necessary for “legitimate business purposes.”
  • Implement “reasonable measures” to ensure that the data is protected against theft/unauthorized access or use (hacking).

Manufacturers that fail to comply face a maximum penalty, per manufacturer, of up to $1 million. The penalty for failure to protect against hacking is up to $100,000 per “unauthorized” access.

Maneesha Mithal, Associate Director, Division of Privacy and Identity Protection, of the Federal Trade Commission (FTC), testified that the proposed legislation “could substantially weaken the security and privacy protections that consumers have today.”

The FTC’s criticism focuses on the proposed safe harbor against FTC enforcement for manufacturers. The FTC testified that a manufacturer should not earn immunity under the FTC Act if the privacy policy offers little or no privacy protection, or is not followed or enforced. The FTC expressed disapproval of provisions allowing retroactive application of a privacy policy to data previously collected. The FTC also advised against applying the proposed safe harbor to data outside of the vehicle, such as data collected from a website or mobile app.

Although the FTC applauded the goal of deterring criminal hacking of the auto systems, the FTC testified that the legislation, as drafted, may disincentivize manufacturers’ efforts in safety and privacy improvements. The testimony echoed that of other industry critics who believe that what is considered “authorized” access is too vague, which may prevent manufacturers from allowing others to access vehicle data systems, such as for repair or research on an auto’s critical systems.

Finally, the FTC criticized the provisions creating a council to develop cybersecurity best practices.  Since the council could operate by a simple majority, it could act without any government or consumer advocacy input, diluting consumer protections.

The hearing agenda, as well as the text of the draft legislation is available here.

The FTC’s prepared statement, as well as the text of the testimony is available here.




read more

The FTC Continues to Flex its Safe Harbor Enforcement Muscles

On August 17, 2015, the Federal Trade Commission (FTC) announced settlements with 13 companies on charges that they misled consumers by claiming that they were certified members of the U.S.-EU or U.S.-Swiss Safe Harbor programs when in fact their certifications had lapsed or never existed in the first place. The FTC’s announcement comes on the heels of two previous settlements reached in late May 2015 with companies that had lapsed certifications despite representations to the contrary made to online consumers. This recent activity by the FTC serves as yet another reminder to businesses to monitor their Safe Harbor program certification renewal dates and to exercise care when making representations in privacy policies related to Safe Harbor program certification.

The Safe Harbor programs provide a method for U.S. companies to transfer personal data outside of the European Union (EU) or European Economic Area (EEA) consistent with the requirements of the European Union Directive on Data Protection or the Swiss Federal Act on Data Protection. To participate in a Safe Harbor program, a company must self-certify to the U.S. Department of Commerce that it complies with seven privacy principles and related requirements. Once certified, a company is required to renew its certification with the Department of Commerce each year to maintain its status as a current member of the Safe Harbor program.

The companies at the center of the recent enforcement actions represent a variety of industries, including app development, pharmaceutical and biotechnology research, medical waste processing and wholesale food manufacturing. This broad industry representation suggests to us that the FTC is committed to ongoing enforcement. Accordingly, we want to remind readers of these tips:

  • Check your company’s certification status to ensure that it is marked “current” on the Department of Commerce website: https://safeharbor.export.gov/list.aspx;
  • Review any privacy policies and online statements referencing the Safe Harbor programs to ensure that they properly reflect the certification status and the company’s actual privacy and data security practices;
  • Institute a systemic reminder six months prior to the recertification date that triggers compliance review activity with a due date for completion prior to the recertification deadline, together with a requirement that the actual online recertification be completed prior to the annual deadline;
  • Remove all references to the Safe Harbor programs from publicly available privacy policies and statements if the company’s certification status is unclear; and
  • Review substantive compliance with the Safe Harbor programs and institute corrective action and controls to ensure that compliance is maintained.

 




read more

The Connected Car and Keeping YOU in the Driver’s Seat

Remember KITT? KITT (the Knight Industries Two Thousand) was the self-directed, self-driving, supercomputer hero of the popular 1980s television show Knight Rider. Knight Rider was a science fiction fantasy profiling the “car of the future.” The self-directed car is science fiction no more. The future is now and, in fact, we’ve seen a lot of press this year about self-driving or driverless cars.

Driverless cars, equipped with a wide variety of connected systems including cameras, radar, sonar and LiDar (light detection and ranging), are expected on the road within the next few years. They can sense road conditions, identify hazards and negotiate traffic, all from a remote command center. Just as with most connected devices in the age of the Internet of Things (IoT), these ultra-connected devices claim to improve efficiency and performance, and enhance safety.

Though not quite driverless yet, connected vehicles are already on the market, in-market and on the road. Like many IoT “things”, ultra-connected vehicles systems may be vulnerable to hacker attacks.

Christopher Valasek and Charlie Miller, two computer security industry leaders, have presented on this topic at various events, including the 2014 Black Hat USA security conference . They analyzed the information security vulnerabilities of various car makes and models, rating the vehicles on three specific criteria: (1) the area of their wireless “attack surface” (i.e., how many data incorporating features such as Bluetooth, Wi-Fi, keyless entry systems, automated tire monitoring systems); (2) access to the vehicles network through those data points; and (3) the vehicle’s “cyberphysical” features (i.e., connected features such as parking assist, automated braking, and other technological driving aides). This last category of features, combined with access through the data points outlined in items (1) and (2), presented a composite risk profile of each vehicle make’s hackability. Their conclusions were startling: radios, brakes, steering systems were all found to be accessible.

Miller and Valasek claim that their intent was to encourage car manufacturers to consider security in vehicle system connectivity and cyberphysical attributes. They approached vehicle manufacturers and shared their report with the Department of Transportation and the Society of Automobile Engineers. Some manufacturers promised to investigate their vehicle systems and correct the deficiencies. Some seemingly ignored the report altogether. They did, however, catch the attention of Senators Ed Markey (D-MA) and Richard Blumenthal (D-CT). On July 21, 2015, Senators Markey and Blumenthal introduced legislation that would direct the National Highway Traffic Safety Administration (NHTSA) and the Federal Trade Commission (FTC) to establish federal standards to secure vehicles and protect drivers’ privacy. The Security and Privacy in Your Car Act, aptly coined “the SPY Car Act”, would also require manufacturers to establish a ‘cyber dashboard’ that rates vehicle security, informing consumers as to the security performance of their vehicle.

As proposed, the SPY Car Act would require that all motor vehicles manufactured in the U.S. be “equipped with reasonable measures to protect against hacking attacks.” All “entry points” are to be protected through “reasonable” measures against hacking. Internal networks are to [...]

Continue Reading




read more

FTC Approves Final Order with AmeriFreight: Websites Must Disclose Endorsements

Earlier this year, AmeriFreight, a Georgia-based auto shipment broker, settled with the Federal Trade Commission (FTC) over charges that the company posted customer reviews on its website while failing to disclose that it had given cash discounts to customers in exchange for the reviews.  According to the FTC complaint, AmeriFreight touted on its website homepage that it had “more highly ranked ratings and reviews than any other company in the automotive transportation business” and that a majority of the online reviews on  AmeriFreight’s website failed to disclose that the reviewers were compensated $50 for posting reviews and were also eligible to receive an additional $100 if selected for the “Best Monthly Review Award.”  The FTC charged that AmeriFreight, by failing to disclose the incentives it had given to reviewers, had misrepresented its customer reviews as those of unbiased consumers.  The FTC’s position can be summed up best by the following quotes from its Director of the Bureau of Consumer Protection: “Companies must make it clear when they have paid their customers to write online reviews” and if companies “fail to do that – as AmeriFreight did – then they’re deceiving consumers, plain and simple.”

The FTC’s Endorsement Guidelines

Guidelines issued in 2009 by the Federal Trade Commission (the “FTC Endorsement Guidelines”) make clear that an advertiser must fully disclose any connection between the advertiser and an endorser of the advertiser’s product or service that might materially affect the weight or credibility of the endorsement, such as the fact that the endorser received compensation or some other benefit or incentive from the advertiser in exchange for providing a favorable review.  An advertiser’s failure to disclose an endorser’s material connection with the advertiser constitutes an unfair and deceptive trade practice as well as false advertising, both in violation of Section 5(a) of the Federal Trade Commission Act.  The requirement of disclosure of material connections applies not only to celebrity, expert or professional endorsers, but also to ordinary consumer-endorsers.  Many companies today use consumer endorsements in promoting their products or services, including the so-called “word-of-mouth advertising” whereby satisfied customers tell other people how much they like a product or service.  A common example of this form of advertising is publishing consumer-submitted reviews on the internet.  Good word of mouth generated by favorable customer reviews can make a big difference in a company’s online ad campaign.  However, companies that are looking to incentivize customers to submit good reviews must be wary of not running afoul of the FTC Endorsement Guidelines.  In particular, where a company offers money or other benefits to customers in exchange for good reviews, it must disclose such fact when publishing reviews.

Key Takeaways for Businesses

The FTC’s complaint against AmeriFreight is the first time the agency has charged a company with misrepresenting online reviews by failing to disclose that it gave cash discounts to customers to post the reviews.  This has significant implications for businesses that use customer [...]

Continue Reading




read more

Italian Data Privacy Authority’s Public Consultation on the Internet of Things

On April 28, 2015, the Italian Data Privacy Authority (the Authority) launched a public consultation on the Internet of Things aimed at collecting contributions from stakeholders and assessing its potential impact on consumers’ privacy. This public consultation in Italy follows the opinion of the EU Article 29 Working Party of September 2014 and a more recent report of the U.S. Federal Trade Commission of January 2015, which had identified a number of issues and challenges in relation to the Internet of Things. Interested parties can submit their comments to the Authority by e-mail within 180 days of the publication in the Official Journal of the decision to launch the consultation (expected in the next few days).

This is an outstanding opportunity for stakeholders to provide their contribution on issues such as users’ profiling, data anonymization, the applicability of the data protection by design principles and the use of certification and authentication tools, in order to identify a set of best practices to ensure that compliance with data privacy rules does not constitute a limit to the development of Internet of Things technologies. The consultation might hopefully result in the adoption of specific guidance by the Authority on the application of data privacy rules to businesses active in the Internet of Things market, which currently face significant compliance issues.




read more

National Roadmap for Health Data Sharing: FTC Advocates Preservation of Privacy and Competition

On April 1, 2015, the Office of the National Coordinator for Health Information Technology (ONC), which assists with the coordination of federal policy on data sharing objectives and standards, issued its Shared Nationwide Interoperability Roadmap and requested comments.  The Roadmap seeks to lay out a framework for developing and implementing interoperable health information systems that will allow for the freer flow of health-related data by and among providers and patients.  The use of technology to capture and understand health-related information and the strategic sharing of information between health industry stakeholders and its use is widely recognized as critical to support patient engagement, improve quality outcomes and lower health care costs.

On April 3, 2015, the Federal Trade Commission issued coordinated comments from its Office of Policy Planning, Bureau of Competition, Bureau of Consumer Protection and Bureau of Economics.  The FTC has a broad, dual mission to protect consumers and promote competition, in part, by preventing business practices that are anticompetitive or deceptive or unfair to consumers.  This includes business practices that relate to consumer privacy and data security.  Notably, the FTC’s comments on the Roadmap draw from both its pro-competitive experience and its privacy and security protection perspective, and therefore offer insights into the FTC’s assessment of interoperability from a variety of consumer protection vantage points.

The FTC agreed that ONC’s Roadmap has the potential to benefit both patients and providers by “facilitating innovation and fostering competition in health IT and health care services markets” – lowering health care costs, improving population health management and empowering consumers through easier access to their personal information.  The concepts advanced in the Roadmap, however, if not carefully implemented, can also have a negative effect on competition for health care technology services.  The FTC comments are intended to guide ONC’s implementation with respect to: (1) creating a business and regulatory environment that encourages interoperability, (2) shared governance mechanisms that enable interoperability, and (3) advancing technical standards.

Taking each of these aspects in turn, creating a business and regulatory environment that encourages interoperability is important because, if left unattended, the marketplace may be resistant to interoperability.  For example, health care providers may resist interoperability because it would make switching providers easier and IT vendors may see interoperability as a threat to customer-allegiance.  The FTC suggests that the federal government, as a major payer, work to align economic incentives to create greater demand among providers for interoperability.

With respect to shared governance mechanisms, the FTC notes that coordinated efforts among competitors may have the effect of suppressing competition.  The FTC identifies several examples of anticompetitive conduct in standard setting efforts for ONC’s consideration as it considers how to implement the Roadmap.

Finally, in advancing core technical standards, the FTC advised ONC to consider how standardization could affect competition by (1) limiting competition between technologies, (2) facilitating customer lock-in, (3) reducing competition between standards, and (4) impacting the method for selecting standards.

As part of its mission to protect consumers, the FTC focuses its privacy and security [...]

Continue Reading




read more

FTC Merger Review Likely to Incorporate Analysis of Privacy Issues

The Federal Trade Commission (FTC or the Commission), along with the U.S. Department of Justice, can challenge mergers it believes will result in a substantial lessening of competition – for example through higher prices, lower quality or reduced rates of innovation.  Although the analysis of whether a transaction may be anticompetitive typically focuses on price, privacy is increasingly regarded as a kind of non-price competition, like quality or innovation.  During a recent symposium on the parameters and enforcement reach of Section 5 of the FTC Act, Deborah Feinstein, the director of the FTC’s Bureau of Competition, noted that privacy concerns are becoming more important in the agency’s merger reviews.  Specifically she stated, “Privacy could be a form of non-price competition important to customers that could be actionable if two kinds of companies competed on privacy commitments on technologies they came up with.”

At this same symposium, Jessica Rich, director of the FTC’s Bureau of Consumer Protection, remarked on the agency’s increasing expectations that companies protect the consumer data they collect and be more transparent about what they collect, how they store and protect it, and about third parties with whom they share the data.

The FTC’s Bureaus of Competition and Consumer Protection fulfill the agency’s dual mission to promote competition and protect consumers, in part, through the enforcement of Section 5 of the FTC Act.  With two areas of expertise and a supporting Bureau of Economics under one roof, the Commission is uniquely positioned to analyze whether a potential merger may substantially lessen privacy-related competition.

The concept that privacy is a form of non-price competition is not new to the FTC.  In its 2007 statement upon closing its investigation into the merger of Google, Inc. and DoubleClick Inc., the Commission recognized that mergers can “adversely affect non-price attributes of competition, such as consumer privacy.”  Commissioner Pamela Jones Harbour’s dissent in the Google/DoubleClick matter outlined a number of forward-looking competition and privacy-related considerations for analyzing mergers of data-rich companies.  The FTC ultimately concluded that the evidence in that case “did not support the theories of potential competitive harm” and thus declined to challenge the deal.  The matter laid the groundwork, however, for the agency’s future consideration of these issues.

While the FTC has yet to challenge a transaction on the basis that privacy competition would be substantially lessened, parties can expect staff from both the Bureau of Competition and the Bureau of Consumer Protection to be working closely together to analyze a proposed transaction’s impact on privacy.  The FTC’s review of mergers between entities with large databases of consumer information may focus on: (1) whether the transaction will result in decreased privacy protections, i.e., lower quality of privacy; and (2) whether the combined parties achieve market power as a result of combining their consumer data.

This concept is not unique to the United States.  The European Commission’s 2008 decision in TomTom/Tele Atlas examined whether there would be a decrease [...]

Continue Reading




read more

Consumer Health Information Update from Both Sides of the Atlantic

As we reported in May 2014, the Federal Trade Commission (FTC) convened stakeholders to explore whether health-related information collected from and about consumers — known as consumer-generated health information (CHI) — through use of the internet and increasingly-popular lifestyle and fitness mobile apps is more sensitive and in need of more privacy-sensitive treatment than other consumer-generated data.

One of the key questions raised during the FTC’s CHI seminar is: “what is consumer health information”?  Information gathered during traditional medical encounters is clearly health-related.  Information gathered from mobile apps designed as sophisticated diagnostic tools also is clearly health-related — and may even be “Protected Health Information,” as defined and regulated by Health Information Portability and Accountability Act (HIPAA), depending on the interplay of the app and the health care provider or payor community.  But, other information, such as diet and exercise, may be viewed by some as wellness or consumer preference data (for example, the types of foods purchased).  Other information (e.g., shopping habits) may not look like health information but, when aggregated with other information generated by and collected from consumers, may become health-related information.  Information, therefore, may be “health information,” and may be more sensitive as such, depending on (i) the individual from whom it is collected, (ii) the context in which it is initially collected; (iii) the other information which it is combined; (iv) the purpose for which the information was initially collected; and (v) the downstream uses of the information.

Notably, the FTC is not the only regulatory body struggling with how to define CHI.  On February 5, 2015, the European Union’s Article 29 Working Party (an EU representative body tasked with advising EU Member States on data protection) published a letter in response to a request from the European Commission to clarify the definitional scope of “data concerning health in relation to lifestyle and wellbeing apps.”

The EU’s efforts to define CHI underscore the importance of understanding CHI.  The EU and the U.S. data privacy and security regimes differ fundamentally in that the EU regime broadly protects personally identifiable information.  The US does not currently provide universal protections for personally identifiable information.  The U.S. approach varies by jurisdiction and type of information and does not uniformly regulate the mobile app industry or the CHI captured by such apps.  These different regulatory regimes make the EU’s struggle to define the precise scope and definition of “lifestyle and wellbeing” data (CHI) and develop best practices going forward all the more striking because, even absent such a definition, the EU privacy regime would offer protections.

The Article 29 Working Party letter acknowledges the European Commission’s work to date, including the European Commission’s “Green Paper on Mobile Health,” which emphasized the need for strong privacy and security protections, transparency – particularly with respect to how CHI interoperates with big data  – and the need for specific legislation on CHI-related  apps or regulatory guidance that will promote “the safety and performance of lifestyle and wellbeing apps.”  But, [...]

Continue Reading




read more

The FTC Did Some Kid-ding Around in 2014

2014 was a busy year for the Federal Trade Commission (FTC) with the Children’s Online Privacy Protection Act (COPPA).  The FTC announced something new under COPPA nearly every month, including:

  • In January, the FTC issued an updated version of the free consumer guide, “Net Cetera:  Chatting with Kids About Being Online.”  Updates to the guide include advice on mobile apps, using public WiFi securely, and how to recognize text message spam, as well as details about recent changes to COPPA.
  • In February, the FTC approved the kidSAFE Safe Harbor Program.  The kidSAFE certification and seal of approval program helps children-friendly digital services comply with COPPA.  To qualify for a kidSAFE seal, digital operators must build safety protections and controls into any interactive community features; post rules and educational information about online safety; have procedures for handling safety issues and complaints; give parents basic safety controls over their child’s activities; and ensure all content, advertising and marketing is age-appropriate.
  • In March, the FTC filed an amicus brief in the 9th U.S. Circuit Court of Appeals, arguing that the ruling of U.S. District  Court for the Northern District of California in Batman v. Facebook that COPPA preempts state law protections for the online activities of teenagers children outside of COPPA’ coverage is “patently wrong.”
  • In April, the FTC updated its “Complying with COPPA:  Frequently Asked Questions” (aka the COPPA FAQs) to address how COPPA applies in the school setting.  In FAQ M.2, the FTC discussed whether a school can provide the COPPA-required consent on behalf of parents, stating that “Where a school has contracted with an operator to collect personal information from students for the use and benefit of the school, and for no other commercial purpose, the operator is not required to obtain consent directly from parents, and can presume that the school’s authorization for the collection of students’ personal information is based upon the school having obtained the parents’ consent.”  But, the FTC also recommends as “best practice” that schools provide parents with information about the operators to which it has consented on behalf of the parents.  The FTC requires that the school investigate the collection, use, sharing, retention, security and disposal practices with respect to personal information collected from its students.
  • In July, COPPA FAQ H.5, FAQ H.10, and FAQ H.16 about parental consent verification also were updated.  In FAQ H.5, the FTC indicates that “collecting a 16-digit credit or debit card number alone” is not sufficient as a parental consent mechanism, in some circumstances, “collection of the card number – in conjunction with implementing other safeguards – would suffice.”  Revised FAQ H.10 indicates that a developer of a child-directed app may use a third party for parental verification “as long as [developers] ensure that COPPA requirements are being met,” including the requirement to “provide parents with a direct notice outlining [the developer’s] information collection practices before the parent provides his or her consent.” In revised FAQ H.16, the FTC [...]

    Continue Reading



read more

STAY CONNECTED

TOPICS

ARCHIVES

2021 Chambers USA top ranked firm
LEgal 500 EMEA top tier firm 2021
U.S. News Law Firm of the Year 2022 Health Care Law