FTC
Subscribe to FTC's Posts

FTC Releases Extensive Report on the “Internet of Things”

On January 27, 2015, U.S. Federal Trade Commission (FTC) staff released an extensive report on the “Internet of Things” (IoT). The report, based in part on input the FTC received at its November 2013 workshop on the subject, discusses the benefits and risks of IoT products to consumers and offers best practices for IoT manufacturers to integrate the principles of security, data minimization, notice and choice into the development of IoT devices. While the FTC staff’s report does not call for IoT specific legislation at this time, given the rapidly evolving nature of the technology, it reiterates the FTC’s earlier recommendation to Congress to enact strong federal data security and breach notification legislation.

The report also describes the tools the FTC will use to ensure that IoT manufacturers consider privacy and security issues as they develop new devices. These tools include:

  • Enforcement actions under such laws as the FTC Act, the Fair Credit Reporting Act (FCRA) and the Children’s Online Privacy Protection Act (COPPA), as applicable;
  • Developing consumer and business education materials in the IoT area;
  • Participation in multi-stakeholder groups considering guidelines related to IoT; and
  • Advocacy to other agencies, state legislatures and courts to promote protections in this area.

In furtherance of its initiative to provide educational materials on IoT for businesses, the FTC also announced the publication of “Careful Connections: Building Security in the Internet of Things”.  This site provides a wealth of advice and resources for businesses on how they can go about meeting the concept of “security by design” and consider issues of security at every stage of the product development lifecycle for internet-connected devices and things.   

This week’s report is one more sign pointing toward our prediction regarding the FTC’s increased activity in the IoT space in 2015. 




read more

In with the New: Expect FTC Activity on IoT in 2015

The “Internet of Things” (IoT) continues to grow.  (IoT refers to the ability of everyday objects to connect to the Internet and one another.)  It is estimated that there will be 4.9 billion connected appliances, devices and other “things” in use worldwide by the end of 2015, an increase of 30 percent from 2014.  The global market for IoT products is expected to reach $7.1 trillion by 2020.

Proponents of IoT believe that the data generated and shared by connected objects can provide tremendous benefits for individuals, businesses and society as a whole.  For example, IoT devices could be used to alert a person of an impending heart attack, improve a business’ manufacturing processes and reduce vehicle traffic congestion.  While IoT can provide many benefits, it also poses privacy and security challenges.  Internet connected devices, especially when used in an individual’s home or on his or her body, can generate voluminous amounts of highly personal and sensitive data about that individual, including information about physical activity, existing health conditions, energy consumption and entertainment choices.  Many users of these devices are unclear about how this data is being used and shared with others. Moreover, the sheer amount and sensitivity of the data collected and transmitted by many IoT products make them an appealing target for hackers.

The Federal Trade Commission (FTC) did not file an enforcement action against a manufacturer of IoT products for inadequate data privacy and security practices in 2014, as it had in 2013. Nonetheless, the privacy and security challenges associated with the massive collection of consumer data by IoT products still are on the FTC’s radar.  Commissioner Julie Brill has written extensively about the need to weave in privacy principles to IoT.  While IoT products ranging from automated door locks to internet connected pet trackers dominated this year’s International Consumer Electronics Show (CES), Chairwoman Edith Ramirez’s keynote address at the CES outlined several concerns about IoT, including ubiquitous data collection, the ability of IoT devices to capture sensitive personal information about consumers, unexpected uses of consumer data and data security concerns.

Since IoT is on the FTC’s radar, I predict that the FTC will carefully scrutinize manufacturers of IoT products during 2015 and perhaps bring another action against a maker of IoT products for inadequate data privacy or security practices.




read more

In with the New: 2015 Privacy, Advertising and Digital Media Predictions – Part I

What privacy, advertising and digital media trends will make headlines in 2015?  Digital Health for one,  Big Data for another.

Digital Health

The 2015 International Consumer Electronics Show (CES) started yesterday.  Sessions like “Sensibles: The Smarter Side of Wearables” and “DIY Health: Consumer Accessible Innovation” suggest that the consumer health issues explored by the Federal Trade Commission (FTC) last Spring (see our blog post here) are increasingly relevant.  Most notably, as more health-related information becomes digital, digital health businesses will need to revisit long-standing privacy, intellectual property protection, notice and consent practices that may not be well-suited to the more sensitive category of consumer-generated health information (CHI) (i.e., health-related information that consumers submit to or through mobile apps and devices).  In many cases, the law is underdeveloped and businesses must develop and implement their own best practices to demonstrate good faith as stewards of CHI.

We predict that CHI and the issues raised by its collection, use, disclosure and storage will stay on the FTC’s radar during 2015.  Perhaps the FTC will offer some insight about its position on CHI through guidance or regulatory activity related to a digital health business.

With mobile devices proliferating, the volume, versatility and variety of consumer-generated data, including CHI, also is proliferating.  CHI typically stands outside of HIPAA’s regulatory silo.  HIPAA regulates health plans, health care clearinghouses, health care providers who engage in standardized transactions with health plans and the business associates that assist health plans, clearinghouses and providers, and need protected health information to provide that assistance.   Mobile medical services and environments, however, typically fall outside of this framework: most mobile apps, for example, are used directly by consumers, and often at the direction of and under the control of plans and providers.  HIPAA may have, however, more reach into the growing business-to-business mobile app sector.

But, in the CHI arena, the sources of privacy and security regulation are murky.  Among likely hot topics in 2015 are:

  • When is consumer-generated information also consumer-generated health information?
  • Can data ever be “de-identified” or made anonymous in light of the so-called mosaic (or pointillist) effect?
  • What role can the “pay with data” model play in consumer protection?
  • Is all CHI deserving of the same level of protection?
  • What sources of oversight exist and are they sufficient?

The news is ripe with references to data “privacy” and data “security,” but the sensitivity associated with health information requires thinking about data “stewardship” – a broader concept that encompasses not only privacy and security but also data asset management and data governance.  Data stewardship captures not only data as an asset, but also as an opportunity to earn public trust and confidence while preserving innovation. 

We predict that how to be good data stewards will be a critical issue for digital health businesses in 2015 and that forward-looking and transparent efforts at self-policing will be key to not only avoiding regulatory scrutiny but also fostering consumer trust.

Big Data

Big Data was big news [...]

Continue Reading




read more

You Are Invited: Join FTC Chairwoman Ramirez on November 12 at Our Menlo Park Office for a Conversation on Privacy and Technology

Will you be in the Bay Area on November 12?  You are invited to join Federal Trade Commission (FTC) Chairwoman Edith Ramirez at McDermott’s office in Menlo Park, California for a conversation in privacy and technology.  The FTC is celebrating its 100th anniversary and this will be the first time Chairwoman Ramirez is visiting the Bay Area since her appointment.  Come and ask the tough questions, join the lively conversation and mark this important visit by Chairwoman Ramirez as she talks about all things privacy and technology to some of the top tech teams in the country.  Please RSVP as space is limited.  A complimentary networking reception with Chairwoman Ramirez will immediately follow the program.

To register, please click here.




read more

Digital Marketing Minute: A Bad Review for Yelp

The Federal Trade Commission (FTC) announced last week that Yelp – the online service through which consumers can read and write reviews about local businesses – has agreed to pay $450,000 to settle the FTC’s charges that Yelp knowingly and without verifiable parental consent (VPC), collected personal information from children under the age of 13 through its mobile app in violation of the federal law, the Children’s Online Privacy Protection Act (COPPA).

COPPA was enacted in 1998. The FTC, which is responsible for enforcing COPPA, implemented regulations in April 2000 that are known as the COPPA Rule. The FTC issued an amended COPPA Rule in December 2012, which became effective July 1, 2013. 

In general, COPPA and the COPPA Rule prohibit operators of websites, mobile applications or other digital services (collectively, “digital services”) from knowingly collecting personal information from children under age 13 unless and until the digital service operator has VPC. 

Under the amended COPPA Rule, COPPA has a broader scope than digital service operators may realize.  COPPA applies not only to digital services that are directed to children, but also to any general-audience digital service when the operator of the digital service has “actual knowledge” that the digital services is collecting personal information from children under age 13 without VPC. 

COPPA does not require operators of general-audience digital services to ask users for age or date of birth information but, under the actual knowledge test, if the digital service collects information that establishes that a user is under 13, the digital service must be COPPA compliant, which means (among other requirements) obtaining VPC before collecting personal information from the under-age-13 user.

The FTC concluded that Yelp had “actual knowledge” that it was collecting personal information from children under age 13 because the registration page on Yelp’s app asked users to enter their date of birth but did not block access to the app for users who were too young (i.e., under age 13).   

Key Takeaway: If your general-audience digital service asks a user for his or her birth date, make sure that a user who is under age 13 is blocked from using the digital service.  Also, to help prevent users who are too young from circumventing the block, consider one or all of the following techniques:

  1. Request birth date in a neutral manner, i.e., no prompt is given to the age of eligibility, such as “You must be age 13 or older to register.”
  2. Present a neutral on-screen error message when a user is under age 13, such as “Sorry, you’re not eligible,” rather than “Sorry, you are under age 13.”
  3. Deploy a cookie or other functionality to prevent an under-age user whose access was blocked from using the back button (or similar technique) to re-enter an old-enough birth date.      



read more

Don’t Forget: You Can Watch Monday’s FTC Big Data Workshop LIVE Online

On Monday, September 15, 2014, the Federal Trade Commission (FTC) will host a workshop in Washington, D.C., that is free and open to the public, exploring the use of big data and its impact on American consumers.   Don’t have your plane ticket reserved?  Not a problem!  The workshop starts at 9:00 am EST and those who are unable to attend the workshop in person can attend via webcast.

The workshop, entitled “Big Data: A Tool for Inclusion or Exclusion?”, will include presentations and panel discussions featuring academics, business and industry representatives, and consumer advocates. It will address the following issues:

  • How are organizations using big data to categorize customers?
  • What benefits do consumers gain from these practices? Do these practices raise consumer protection concerns?
  • What benefits do organizations gain from these practices? What are the social and economic impacts, both positive and negative, from the use of big data to categorize consumers?
  • How do existing laws apply to such practices? Are there gaps in the legal framework?
  • Are companies appropriately assessing the impact of big data practices on low income and underserved populations? Should additional measures be considered?



read more

Thinking Outside the HIPAA Box

On Wednesday, May 7, the Federal Trade Commission (FTC) held the third of its Spring Seminars on emerging consumer privacy issues.  This session focused on consumer-generated health information (CHI).  CHI is data generated by consumers’ use of the Internet and mobile apps that relates to an individual’s health.  The “H” in CHI defies easy definition but likely includes, at minimum, data generated from internet or mobile app activity related to seeking information about specific conditions, disease/ medical condition management tools, support and shared experiences through online communities or tools for tracking diet, exercise or other lifestyle data.

In the United States, many consumers (mistakenly) believe that all of their health-related information is protected, at the federal level, by the Health Information Portability and Accountability Act (HIPAA).  HIPAA does offer broad privacy protections to health-related information, but only to identifiable health information received by or on behalf of a “covered entity” or a third party working for a covered entity.  Covered entities are, essentially, health plans and health care providers who engage in reimbursement transactions with health plans (referred to as “Protected Health Information” or “PHI”). When HIPAA was enacted in 1996, PHI was the primary type of health information, but CHI, which is generally not also PHI, has changed that.  As FTC Commissioner Julie Brill noted her in her opening remarks, CHI is “health data stored outside the HIPAA silo.”

Without the limitations imposed by HIPAA, online service providers and mobile apps generally (except where state law requires differently) can treat CHI like other digital non-health data that they collect from consumers.  As a result, the FTC expressed concerned that CHI may be aggregated, shared and linked in ways that consumers did not foresee and may not understand.

The panelists at the FTC discussed the difficulty in defining CHI, and whether and how it is different from other kinds of data collected from consumers.  One panelist noted that whether a consumer considers his or her CHI sensitive is highly individualized.  For example, are the heart rate and exercise data collected by mobile fitness apps sensitive? Would the answer to this question change if these data points were linked with other data points that began to suggest other health or wellness indicators, just as weight?  Would the answer change if that linked data was used to predict socioeconomic status that is often linked to certain health, wellness and lifestyle indicators or used to inform risk rating or direct to consumer targeted advertising?

Panelists also discussed the larger and more general question of how to define privacy in a digital economy and how to balance privacy with the recognized benefits of data aggregation and data sharing.  These questions are compounded by the difficulty of describing data as being anonymized or de-identified – foundational principles in most privacy frameworks – because the quality of being “identifiable” in the digital economy may depend on the proximity of a piece of data to other pieces of data.

Though the “how” and “what” of additional [...]

Continue Reading




read more

FTC Enforces Facebook Policies to Stop Jerk

The Federal Trade Commission (FTC) recently accused the operator of www.Jerk.com (Jerk) of misrepresenting to users the source of the personal content that Jerk used for its purported social networking website and the benefits derived from a user’s purchase of a Jerk membership.   According to the FTC, Jerk improperly accessed personal information about consumers from Facebook, used the information to create millions of unique profiles identifying subjects as either “Jerk” or “Not a Jerk” and falsely represented that a user could dispute the Jerk/Not a Jerk label and alter the information posted on the website by paying a $30 subscription fee.  The interesting issue in this case is not the name of the defendant or its unsavory business model; rather, what’s interesting is the FTC’s tacit enforcement of Facebook’s privacy policies governing the personal information of Facebook’s own users.

Misrepresenting the Source of Personal Information

Although Jerk represented that its profile information was created by its users and reflected those users’ views of the profiled individuals, Jerk in fact obtained the profile information from Facebook.  In its complaint, the FTC alleges that Jerk accessed Facebook’s data through Facebook’s application programming interfaces (API), which are tools developers can use to interact with Facebook, and downloaded the names and photographs of millions of Facebook users without consent. The FTC used Facebook’s various policies as support for its allegation that Jerk improperly obtained the personal information of Facebook’s users and, in turn, misrepresented the source of the information.  The FTC noted that developers accessing the Facebook platform must agree to Facebook’s policies, which include (1) obtaining users’ explicit consent to share certain Facebook data; (2) deleting information obtained through Facebook once Facebook disables the developers’ Facebook access; (3) providing an easily accessible mechanism for consumers to request the deletion of their Facebook data; and (4) deleting information obtained from Facebook upon a consumer’s request.  Jerk used the data it collected from Facebook not to interact with Facebook but to create unique Jerk profiles for its own commercial advantage.  Jerk’s misappropriation of user data from Facebook was the actual source of the data contrary to Jerk’s representation that the data had been provided by Jerk’s users.

Misrepresenting the Benefit of the Bargain

According to the FTC, Jerk represented that purchase of a $30 subscription would enable users to obtain “premium features,” including the ability to dispute information posted on Jerk and alter or delete their Jerk profile and dispute the false information on their profile.  Users who paid the subscription often received none of the promised benefits.  The FTC noted that contacting Jerk with complaints was difficult for consumers:  Jerk charged $25 for users to email the customer service department.

A hearing is scheduled for January 2015. Notably, the FTC’s proposed Order, among other prohibitions, enjoins Jerk from using in any way the personal information that Jerk obtained prior to the FTC’s action – meaning the personal information that was obtained illegally from Facebook.




read more

STAY CONNECTED

TOPICS

ARCHIVES

2021 Chambers USA top ranked firm
LEgal 500 EMEA top tier firm 2021
U.S. News Law Firm of the Year 2022 Health Care Law