Mobile Apps
Subscribe to Mobile Apps's Posts

The FTC Did Some Kid-ding Around in 2014

2014 was a busy year for the Federal Trade Commission (FTC) with the Children’s Online Privacy Protection Act (COPPA).  The FTC announced something new under COPPA nearly every month, including:

  • In January, the FTC issued an updated version of the free consumer guide, “Net Cetera:  Chatting with Kids About Being Online.”  Updates to the guide include advice on mobile apps, using public WiFi securely, and how to recognize text message spam, as well as details about recent changes to COPPA.
  • In February, the FTC approved the kidSAFE Safe Harbor Program.  The kidSAFE certification and seal of approval program helps children-friendly digital services comply with COPPA.  To qualify for a kidSAFE seal, digital operators must build safety protections and controls into any interactive community features; post rules and educational information about online safety; have procedures for handling safety issues and complaints; give parents basic safety controls over their child’s activities; and ensure all content, advertising and marketing is age-appropriate.
  • In March, the FTC filed an amicus brief in the 9th U.S. Circuit Court of Appeals, arguing that the ruling of U.S. District  Court for the Northern District of California in Batman v. Facebook that COPPA preempts state law protections for the online activities of teenagers children outside of COPPA’ coverage is “patently wrong.”
  • In April, the FTC updated its “Complying with COPPA:  Frequently Asked Questions” (aka the COPPA FAQs) to address how COPPA applies in the school setting.  In FAQ M.2, the FTC discussed whether a school can provide the COPPA-required consent on behalf of parents, stating that “Where a school has contracted with an operator to collect personal information from students for the use and benefit of the school, and for no other commercial purpose, the operator is not required to obtain consent directly from parents, and can presume that the school’s authorization for the collection of students’ personal information is based upon the school having obtained the parents’ consent.”  But, the FTC also recommends as “best practice” that schools provide parents with information about the operators to which it has consented on behalf of the parents.  The FTC requires that the school investigate the collection, use, sharing, retention, security and disposal practices with respect to personal information collected from its students.
  • In July, COPPA FAQ H.5, FAQ H.10, and FAQ H.16 about parental consent verification also were updated.  In FAQ H.5, the FTC indicates that “collecting a 16-digit credit or debit card number alone” is not sufficient as a parental consent mechanism, in some circumstances, “collection of the card number – in conjunction with implementing other safeguards – would suffice.”  Revised FAQ H.10 indicates that a developer of a child-directed app may use a third party for parental verification “as long as [developers] ensure that COPPA requirements are being met,” including the requirement to “provide parents with a direct notice outlining [the developer’s] information collection practices before the parent provides his or her consent.” In revised FAQ H.16, the FTC [...]

    Continue Reading



read more

Any Progress? The Draft Data Protection Regulation Celebrates its Third Anniversary

On the third anniversary of the EU Commission’s proposed new data protection regime, the UK ICO has published its thoughts on where the new regime stands. The message is mixed: progress in some areas but nothing definitive, and no real clarity as to when the new regime may come into force.

The legislative process involves the agreement of the European Commission, the European Parliament and the Council of Europe (representing the governments of the member states). So far the European Parliament has agreed its amendments to the Commission’s proposal and we are still waiting for the Council to agree it’s amendments before all three come together and try and find a mutually agreeable position.

The Council is guided by the mantra “nothing is agreed until everything is agreed”, and so even though there has been progress with the Council reaching “partial general agreement” on international transfers, risk-based obligations on controllers and processors, and the provisions relating to specific data processing situations such as research and an approach agreed on the one-stop shop principle (allowing those operating in multiple states to appointed and deal with a single authority), this progress means nothing until there is final agreement on everything. At this stage that means all informal agreements remain open to renegotiation.

It is noted that Latvia holds the presidency of the Council until June 2015. The Latvians have already noted that Anydata protection reform remains a key priority but progress has been slow and time may be against them. Where Latvia fails, Luxembourg will hopefully succeed as it takes up the presidency from June.

The ICO is urging all stakeholders to push on with the reform, although they see the proposed timetable of completion of the trilogue process by the end of 2015 as being optimistic. Instead a more reasonable timetable may be a final agreement by mid-2016 with the new regime up and running in 2018.




read more

In with the New: 2015 Privacy, Advertising and Digital Media Predictions – Part I

What privacy, advertising and digital media trends will make headlines in 2015?  Digital Health for one,  Big Data for another.

Digital Health

The 2015 International Consumer Electronics Show (CES) started yesterday.  Sessions like “Sensibles: The Smarter Side of Wearables” and “DIY Health: Consumer Accessible Innovation” suggest that the consumer health issues explored by the Federal Trade Commission (FTC) last Spring (see our blog post here) are increasingly relevant.  Most notably, as more health-related information becomes digital, digital health businesses will need to revisit long-standing privacy, intellectual property protection, notice and consent practices that may not be well-suited to the more sensitive category of consumer-generated health information (CHI) (i.e., health-related information that consumers submit to or through mobile apps and devices).  In many cases, the law is underdeveloped and businesses must develop and implement their own best practices to demonstrate good faith as stewards of CHI.

We predict that CHI and the issues raised by its collection, use, disclosure and storage will stay on the FTC’s radar during 2015.  Perhaps the FTC will offer some insight about its position on CHI through guidance or regulatory activity related to a digital health business.

With mobile devices proliferating, the volume, versatility and variety of consumer-generated data, including CHI, also is proliferating.  CHI typically stands outside of HIPAA’s regulatory silo.  HIPAA regulates health plans, health care clearinghouses, health care providers who engage in standardized transactions with health plans and the business associates that assist health plans, clearinghouses and providers, and need protected health information to provide that assistance.   Mobile medical services and environments, however, typically fall outside of this framework: most mobile apps, for example, are used directly by consumers, and often at the direction of and under the control of plans and providers.  HIPAA may have, however, more reach into the growing business-to-business mobile app sector.

But, in the CHI arena, the sources of privacy and security regulation are murky.  Among likely hot topics in 2015 are:

  • When is consumer-generated information also consumer-generated health information?
  • Can data ever be “de-identified” or made anonymous in light of the so-called mosaic (or pointillist) effect?
  • What role can the “pay with data” model play in consumer protection?
  • Is all CHI deserving of the same level of protection?
  • What sources of oversight exist and are they sufficient?

The news is ripe with references to data “privacy” and data “security,” but the sensitivity associated with health information requires thinking about data “stewardship” – a broader concept that encompasses not only privacy and security but also data asset management and data governance.  Data stewardship captures not only data as an asset, but also as an opportunity to earn public trust and confidence while preserving innovation. 

We predict that how to be good data stewards will be a critical issue for digital health businesses in 2015 and that forward-looking and transparent efforts at self-policing will be key to not only avoiding regulatory scrutiny but also fostering consumer trust.

Big Data

Big Data was big news [...]

Continue Reading




read more

Privacy and Data Protection: 2014 Year in Review

In 2014, regulators around the globe issued guidelines, legislation and penalties in an effort to enhance security and control within the ever-shifting field of privacy and data protection. The Federal Trade Commission confirmed its expanded reach in the United States, and Canada’s far-reaching anti-spam legislation takes full effect imminently. As European authorities grappled with the draft data protection regulation and the “right to be forgotten,” the African Union adopted the Convention on Cybersecurity and Personal Data, and China improved the security of individuals’ information in several key areas. Meanwhile, Latin America’s patchwork of data privacy laws continues to evolve as foreign business increases.

This report furnishes in-house counsel and others responsible for privacy and data protection with an overview of key action points based on these and other 2014 developments, along with advance notice of potential trends in 2015. McDermott will continue to report on future updates, so check back with us regularly.

Read the full report here.




read more

Join Us at BAA’s Marketing Law Conference for a Panel Discussion on Developments in Mobile Marketing

For those Of Digital Interest readers attending the Brand Activation Association’s (BAA) 36th Annual Marketing Law Conference, please join McDermott partner – and Of Digital Interest editor – Julia Jacobson as she moderates a panel titled “New and Unexpected: Developments in Mobile Marketing – Mobile Tracking, Apps and Mobile Payments.” She will be joined by Ira Schlussel of HelloWorld, Inc., Paul Twarog of Google Inc. and co-moderator Terese Arenth. The panel session starts at 3:20 pm on Thursday, November 6.  We hope to see you there.




read more

GPEN Publishes Privacy Sweep Results

On 10 September 2014, the Global Privacy Enforcement Network (GPEN) published the results of its privacy enforcement survey or “sweep” carried out earlier in 2014 with respect to popular mobile apps.  The results of the sweep are likely to lead to future initiatives by data protection authorities to protect personal information submitted to mobile apps.

The purpose of the sweep was to determine the transparency of the privacy practices of some 1,211 mobile apps and involved the participation of 26 data protection authorities across the globe.  The results of the sweep suggest that a high proportion of the apps downloaded did not sufficiently explain how consumers’ personal information would be collected and used.

Background

GPEN was established in 2010 on the recommendation of the Organisation for Economic Co-operation and Development.  GPEN aims to create cooperation between data protection regulators and authorities throughout the world in order to strengthen personal privacy globally.  GPEN is currently made up of 51 data protection authorities across some 39 jurisdictions.

Over the course of a week in May 2014, GPEN’s “sweepers” – made up of 26 data protection authorities across 19 jurisdictions, including the UK Information Commissioner’s Office (ICO) – participated in the survey by downloading and briefly interacting with the most popular apps released by developers in their respective jurisdictions, in an attempt to recreate a typical consumer’s experience.  In particular GPEN intended the sweep to increase public and commercial awareness of data protection rights and responsibilities as well as identify specific high-level issues which may become the focus of future enforcement actions and initiatives.

Sweep Results

The key negative findings of GPEN sweep include:

  • 85 percent of apps failed to clearly explain how personal information would be processed.
  • 59 percent of apps did not clearly indicate basic privacy information (with 11 percent failing to include any privacy information whatsoever).
  • 31 percent of apps were excessive in their permission requests to access personal information.
  • 43 percent of the apps had not sufficiently tailored their privacy communications for the mobile app platform – often instead relying on full version privacy policies found on websites.

However, the sweep results also highlighted a number of examples of best practices for app developers, including:

  • Many apps provided clear, easy-to-read and concise explanations about exactly what information would be collected, how and when it would be used and, in some instances, explained specifically and clearly what would not be done with the information collected.
  • Some apps provided links to the privacy policies of their advertising partners and opt-out elections in respect of analytic devices.
  • There were good examples of privacy policies specifically tailored to the app platform, successfully making use of just-in-time notifications (warning users when personal information was about to be collected or used), pop-ups and layered information, allowing for consumers to obtain more detailed information if required.

Many of the GPEN members are expected to take further action following the sweep results.  For its part, the UK ICO has commented that in light [...]

Continue Reading




read more

Digital Marketing Minute: A Bad Review for Yelp

The Federal Trade Commission (FTC) announced last week that Yelp – the online service through which consumers can read and write reviews about local businesses – has agreed to pay $450,000 to settle the FTC’s charges that Yelp knowingly and without verifiable parental consent (VPC), collected personal information from children under the age of 13 through its mobile app in violation of the federal law, the Children’s Online Privacy Protection Act (COPPA).

COPPA was enacted in 1998. The FTC, which is responsible for enforcing COPPA, implemented regulations in April 2000 that are known as the COPPA Rule. The FTC issued an amended COPPA Rule in December 2012, which became effective July 1, 2013. 

In general, COPPA and the COPPA Rule prohibit operators of websites, mobile applications or other digital services (collectively, “digital services”) from knowingly collecting personal information from children under age 13 unless and until the digital service operator has VPC. 

Under the amended COPPA Rule, COPPA has a broader scope than digital service operators may realize.  COPPA applies not only to digital services that are directed to children, but also to any general-audience digital service when the operator of the digital service has “actual knowledge” that the digital services is collecting personal information from children under age 13 without VPC. 

COPPA does not require operators of general-audience digital services to ask users for age or date of birth information but, under the actual knowledge test, if the digital service collects information that establishes that a user is under 13, the digital service must be COPPA compliant, which means (among other requirements) obtaining VPC before collecting personal information from the under-age-13 user.

The FTC concluded that Yelp had “actual knowledge” that it was collecting personal information from children under age 13 because the registration page on Yelp’s app asked users to enter their date of birth but did not block access to the app for users who were too young (i.e., under age 13).   

Key Takeaway: If your general-audience digital service asks a user for his or her birth date, make sure that a user who is under age 13 is blocked from using the digital service.  Also, to help prevent users who are too young from circumventing the block, consider one or all of the following techniques:

  1. Request birth date in a neutral manner, i.e., no prompt is given to the age of eligibility, such as “You must be age 13 or older to register.”
  2. Present a neutral on-screen error message when a user is under age 13, such as “Sorry, you’re not eligible,” rather than “Sorry, you are under age 13.”
  3. Deploy a cookie or other functionality to prevent an under-age user whose access was blocked from using the back button (or similar technique) to re-enter an old-enough birth date.      



read more

The California AG’s New Guide on CalOPPA – A Summary for Privacy Pros

Last week, the California Attorney General’s Office (AGO) released a series of recommendations entitled Making Your Privacy Practices Public (Guide) designed to help companies meet the requirements of California’s Online Privacy Protection Act (CalOPPA) and “provide privacy policy statements that are meaningful to consumers.”

As we have previously discussed, CalOPPA requires website operators to disclose (1) how they respond to Do Not Track (DNT) signals from browsers and other mechanism that express the DNT preference, and (2) whether third parties use or may use the site to track (i.e., collect personally identifiable information about) individual California residents “over time and across third party websites.”   Since the disclosure requirements became law, however, there has been considerable confusion among companies on how exactly to comply, and some maintain that despite W3C efforts, there continues to be no industry-wide accepted definition of what it means to “respond” to DNT signals.  As a result, the AGO engaged in an outreach process, bringing stakeholders together to provide comments on draft recommendations over a period of several months, finally culminating in the AGO publishing the final Guide earlier this week.

The Guide is just that – a guide – rather than a set of binding requirements.  However, the recommendations in the Guide do seem to present a road map for how companies might steer clear of an AGO enforcement action in this area.  As a result, privacy professionals may want to consider matching up the following key recommendations from the Guide with existing privacy policies, to confirm that they align or to consider whether it is necessary and appropriate to make adjustments:

  • Scope of the Policy:  Explain the scope of the policy, such as whether it covers online or offline content, as well as other entities such as subsidiaries.
  • Availability:  Make the policy “conspicuous” which means:
    • for websites, put a link on every page that collects personally identifiable information (PII).
    • for mobile apps that collect PII, put link at point of download, and from within the app – for example: put a link accessible from the “about” or “information” or “settings” page.
  • Do Not Track:
    • Prominently label the section of your policy regarding online tracking, for example: “California Do Not Track Disclosures”.
    • Describe how you respond to a browser’s Do Not Track signal or similar mechanisms within your privacy policy instead of merely providing a link to another website; when evaluating how to “describe” your response, consider:
      • Do you treat users whose browsers express the DNT signal differently from those without one?
      • Do you collect PII about browsing activities over time and third party sites if you receive the DNT signal?  If so, describe uses of the PII.
    • If you choose to link to an online program rather than describe your own response, provide the link with a general description of what the program does.
  • Third Party Tracking:
    • Disclose whether third parties are or may be collecting PII.
    • When drafting the disclosure [...]

      Continue Reading



read more

The New Normal: Big Data Comes of Age

On May 1, 2014, the White House released two reports addressing the public policy implications of the proliferation of big data. Rather than trying to slow the accumulation of data or place barriers on its use in analytic endeavors, the reports assert that big data is the “new normal” and encourages the development of policy initiatives and legal frameworks that foster innovation, promote the exchange of information and support public policy goals, while at the same time limiting harm to individuals and society. This Special Report provides an overview of the two reports, puts into context their conclusions and recommendations, and extracts key takeaways for businesses grappling with understanding what these reports—and this “new normal”—mean for them.

Read the full article.




read more

Thinking Outside the HIPAA Box

On Wednesday, May 7, the Federal Trade Commission (FTC) held the third of its Spring Seminars on emerging consumer privacy issues.  This session focused on consumer-generated health information (CHI).  CHI is data generated by consumers’ use of the Internet and mobile apps that relates to an individual’s health.  The “H” in CHI defies easy definition but likely includes, at minimum, data generated from internet or mobile app activity related to seeking information about specific conditions, disease/ medical condition management tools, support and shared experiences through online communities or tools for tracking diet, exercise or other lifestyle data.

In the United States, many consumers (mistakenly) believe that all of their health-related information is protected, at the federal level, by the Health Information Portability and Accountability Act (HIPAA).  HIPAA does offer broad privacy protections to health-related information, but only to identifiable health information received by or on behalf of a “covered entity” or a third party working for a covered entity.  Covered entities are, essentially, health plans and health care providers who engage in reimbursement transactions with health plans (referred to as “Protected Health Information” or “PHI”). When HIPAA was enacted in 1996, PHI was the primary type of health information, but CHI, which is generally not also PHI, has changed that.  As FTC Commissioner Julie Brill noted her in her opening remarks, CHI is “health data stored outside the HIPAA silo.”

Without the limitations imposed by HIPAA, online service providers and mobile apps generally (except where state law requires differently) can treat CHI like other digital non-health data that they collect from consumers.  As a result, the FTC expressed concerned that CHI may be aggregated, shared and linked in ways that consumers did not foresee and may not understand.

The panelists at the FTC discussed the difficulty in defining CHI, and whether and how it is different from other kinds of data collected from consumers.  One panelist noted that whether a consumer considers his or her CHI sensitive is highly individualized.  For example, are the heart rate and exercise data collected by mobile fitness apps sensitive? Would the answer to this question change if these data points were linked with other data points that began to suggest other health or wellness indicators, just as weight?  Would the answer change if that linked data was used to predict socioeconomic status that is often linked to certain health, wellness and lifestyle indicators or used to inform risk rating or direct to consumer targeted advertising?

Panelists also discussed the larger and more general question of how to define privacy in a digital economy and how to balance privacy with the recognized benefits of data aggregation and data sharing.  These questions are compounded by the difficulty of describing data as being anonymized or de-identified – foundational principles in most privacy frameworks – because the quality of being “identifiable” in the digital economy may depend on the proximity of a piece of data to other pieces of data.

Though the “how” and “what” of additional [...]

Continue Reading




read more

STAY CONNECTED

TOPICS

ARCHIVES

2021 Chambers USA top ranked firm
LEgal 500 EMEA top tier firm 2021
U.S. News Law Firm of the Year 2022 Health Care Law