Advertising & Marketing

Although the incorporation of technology into human endeavours—commercial, political and personal—is a normal component of technological innovation, the advent of artificial intelligence technology is producing significant challenges we have not felt or understood with earlier innovations. For many years, for example, there has been speculation, research and public debate about the impact of the internet, the functioning of search engines, and online advertising techniques on commercial and political decisions.

The alleged “hacking” of the 2016 US presidential election, and the concerns about such activities in the 2017 European elections, will only heighten the interweaving discussions on free speech, national sovereignty, cyber security and the nature of privacy.

The use of artificial intelligence and machine-learning technologies has only added to the list of issues and areas of concern. The consequences of automobile accidents involving “self-driving” technologies, the “flash crashes” on securities markets due to algorithmic trading, and bias in systems designed to determine benefit eligibility, are requiring us to consider what happens when we defer judgment to machines, and highlighting the importance of quality in data sets and sensors.

Continue Reading

Read Full International News, Fall 2017

On March 23, 2017, the New York Attorney General’s office announced that it has settled with the developers of three mobile health (mHealth) applications (apps) for, among other things, alleged misleading commercial claims. This settlement highlights for mHealth app developers the importance of systematically gathering sufficient evidence to support their commercial claims.

Read the full article.

In an age where providers are increasingly taking the management of their patient’s health online and out of the doctor’s office, the creation of scalable and nimble patient engagement tools can serve to improve patient experience, health care outcomes and health care costs. While the level of enthusiasm for these tools is at an all-time high, there is a growing concern about the unexpected deterrent to the adoption of these tools from an unlikely source: the Telephone Consumer Protection Act of 1991 (TCPA).

Many professionals in the health industry have come to share two misconceptions about the TCPA: first, that the TCPA only applies to marketing phone calls or text message “spam,” and second, that the TCPA does not apply to communications from HIPAA covered entities to their patients/health plan members. These misconceptions can be costly mistakes for covered entities that have designed their patient engagement outreach programs without include a TCPA compliance strategy.

Compliance Challenges

As discussed in a previous post, the TCPA was originally intended to curb abusive telemarketing calls. When applying the law to smarter and increasingly innovative technologies (especially those that we see in the patient engagement world), the TCPA poses significant compliance challenges for the users of these tools that arguably threaten to curb meaningful progress on important public health and policy goals.

Despite its initial scope of addressing robocalls, the TCPA also applies to many automated communications between health care providers and their patients, and between plans and their members. There is a diverse array of technical consent requirements that apply depending on what type of phone call you make. For instance, most auto-dialed marketing calls to cell phones require prior express written consent, meaning that the caller must first obtain written consent before making the call. To make compliance more compliance, callers remain responsible for proving consent and the accuracy of the numbers dialed.

Indeed, the TCPA presents a serious challenge for patient engagement tools, especially when violations of the TCPA can yield statutory damages of up to $1,500 per call or text message. While Federal Communications Commission orders over the past several years have added some clarity and a “safe harbor” for HIPAA-covered entities to help entities achieve compliance, there is still no “free pass” from the TCPA’s requirements. Therefore, covered entities and the business associates who work for them should not assume that compliance with HIPAA offers any security of defense against a successful claim under the TCPA.

Continue Reading The TCPA: An Unexpected Deterrent to Patient Engagement Tools

The German Federal Labor Court (Bundesarbeitsgericht (BAG)) has published the reasons for its two decisions about whether an employee can revoke consent given to his or her employer for public use of the employee’s image in photos, videos or other marketing materials (BAG 19 February 2015, 8 AZR 1011/13; BAG 11 December 2014 – 8 AZR 1010/13). The BAG held that (1) an employer can rely on an employee’s voluntary consent under German data privacy laws and (2) an employee must take into account the employer’s interests when justifying his or her revocation of a valid consent.  The BAG’s decisions are notable because they are contrary to the widely-held opinion that employee consent given in the context of the employment relationship is not completely voluntary.

German data privacy and copyright laws require an employer to obtain an employee’s consent to use the employee’s image in photos or videos developed for marketing or similar purposes.  The consent must be voluntarily given and not tied to the employee’s employment status.  Before the BAG’s decisions, some German data privacy law commentators argued that an employee’s consent is not always freely given because of the employee’s subordinate status in the employment relationship.

Now, under the BAG’s decisions, the existence of the employer-employee relationship does not cause an employee’s individual consent to be per se ineffective. The BAG determined that employees can freely choose whether to consent or not. If an employee believes that he or she is subject to discrimination for withholding consent, remedies are available under other German laws. The BAG emphasised that the consent must be in writing and include certain information to be valid and that whether the consent is subsequently revocable depends on the facts and circumstances.

Key Takeaway:

An employer should obtain individual written consent from an employee to use the employee’s image or likeness in marketing materials. To help prevent future revocation, the written consent must state (among other specific requirements) that the employer’s rights survive termination of the employment relationship.

Is a social media promotion part of your organization’s branding plans? Please join Julia Jacobson (McDermott partner and Of Digital Interest editor) and her co-panelists next Tuesday, July 28, 2015, at 2:00 pm for “Sweeps, Contests & Games in Social Media”. The webinar, the second in a three-part series hosted by the Brand Activation Association (a division of the Association of National Advertisers (ANA)) will explore endorsement, intellectual property and privacy legal issues, as well as the practical aspects of balancing brand wants with compliance needs and participation verification and fulfillment.

For more information, please click here.

Last Friday, July 10, 2015, the Federal Communications Commission (FCC) released Declaratory Ruling and Order 15-72 (“Order 15-72”) to address more than 20 requests for clarity on FCC interpretations of the Telephone Consumer Protection Act (TCPA). The release of Order 15-72 follows a June 18th open meeting at which the FCC adopted the rulings now reflected in Order 15-72 that are intended to “close loopholes and strengthen consumer protections already on the books.”

Keys rulings in Order 15-72 include:

  • Confirming that text messages are “calls” subject to the TCPA;
  • Clarifying that consumers may revoke their consent to receive robocalls (i.e., telemarketing calls or text messages from an automated system or with a prerecorded or artificial voice) “at any time and through any reasonable means”;
  • Making telemarketers liable for robocalls made to reassigned wireless telephone numbers without consent from the current account holder, subject to “a limited,one-call exception for cases in which the caller does not have actual or constructive knowledge of the reassignment”;
  • Requiring consent for internet-to-phone text messages;
  • Clarifying that “nothing … prohibits” implementation of technology that helps consumers block unwanted robocalls;
  • Allowing certain parties an 89-day (after July 10, 2015) window to update consumer consent to “prior express written consent” as the result of an ambiguous provision in the 2012 FCC Order that established the “prior express written consent” requirement; and
  • Exempting from the consent requirement certain free “pro-consumer financial- and healthcare-related messages”.

We are reviewing the more than 135 pages of Order 15-72, as well as the separate statements of FCC Commissioners Wheeler, Clyburn, Rosenworcel (dissenting in part), Pai (dissenting) and O’Rielly (dissenting in part). Please check back soon for more information and analysis.

Earlier this year, AmeriFreight, a Georgia-based auto shipment broker, settled with the Federal Trade Commission (FTC) over charges that the company posted customer reviews on its website while failing to disclose that it had given cash discounts to customers in exchange for the reviews.  According to the FTC complaint, AmeriFreight touted on its website homepage that it had “more highly ranked ratings and reviews than any other company in the automotive transportation business” and that a majority of the online reviews on  AmeriFreight’s website failed to disclose that the reviewers were compensated $50 for posting reviews and were also eligible to receive an additional $100 if selected for the “Best Monthly Review Award.”  The FTC charged that AmeriFreight, by failing to disclose the incentives it had given to reviewers, had misrepresented its customer reviews as those of unbiased consumers.  The FTC’s position can be summed up best by the following quotes from its Director of the Bureau of Consumer Protection: “Companies must make it clear when they have paid their customers to write online reviews” and if companies “fail to do that – as AmeriFreight did – then they’re deceiving consumers, plain and simple.”

The FTC’s Endorsement Guidelines

Guidelines issued in 2009 by the Federal Trade Commission (the “FTC Endorsement Guidelines”) make clear that an advertiser must fully disclose any connection between the advertiser and an endorser of the advertiser’s product or service that might materially affect the weight or credibility of the endorsement, such as the fact that the endorser received compensation or some other benefit or incentive from the advertiser in exchange for providing a favorable review.  An advertiser’s failure to disclose an endorser’s material connection with the advertiser constitutes an unfair and deceptive trade practice as well as false advertising, both in violation of Section 5(a) of the Federal Trade Commission Act.  The requirement of disclosure of material connections applies not only to celebrity, expert or professional endorsers, but also to ordinary consumer-endorsers.  Many companies today use consumer endorsements in promoting their products or services, including the so-called “word-of-mouth advertising” whereby satisfied customers tell other people how much they like a product or service.  A common example of this form of advertising is publishing consumer-submitted reviews on the internet.  Good word of mouth generated by favorable customer reviews can make a big difference in a company’s online ad campaign.  However, companies that are looking to incentivize customers to submit good reviews must be wary of not running afoul of the FTC Endorsement Guidelines.  In particular, where a company offers money or other benefits to customers in exchange for good reviews, it must disclose such fact when publishing reviews.

Key Takeaways for Businesses

The FTC’s complaint against AmeriFreight is the first time the agency has charged a company with misrepresenting online reviews by failing to disclose that it gave cash discounts to customers to post the reviews.  This has significant implications for businesses that use customer reviews as part of their advertising or marketing initiatives.  The AmeriFreight case makes clear that advertisements, regardless of form, must be transparent.  When a business touts its products or services, whether in endorsed advertisements or customer reviews, it must make clear that it has paid its customers and/or endorsers to review or endorse the product or service.  A business may not tout its “highly ranked ratings and reviews” or the like if it offered incentives to its reviewers without first disclosing the material connection between its endorsers and the business.  Hiding this fact may subject a business to the FTC Act and associated fines and penalties.

Thursday, April 30, 2015, marks the last day a business can request a retroactive waiver for failing to comply with certain fax advertising requirements promulgated by the Federal Communications Commission (FCC). The scope of these requirements was clarified on October 30, 2014, when the FCC issued an Order (2014 Order) under the Junk Fax Prevention Act of 2005 (Junk Fax Act). The 2014 Order confirms that senders of all advertising faxes must include information that allows recipients to opt out of receiving future faxes from that sender.

The 2014 Order clarifies certain aspects of the FCC’s 2006 Order under the Junk Fax Act (the Junk Fax Order). Among other requirements, the Junk Fax Order established the requirement that the sender of an advertising fax provide notice and contact information that allows a recipient to “opt out” of any future fax advertising transmissions.

Following the FCC’s publication of the Junk Fax Order, some businesses interpreted the opt-out requirements as not applying to advertising faxes sent with the recipient’s prior express permission (based on footnote 154 in the Junk Fax Order). The 2014 Order provided a six-month period for senders to comply with the opt-out requirements of the Junk Fax Order for faxes sent with the recipient’s prior express permission and to request retroactive relief for failing to comply. The six-month period ends on April 30, 2015. Without a waiver, the FCC noted that “any past or future failure to comply could subject entities to enforcement sanctions, including potential fines and forfeitures, and to private litigation.”

For more information about the Junk Fax Act in general, or the waiver request process in particular, please contact Julia Jacobson or Matt Turnell.

On February 27, 2015, the Obama White House released an “Administration Discussion Draft” of its Consumer Privacy Bill of Rights Act of 2015 (Proposed Consumer Privacy Act)

The Proposed Consumer Privacy Act revises and builds on the “Consumer Privacy Bill of Rights” that the Obama White House released in its 2012 Consumer Data Privacy in a Networked World: A Framework for Protecting Privacy and Promoting Innovation in the Global Digital Economy report.

As described during President Obama’s January 12 visit to the Federal Trade Commission (FTC), the Proposed Consumer Privacy Act identifies seven “basic principles to both protect personal privacy and ensure that industry can keep innovating.”   These seven principles are:

  1. Transparency (§101): Transparency is a principle frequently cited in guidance from the FTC, as well as self-regulatory framework, such as the Digital Advertising Alliance’s cross-industry code for interest based-advertising. The Proposed Consumer Privacy Act describes transparency as “concise and easily understandable language, accurate, clear, timely, and conspicuous notice about privacy and security practices.” The notice required from an entity subject to the Proposed Consumer Privacy Act (defined as a “covered entity” (CE)) must describe the entity’s collection, use, disclosure, retention, destruction and security practices.
  2. Individual Control (§102): The Individual Control principle means offering consumers a “reasonable means to control the processing (i.e., taking any action regarding) personal data about them in proportion to the privacy risk to the individual and consistent with context.” An individual must have a way to either withdraw consent related to his or her personal data that is “reasonably comparable” to the means by which the consent was initially granted consent or request that the CE “de-identify” (as defined in the Proposed Consumer Privacy Act) his or her personal data.
  3. Respect for Context (§103): Under the Respect for Context principle, a CE must process personal data reasonably “in light of context.” If the processing is not reasonable, the CE must undertake a “privacy risk analysis” to identify and take reasonable steps to mitigate privacy-related risk, including “heightened transparency and individual control,” such as just-in-time notices.  Reasonableness is presumed when a CE’s personal data processing “fulfills an individual’s request.”
  4. Focused Collection and Responsible Use (§104): The Focused Collection and Responsible Use principle requires that a CE limit its collection, retention and use of personal data to a “manner that is reasonable in light of context.” The CE also must “delete, destroy, or de-identify” personal data within a “reasonable time” after the original purpose for its collection, retention, or use has been fulfilled.
  5. Security (§105): Under the Security principle, a CE must: identify internal and external “risks to privacy and security” of personal data; implement and maintain safeguards “reasonably designed” to secure personal data; regularly assess the efficacy of the safeguards, and adjust the safeguards to reflect material changes to business practices or “any other circumstances that create a material impact on the privacy or security” of personal data under the CE’s control. The four factors presented for evaluating the reasonableness of security safeguards are: (i) degree of privacy risk; (ii) foreseeability of risks; (iii) “widely accepted practices”; and (iv) cost.
  6. Access and Accuracy (§106): The Access principle requires a CE to give an individual “reasonable access to, or accurate representation of” that individual’s personal data under the CE’s control, with limitations on access for (among others) legally privileged information, law enforcement, national security or frivolous requests.  The Accuracy principle also requires a CE to establish a procedure for an individual to ensure that his or her personal data held by the CE is accurate.   A CE does not need to correct personal data obtained “directly from the individual” or from certain governmental databases.  The CE may decline to correct the personal data but must destroy or delete it upon request.
  7. Accountability (§107): Accountability under the Proposed Consumer Privacy Act means that a CE must (among other measures) provide employee training, conduct evaluation of privacy protections, adopt a “privacy by design” approach to its systems and practices and “bind” downstream users of its personal data to the CE’s commitments to the individuals from which the personal data was collected, as well as the requirements of the Proposed Consumer Privacy Act.

The Proposed Consumer Privacy Act does not cover much new ground, but its significance rests in the knitting together of the existing guidelines from the FTC and other federal regulators (e.g., National Telecommunications and Information Administration) and industry self-regulatory codes, as well as its intention to begin to provide a more certain outline to practices that will be deemed unfair or deceptive. The Act also buttresses the FTC’s role as “enforcer in chief” for consumer privacy, which has significance given challenges to the scope of the FTC’s authority to regulate data security.

Some of the provisions that we found notable in the Proposed Consumer Privacy Act include:

  • That “context” (a defined term) of personal data collection is a prominent feature in determining whether consent has been obtained from individuals (see, g. Subsection 4k(3)) is notable for a business struggling with how to manage consumer expectations about how their personal data is used and disclosed.
  • A CE that complies with an FTC-approved code of conduct for processing personal data has safe harbor protection under the Proposed Consumer Privacy Act (§301). Federal regulators have consistently shown support for industry codes of conduct as a means to police data privacy and protection.  In its 2012 White House Report, the Obama White House noted the importance of self-regulatory guidance as a framework for regulating consumer privacy and security on the Internet.  Also in 2012, the U.S. Department of Commerce recommended legislative authority for the FTC to provide input about and directly enforce “industry codes of conduct.”  Further, the FTC has indicated that publicizing participation in a code of conduct but failing to adhere to the code can be a deceptive trade practice.  The Proposed Consumer Privacy Act expressly prohibits a state or local government from enforcing any personal data processing law to if the CE is entitled to safe harbor protection through compliance with an approved industry code.
  • Enforcement is carried out through federal and state regulators and does not include a private right of action, consistent with the Federal Trade Commission Act, the Children’s Online Privacy Protection Act (COPPA) and HIPAA.
    • On the federal level, the FTC can treat violations as unfair and deceptive trade practices pursuant to Section 5 of the Federal Trade Commission Act. The FTC is not allowed to undertake enforcement against a CE during its first 18 months’ of processing personal data. Presumably, this grace period is intended to fulfill the stated goal of not stifling innovation.  Note, too, that the FTC is expressly prohibited from requiring a CE to deploy or use specific products or technologies.
    • On the state level, a state Attorney General can bring a civil action for violation of the Proposed Consumer Privacy Act that “caused or is causing” harm to a substantial number” of its state’s residents. The only remedy available is injunctive relief unless the FTC intervenes.
  • The Proposed Consumer Privacy Act would preempt “any provision of a statute, regulation, or rule of a State or local government” that “imposes requirements on covered entities with respect to personal data processing” but not state consumer protection or data breach notification laws, or state or local laws that address the “processing of health information or financial information.”

We will continue to assess and monitor the Proposed Consumer Privacy Act, as well as the White House’s other privacy legislation, such as the Student Digital Privacy Act, against the backdrop of the wide range of privacy, security, breach and data utility initiatives underway at the state and federal level.

2014 was a busy year for the Federal Trade Commission (FTC) with the Children’s Online Privacy Protection Act (COPPA).  The FTC announced something new under COPPA nearly every month, including:

  • In January, the FTC issued an updated version of the free consumer guide, “Net Cetera:  Chatting with Kids About Being Online.”  Updates to the guide include advice on mobile apps, using public WiFi securely, and how to recognize text message spam, as well as details about recent changes to COPPA.
  • In February, the FTC approved the kidSAFE Safe Harbor Program.  The kidSAFE certification and seal of approval program helps children-friendly digital services comply with COPPA.  To qualify for a kidSAFE seal, digital operators must build safety protections and controls into any interactive community features; post rules and educational information about online safety; have procedures for handling safety issues and complaints; give parents basic safety controls over their child’s activities; and ensure all content, advertising and marketing is age-appropriate.
  • In March, the FTC filed an amicus brief in the 9th U.S. Circuit Court of Appeals, arguing that the ruling of U.S. District  Court for the Northern District of California in Batman v. Facebook that COPPA preempts state law protections for the online activities of teenagers children outside of COPPA’ coverage is “patently wrong.”
  • In April, the FTC updated its “Complying with COPPA:  Frequently Asked Questions” (aka the COPPA FAQs) to address how COPPA applies in the school setting.  In FAQ M.2, the FTC discussed whether a school can provide the COPPA-required consent on behalf of parents, stating that “Where a school has contracted with an operator to collect personal information from students for the use and benefit of the school, and for no other commercial purpose, the operator is not required to obtain consent directly from parents, and can presume that the school’s authorization for the collection of students’ personal information is based upon the school having obtained the parents’ consent.”  But, the FTC also recommends as “best practice” that schools provide parents with information about the operators to which it has consented on behalf of the parents.  The FTC requires that the school investigate the collection, use, sharing, retention, security and disposal practices with respect to personal information collected from its students.
  • In July, COPPA FAQ H.5, FAQ H.10, and FAQ H.16 about parental consent verification also were updated.  In FAQ H.5, the FTC indicates that “collecting a 16-digit credit or debit card number alone” is not sufficient as a parental consent mechanism, in some circumstances, “collection of the card number – in conjunction with implementing other safeguards – would suffice.”  Revised FAQ H.10 indicates that a developer of a child-directed app may use a third party for parental verification “as long as [developers] ensure that COPPA requirements are being met,” including the requirement to “provide parents with a direct notice outlining [the developer’s] information collection practices before the parent provides his or her consent.” In revised FAQ H.16, the FTC addresses whether an app store operator that offers a verifiable parental consent mechanism is exposed to liability under COPPA.  Since an app store operator does not qualify as an “operator” under COPPA, the app store is not liable under COPPA “for failing to investigate the privacy practices of the operators for whom [they] obtain consent,” but could be liable under the FTC Act for false or deceptive practices.
  • In August, the FTC approved the Internet Keep Safe Coalition (iKeepSafe) program as a safe harbor oversight program. The FTC also called for public comments on AgeCheq, Inc.’s parental verification method, which sought to verify parental identity via a financial transaction or a hand-signed declaration.  The FTC subsequently rejected the proposed method in November because these methods have already been recognized as valid means of obtaining verifiable parental consent under COPPA and emphasized that companies are free to develop common consent mechanisms without Commission approval.
  • In September, Yelp was fined $450,000 for failing to comply with COPPA.  (See our blog post here).  Also in September, TinyCo (the developer of Tiny Pets, Tiny Zoo, Tiny Village, Tiny Monsters and Mermaid Resort) was fined $300,000 for collecting children’s email addresses, in exchange for in-game bonuses, without parental consent in violation of COPPA.
  • In November, AgeCheq, Inc. proposed a second parental consent verification method to ensure COPPA compliance.  The second proposed method consisted of a device-signed parental consent form with a multi-step method requiring entry of a code sent by text message to a mobile device. The Center for Digital Democracy urged the FTC to reject AgeCheq’s method in comments filed on December 29, 2014.  On January 29, 2015, the FTC announced its rejection of AgeCheq’s second proposed parental verification method.
  • In December, the FTC warned BabyBus, a China-based children’s app developer, that its apparent collection of user geolocation information may violate COPPA if (i) user geolocation information is indeed being collected and (ii) if the company does not get parents’ consent before collection the information from children under age 13.  The FTC noted that “COPPA and its related rules apply to foreign-based Web sites and online services that are involved in commerce in the United States.”

Given California’s new student privacy law, Student Online Personal Information Protection Act (effective January 1, 2016), and the recent increased focus on student privacy resulting from President Obama’s announcement about the Student Privacy Act, we expect that 2015 also will be an active year for children’s privacy.  Stay tuned!