Last week, the US Court of Appeals for the DC Circuit issued a long-awaited decision on an omnibus challenge to the FCC’s interpretation of the TCPA. While the decision provides some relief for businesses, it does not eliminate the prospect of TCPA liability and leaves important TCPA interpretive questions unresolved. Businesses should continue to be vigilant regarding consent and opt-out procedures when sending automated text messages and automated or pre-recorded calls to consumers. Continue Reading
On July 28, 2016, US Department of Health and Human Services (HHS) issued guidance (guidance) under the Health Insurance Portability and Accountability Act (HIPAA) on what covered entities and business associates can do to prevent and recover from ransomware attacks. Ransomware attacks can also trigger concerns under state data breach notification laws.
The HIPAA Security Rule requires covered entities and business associates to implement security measures. It also requires covered entities and business associates to conduct an accurate and thorough risk analysis of the potential risks and vulnerabilities to the confidentiality, integrity and availability of electronic protected health information (ePHI) the entities create, receive, maintain or transmit and to implement security measures sufficient to reduce those identified risks and vulnerabilities to a reasonable and appropriate level. The HIPAA Security Rule establishes a floor for the security of ePHI, although additional and/or more stringent security measures are certainly permissible and may be required under state law. Compliance with HIPAA’s existing requirements provides covered entities and business associates with guidance on how to prevent and address breaches that compromise protected health information. The new HIPAA guidance specific to ransomware reinforces how the existing requirements can help an entity protect sensitive information.
Read the full article here.
2014 was a busy year for the Federal Trade Commission (FTC) with the Children’s Online Privacy Protection Act (COPPA). The FTC announced something new under COPPA nearly every month, including:
- In January, the FTC issued an updated version of the free consumer guide, “Net Cetera: Chatting with Kids About Being Online.” Updates to the guide include advice on mobile apps, using public WiFi securely, and how to recognize text message spam, as well as details about recent changes to COPPA.
- In February, the FTC approved the kidSAFE Safe Harbor Program. The kidSAFE certification and seal of approval program helps children-friendly digital services comply with COPPA. To qualify for a kidSAFE seal, digital operators must build safety protections and controls into any interactive community features; post rules and educational information about online safety; have procedures for handling safety issues and complaints; give parents basic safety controls over their child’s activities; and ensure all content, advertising and marketing is age-appropriate.
- In March, the FTC filed an amicus brief in the 9th U.S. Circuit Court of Appeals, arguing that the ruling of U.S. District Court for the Northern District of California in Batman v. Facebook that COPPA preempts state law protections for the online activities of teenagers children outside of COPPA’ coverage is “patently wrong.”
- In April, the FTC updated its “Complying with COPPA: Frequently Asked Questions” (aka the COPPA FAQs) to address how COPPA applies in the school setting. In FAQ M.2, the FTC discussed whether a school can provide the COPPA-required consent on behalf of parents, stating that “Where a school has contracted with an operator to collect personal information from students for the use and benefit of the school, and for no other commercial purpose, the operator is not required to obtain consent directly from parents, and can presume that the school’s authorization for the collection of students’ personal information is based upon the school having obtained the parents’ consent.” But, the FTC also recommends as “best practice” that schools provide parents with information about the operators to which it has consented on behalf of the parents. The FTC requires that the school investigate the collection, use, sharing, retention, security and disposal practices with respect to personal information collected from its students.
- In July, COPPA FAQ H.5, FAQ H.10, and FAQ H.16 about parental consent verification also were updated. In FAQ H.5, the FTC indicates that “collecting a 16-digit credit or debit card number alone” is not sufficient as a parental consent mechanism, in some circumstances, “collection of the card number – in conjunction with implementing other safeguards – would suffice.” Revised FAQ H.10 indicates that a developer of a child-directed app may use a third party for parental verification “as long as [developers] ensure that COPPA requirements are being met,” including the requirement to “provide parents with a direct notice outlining [the developer’s] information collection practices before the parent provides his or her consent.” In revised FAQ H.16, the FTC addresses whether an app store operator that offers a verifiable parental consent mechanism is exposed to liability under COPPA. Since an app store operator does not qualify as an “operator” under COPPA, the app store is not liable under COPPA “for failing to investigate the privacy practices of the operators for whom [they] obtain consent,” but could be liable under the FTC Act for false or deceptive practices.
- In August, the FTC approved the Internet Keep Safe Coalition (iKeepSafe) program as a safe harbor oversight program. The FTC also called for public comments on AgeCheq, Inc.’s parental verification method, which sought to verify parental identity via a financial transaction or a hand-signed declaration. The FTC subsequently rejected the proposed method in November because these methods have already been recognized as valid means of obtaining verifiable parental consent under COPPA and emphasized that companies are free to develop common consent mechanisms without Commission approval.
- In September, Yelp was fined $450,000 for failing to comply with COPPA. (See our blog post here). Also in September, TinyCo (the developer of Tiny Pets, Tiny Zoo, Tiny Village, Tiny Monsters and Mermaid Resort) was fined $300,000 for collecting children’s email addresses, in exchange for in-game bonuses, without parental consent in violation of COPPA.
- In November, AgeCheq, Inc. proposed a second parental consent verification method to ensure COPPA compliance. The second proposed method consisted of a device-signed parental consent form with a multi-step method requiring entry of a code sent by text message to a mobile device. The Center for Digital Democracy urged the FTC to reject AgeCheq’s method in comments filed on December 29, 2014. On January 29, 2015, the FTC announced its rejection of AgeCheq’s second proposed parental verification method.
- In December, the FTC warned BabyBus, a China-based children’s app developer, that its apparent collection of user geolocation information may violate COPPA if (i) user geolocation information is indeed being collected and (ii) if the company does not get parents’ consent before collection the information from children under age 13. The FTC noted that “COPPA and its related rules apply to foreign-based Web sites and online services that are involved in commerce in the United States.”
Given California’s new student privacy law, Student Online Personal Information Protection Act (effective January 1, 2016), and the recent increased focus on student privacy resulting from President Obama’s announcement about the Student Privacy Act, we expect that 2015 also will be an active year for children’s privacy. Stay tuned!
Since 2001, the French Court of Cassation has made a continuous effort to refine and, in some circumstances, narrow the scope of the right to privacy in the workplace with a view to reaching a fair and balanced approach. The January 6, 2015 declaration of the French Data Protection Authority (CNIL) further highlights this trend towards the standardization of information collection at work, and serves to clarify and expand the right of employers to listen in on employees’ phone calls at work.
In the landmark 2001 “Nikon Case,” the Court of Cassation ruled that “an employee has the right to the respect of his private life – including the right to the secrecy of correspondence – on the work premises and during working hours.” This announcement was qualified, however, and the court further refined that unless marked by the employee as “private,” the documents and files created by an employee on a company-computer for work purposes are presumed to be professional, which means that the company can access those documents and files without the employee’s presence. This can lead to an employer using such emails against an employee in the case of employment termination. Nonetheless, employers have an obligation under privacy and labor laws to inform employees about the collection and use of their personal data.
Building off of this decision, in October 2014, the French Social Supreme Court held that evidence gathered against an employee from data that had not previously been declared to and registered with CNIL was de facto illegal.
The French Labor Code and the French Data Protection Act both stipulate rules for the use of monitoring software by employers in the event that an employer wishes to establish such mechanisms. In particular, the employer must submit information to and engage in consultation with the works council, provide information to employees impacted by the software and make a formal declaration of the proposed monitoring activities to CNIL.
CNIL Declaration: Movement Toward a Simplified Norm
Continuing this trend, the declaration issued by the CNIL on January 6, 2015, further demonstrates not only how important the CNIL is, but also how the area of data protection is evolving and become more standardized in France.
This recent declaration established that employers wishing to record their employee’s telephone communications must first declare such information by filling out a simplified declaration form in lieu of a normal declaration form. After effectuating this simplified declaration, an employer will have the ability to listen to and record employee conversations for the purpose of employee training, evaluation and betterment of the quality of service.
While this declaration serves to grant employers permission to monitor employees, it also imposes upon them a number of restrictions: (i) the employee must be notified and informed of his or her right to refuse such recordings and (ii) the employee may only keep recordings for a period of six months. The information gathered from such recordings, however, may be kept for a reasonable period of time.
The issuance of this simplified norm for employee monitoring of phone calls at work may not greatly affect the practical daily lives of employers or employees, but it does speak volumes about the power that the CNIL retains to impact labor law. The CNIL’s declaration creates a sort of exception to normal procedure. Most often, the adoption of a simplified norm corresponds with the daily needs of businesses and administrations as the CNIL deems fit. In deciding to isolate the procedure for employee monitoring and by reclassifying the declaration procedure, the CNIL is highlighting and reacting to just how deep the issue of data protection has permeated the world of labor law.
In 2014, regulators around the globe issued guidelines, legislation and penalties in an effort to enhance security and control within the ever-shifting field of privacy and data protection. The Federal Trade Commission confirmed its expanded reach in the United States, and Canada’s far-reaching anti-spam legislation takes full effect imminently. As European authorities grappled with the draft data protection regulation and the “right to be forgotten,” the African Union adopted the Convention on Cybersecurity and Personal Data, and China improved the security of individuals’ information in several key areas. Meanwhile, Latin America’s patchwork of data privacy laws continues to evolve as foreign business increases.
This report furnishes in-house counsel and others responsible for privacy and data protection with an overview of key action points based on these and other 2014 developments, along with advance notice of potential trends in 2015. McDermott will continue to report on future updates, so check back with us regularly.
More than a decade ago, “dual use” devices (i.e., one device used for both work and personal reasons) began creeping into workplaces around the globe. Some employees insisted on bringing fancy new smart phones from home to replace the company-issued clunker and, while many employers resisted at first, dual use devices quickly became so popular that allowing them became inevitable or necessary for employee recruitment and retention, not to mention the cost savings that could be achieved by having employees buy their own devices. Because of early resistance, however, many HR and IT professionals found themselves scrambling in a reactive fashion to address the issues that these devices can raise in the workplace after they were already prevalent. Today, most companies have robust policies and procedures to address the risks presented by dual use devices, setting clear rules for addressing privacy, security, protection of trade secrets, records retention and legal holds, as well as for preventing harassment, complying with the National Labor Relations Act (NLRA), protecting the company’s relationships and reputation, and more.
In 2014, there is a new trend developing in the workplace: wearable technologies. The lesson to be learned from the dual use device experience of the past decade: Companies should consider taking proactive steps now to identify the risks presented by allowing wearables at work, and develop a strategy to integrate them into the workplace in a way that maximizes employee engagement, but minimizes corporate risk.
An effective integration strategy will depend on the particular industry, business needs, geographic location and corporate culture, of course. The basic rule of thumb from a legal standpoint, however, is that although wearables present a new technology frontier, the old rules still apply. This means that companies will need to consider issues of privacy, security, protection of trade secrets, records retention, legal holds and workplace laws like the NLRA, the Fair Labor Standards Act, laws prohibiting harassment and discrimination, and more.
Employers evaluating use of these technologies should consider two angles. First, some companies may want to introduce wearables into the workplace for their own legitimate business purposes, such as monitoring fatigue of workers in safety-sensitive positions, facilitating productivity or creating efficiencies that make business operations run more smoothly. Second, some companies may want to consider allowing “dual use” or even just “personal use” wearables in the workplace.
In either case, companies should consider the following as part of an integration plan:
- Identify a specific business-use case;
- Consider the potential for any related privacy and security risks;
- Identify how to mitigate those risks;
- Consider incidental impacts and compliance issues – for instance, how the technologies impact the existing policies on records retention, anti-harassment, labor relations and more;
- Build policies that clearly define the rules of the road;
- Train employees on the policies;
- Deploy the technology; and
- Review the program after six or 12 months to confirm the original purpose is being served and whether any issues have emerged that should be addressed.
In other words, employers will need to run through a similar evaluation as when they adopted dual use devices, while accounting for the unique features of these technologies. For example, will employees be permitted to record conversations with their colleagues? Will employees be able to use wearables in every room in the building, or should there be limitations, such as avoiding locker rooms, or classified areas? What about use while operating a company-owned vehicle? There are many issues to consider and convening a work group of tech-savvy employees with interested managers and other HR professionals can help to issue spot and address potential concerns in a proactive way prior to these devices becoming more prevalent. The company should then create appropriate policies on use (or simply amend existing dual use or personal use device policies to include these technologies), train employees on the new rules of the road, and then follow through with discipline for those employees who fail to follow the rules, potentially suspending the right to use wearables at work in such cases.
There is no question that wearables are here to stay. And if they have not yet shown up in your workplace, they soon will. By following these tips, employers can take steps now to develop an effective implementation strategy to maximize employee engagement and business benefits while minimizing company risk.
Changes Impacting Businesses that Process Personal Data in Russia
On July 21, 2014, a new law Federal Law № 242-FZ was adopted in Russia (Database Law) introducing amendments to the existing Federal Law “On personal data” and to the existing Federal Law “On information, information technologies and protection of information.” The new Database Law requires companies to store and process personal data of Russian nationals in databases located in Russia. At a minimum, the practical effect of this new Database Law is that companies operating in Russia that collect, receive, store or transmit (“process”) personal data of natural persons in Russia will be required to place servers in Russia if they plan to continue doing business in that market. This would include, for example, retailers, restaurants, cloud service providers, social networks and those companies operating in the transportation, banking and health care spheres. Importantly, while Database Law is not scheduled to come into force until September 1, 2016, a new bill was just introduced on September 1, 2014 to move up that date to January 1, 2015. The transition period is designed to give companies time to adjust to the new Database Law and decide whether to build up local infrastructure in Russia, find a partner having such infrastructure in Russia, or cease processing information of Russian nationals. If the bill filed on September 1 becomes law, however, that transition period will be substantially shortened and businesses operating in Russia will need to act fast to comply by January 1.
Some mass media in Russia have interpreted provisions of the Database Law as banning the processing of Russian nationals’ personal data abroad. However, this is not written explicitly into the law and until such opinion is confirmed by the competent Russian authorities, this will continue to be an open question. There is hope that the lawmakers’ intent was to give a much needed boost to the Russian IT and telecom industry, rather than to prohibit the processing of personal data abroad. If this hope is confirmed, then so long as companies operating in Russia ensure that they process personal data of Russian nationals in databases physically located in Russia, they also should be able to process this information abroad, subject to compliance with cross-border transfer requirements.
The other novelty of this new Database Law is that it grants the Russian data protection authority (DPA) the power to block access to information resources that are processing information in breach of Russian laws. Importantly, the Database Law provides that the blocking authority applies irrespective of the location of the offending company or whether they are registered in Russia. However, the DPA can initiate the procedure to block access only if there is a respective court judgment. Based on the court judgment the DPA then will be able to require a hosting provider to undertake steps to eliminate the infringements. For example, the hosting provider must inform the owner of the information resource that it must eliminate the infringement, or the hosting provider must restrict the owner’s access to the information that is processed with the infringements. In case of the owner’s refusal or inaction, the hosting provider is obliged to restrict the access to the respective information resource altogether. If the foregoing steps are not performed by the hosting provider in due course, the DPA may request that the communication service provider restrict access to the respective information resource altogether, in particular to web address, domain name and references to the web pages in the internet.
Changes Impacting Businesses that Process Internet Communications
In addition to the new Database Law, a new Federal Law № 97-FZ dated May 5, 2014 (Moderator Law) amends an existing Federal Law “On information, information technologies and information protection” to create new obligations for organizers of information distribution in the internet (moderators). The term “moderator” is defined as those maintaining information systems or software designed or used to receive, transfer, deliver or process electronic messages on the internet. The relevant Russian regulator has clarified unofficially that the Moderator Law is addressed only to instant messaging, blogging, social media and e-mails (see clarification at http://rkn.gov.ru/press/publications/news26545.htm). However, the broad and ambiguous definition makes it possible to apply the Moderator Law to every website that has a chat or comment feature, or that is capable of sending or receiving messages from users. The definition as it is might also apply to e-commerce, services of cloud storage, and more.
The amendments impose several new obligations on moderators, some of which give sweeping new rights of access to the Russian Government:
- All moderators must file notification to the state authorities upon commencing moderating activity (meaning, upon maintaining information systems or software designed or used to receive, transfer, deliver or process electronic messages on the internet). The entity shall file notification upon respective request of competent state authority or at its own initiative. The entity then will be qualified as moderator after its inclusion into special Register of moderators. The particular procedure of notification is specified in the Governmental Regulation №746 dated July 31, 2014 which became effective August 12, 2014.
- All moderators are obliged to store (in the territory of Russia for not less than six months) information on the facts of reception, transfer, delivery, processing of electronic messages of users and the data of such users. The types of information to be stored are determined in recently published in Governmental Regulation № 759 dated July 31, 2014 which became effective August 14, 2014. The Regulation also specifies categories of the users whose electronic messages and data should be stored. Moderators also are under obligation to transfer such information to competent state authorities upon their request. The requested information should be provided by the moderator within the specified term which is under general rule 30 days. However, there might be urgent requests which imply requirement to provide information within three days.
- The moderators are obliged to comply with requirements for technical equipment as well as software and hardware tools established by the state authorities responsible to ensure security (for example, Federal Security Service), as well as those conducting criminal investigation in order to let them perform their functions. For example, if the state authority cannot decrypt requested information on the moderator’s information systems, the moderator must assist authorities by taking required steps to grant access to the information it needs. The detailed procedure on liaising of moderators with state authorities on technical requirements is specified in the Governmental Regulations №743 dated July 31, 2014, which became effective August 12, 2014.
Note, however that the outlined obligations are not applicable to operators of state (municipal) information systems, communications operators (i.e., legal entities rendering communications services under the respective license) as well as to the individuals acting as moderators for private (personal) purposes.
If a moderator fails to comply with the Moderator Law or its implementing Regulations, the competent state authority is entitled to restrict access to the informational resources of the moderator by following statutory specified procedure set forth in the Governmental Regulation №745 dated July 31, 2014, which became effective August 12, 2014.
The violation of the Moderator Law and its implementing Regulations exposes the company and its officers to the following potential fines:
- Failure to file required notifications can result a company fine from 100,000 to 300,000 RUR ($2,695.84 up to $8,087.52 USD) and a fine for company officers ranging from 10,000 to 30,000 RUR ($269.58 to $808.75 USD);
- Failure to store information or ensure access by authorities can result in a company fine ranging from 300,000 to 500,000 RUR ($8,087.52 up to $13,479.20 USD) , and a fine for or company officers – from 30,000 to 50,000 RUR ($808.75 up to $1,347.92 USD); and
- Failure to comply with technical requirements can result in a company fine ranging from 300,000 to 500,000 RUR ($8,087.52 up to $13,479.20 USD) and a fine for company officers ranging from 30,000 to 50,000 RUR ($808.75 up to $1,347.92 USD).
Guest author, Maria Ostashenko is Of Counsel at ALRUD Law Firm based in Moscow, Russia. Ms. Ostashenko and the ALRUD Firm are part of McDermott’s worldwide network of local privacy counsel who enable us to deliver seamless advice to multinational clients with the speed, efficiency and quality that our clients have come to expect from our team.
Boston-based litigation partner Matt Turnell shares his predictions about class action litigation under the Telephone Consumer Protection Act (TCPA) and Electronic Communications Privacy Act (ECPA) in 2014 and Boston-based white-collar criminal defense and government investigations partner David Gacioch shares his predictions about government responses to data breaches.
Class Action Litigation Predictions
2014 is already shaping up to be an explosive year for privacy- and data-security-related class actions. Last December’s data breach at Target has already led to more than 70 putative class actions being filed against the retailer. With recently disclosed data breaches at Neiman Marcus and Michaels Stores—and possibly more to come at other major retailers—court dockets will be flooded with these suits this year. And consumers are not the only ones filing class actions; banks that have incurred extra costs as a result of the data breaches are headed to court as well, with at least two putative class actions on behalf of banks filed so far against Target.
That volume of litigation related to the Target data breaches likely will be matched by a steady stream of class actions filed under the TCPA. 2013 was a busy year for the TCPA docket and I expect that the Federal Communications Commission’s (FCC) stricter rules requiring express prior written consent from the called party, which took effect in October 2013, means that 2014 will be just as busy since the majority of TCPA class actions seek statutory damages for companies’ failure to obtain consent before making autodialed or prerecorded voice calls or sending unsolicited text messages or faxes.
In 2014, I expect to see key decisions under the ECPA related to social media platforms and email providers capturing and using content from customers’ emails and other messages for targeted advertising or other purposes. One district court has already denied a motion to dismiss an ECPA claim challenging this conduct and I predict that other decisions are forthcoming this year. Needless to say, decisions in favor of class-action plaintiffs in this area could have major implications for how social media sites and email providers do business.
– Matt Turnell, Partner
Government Responses to Data Breaches
As significant data breaches continue to dominate the news, public awareness of data privacy and security issues will increase, as will their political appeal. I expect to see in 2014:
- Record numbers of breach reports to state and federal regulators, as awareness of reporting obligations spreads further and further across data owner, licensee, broker and transmitter groups;
- More states committing more enforcement resources to data privacy and security, including budget dollars and dedicated attorney general’s office units;
- More state/federal and multi-state coordination of investigations, leading to increased settlement leverage by enforcement authorities vis-à-vis firms under investigation; and
- Greater numbers and dollar values of settlements by the Federal Trade Commission (FTC) and state attorneys general than ever before.
Similarly, with the HIPAA Omnibus Final Rule going into effect on September 23, 2013, coupled with the late-2013 Department of Health and Human Services (HHS) Office of Inspector General Report decrying HHS Office for Civil Rights’ (OCR) recent pace of HIPAA-related auditing and enforcement will lead to a jump in HIPAA breach reporting and harder lines taken by OCR with respect to investigation dispositions. Therefore, expect increased settlement counts and dollar values in the OCR enforcement during 2014, too.
Substantively, expect enforcement agencies to continue focusing their greatest attention on companies that they perceive as foot-dragging or stone-walling on notification obligations in the aftermath of breaches.
– David Gacioch, Partner
Privacy and data protection continue to be an exploding area of focus for regulators in the United States and beyond. This report gives in-house counsel and others responsible for privacy and data protection an overview of some of the major developments in this area in 2013 around the globe, as well as a prediction of what is to come in 2014.
New technologies enable marketers to collect and analyze more — and more specific— data than ever before. Marketers can track consumers across the internet and mobile applications, and can deliver advertising based on consumers’ interests inferred from the collected data. In theory, consumer tracking enables marketers to present advertising to consumers who are predisposed to a specific product or service, producing a higher purchase rate and transaction price, and a greater return on investment in marketing activities.
While these new technologies make advertising and marketing more targeted and efficient, they also present new challenges for marketers. Although a majority of consumers understand the “pay with data” model through which websites, mobile applications and other digital services are made available at no cost, they do not want advertisers to track them or to aggregate the tracking data into so-called “big data” databases. Consequently, consumer digital privacy has been the subject of many recent news articles – from lawsuits filed by consumers against email service providers and social media platforms for undisclosed data mining to senatorial requests to data brokers for transparency.
In this four-part series, we will highlight of some recent developments in consumer data privacy law and suggested steps for marketers on how to address them.
Children’s Online Privacy Protection Act Amendments
The Children’s Online Privacy Protection Act (COPPA) is a federal statute enacted in 1998 that requires operators of commercial digital services to provide parental notification and obtain verifiable parental consent prior to collecting personal information from children under 13. To implement COPPA, the Federal Trade Commission (FTC) issued a set of regulations known as the Children’s Online Privacy Protection Rule (COPPA Rule). On December 19, 2012, the FTC released amendments to the COPPA Rule which became effective July 1, 2013.
The amended COPPA Rule enhances online privacy protection for children and makes digital services’ operators more accountable for data collection activities involving children under age 13. Notable for marketers is a new liability standard for third-party service providers. Specifically, effective July 1, 2013:
- The operator of “children-directed” (i.e., intended for children under age 13) online or mobile websites and services is strictly liable for actions of independent third parties – including social media plug-ins – on/through its website and mobile services if the third party is acting as its agent or service provider or if the operator benefits by allowing the third party information collection; and
- A software plug-in, ad network or similar party that collects information on or through a third-party’s online or mobile website or service now is liable under COPPA if that party has actual knowledge it is collecting personal information on a children-directed platform.
The amended COPPA Rule makes several other key changes to the original COPPA Rule, including:
- An expanded definition of personal information to include geo-location information, a child’s photo or audio or video file, screen or user names, and persistent identifiers, such as information held in a cookie, an IP address, a mobile device ID number, that can be used to identify an individual consumer over time and across different websites or online services;
- Further clarification about the test for determining whether an online service is children-directed (which remains a highly fact-specific inquiry that depends on the totality of the circumstances);
- The addition of an age-screening safe harbor for online services that fit the directed-to-children criteria but do not target children as their primary audience;
- Streamlined disclosure requirements for parental notification and privacy statement regarding information practices with respect to children; and
- Expanded acceptable methods for obtaining verified parental consent.
Action Step for Marketers
To ensure compliance with the amended COPPA Rule, all marketers need to evaluate their data collection activities with respect to children on their own digital services and third-party digital services to ensure that disclosures about data collection from children are accurate and up-to-date. Even though operators of digital services directed to children are strictly liable for their third-party service providers or if they have actual knowledge of data collection from children, marketers should consider checking their services agreements to make sure that service providers’ compliance with the amended COPPA Rule (and data privacy and security laws in general) is covered by existing provisions.
For more information, please see our article in the Boston Bar Journal, “Protecting Children Online: New Compliance Obligations for Digital Marketing to Children.”