Part III of our 2015 predictions series comes from Of Digital Interest editor and McDermott partner, Heather Sussman, who predicts that states will be active with privacy and data security legislation during 2015.

States Active with Privacy and Data Security Legislation

With comparatively little movement from the federal government in 2014, state legislatures around the country have been working to take an active role in addressing the ever-increasing public concern over the collection, use, disclosure and disposal of personal information.  Of the 23 states that introduced or considered security breach notification legislation in 2014, at least 11 enacted their bills into law. There remain several bills pending in 2015 in state legislatures across the United States. that may amend or impact the breach notification landscape. 

For 2015, we predict action in the following states:

  • Both Massachusetts and New Jersey have pending bills that aim to further protect financial information, focusing on the breach of “access devices” associated with electronic transactions. Massachusetts SB 132 and New Jersey AB 1239 propose to add restrictions on data retention of certain financial information collected from access devices, as well as dictate how financial institutions will recover costs after a breach.
  • In Pennsylvania, the legislature is considering AB1329, which increases penalties for failure to report a breach to $5,000 for a first offense, $10,000 for a second offense, and $15,000 for a third or subsequent offense, AB2480, which requires certain notifications and free credit reports for six months following breach, and AB3146/SB2188, which requires notification of a breach of online account credentials.
  • Two Rhode Island bills impact existing breach laws: HB 5769, which enumerates additional patient’s rights, including the right to be notified in the event of a breach of confidential health care information, and HB 7519 which mandates specific content in breach notifications to consumers.  Notifications now must include contact information for consumer reporting agencies and the Federal Trade Commission (FTC), a statement that an individual can obtain information regarding fraud alerts and security freezes, and a statement that warms against possible imposters who attempt to fraudulently notify individuals of security breaches.  This latter bill would also require providing one year of credit monitoring at no cost to individuals whose data are impacted in the breach.
  • Delaware also has two bills pending: SB101 which would clarify that a person who is a victim of a “Digital Data Breach” shall have seven years from the date the personal information is posted in which to bring a civil action for damages, and SB102 which would add name, birth date and address to the definition of personal information.  The latter bill also provides either of the following specific damages for breach victims, whichever is greater: consequential damages, profits derived from the unauthorized use, or both; or $1,000 per breach per person if no actual damages can be proven.  Punitive damages may be awarded against a person found to have willfully violated this Chapter

In addition to legislation seeking to amend existing breach notification laws, we predict continued debate in 2015 on the following bills, with New York State’s Online Privacy Act having the greatest potential to change the privacy landscape for online businesses in 2015.


SB34: Automated License Plate Recognition Systems: Use of Data

  • Would impose specified requirements on an automated license plate recognition (ALPR) operator including, among others, ensuring that the information the ALPR operator collects is protected with certain safeguards, and implementing and maintaining specified security procedures and a usage and privacy policy with respect to that information.
  • Would require an ALPR operator that accesses or provides access to ALPR information to maintain a specified record of that access.
  • Would require an ALPR end-user to implement and maintain a specified usage and privacy policy.
  • Would authorize an individual who has been harmed by a violation of these provisions to bring a civil action against a person who knowingly caused that violation.
  • Would include information or data collected through the use or operation of an automated license plate recognition system, when that information is not encrypted and is used in combination with an individual’s name, in the definition of “personal information” discussed above.

SB26: Statewide Health Care Cost and Quality Database

  • Seeks to create a statewide health care cost and quality database comprised of health care performance information.
  • With a goal of making this database publicly available, timely and comprehensive, the bill would require all data disclosures pursuant to the bill’s provisions to comply with all applicable state and federal laws protecting privacy and security of data.
  • Also prohibits public disclosure of any unaggregated, individually identifiable health information.


HB298: An Act Relative to Identity Theft Protection

  • Creates a new chapter entitled “Consumer Breach Notification”
    • Defines “personal information,” “security breach,” “data collector,” et al.
    • Creates notice requirements (time, content, form).
    • The bill amends existing penalties available for such breaches.


SB5932: Student Information

  • Prohibits the release of personally identifiable student information where parental consent is not provided.

SB7358: New York State Online Privacy Act

  • Establishes the New York State Online Privacy Act; includes definitions, requirements to post a privacy policy, specifications for minors and responsibilities of operators, liability and enforcement.
  • Creates the Office of Privacy Protection to be headed by a Commission of Privacy and Protection and assisted by an advisory committee, with the powers and duties to provide guidance, make recommendations, develop programming, and receive complaints and undertake investigations.


HB5769/SB649: Confidentiality of Health Care Communications and Information

  • Created a section on Patient’s Rights, which permits patients to obtain a copy of their confidential health care information and communications within 10 days of a request, a copy of a disclosure report and to be notified of a breach.

We will continue to follow the status of these bills and report here on Of Digital Interest if any are enacted into law.

On 26 November 2014, the Article 29 Working Party adopted a working document on establishing a cooperation procedure for issuing common opinions on whether contractual clauses are compliant with the European Commission’s Model Clauses (Model Clauses).

The working document establishes the procedure in which companies wishing to use identical contractual clauses in different Member States for transfers of personal data outside the European Economic Area (EEA) are able to obtain a coordinated position from the relevant Data Protection Authorities (DPA) on the proposed contracts, without the need to approach each relevant DPA individually for approval.


Model Clauses represent one of the ways that a data controller can overcome the general prohibition contained in the EU Data Protection Directive (95/46/EC) on cross-border transfers of personal data to countries outside the EEA that do not offer adequate levels of data protection.  The Model Clauses are intended to be used without amendment – although some divergence, e.g., through the use of additional clauses having no impact on the overall compliance of the Model clauses adopted, may be acceptable.

Company groups in Europe often use identical contractual clauses in different jurisdictions for the purposes of transfers out of the EEA.  However, differing implementation of the Data Protection Directive between Member States has resulted in the situation whereby some jurisdictions require DPA approval of the Model Clauses used (such as Austria, Denmark, France and Spain), whether used with or without amendment, whereas other jurisdictions do not require such DPA approval where the Model Clauses are used without amendment.  The result of the above is that it may be possible that identical contracts using the Model Clauses with only minor amendment are considered compliant by a DPA in one jurisdiction but not in others.

According to the Working Party, the purpose of this working document is to create a procedure allowing companies to obtain a coordinated position from the relevant DPAs when using identical contractual clauses based on the Model Clauses with minor amendment, in particular as to whether the contractual clauses are compliant with the Model Clauses.

The Process

Should a company wish to know whether its contract is compliant with the Model Clauses, under the proposed cooperation procedure, it will first need to ask the DPA it believes is entitled to act as the lead DPA to launch the EU cooperation procedure.

The company will then need to provide the lead DPA a copy of the contract, indicating the references to the Model Clauses together with any divergences and additional clauses, as well a list of EEA countries from which the company will be carrying out the transfers.

The Lead DPA

The Working Party has suggested that the company should choose the lead DPA from a Member State in which the transfers will take place and it will be for the company to justify why the DPA should be considered the lead.  According to the Working Party, the following criteria should be considered by the company:

  1. The location from which the contractual clauses are decided and elaborated;
  2. The place where most decisions, in terms of the purposes and the means of the processing, are taken;
  3. The best location, in terms of management functions, administrative burden, etc., for the handling of the application and the enforcement of the contractual clauses;
  4. The Member States within the European Union from which most transfers outside the EEA will take place; and
  5. The location of the group’s European headquarters or the location of the company within the group with delegated data protection responsibilities.

The lead DPA will then have two weeks to respond to the company and accept the role of lead DPA.  If it accepts the role, it will forward all information to the other DPAs that will be involved in the process and identify the proposed reviewer DPAs.

The reviewer DPAs will need to respond to the lead DPA indicating whether they accept their roles and the lead DPA will be in charge of considering if the proposed contract is in conformity with the Model Clauses.

The Cooperation Procedure

Once the lead DPA has established whether the proposed contract is compliant with the Model Clauses, it will prepare a draft letter that it will share with the reviewers.  The reviewers will have one month to consider the letter and provide any comments.  The letter will then be sent to the other relevant DPAs, who will have one month to respond with any comments.  Should there be no response from these DPAs, they will be deemed to have agreed to the draft letter.

Once the process is complete, the lead DPA will sign the letter on behalf of the other DPAs and send the letter to the company, specifying whether the proposed contract conforms to the Model Clauses.  From this point, the EU cooperation procedure is closed and the letter will be taken into consideration as a basis for providing the necessary national authorisations or permits from the relevant national bodies.


This process should be welcomed at least as an attempt to ease the administrative burden on data controllers when they wish to transfer personal data overseas.  Under the process, data controllers will be able to obtain a common position from the relevant DPAs in the jurisdictions affected through a single point of contact, which should provide some comfort.  However, it should be noted that a positive letter from the lead DPA will not dispense of the need to obtain DPA authorisations from those jurisdictions that require as such by law, nor necessarily guarantee that such authorisations will be granted.  Furthermore, it remains to be seen whether this will actually streamline the process or add further to delays.

On 5 November 2014, Peter Hustinx, the European Data Protection Supervisor (EDPS), together with Germany’s Federal Data Protection Commissioner, Andrea Voβhoff, held a panel discussion in respect of the state of play and perspectives on EU data protection reform.

Although participants identified a number of key outstanding issues to be resolved prior to the conclusion of the reform process, there was some optimism that such issues could be overcome, and the process completed, before the end of 2015.


The EDPS is an independent supervisory authority whose members are elected by the European Parliament and the Council in order to protect personal information and privacy, in addition to promoting and supervising data protection in the European Union’s institutions and bodies.  The role of the EDPS includes inter alia advising on privacy legislation and policies to the European Commission, the European Parliament and the Council and working with other data protection authorities (DPA) to promote consistent data protection throughout Europe.

The proposed data protection regulation is intended to replace the 1995 Data Protection Directive (95/46/EC) (the Directive) and aims not only to give individuals more control over their personal data, but also make it easier for companies to work across borders by harmonising laws between all EU Member States.  The European Parliament and the Civil Liberties, Justice and Home Affairs (LIBE) Committee have driven the progress on new data protection laws, but there has been frustration aimed at the Council of Ministers for their slow progress.  Following the vote by the European Parliament in March 2014 in favour of the new data protection laws, the next steps include the full Ordinary Legislative Procedure (co-decision procedure), which requires the European Parliament and the Council to reach agreement together.

The panel discussion attendees were made up of institutional representatives and key figures involved in the EU Data Protection Reform Package, including: Stefano Mura (Head of the Department for International Affairs at Italy’s Ministry of Justice); Jan Albrecht MEP (Vice-Chair and Rapporteur of the European Parliament LIBE Committee); and Isabelle Falque-Pierrotin (President of CNIL and Chair of the Article 29 Working Party).  The purpose of the panel discussion was to consider the outstanding issues and next steps to finalise proposals on EU data protection reform, particularly in the context of the recent CJEU rulings on data retention and the right to be forgotten.

Key Messages

The key points raised during the panel discussion included:

  • There is optimism that the reform process will be completed in the next year subject to resolving outstanding issues, such as:
    • Whether public authority processing should be included in the proposed data protection regulation – Andrea Voshoff commented that this issue was being considered by the Council of Ministers Committee in relation to the introduction of a clause preventing the lowering of standards by national laws.  Stefano Mura added that while there is a desire for both a uniform approach between the EU Member States and a right for Member States to regulate their own public sectors, a balance should be achievable; and
    • How to deal with the recent and expected future case law in respect of the right to be forgotten.  Andrea Voshoff commented that guidelines were needed for such cases in order to balance data protection principles against freedom of expression.  Furthermore, it was commented there was a requirement that the proposed regulation should deal with both access to such information but also its creation.
  • The discussions on rules applicable to the public sector and dispute resolution may take some time due to difficulties of coordination at the national level.  However, the proposed Data Protection Board “one-stop shop” was suggested as being a good example of balancing the need for a uniform approach for data controllers while providing remedies for data subjects.
  • Adoption of the proposed data protection regulation needs to be completed in 2015 because inter alia:
    • The proposed regulation would provide unity and increased credibility to DPAs;
    • New compliance tools and stronger sanctions are required;
    • In the wake of the Snowden revelations, there is a public expectation of a uniform approach to European data protection; and
    • Technology and the digital economy has advanced significantly since EDPS’ 2007 Opinion that change to the Directive was unavoidable.
  • The use of pseudonymisation techniques on personal data should not result in a lower standard of protection applicable to such data.
  • Despite the urgency to finalise the proposed regulation, key issues should not be ignored and a consensus achieved as to its scope and approach – otherwise the regulation risks being ineffective.


The Council are still reviewing the draft data protection regulation at a technical level and negotiations on the proposed text between the Council and the European Parliament will only begin once the Council is ready.  The earliest that there could be agreement on the draft regulation is likely to be the first half of 2015 – the expectation would then be that the revised data protection framework will come into force in 2017.

Health care and life sciences companies increasingly operate in a digital environment. McDermott Will & Emery is hosting a complimentary four-part webinar series to explore the practical business considerations and to simplify the regulatory complexity of digital health, including health information technology (IT), big data, mobile health and telehealth.


Big Data Part I: Data-Driven Life Sciences Innovation, Personalized Medicine and Research

Date:  Tuesday, December 9
Time: 12:00 – 1:30 pm EST

Data-driven solutions are a powerful tool for the life sciences sector and research community. This program will explore how life sciences companies, research institutions, providers and informatics businesses can successfully navigate the regulatory landscape for big data to develop, introduce and differentiate products. The following key questions, among others, will be addressed:

  • Overview of Regulatory Framework — How can big data initiatives be structured to comply with key federal laws?
  • Data Registry Creation and Management — What are the data stewardship and data governance considerations?
  • Emerging Partnerships — What is the role of the government and industry for data initiatives in the life sciences?
  • Personalized Medicine — How is data used to deliver personalized medicine at the bedside?
  • Biospecimens — How does the introduction of annotated biospecimens affect the big data landscape?
  • Genomic Age – What does it mean to engage in big data initiatives in an age of genomic medicine?


Amy Hooper Kearbey, Partner, McDermott Will & Emery LLP

Matthew Hawryluk, Ph.D., Senior Director, Corporate & Business Development, Foundation Medicine, Inc.

Jennifer C. King, Ph.D., Director of Data Governance for CancerLinQ, American Society of Clinical Oncologists

Sari Heller Ratican, Chief Privacy Officer, Amgen


Jennifer S. Geetter, Partner, McDermott Will & Emery LLP


  • WEBINAR THREE: Big Data Part 2: Data-Driven Changes to Payment Models, January 13, 2015
  • WEBINAR FOUR: Mobile Health & Telehealth: Mobile and Telehealth Technology Create New Business Opportunities, February 10, 2015

Click here to view WEBINAR ONE: Health IT: Collection, analysis and sharing of health information

For more information, please contact McDermott Events.

National Institute of Standards and Technology (NIST) has published draft recommendations aimed at securing the confidentiality of sensitive federal information located within non-federal entities’ information technology systems.  Draft Special Publication 800-171 includes draft recommendations intended to secure all “controlled unclassified information (CUI)” for non-federal entities doing business with, or for, the federal government.  CUI includes personally identifiable data, financial information, medical records and other sensitive data.

Many of the recommendations are currently in use on a voluntary and limited basis.  Requiring the additional security measures could directly affect thousands of contractors, related businesses, universities and nonprofits conducting business with or research for, the federal government.

Deadline for submitting public comments on Draft Special Publication 800-171 is January 16, 2015. Find Draft Special Publication here.

Will you be in the Bay Area on November 12?  You are invited to join Federal Trade Commission (FTC) Chairwoman Edith Ramirez at McDermott’s office in Menlo Park, California for a conversation in privacy and technology.  The FTC is celebrating its 100th anniversary and this will be the first time Chairwoman Ramirez is visiting the Bay Area since her appointment.  Come and ask the tough questions, join the lively conversation and mark this important visit by Chairwoman Ramirez as she talks about all things privacy and technology to some of the top tech teams in the country.  Please RSVP as space is limited.  A complimentary networking reception with Chairwoman Ramirez will immediately follow the program.

To register, please click here.

A recent ruling by the Ninth Circuit took an expansive view of vicarious liability under the Telephone Consumer Protection Act (TCPA).  Reversing the district court’s grant of summary judgment, the court in Gomez v. Campbell held that a marketing consultant could be held liable for text messages sent in violation of the TCPA, even though the marketing consultant itself had not sent the texts and even though the texts were sent on behalf of the marketing consultant’s client, not the consultant itself.

Among other things, the TCPA prohibits (with certain exceptions) the use of automatic telephone dialing systems in making calls to cellphones.  Both the Federal Communications Commission (FCC) and the courts have interpreted this provision to bar the use of automated systems to send unsolicited texts to cellphones.  In Gomez, the Campbell-Ewald Company had been hired by the Navy to conduct a multimedia recruiting campaign.  Campbell-Ewald had then outsourced the text-messaging component of the campaign to a third party, Mindmatics.  Mindmatics then allegedly sent text messages to the plaintiff and others who had not given consent.

On appeal, Campbell-Ewald raised two variations of the arguments that it should not be held liable for texts that it had not itself sent.  First, Campbell-Ewald argued that it did not “make” or “initiate” any calls under the TCPA because Mindmatics had sent the texts.  As the statue only provides for liability for those that “make” or “initiate” prohibited calls, Campbell-Ewald argued that it could not be held liable.  Second, addressing another potential avenue of liability, Campbell-Ewald noted that the FCC had interpreted the TCPA to allow for liability against those “on whose behalf” unsolicited calls are made.  But, Campbell-Ewald argued, it could not be held liable on this ground either because the texts had been sent on behalf of its client, the Navy, not Campbell-Ewald.

In the end, the Ninth Circuit sidestepped both these arguments and found Campbell-Ewald potentially liable on a third basis, “ordinary tort-related vicarious liability rules.”  The court noted that where a statute is silent on vicarious liability—as the court judged the TCPA to be—traditional common law standards of vicarious liability apply.  Thus, the court held, Campbell-Ewald could be liable under the TCPA based on the agency relationship between Campbell-Ewald and Mindmatics.  The court further noted that FCC had stated that the TCPA imposes liability “under federal common law principles of agency,” and held that the FCC’s interpretation was entitled to deference.

Finally, the court noted that it made little sense to subject both the actual sender and the ultimate client to liability, while absolving the middleman marketing consultant, noting, “a merchant presumably hires a consultant in party due to its experience in marketing norms.”

The decision reinforces the importance for companies to closely monitor anyone sending texts or placing calls on their behalf or at their direction.  Following Gomez, it is clear that any company that had a role in sending unsolicited calls or texts can potentially be held liable under the TCPA; and the company with the deepest pockets usually becomes the target, no matter home minimal its role in the alleged violation.

On 18 September 2014, the European Union’s Article 29 Data Protection Working Party published a press release outlining its recent plenary session discussions on the so-called “right to be forgotten” or “de-listed.”

The Working Party identifies that search engines, as data controllers, are under an obligation to acknowledge requests to be de-listed and establishes amongst European data protection authorities a “tool box” for ensuring a common approach to complaints handling in the case of refusals to de-list.


The Working Party, made up of EU member state national data protection authorities, is an independent advisory body on data protection and privacy, set up under Article 29 of the Data Protection Directive (95/46/EC) (DPD) in order to contribute to the DPD’s uniform application.

The purpose of its latest plenary session held on 16 and 17 September 2014 was to discuss the aftermath of the European Court of Justice’s (ECJ) May 2014 ruling which recognised an EU citizen’s right to have the results of searches conducted against their name and containing their personal information removed where such information was inaccurate, inadequate, irrelevant or excessive for the purposes of data processing.

Key Messages

The Working Party has acknowledged that there is high public demand for the right to be forgotten, based on the number of complaints received by European data protection authorities relating to refusals by search engines to de-list since the ECJ ruling.

The Working Party has agreed that there is a need for a uniform approach to the handling of de-listing complaints.  As such the Working Party has proposed that:

  • It is necessary to put in place a network of dedicated contact persons within European data protection authorities to develop common case-handling criteria; and
  • Such a network will provide data protection authorities with a record of decisions taken on complaints and a dashboard to assist in reviewing similar, new or more difficult cases.

Going forwards the Working Party also confirmed that it will continue to review how search engines comply with the ECJ’s ruling, having already held a consultation process with search engines and media companies over the summer.

On 10 September 2014, the Global Privacy Enforcement Network (GPEN) published the results of its privacy enforcement survey or “sweep” carried out earlier in 2014 with respect to popular mobile apps.  The results of the sweep are likely to lead to future initiatives by data protection authorities to protect personal information submitted to mobile apps.

The purpose of the sweep was to determine the transparency of the privacy practices of some 1,211 mobile apps and involved the participation of 26 data protection authorities across the globe.  The results of the sweep suggest that a high proportion of the apps downloaded did not sufficiently explain how consumers’ personal information would be collected and used.


GPEN was established in 2010 on the recommendation of the Organisation for Economic Co-operation and Development.  GPEN aims to create cooperation between data protection regulators and authorities throughout the world in order to strengthen personal privacy globally.  GPEN is currently made up of 51 data protection authorities across some 39 jurisdictions.

Over the course of a week in May 2014, GPEN’s “sweepers” – made up of 26 data protection authorities across 19 jurisdictions, including the UK Information Commissioner’s Office (ICO) – participated in the survey by downloading and briefly interacting with the most popular apps released by developers in their respective jurisdictions, in an attempt to recreate a typical consumer’s experience.  In particular GPEN intended the sweep to increase public and commercial awareness of data protection rights and responsibilities as well as identify specific high-level issues which may become the focus of future enforcement actions and initiatives.

Sweep Results

The key negative findings of GPEN sweep include:

  • 85 percent of apps failed to clearly explain how personal information would be processed.
  • 59 percent of apps did not clearly indicate basic privacy information (with 11 percent failing to include any privacy information whatsoever).
  • 31 percent of apps were excessive in their permission requests to access personal information.
  • 43 percent of the apps had not sufficiently tailored their privacy communications for the mobile app platform – often instead relying on full version privacy policies found on websites.

However, the sweep results also highlighted a number of examples of best practices for app developers, including:

  • Many apps provided clear, easy-to-read and concise explanations about exactly what information would be collected, how and when it would be used and, in some instances, explained specifically and clearly what would not be done with the information collected.
  • Some apps provided links to the privacy policies of their advertising partners and opt-out elections in respect of analytic devices.
  • There were good examples of privacy policies specifically tailored to the app platform, successfully making use of just-in-time notifications (warning users when personal information was about to be collected or used), pop-ups and layered information, allowing for consumers to obtain more detailed information if required.

Many of the GPEN members are expected to take further action following the sweep results.  For its part, the UK ICO has commented that in light of the above results, it and other GPEN members intend to write to developers identified as deficient.  The Belgian Privacy Commission has, in addition, confirmed that cases of gross violations of data protection law identified in the sweep would be forwarded to and dealt with by the relevant authorities.

The Federal Trade Commission (FTC) announced last week that Yelp – the online service through which consumers can read and write reviews about local businesses – has agreed to pay $450,000 to settle the FTC’s charges that Yelp knowingly and without verifiable parental consent (VPC), collected personal information from children under the age of 13 through its mobile app in violation of the federal law, the Children’s Online Privacy Protection Act (COPPA).

COPPA was enacted in 1998. The FTC, which is responsible for enforcing COPPA, implemented regulations in April 2000 that are known as the COPPA Rule. The FTC issued an amended COPPA Rule in December 2012, which became effective July 1, 2013. 

In general, COPPA and the COPPA Rule prohibit operators of websites, mobile applications or other digital services (collectively, “digital services”) from knowingly collecting personal information from children under age 13 unless and until the digital service operator has VPC. 

Under the amended COPPA Rule, COPPA has a broader scope than digital service operators may realize.  COPPA applies not only to digital services that are directed to children, but also to any general-audience digital service when the operator of the digital service has “actual knowledge” that the digital services is collecting personal information from children under age 13 without VPC. 

COPPA does not require operators of general-audience digital services to ask users for age or date of birth information but, under the actual knowledge test, if the digital service collects information that establishes that a user is under 13, the digital service must be COPPA compliant, which means (among other requirements) obtaining VPC before collecting personal information from the under-age-13 user.

The FTC concluded that Yelp had “actual knowledge” that it was collecting personal information from children under age 13 because the registration page on Yelp’s app asked users to enter their date of birth but did not block access to the app for users who were too young (i.e., under age 13).   

Key Takeaway: If your general-audience digital service asks a user for his or her birth date, make sure that a user who is under age 13 is blocked from using the digital service.  Also, to help prevent users who are too young from circumventing the block, consider one or all of the following techniques:

  1. Request birth date in a neutral manner, i.e., no prompt is given to the age of eligibility, such as “You must be age 13 or older to register.”
  2. Present a neutral on-screen error message when a user is under age 13, such as “Sorry, you’re not eligible,” rather than “Sorry, you are under age 13.”
  3. Deploy a cookie or other functionality to prevent an under-age user whose access was blocked from using the back button (or similar technique) to re-enter an old-enough birth date.