If you haven’t heard about newest gaming craze yet, it’s based on what is called “augmented reality” (AR) and it could potentially impinge on your home life and workplace as such games allow users to “photograph” imaginary items overlaid with objects existing in the real world. An augmented reality game differs from “virtual reality” in that it mixes elements of the real world with avatars, made up creatures, fanciful landscapes and the like, rather than simply presenting a completely fictional scenario. Whether you play such games yourself or are merely existing in nearby surroundings, here are few things to think about as an active participant, and some tips regarding Intellectual Property and confidentiality issues that arise from others playing the game around you.

Augmented reality games are typically played on a smartphone app and some of them allow the user to capture images of the player’s experience and post it on social media, text it to friends or maintain it on the phone’s camera roll. However, special glasses could be used or other vehicles could deliver the augmented reality experience in different contexts—not just gaming. For example, technology in this area is rapidly advancing which will allow users to link up and “experience” things together way beyond what exists in the real world, i.e., in a “mixed world” experience, if you will. These joint holographic experiences are just one facet of the direction that augmented reality is taking.

As always, with new technological advancements, there are some caveats to using AR that you should be aware of.

Trademarks

If a company’s trademark is visible in the photo of your AR experience, you need to be mindful that you do not run afoul of trademark laws. For the same reasons that some trademarks are blurred out on TV shows, you should not be publishing such photos in any fashion that might draw negative attention from the trademark owner on social media accounts. Even if you are not selling competing goods, you could potentially be liable for trademark infringement. There is another, more important reason not to post such photos that is discussed below and can lead to a second cause of action against you arising from the same photo—the right of publicity, which is a personal right and is treated in vastly different ways in each state.

Right of Publicity

The Right of Publicity (ROP) protects everyone from misappropriation of his/her name, likeness, voice, image or other recognizable element of personal identity. It is protected by state law and many states vary greatly in their treatment of ROP. For example, some states protect a person’s ROP post-mortem, whereas others have no protection whatsoever. Due to the ease with which still or moving images can be reproduced and posted on the Internet, it is critical that you consider your postings from a ROP standpoint before you upload that image to a social media account. For instance, if your photo features your best friend taken in a shared AR experience, she may not object to you posting her photo to one of your social media accounts. However, if a brand name clothing manufacturer reposts it and somehow uses the momentum of the AR craze to show how game players and/or the avatars and creatures within the game are attracted to their brand of clothing, it could result in not just an issue with the game developer, but also your best friend, who may now be the unwitting spokesmodel for that brand of clothing. Basically, the manufacturer would be receiving an unfair free endorsement deal without ever having to negotiate with your best friend. In many states, she would have a ROP cause of action against the clothing manufacturer for commercial use of her image without her permission. This is exponentially dangerous if the best friend is a minor and her parents have not consented to this use of her image. As you can see, the landscape is fraught with potential pitfalls unless you are a news reporting agency or the like and your actions clearly fall under the First Amendment/free speech exception.

Confidential Information

One very important aspect of an AR game is a player’s ability to capture a photograph of the scene being explored or the personal experience of the user in a real world setting (e.g., it could show your desk at work, but in an outer space setting, or your car dashboard with the view from the driver’s perspective out the windshield showing a fairyland with mythical creatures in the distance). However, in taking these mixed virtual/real world photos, it is essential to be mindful of your surroundings. Doctors, lawyers, mental health professionals, bankers, and others with a much higher level of fiduciary duty to their clients must ensure that if they are taking such photos, no confidential information that would breach such duties is captured in the photos. Whether taken in the app itself or in screenshot form, these photos could prove to be problematic if they are automatically uploaded to the cloud or captured in the app. For example, a judge recently tweeted that defense counsel had beenplaying an AR game in the courtroom while court was in session. Setting aside the appropriateness of such behavior, query whether such actions violate confidentiality rules.

For all such professionals there are governing rules about the treatment of certain types of confidential information (The Gramm-Leach-Bliley (GLB) Act, The Health Insurance Portability and Accountability Act (HIPAA), etc.). If the game is set to capture images of the AR characters or scenes in the real world then anything within the player’s view or in the surrounding area is captured in the photograph with the character. To the extent that confidential personal information or trade secret information is being captured, this is a problem. The quick fix is to set the game to have a fully virtual background, rather than an AR one, a feature that some AR games already have. Although this is arguably less fun, it mitigates the danger of capturing sensitive data on your camera roll, in the cloud, or accidentally posting it, all of which could have very serious consequences.

In summary, the new AR games are wildly popular and likely are here to stay. Given that, it’s best to be mindful of your surroundings and make sure that you, and those around you, are playing responsibly.

Recent comments linking digital health tools to so-called “snake oil” has the channels of social media atwitter.  (Add this post to the noise!)  While some may decry the comparison, there is a lot we can learn from that perspective.

One of the challenges of broad digital health adoption is the simple fact that digital health encompasses such a broad array of technologies, usages and purposes.  There is no one tonic that will cure a list of ailments; rather we are presented with shelves of solutions to even more shelves of challenges waiting to be addressed.  Digital health includes, by my definition, the application of social media tools to preventative health and chronic disease management measures, as well as highly sophisticated data analytics applied to massive amounts of population health data to identify important health trends.  It also includes home monitoring devices that keep health care providers informed of their patient’s at-home health condition, as well as telestroke programs that allow physicians to access needed expertise.  The list is potentially endless, as new technologies created to address health issues and existing technologies are being put to use in the health care context. Continue Reading The Rocky Road of Evaluation for Digital Health Tools

Earlier today, the Court of Justice of the European Union (CJEU) announced its determination that the U.S.-EU Safe Harbor program is no longer a “safe” (i.e., legally valid) means for transferring personal data of EU residents from the European Union to the United States.

The CJEU determined that the European Commission’s 2000 decision (Safe Harbor Decision) validating the Safe Harbor program did not and “cannot eliminate or even reduce the powers” available to the data protection authority (DPA) of each EU member country. Specifically, the CJEU opinion states that a DPA can determine for itself whether the Safe Harbor program provides an “adequate” level of personal data protection (i.e., “a level of protection of fundamental rights and freedoms that is essentially equivalent to that guaranteed within the European Union” as required by the EU Data Protection Directive (95/46/EC)).

The CJEU based its decision invalidating that Safe Harbor opinion in part on the determination that the U.S. government conducts “indiscriminate surveillance and interception carried out … on a large scale”.

The plaintiff in the case that gave rise to the CJEU opinion, Maximilian Schrems (see background below), issued his first public statement praising the CJEU for a decision that “clarifies that mass surveillance violates our fundamental rights.”

Schrems also made reference to the need for “reasonable legal redress,” referring to the U.S. Congress’ Judicial Redress Act of 2015. The Judicial Redress Act, which has bi-partisan support, would allow EU residents to bring civil actions in U.S. courts to address “unlawful disclosures of records maintained by an [U.S. government] agency.

Edward Snowden also hit the Twittersphere with “Congratulations, @MaxSchrems. You’ve changed the world for the better.”

Background

Today’s CJEU opinion invalidating the Safe Harbor program follows on the September 23, 2015, opinion from the advocate general (AG) to the CJEU in connection with Maximilian Schrems vs. Data Protection Commissioner.

In June 2013, Maximilian Schrems, an Austrian student, filed a complaint with the Irish DPA. Schrems’ complaint related to the transfer of his personal data collected through his use of Facebook. Schrems’ Facebook data was transferred by Facebook Ireland to Facebook USA under the Safe Harbor program. The core claim in Schrems’ complaint is that the Safe Harbor program did not adequately protect his personal data, because Facebook USA is subject to U.S. government surveillance under the PRISM program.

The Irish DPA rejected Schrems’ complaint because Facebook was certified under the Safe Harbor Program. Schrems appealed to the High Court of Ireland, arguing that the Irish (or any other country’s) DPA has a duty to protect EU citizens against privacy violations, like access to their personal data as part of U.S. government surveillance. Since Schrems’ appeal relates to EU law (not solely Irish law), the Irish High Court referred Schrems’ appeal to the CJEU.

What This Means for U.S. Business

The invalidation of the Safe Harbor program, which is effective immediately, means that a business that currently relies on the Safe Harbor program will need to consider another legally valid means to legally transfer personal data from the EU to the United States, such as the use of EU-approved model contractual clauses or binding corporate resolutions.

We believe, however, that this is not the final chapter in the Safe Harbor saga. Please check back soon for more details and analysis.

Is a social media promotion part of your organization’s branding plans? Please join Julia Jacobson (McDermott partner and Of Digital Interest editor) and her co-panelists next Tuesday, July 28, 2015, at 2:00 pm for “Sweeps, Contests & Games in Social Media”. The webinar, the second in a three-part series hosted by the Brand Activation Association (a division of the Association of National Advertisers (ANA)) will explore endorsement, intellectual property and privacy legal issues, as well as the practical aspects of balancing brand wants with compliance needs and participation verification and fulfillment.

For more information, please click here.

On 11 May 2015, the UK Information Commissioner’s Office (ICO), the French data protection authority (CNIL) and the Office of the Privacy Commissioner of Canada (OPCC) announced their participation in a new Global Privacy Enforcement Network (GPEN) privacy sweep to examine the data privacy practices of websites and apps aimed at or popular among children. This closely follows the results of GPEN’s latest sweep on mobile applications (apps),which suggested a high proportion of apps collected significant amounts of personal information but did not sufficiently explain how consumers’ personal information would be collected and used. We originally reported the sweep on mobile apps back in September 2014.

According to the CNIL and ICO, the purpose of this sweep is to determine a global picture of the privacy practices of websites and apps aimed at or frequently used by children. The sweep seeks to instigate recommendations or formal sanctions where non-compliance is identified and, more broadly, to provide valuable privacy education to the public and parents as well as promoting best privacy practice in the online space.

Background

GPEN was established in 2010 on the recommendation of the Organisation for Economic Co-operation and Development. GPEN aims to create cooperation between data protection regulators and authorities throughout the world in order to globally strengthen personal privacy. GPEN is currently made up of 51 data protection authorities across some 39 jurisdictions.

According to the ICO, GPEN has identified a growing global trend for websites and apps targeted at (or used by) children. This represents an area that requires special attention and protection. From 12 to 15 May 2015, GPEN’s “sweepers”—comprised of 28 volunteering data protection authorities across the globe, including the ICO, CNIL and the OPCC—will each review 50 popular websites and apps among children (such as online gaming sites, social networks, and sites offering educational services or tutoring). In particular, the sweepers will seek to determine inter alia:

  • The types of information being collected from children;
  • The ways in which privacy information is explained, including whether it is adapted to a younger audience (e.g., through the use of easy to understand language, large print, audio and animations, etc.);
  • Whether protective controls are implemented to limit the collection of childrens’ personal information, such as requiring parental permission prior to use of the relevant services or collection of personal information; and
  • The ease with which one can request for personal information submitted by children to be deleted.

Comment

We will have to wait some time for in-depth analysis of the sweep, as the results are not expected to be published until the Q3 of this year. As with previous sweeps, following publishing of the results, we can expect data protection authorities to issue new guidance, as well as write to those organisations identified as needing to improve or take more formal action where appropriate.

On the third anniversary of the EU Commission’s proposed new data protection regime, the UK ICO has published its thoughts on where the new regime stands. The message is mixed: progress in some areas but nothing definitive, and no real clarity as to when the new regime may come into force.

The legislative process involves the agreement of the European Commission, the European Parliament and the Council of Europe (representing the governments of the member states). So far the European Parliament has agreed its amendments to the Commission’s proposal and we are still waiting for the Council to agree it’s amendments before all three come together and try and find a mutually agreeable position.

The Council is guided by the mantra “nothing is agreed until everything is agreed”, and so even though there has been progress with the Council reaching “partial general agreement” on international transfers, risk-based obligations on controllers and processors, and the provisions relating to specific data processing situations such as research and an approach agreed on the one-stop shop principle (allowing those operating in multiple states to appointed and deal with a single authority), this progress means nothing until there is final agreement on everything. At this stage that means all informal agreements remain open to renegotiation.

It is noted that Latvia holds the presidency of the Council until June 2015. The Latvians have already noted that Anydata protection reform remains a key priority but progress has been slow and time may be against them. Where Latvia fails, Luxembourg will hopefully succeed as it takes up the presidency from June.

The ICO is urging all stakeholders to push on with the reform, although they see the proposed timetable of completion of the trilogue process by the end of 2015 as being optimistic. Instead a more reasonable timetable may be a final agreement by mid-2016 with the new regime up and running in 2018.

In 2014, regulators around the globe issued guidelines, legislation and penalties in an effort to enhance security and control within the ever-shifting field of privacy and data protection. The Federal Trade Commission confirmed its expanded reach in the United States, and Canada’s far-reaching anti-spam legislation takes full effect imminently. As European authorities grappled with the draft data protection regulation and the “right to be forgotten,” the African Union adopted the Convention on Cybersecurity and Personal Data, and China improved the security of individuals’ information in several key areas. Meanwhile, Latin America’s patchwork of data privacy laws continues to evolve as foreign business increases.

This report furnishes in-house counsel and others responsible for privacy and data protection with an overview of key action points based on these and other 2014 developments, along with advance notice of potential trends in 2015. McDermott will continue to report on future updates, so check back with us regularly.

Read the full report here.

For those Of Digital Interest readers attending the Brand Activation Association’s (BAA) 36th Annual Marketing Law Conference, please join McDermott partner – and Of Digital Interest editor – Julia Jacobson as she moderates a panel titled “New and Unexpected: Developments in Mobile Marketing – Mobile Tracking, Apps and Mobile Payments.” She will be joined by Ira Schlussel of HelloWorld, Inc., Paul Twarog of Google Inc. and co-moderator Terese Arenth. The panel session starts at 3:20 pm on Thursday, November 6.  We hope to see you there.

At the end of September, California Governor Edmund G. Brown, Jr. approved six bills designed to enhance and expand California’s privacy laws. These new laws are scheduled to take effect in 2015 and 2016.  It will be important to be mindful of these new laws and their respective requirements when dealing with personal information and when responding to data breaches.

Expansion of Protection for California Residents’ Personal Information – AB 1710

Under current law, any business that owns or licenses certain personal information about a California resident must implement reasonable security measures to protect the information and, in the event of a data or system breach, must notify affected persons.  See Cal. Civil Code §§ 1798.81.5-1798.83.  Current law also prohibits individuals and entities from posting, displaying, or printing an individual’s social security number, or requiring individuals to use or transmit their social security number, unless certain requirements are met.  See Cal. Civil Code § 1798.85.

The bill makes three notable changes to these laws.  First, in addition to businesses that own and license personal information, businesses that maintain personal information must comply with the law’s security and notification requirements.  Second, in the event of a security breach, businesses now must not only notify affected persons, but also provide “appropriate identity theft prevention and mitigation services” to the affected persons at no cost for at least 12 months, if the breach exposed or may have exposed specified personal information.  Third, in addition to the current restrictions on the use of social security numbers, individuals and entities now also may not sell, advertise to sell, or offer to sell any individual’s social security number.

Expansion of Constructive Invasion of Privacy Liability – AB 2306

Under current law, a person can be liable for constructive invasion of privacy if the person uses a visual or auditory enhancing device and attempts to capture any type of visual image, sound recording, or other physical impression of the person in a personal or familial activity under circumstances in which the person had a reasonable expectation of privacy.  See Cal. Civil Code § 1708.8.

The bill expands the reach of the current law by removing the limitation requiring the use of a “visual or auditory enhancing device” and imposing liability if the person uses any device to capture a visual image, sound recording, or other physical impression of a person in a personal or familial activity under circumstances in which the person had a reasonable expectation of privacy.

The law will also continue to impose liability on those who acquire the image, sound recording, or physical impression of the other person, knowing that it was unlawfully obtained.  Those found liable under the law may be subject to treble damages, punitive damages, disgorgement of profits and civil fines.

Protection of Personal Images and Videos (“Revenge Porn” Liability)– AB 2643

Assembly Bill 2643 creates a private right of action against a person who intentionally distributes by any means, without consent, material that exposes a person’s intimate body parts or the person engaging in certain sexual acts, with knowledge that the victim had a reasonable expectation that the material would remain private.

Protection of Student’s Online Personal Information – The Student Online Personal Information Protection Act – SB 1177

The Student Online Personal Information Protection Act (SOPIPA) prohibits an operator of an Internet website, online service, online application or mobile application that is used, designed and marketed primarily for K-12 school purposes from (1) knowingly engaging in targeted advertising to students or their parents or guardians on the site, service, or application, (2) engaging in targeted advertising on a different site, service, or application using any information that was acquired from the operator’s site, service or application, (3) using information created or gathered by the operator’s site, service, or application to generate a profile about a student, (4) selling a student’s information, and (5) disclosing certain information pertaining to a student.   The law also requires the operator to maintain reasonable security measures to protect the student’s information from unauthorized access, destruction, use, modification or disclosure.

Protection of Students’ Social Media Information – AB 1442

Assembly Bill 1442 regulates the use of students’ social media information.  If a school intends to implement a program to gather students’ social media information, the school must notify students and parents or guardians about the proposed program and provide an opportunity for public comment.  If the program is adopted, the school must only gather or maintain information that pertains directly to school or student safety, provide the student with access to his or her information and an opportunity to correct or delete such information, destroy information after the student turns 18 or is no longer enrolled at the school, and notify each parent or guardian that the student’s social media information is being collected.

It is important to note that the law also imposes requirements on third parties that are retained by schools to gather the social media information of students.  Under the law, a third party may not use the information for any purpose other than to satisfy the contract, may not sell or share the information and must destroy the information immediately upon conclusion of the contract.

Protection of Students’ Records in Digital Storage Services – AB 1584

Assembly Bill 1584 permits a school to use a third party for the digital storage, management, and retrieval of student records, or to provide digital educational software or both.   In order to protect those records, any such contract with a third party must contain certain provisions, including a statement that all of the records remain the property of and under the control of the school, a description of the procedures that will be used to notify affected students, parents or guardians in the event of any unauthorized disclosure, a prohibition against using any students’ information for any purposes other than those required by the contract, and a certification that students’ information will not be available to the third party upon completion of the contract.

Key Takeaways

California continues to be a leader when it comes to protecting data privacy.  Given these recent expansions to California’s privacy laws, it is and will continue to be important when dealing with any individualized personal information to be aware of the type of information involved, the source of the information, security measures in place to protect the information and the appropriate steps that will need to be taken if any security measures are compromised.