New technologies and the expansion of the Internet of Things have allowed children of this generation to experience seamless interactive technologies through microphones, GPS devices, speech recognition, sensors, cameras and other technological capabilities. These advancements create new markets for entertainment and education alike and, in the process, collect endless amounts of data from children–from their names and locations to their likes/dislikes and innermost thoughts.

The collection of data through this Internet of Toys is on the tongues of regulators and law enforcement, who are warning parents to be wary when purchasing internet-connected toys and other devices for children. These warnings also extend to connected toy makers, urging companies to comply with children’s privacy rules and signaling that focused enforcement is forthcoming.

Federal Trade Commission Makes Clear That Connected Toy Makers Must Comply with COPPA

On June 21 2017, the Federal Trade Commission (FTC) updated its guidance for companies required to comply with the Children’s Online Privacy and Protection Act (COPPA) to ensure those companies implement key protections with respect to Internet-connected toys and associated services. While the FTC’s Six Step Compliance Plan for COPPA compliance is not entirely new, there are a few key updates that reflect developments in the Internet of Toys marketplace. Continue Reading Regulating the Internet of Toys

On May 3, 2017, the Creating Opportunities Now for Necessary and Effective Care Technologies for Health Act of 2017 (S. 1016) (CONNECT Act of 2017) was reintroduced by the same six senators who had initially introduced the legislation in early 2016 and referred to the Senate Committee on Finance. As we previously reported on February 29, 2016, this iteration of the proposed bill also focuses on promoting cost savings and quality care under the Medicare program through the use of telehealth and remote patient monitoring (RPM) services, and incentivizing such digital health technologies by expanding coverage for them under the Medicare program—albeit using different terminology. Chiefly, the CONNECT Act of 2017 serves as a way to expand telehealth and RPM for Medicare beneficiaries, makes it easier for patients to connect with their health care providers and helps reduce costs for patients and providers. As with the previous iteration, the CONNECT Act of 2017 has received statements of support from over 50 organizations, including the American Medical Association, American Telemedicine Association, Healthcare Information and Management Systems Society, Connected Health Initiative, Federation of State Medical Boards, National Coalition on Health Care and an array of vendors and health systems. Continue Reading Round Two: Significant Telehealth Expansion Re-Proposed in Bipartisan Senate Bill

As one of the last states to retain highly restrictive (and arguably anti-competitive) telemedicine practice standards, health care providers, regulatory boards, technology companies, payors and other stakeholders have been actively monitoring Texas’ approach to telemedicine regulation and the related Teladoc case. Texas has eliminated its most restrictive requirement for delivering care via telemedicine in Texas, increasing opportunities for providers to reach patients using technology.  Senate Bill 1107 was passed on May 11, 2017, and the House added an amendment in passing Senate Bill 1107, which was approved in the Senate on May 18.  The bill was signed into law by Governor Abbott last weekend.

Read the full article.

Texas telehealth requirements will significantly change in the near future if Texas Senate Bill 1107 is passed into law, as it removes the controversial “face-to-face” or in-person consultation requirement to establish a physician-patient relationship and lawfully provide telehealth and telemedicine services within the state. This bill comes after a six-year-long battle between telemedicine stakeholders and the Texas Medical Board, and will better align Texas’ regulations with those found in other states.

Read the full article.

On August 3, 2016, the Federal Trade Commission (FTC) staff submitted public comments regarding the Delaware Board of Occupational Therapy Practice’s proposed regulation for the provision of occupational therapy services via telehealth in Delaware (the Proposed Regulation).  The FTC’s comments to the Proposed Regulation follow its comments to Alaska’s telehealth legislation earlier this year and evidence its continued focus on telehealth’s ability to foster flexibility in health care delivery by increasing practitioner supply; encouraging competition; and improving access to affordable, quality health care.

By way of background, in 2015, Delaware amended its Insurance and Professions and Occupations Code (the Code) to include the regulation of telehealth and telemedicine services, including the delivery of occupational care remotely under existing, in-person standards of care.  Consistent with the Code, the Delaware Board of Occupational Therapy Practice (the Board) revised its rules and regulations to address telehealth services.  The Proposed Regulation defines telehealth as “the use of electronic communications to provide and deliver a host of health-related information and health care services, including occupational therapy related information and services, over electronic devices. Telehealth encompasses a variety of occupational therapy promotion activities, including consultation, education, reminders, interventions, and monitoring of interventions.”

The Proposed Regulation gives Occupational Therapist and Occupational Therapist Assistant licensees’ (Licensees) discretion in assessing and determining the appropriate level and type of care for an individual patient, provided that certain requirements are satisfied. Specifically, under the Proposed Regulation, Licensees that provide treatment through telehealth must have an active Delaware license in good standing to practice telehealth in the state of Delaware.  In addition to obtaining informed consent and complying with confidentiality requirements, the licensee must also: (1) be responsible for determining and documenting that telehealth is an appropriate level of care for the patient; (2) comply with the Board’s rules and regulations and all current standards of care requirements applicable to onsite care; (3) limit the practice of telehealth to the area of competence in which proficiency has been gained through education, training and experience; (4) determine the need for the physical presence of an occupational therapy practitioner during any interactions with patients, if he/she is the Occupational Therapist who screens, evaluates, writes or implements the plan of care; (5) determine the amount and level of supervision needed during the telehealth encounter; and (6) document in the file or record which services were provided remotely. (24 Del. Admin. Code § 2000-4.2.)

Staff of the FTC’s Office of Policy Planning and its Bureaus of Competition and Economics, responding to the Board’s request for public comments, stated that by not imposing rigid and unwarranted in-person care and supervision requirements, the Proposed Regulation could have various positive impacts, including: (1) improving access to cost-effective, quality care, especially for patients with limited mobility; (2) reducing Medicaid’s transportation expenditures as well as individuals’ pecuniary and time costs; (3) addressing anticipated workforce shortages in the health care sector by increasing practitioner supply and facilitating care of an aging population; and (4) enhancing competition, consumer choice and access to care.

The FTC did recommend the clarification of the Proposed Regulation on the scope of practice of Occupational Therapist Assistants.  The determination of the appropriateness of remote care and decisions about the amount and level of supervision during a telehealth encounter are expressly restricted to Occupational Therapists, while all other requirements also apply to Occupational Therapist Assistants.  The FTC noted that the ambiguities regarding the role of Occupational Therapist Assistants in telehealth evaluations and the determination of whether to use telehealth could discourage their participation in telehealth care.

In March 2016, the US Federal Trade Commission (“FTC”) staff submitted public comments regarding the telehealth provisions of a proposed state bill in Alaska demonstrating the FTC’s continued focus on health care competition and general discouragement of anti competitive conduct in health care markets, with a renewed interest and focus on telehealth.

Continue Reading FTC Weighs-in on Telehealth: Providing Comments Regarding Alaska’s Proposed Licensure and Standard of Care Requirements

On January 6, the Federal Trade Commission (FTC) released a report that it hopes will educate organizations on the important laws and research that are relevant to big data analytics. The report, Big Data: A Tool for Inclusion or Exclusion? Understanding the Issues, looks specifically at how big data is used after it is collected and analyzed and provides suggestions aimed at maximizing the benefits and minimizing the risks of using big data.

Risk and Rewards

The report argues that big data analytics can provide numerous opportunities for improvements in society. In addition to more effectively matching products and services to consumers, big data can create opportunities for low income and underserved communities. The report highlights a number of innovative uses of big data that provide benefits to underserved populations, such as increased educational attainment, access to credit through nontraditional methods, specialized health care for underserved communities, and better access to employment.

At the same time, the report shows that potential inaccuracies and biases might lead to detrimental effects for low-income and underserved populations. For example, organizations  could use big data to inadvertently exclude low-income and underserved communities from credit and employment opportunities, which may reinforce existing disparities or weaken the effectiveness of consumer choice.

Considerations for Using Big Data

The report outlines some of the consumer protection laws (in particular, the Fair Credit Reporting Act and FTC Act)  and equal opportunity laws that apply to the use of big data, especially with regard to possible issues of discrimination or exclusion. It also recommends that an organization consider the following questions to help ensure that its use of big data analytics does not lead to unlawful exclusion or discrimination:

How representative is your data set? 

If the data set is missing information from particular populations, take appropriate steps to address this problem.

Does your data model account for biases? 

Review data sets and algorithms to ensure that hidden biases do not have an unintended impact on certain populations.

How accurate are your predictions based on big data? 

Balance the risks of using correlative results, especially where the business’ policies could negatively affect certain populations.

Does your reliance on big data cause ethical or fairness concerns?

Consider whether fairness and ethical considerations advise against using big data in certain circumstances and whether the business can use big data in ways that advance opportunities for previously underrepresented populations.

Monitoring and Enforcement Ahead

The FTC stated that its collective challenge is to make sure that big data analytics continue to provide benefits and opportunities to consumers while adhering to core consumer protection values and principles. It has committed to continue monitoring areas where big data practices could violate existing laws and to bring enforcement actions where appropriate.  With that in mind, organizations that already use big data and those that are have been persuaded by reported benefits of big data should heed the FTC’s advice. The FTC is highlighting its interest in the consumer protection and equal opportunity ramifications of big data use. This report serves as a warning—a statement of intent—that the FTC will be evaluating data practices in light of these concerns.  It is clear that organizations must identify and mitigate the risks in using big data, not only those dealing with privacy and data protection but also those presenting consumer protection and equal opportunity issues. Thinking critically about and taking corrective action in line with the considerations listed above, and creating a record that such steps have been taken, may help organizations using big data to avoid FTC regulatory scrutiny.

The Energy & Commerce Committee of the U.S. House of Representatives held a hearing on October 21st titled “Examining Ways to Improve Vehicle and Roadway Safety” to consider (among other matters) Vehicle Data Privacy legislation for internet-connected cars.

The proposed legislation includes requirements that auto manufacturers:

  • “Develop and implement” a privacy policy incorporating key elements on the collection, use and sharing of data collected through technology in vehicles. By providing the policy to the National Highway Traffic Safety Administration, a manufacturer earns certain protection against enforcement action under Section 5 of the Federal Trade Commission Act.
  • Retain data no longer than is determined necessary for “legitimate business purposes.”
  • Implement “reasonable measures” to ensure that the data is protected against theft/unauthorized access or use (hacking).

Manufacturers that fail to comply face a maximum penalty, per manufacturer, of up to $1 million. The penalty for failure to protect against hacking is up to $100,000 per “unauthorized” access.

Maneesha Mithal, Associate Director, Division of Privacy and Identity Protection, of the Federal Trade Commission (FTC), testified that the proposed legislation “could substantially weaken the security and privacy protections that consumers have today.”

The FTC’s criticism focuses on the proposed safe harbor against FTC enforcement for manufacturers. The FTC testified that a manufacturer should not earn immunity under the FTC Act if the privacy policy offers little or no privacy protection, or is not followed or enforced. The FTC expressed disapproval of provisions allowing retroactive application of a privacy policy to data previously collected. The FTC also advised against applying the proposed safe harbor to data outside of the vehicle, such as data collected from a website or mobile app.

Although the FTC applauded the goal of deterring criminal hacking of the auto systems, the FTC testified that the legislation, as drafted, may disincentivize manufacturers’ efforts in safety and privacy improvements. The testimony echoed that of other industry critics who believe that what is considered “authorized” access is too vague, which may prevent manufacturers from allowing others to access vehicle data systems, such as for repair or research on an auto’s critical systems.

Finally, the FTC criticized the provisions creating a council to develop cybersecurity best practices.  Since the council could operate by a simple majority, it could act without any government or consumer advocacy input, diluting consumer protections.

The hearing agenda, as well as the text of the draft legislation is available here.

The FTC’s prepared statement, as well as the text of the testimony is available here.

On August 17, 2015, the Federal Trade Commission (FTC) announced settlements with 13 companies on charges that they misled consumers by claiming that they were certified members of the U.S.-EU or U.S.-Swiss Safe Harbor programs when in fact their certifications had lapsed or never existed in the first place. The FTC’s announcement comes on the heels of two previous settlements reached in late May 2015 with companies that had lapsed certifications despite representations to the contrary made to online consumers. This recent activity by the FTC serves as yet another reminder to businesses to monitor their Safe Harbor program certification renewal dates and to exercise care when making representations in privacy policies related to Safe Harbor program certification.

The Safe Harbor programs provide a method for U.S. companies to transfer personal data outside of the European Union (EU) or European Economic Area (EEA) consistent with the requirements of the European Union Directive on Data Protection or the Swiss Federal Act on Data Protection. To participate in a Safe Harbor program, a company must self-certify to the U.S. Department of Commerce that it complies with seven privacy principles and related requirements. Once certified, a company is required to renew its certification with the Department of Commerce each year to maintain its status as a current member of the Safe Harbor program.

The companies at the center of the recent enforcement actions represent a variety of industries, including app development, pharmaceutical and biotechnology research, medical waste processing and wholesale food manufacturing. This broad industry representation suggests to us that the FTC is committed to ongoing enforcement. Accordingly, we want to remind readers of these tips:

  • Check your company’s certification status to ensure that it is marked “current” on the Department of Commerce website: https://safeharbor.export.gov/list.aspx;
  • Review any privacy policies and online statements referencing the Safe Harbor programs to ensure that they properly reflect the certification status and the company’s actual privacy and data security practices;
  • Institute a systemic reminder six months prior to the recertification date that triggers compliance review activity with a due date for completion prior to the recertification deadline, together with a requirement that the actual online recertification be completed prior to the annual deadline;
  • Remove all references to the Safe Harbor programs from publicly available privacy policies and statements if the company’s certification status is unclear; and
  • Review substantive compliance with the Safe Harbor programs and institute corrective action and controls to ensure that compliance is maintained.

 

Remember KITT? KITT (the Knight Industries Two Thousand) was the self-directed, self-driving, supercomputer hero of the popular 1980s television show Knight Rider. Knight Rider was a science fiction fantasy profiling the “car of the future.” The self-directed car is science fiction no more. The future is now and, in fact, we’ve seen a lot of press this year about self-driving or driverless cars.

Driverless cars, equipped with a wide variety of connected systems including cameras, radar, sonar and LiDar (light detection and ranging), are expected on the road within the next few years. They can sense road conditions, identify hazards and negotiate traffic, all from a remote command center. Just as with most connected devices in the age of the Internet of Things (IoT), these ultra-connected devices claim to improve efficiency and performance, and enhance safety.

Though not quite driverless yet, connected vehicles are already on the market, in-market and on the road. Like many IoT “things”, ultra-connected vehicles systems may be vulnerable to hacker attacks.

Christopher Valasek and Charlie Miller, two computer security industry leaders, have presented on this topic at various events, including the 2014 Black Hat USA security conference . They analyzed the information security vulnerabilities of various car makes and models, rating the vehicles on three specific criteria: (1) the area of their wireless “attack surface” (i.e., how many data incorporating features such as Bluetooth, Wi-Fi, keyless entry systems, automated tire monitoring systems); (2) access to the vehicles network through those data points; and (3) the vehicle’s “cyberphysical” features (i.e., connected features such as parking assist, automated braking, and other technological driving aides). This last category of features, combined with access through the data points outlined in items (1) and (2), presented a composite risk profile of each vehicle make’s hackability. Their conclusions were startling: radios, brakes, steering systems were all found to be accessible.

Miller and Valasek claim that their intent was to encourage car manufacturers to consider security in vehicle system connectivity and cyberphysical attributes. They approached vehicle manufacturers and shared their report with the Department of Transportation and the Society of Automobile Engineers. Some manufacturers promised to investigate their vehicle systems and correct the deficiencies. Some seemingly ignored the report altogether. They did, however, catch the attention of Senators Ed Markey (D-MA) and Richard Blumenthal (D-CT). On July 21, 2015, Senators Markey and Blumenthal introduced legislation that would direct the National Highway Traffic Safety Administration (NHTSA) and the Federal Trade Commission (FTC) to establish federal standards to secure vehicles and protect drivers’ privacy. The Security and Privacy in Your Car Act, aptly coined “the SPY Car Act”, would also require manufacturers to establish a ‘cyber dashboard’ that rates vehicle security, informing consumers as to the security performance of their vehicle.

As proposed, the SPY Car Act would require that all motor vehicles manufactured in the U.S. be “equipped with reasonable measures to protect against hacking attacks.” All “entry points” are to be protected through “reasonable” measures against hacking. Internal networks are to be isolated to prevent hacking of the software managing critical vehicle controls, such as braking or steering. Vehicles would undergo a vulnerability analysis including penetration testing based on industry “best security practices.” Furthermore, “Any motor vehicle that presents an entry point shall be equipped with capabilities to immediately detect, report and stop attempts to intercept driving data or control the vehicle.”

The legislation, as well as the continued research efforts of experts such as Valasek and Miller, support the notion that today’s automobiles are not only transportation devices, but also sophisticated computer systems. And like any other computer system, the data processed through them is vulnerable to attack. The more “connected,” the system, the more entry points that are potentially exposed and arguably vulnerable. The “car of the future” is here and the experts and legislators seem to be pushing to keep consumer safety in the driver’s seat.