FTC Issues Privacy Tool, Guidance for Health-Related Mobile AppsJune 21, 2016 | | |
Only days after the Federal Communications Commission (FCC) adopted a notice of proposed rulemaking to establish privacy guidelines applicable to Internet service providers (ISPs),1 the Federal Trade Commission (FTC) took two actions on the privacy front that will affect a smaller, but fast growing, industry: developers of mobile health applications. Given the pervasive use of mobile health applications, the security issues and protocols proposed by the FTC will have broad impact on the extent to which everyone’s privacy will be secured when making use of health care applications, whether for scheduling appointments, receiving the results of tests, or managing medical payments and insurance coverage.
In particular, the FTC released a new Web-based tool for developers of health-related mobile apps that, it said, was designed to help the developers understand what federal laws and regulations might apply to their apps.2 In addition, the FTC simultaneously released its own business guidance aimed at helping mobile health app developers to comply with the FTC Act by building privacy and security into their apps.3
The Web-based tool and the FTC’s guidance can be helpful to app developers, but they both require legal conclusions and judgments that are best made by attorneys and, standing alone, they can provide app developers with a false sense of security. Clients, therefore, should be advised not to rely on these to the exclusion of legal advice but to work with them in conjunction with appropriate counsel. Indeed, the Web-based tool itself states that it is “not meant to be legal advice about all of [a developer’s] compliance obligations,” and that it provides a “snapshot of a few important laws and regulations.”
The Web-Based Tool
The online tool created by the FTC, in conjunction with the Department of Health and Human Services’ Office of National Coordinator for Health Information Technology (ONC) and Office for Civil Rights (OCR) and the Food and Drug Administration (FDA), asks developers 10 high-level questions about the nature of their app, including about its function, the data it collects, and the services it provides to users.
The relatively straightforward questions include: “Are you a health care provider or health plan?,” “Do consumers need a prescription to access your app?,” and “Are you a nonprofit organization?”
There also are more complicated questions, however, including: “Do you create, receive, maintain, or transmit identifiable health information?,” “Are you developing this app on behalf of a HIPAA covered entity (such as a hospital, doctor’s office, health insurer, or health plan’s wellness program)?,” and “Does your app pose ‘minimal risk’ to a user?”
The site has links to a glossary that attempts to explain “identifiable health information,” “HIPAA covered entities,” and “minimal risk,” but the explanations themselves lead to further questions. For instance, the glossary states that “HIPAA covered entities” are “[h]ealth care providers who conduct certain electronic transactions,” “[h]ealth plans,” and “[h]ealth care clearinghouses,” but it then provides links to the Centers for Medicare & Medicaid Services’ Web page, “Are You a Covered Entity?,”4 which allows a visitor to download a file entitled “Covered Entity Charts” that has 10 pages of information and charts to decipher.5 This is not an easy task for a typical app developer.
In any event, based on the app developer’s yes-or-no answers to the 10 questions in the online tool, the developer is pointed toward information about certain federal laws that might apply to the app, including the FTC Act, the FTC’s Health Breach Notification Rule, the Health Insurance Portability and Accountability Act (HIPAA), and the federal Food, Drug and Cosmetics Act (FD&C Act). The tool links directly to each agency’s information about these laws.
The FTC’s Guidance
The FTC’s new guidance offers eight “best practices” to help health app developers build privacy and security into their apps to protect consumers’ data.
First, the guidance suggests that developers minimize the amount of data they use. Toward that end, it asks them to consider whether they need to collect and retain people’s information and whether the data can be kept in a “de-identified form.”6 As one example, the guidance notes that if an app collects geolocation information as part of an effort to map asthma outbreaks in a metropolitan area, the developer might consider whether the same functionality could be provided while maintaining and using that information in de-identified form.
Second, the guidance suggests that developers limit access and permissions.
Toward that end, it points out that developers should consider what permissions their apps really need, whether the operating system they are using provides tools to tailor access to user information, and whether defaults are “privacy-protective.” As an example of a “privacy-protective” default, the guidance notes that a fitness app offering users the option of sharing the results of their workouts with others could have the default choice set to “private” rather than “public.”
The third best practice that is mentioned in the guidance is that mobile health app developers should keep authentication in mind. The guidance explains that developers should invest resources in the design, implementation, and testing of authentication. In addition, they should require complex passwords, not use default passwords unless they require consumers to change the default during set-up, store passwords securely, and, when granting access to their data or functionality—for example, through the operating system’s application programming interface (API)—should limit access to trusted clients or parties with a legitimate need to use the data.
The next best practice recommended by the FTC is that developers “consider the mobile ecosystem.” By that, it notes, developers should ask themselves if they are relying on a mobile platform to protect sensitive data, keeping in mind that there are differences between mobile platforms and that these platforms may use different APIs, have different security-related features, and handle permissions their own way.
In addition, the FTC said, developers should understand how third-party service providers are protecting data and understand that if they use another party’s code to build or enhance their app, they should make certain that the code does not have any known security vulnerabilities and that it has been tested in real-world settings.
The fifth best practice described in the new guidance is that developers should implement security “by design.” In connection with this recommendation, the FTC suggests that developers should develop a culture of security at their company, including by designating a staff member to be responsible for data security, hiring engineers experienced in secure coding practices or who are trained in secure coding, and considering the implementation of “bug bounty” programs that offer rewards—such as free products or cash—to people who identify significant security vulnerabilities in their products.
In connection with this best practice, according to the FTC, developers also should incorporate data security at every stage of the app’s life cycle: design, development, launch, and post-market. The FTC says that it is important for developers to take steps to protect their apps from well-known threats including injection attacks, hard-coded credentials or cryptographic keys that hackers can exploit, insecure APIs that leak or allow unauthorized access to data, and broken or disabled cryptography.
The FTC also is advising that developers keep current on new vulnerabilities and have a plan for how to provide updates for products and how to communicate with consumers even after an app is released.
As a sixth best practice, the FTC is advising that developers not reinvent the wheel but, instead, that they take advantage of what experts already have learned about security. It notes that there are free and low-cost tools such as software development kits (SDKs), software libraries, and cross-platform toolkits that developers can use to safeguard consumers’ personal information and help to protect their privacy. It also points out that there are free tools that can help provide encryption, conduct pre- and post-launch testing, test interfaces, scan networks for open ports, reverse-engineer programming code, check password strength, and even scan for known vulnerabilities.
The FTC also is recommending that developers consider how they will communicate with users about an app’s security options and privacy features. Developers should strive to be “simple, clear, and direct” and avoid complicated jargon or hard-to-find hyperlinks, in the view of the FTC.
Finally, the FTC’s guidance quite properly reminds developers that, in addition to the FTC Act, the FTC’s Health Breach Notification Rule, HIPAA, and the Food, Drug & Cosmetic Act, other privacy laws might be applicable, including the FTC’s Children’s Online Privacy Protection Rule,7 various state laws, and basic truth-in-advertising and privacy principles.
The FTC’s well-organized Web-based tool and new guidance for mobile health app developers should serve as a reminder to them of the importance the agency places on protecting consumers’ privacy. If regulators are interested in privacy, then developers should be interested in privacy.
Many of the questions posed are directly transferable to industries other than health care providers. Any business with a mobile or online presence can benefit from assessing its security and privacy performance against the guidances provided by the FTC. The best practices identified by the FTC, such as clarity in privacy policies, de-identification of data where possible, and setting privacy settings to most restrictive by default, offer valuable advice to every business that offers a mobile application to its customers.
1. See Shari Claire Lewis, “FCC Proposes Rules That Impact Everyone’s Online Privacy,” NYLJ (April 19, 2016).
2. See “Mobile Health Apps Interactive Tool,” available at https://www.ftc.gov/tips-advice/business-center/guidance/mobile-health-apps-interactive-tool.
3. See “Mobile Health App Developers: FTC Best Practices,” available at https://www.ftc.gov/tips-advice/business-center/guidance/mobile-health-app-developers-ftc-best-practices.
4. See https://www.cms.gov/Regulations-and-Guidance/HIPAA-Administrative-Simplification/HIPAAGenInfo/AreYouaCoveredEntity.html.
5. See https://www.cms.gov/Regulations-and-Guidance/HIPAA-Administrative-Simplification/HIPAAGenInfo/Downloads/CoveredEntitycharts.pdf.
6. U.S. Department of Health and Human Services regulations require entities covered by HIPAA either to remove specific identifiers, including date of birth and five-digit zip code, from protected health information or to have a privacy and data security expert determine that the risk of re-identification is “very small.” See http://www.hhs.gov/hipaa/for-professionals/privacy/special-topics/de-identification/index.html.
7. See https://www.ftc.gov/tips-advice/business-center/guidance/childrens-online-privacy-protection-rule-six-step-compliance.
Reprinted with permission from the June 21, 2016 issue of the New York Law Journal. All rights reserved.
- Shari Claire Lewis