Biometric Regulation Leaps Forward with FTC Policy StatementJune 19, 2023 | Deborah M. Isaacson | Amanda Griner |
During the first few months of this year, it would not have been surprising if lawyers and corporate executives already believed that 2023 was going to be the “Year of Biometric Privacy.” Consider that, in February, the Illinois Supreme Court issued two significant decisions interpreting the state’s Biometric Information Privacy Act (BIPA), continuing its trend of broadly interpreting the law’s scope and application in a way favorable to plaintiffs. The same court issued another BIPA ruling in March – a rare defense victory in a BIPA dispute from this court – with yet another BIPA opinion still on the horizon.
Moreover, a number of states and localities across the country have passed laws specifically focused on regulating the commercial use of facial recognition and other biometric information technologies. See, e.g., Washington Biometric Privacy Protection Act, Wash. Rev. Code § 19.375; Prohibit the Use of Face Recognition Technologies by Private Entities in Places of Public Accommodation in the City of Portland, Portland, Oregon, City Code Chapter 34.10; Colorado Privacy Act, 2021 Colo. Legis. Serv. Ch. 483 (S.B. 21-190) (effective July 1, 2023). Here in New York, bills were introduced in January and February relating to biometric privacy.
As significant as these developments are, they all pale in comparison to the practical implications of a policy statement issued on May 18 by the Federal Trade Commission (FTC) on “Biometric Information and Section 5 of the Federal Trade Commission Act” (the Policy Statement). In the Policy Statement, https://www.ftc.gov/system/files/ftc_gov/pdf/p225402biometricpolicystatement.pdf, the FTC warned that the increasing use of consumers’ biometric information and related technologies, including those powered by machine learning, “raises significant consumer privacy and data security concerns and the potential for bias and discrimination.”
After briefly reviewing the developments noted above, this column will delve into the Policy Statement and the implications for businesses in New York and elsewhere.
BIPA, which governs companies’ collection and possession of individuals’ biometric data, is perhaps the most frequently referenced and prominent biometric law in the country. It has been a source of extensive litigation – and resulting headlines – since it was enacted in 2008.
On February 2, the Illinois Supreme Court decided Tims v. Black Horse Carriers, Inc., No. 127801. The case arose when the plaintiff, Jorome Tims, filed a class action lawsuit against Black Horse Carriers, Inc., his former employer, alleging that Black Horse violated Section 15(a) of BIPA, which provides for the retention and deletion of biometric information, and BIPA Sections 15(b) and 15(d), which provide for the consensual collection and disclosure of biometric identifiers and biometric information.
Black Horse moved to dismiss the complaint as untimely. The trial court denied the motion, finding that a five-year statute of limitations governed the plaintiff’s claims. On appeal, the appellate court reversed in part, holding that although actions under Sections 15(a), 15(b), and 15(e) are governed by a five-year statute of limitations, actions under BIPA Sections 15(c) and 15(d) are governed only by a one-year limitations period.
The Illinois Supreme Court affirmed in part and reversed in part, holding that a five-year statute of limitations governs all BIPA claims. The decision clearly expands both the ability of potentially aggrieved parties to seek relief and the potential for liability of defendants under the statute.
On February 17, about two weeks after its decision in Tims, the Illinois Supreme Court decided Cothron v. White Castle System, Inc., No. 128004. The case reached the court after the U.S. Court of Appeals for the Seventh Circuit certified the following question of law: “Do section 15(b) and 15(d) claims accrue each time a private entity scans a person’s biometric identifier and each time a private entity transmits such a scan to a third party, respectively, or only upon the first scan and first transmission?” In a 4-3 decision, the court held that a separate claim accrues under BIPA “each time a private entity scans or transmits an individual’s biometric identifier or information in violation of section 15(b) or 15(d).” The dissent noted, among other things, that the majority’s holding rendered compliance with BIPA “especially burdensome for employers” and “could easily lead to annihilative liability for businesses.”
In March, however, the court in Walton v. Roosevelt University, No. 128338 (March 23, 2023), gave BIPA defendants a rare victory when it affirmed an appellate court’s decision and held that federal labor law preempted BIPA claims asserted by bargaining unit employees covered by a collective bargaining agreement. It remains to be seen how the court will rule in Mosby v. Ingalls Memorial Hospital, No. 129081, where the issue is whether the biometric information of health care workers is excluded under BIPA.
The BIPA exclusion at issue in Mosby is for “information collected, used, or stored for health care treatment, payment, or operations under the federal Health Insurance Portability and Accountability Act of 1996.” The appellate court determined that the biometric information of health care workers is not excluded under BIPA.
The New York Bills
As noted above, a number of biometric-related bills have been introduced recently in the New York State Legislature. For example, S.2390, https://nyassembly.gov/leg/?default_fld=&leg_video=&bn=S02390&term=2023&Summary=Y&Text=Y, would prohibit private entities from using biometric data for any advertising, detailing, marketing, promotion, or any other activity that is intended to be used to influence business volume, sales or market share, or to evaluate the effectiveness of marketing practices or marketing personnel.
A bill in the Assembly, A.1362, https://www.nysenate.gov/legislation/bills/2023/A1362, and the Senate version, S.4457, https://www.nysenate.gov/legislation/bills/2023/s4457, would create the Biometric Privacy Act. These bills would require private entities in possession of biometric identifiers or biometric information to develop a written policy establishing a retention schedule and guidelines for permanently destroying biometric identifiers and biometric information when the initial purpose for collecting or obtaining such identifiers or information has been satisfied, or within three years of the individual’s last interaction with the private entity, whichever occurs first.
The bills did not move out of committee. It is worth noting that other versions of A.1362 and S.4457 were introduced in several prior legislative sessions (2017-2018, 2019-2020, and 2021-2022) to no avail.
That may not matter now, given the FTC’s Policy Statement.
As noted earlier, the FTC explained that it issued the Policy Statement because the “increasing use of consumers’ biometric information and related marketing of technologies that use or that purport to use biometric information” raise “significant concerns with respect to consumer privacy, data security, and the potential for bias and discrimination.”
Importantly, the Policy Statement defines “biometric information” quite broadly, indicating that it refers to data that “depict or describe physical, biological, or behavioral traits, characteristics, or measurements of or relating to an identified or identifiable person’s body.” In particular, the Policy Statement says that biometric information includes, but is not limited to, “depictions, images, descriptions, or recordings of an individual’s facial features, iris or retina, finger or handprints, voice, genetics, or characteristic movements or gestures (e.g., gait or typing pattern).” The Policy Statement adds that biometric information also includes “data derived from such depictions, images, descriptions, or recordings, to the extent that it would be reasonably possible to identify the person from whose information the data had been derived.”
The heart of the Policy Statement is the non-exhaustive list of examples of practices the FTC says it will scrutinize in determining whether companies collecting and using biometric information or marketing or using biometric information technologies are complying with Section 5 of the FTC Act, which prohibits “unfair” or “deceptive” acts or practices in or affecting commerce.
The Policy Statement describes two categories of deceptive acts. First, it provides that “false or unsubstantiated marketing claims relating to the validity, reliability, accuracy, performance, fairness, or efficacy of technologies using biometric information constitute deceptive practices in violation of Section 5 of the FTC Act.” The Policy Statement adds that the FTC “intends to carefully scrutinize” these kinds of claims when they are made about biometric technologies.
Second, the Policy Statement provides that false or misleading statements about the collection and use of biometric information “constitute deceptive acts in violation of Section 5 of the FTC Act,” as does failing to disclose any material information needed to make a representation non-misleading.
The “unfairness” portion of the Policy Statement is quite extensive. It begins by simply declaring that the use of biometric information or biometric information technology “may be an unfair practice within the meaning of the FTC Act.” (Under the FTC Act, a practice is unfair if it causes or is likely to cause substantial injury to consumers that is not reasonably avoidable by consumers themselves and that is not outweighed by countervailing benefits to consumers or competition.)
The Policy Statement next observes that determining whether a business’s use of biometric information or biometric information technology violates Section 5 requires a “holistic assessment” of the business’s relevant practices. In making such assessments, the Policy Statement provides that the FTC will consider factors including:
- Failing to assess foreseeable harms to consumers before collecting biometric information;
- Failing to promptly address known or foreseeable risks;
- Engaging in surreptitious and unexpected collection or use of biometric information;
- Failing to evaluate the practices and capabilities of third parties;
- Failing to provide appropriate training for employees and contractors whose job duties involve interacting with biometric information or technologies that use such information; and
- Failing to conduct ongoing monitoring of technologies that the business develops, offers for sale, or uses in connection with biometric information to ensure that the technologies are functioning as anticipated, that users of the technology are operating it as intended, and that use of the technology is not likely to harm consumers.
The Policy Statement concludes by emphasizing that businesses “should continually assess whether their use of biometric information or biometric information technologies causes or is likely to cause consumer injury in a manner that violates Section 5 of the FTC Act.” If so, according to the Policy Statement, businesses must cease such practices, whether or not the practices are specifically addressed in the Policy Statement.
New York companies subject to BIPA or to other biometric rules, or that are aware of the biometric bills that have been introduced in New York, may already be familiar with the steps that they should take or plan on taking to try to ensure that they do not violate applicable biometric laws. The FTC’s Policy Statement adds a new layer to the compliance obligations of businesses in New York and elsewhere – and a roadmap of actions to adopt. Given the complexities, companies may want to consult with counsel about their data practices to avoid triggering an FTC action or a civil lawsuit with the potential for massive damage awards.
Reprinted with permission from the June 19, 2023, issue of the New York Law Journal©, ALM Media Properties, LLC. Further duplication without permission is prohibited. All rights reserved.
- Amanda Griner
- Deborah M. Isaacson