Litigation

BIPA Litigation in 2021: Where We’ve Been & Where We’re Headed

Published: Aug. 18, 2021

Litigation under Illinois’ Biometric Information Privacy Act (BIPA) has churned steadily since the beginning of this year. Below is an overview of emerging trends and significant decisions that may signal the direction of BIPA litigation through the remainder of 2021 and beyond.

Plaintiffs Strategically Plead Standing

In response to the trend of defendants removing BIPA cases to federal court, plaintiffs have begun strategically drafting their complaints to disavow Article III standing to keep them in state court — avoiding the preemption and class action certification complications that can come along with federal jurisdiction. For example, in Thornley v. Clearview AI, Inc., plaintiffs alleged that Clearview violated BIPA § 15(c) (prohibiting entities from selling or otherwise profiting from biometric data), but stated that plaintiffs had not “suffered any injury as a result of the violations” beyond the “statutory aggrievement.” Plaintiffs’ strategic pleading to avoid alleging a concrete, particularized injury paid off, because the Seventh Circuit held that plaintiffs had alleged a bare procedural violation that did not qualify for Article III standing. The Seventh Circuit even acknowledged: “It is no secret to anyone that [plaintiffs] took care in their allegations … to steer clear of federal court. But in general, plaintiffs may do this … they may take advantage of the fact that Illinois permits BIPA cases that allege bare statutory violations, without any further need to allege or show injury.” The case was remanded to state court.

Thornley came on the heels of Fox v. Dakkota Integrated Systems, LLC, a significant standing decision that clarified when alleged violations of BIPA § 15(a) (requiring entities to develop, publicly disclose, and comply with data retention schedules and destruction guidelines) qualify as an “injury in fact” sufficient to confer Article III standing. The plaintiff in Dakkota alleged that the defendant failed to post the required policies (an injury to the public at large) and unlawfully retained the plaintiff’s handprint scan after her employment with the defendant ended. The Seventh Circuit held that the unlawful retention constituted “a concrete and particularized invasion of [the plaintiff’s] privacy interest in her biometric data” that met the bar for Article III standing. The court thus concluded that removal was proper, and the case remained in federal court until the plaintiff voluntarily dismissed the case. 

Given the roadmap provided by Thornley and Dakkota and the risks of federal jurisdiction for plaintiffs, we expect continued efforts to strategically draft complaints in a manner that limits defendants’ ability to remove cases to federal court. 

Faceprint Cases Continue, with Sizable Settlements

While fingerprints remained the most frequently sued-over biometric, faceprints have been the second most popular target of litigation. A “scan of facial geometry” or “faceprint” is a “biometric identifier” under BIPA. While BIPA does not define “scan of facial geometry” or “faceprint,” these identifiers are commonly defined as the measurements of distances between various facial features to generate a unique numerical representation of an individual face. Recently, plaintiffs have targeted algorithms that recognize faces – for example, Apple was sued over the Face ID feature, which allows iPhone owners to unlock their phones with their faces, and a university was sued for using exam monitoring software that required photo identification. In other cases, plaintiffs do not allege that the systems actually recognize them. For instance, Mary Kay Cosmetics and Ulta Beauty were sued over makeup try-on apps that superimpose makeup on a user’s face, but the plaintiffs do not allege that the apps recognize users – they merely allege that the apps scan users’ facial geometry to display the image of the user’s face. 

While the scope of BIPA liability for the deployment of facial recognition technology and/or use of face data is still being defined, companies should tread carefully given the risk of sizeable monetary penalties, which could lead to high settlements. In February 2021, Facebook settled multi-year litigation over its photo tagging feature for $650 million, the largest BIPA settlement to date. Plaintiffs are seeking approval of a $92 million settlement with TikTok in a case over its detection of faces in videos.

Voiceprints are a Growing Area of Litigation

Five complaints based on voiceprints have been filed so far this year, more than in any other previous year. What constitutes a “voiceprint” is currently being litigated, as BIPA does not provide a definition and the Illinois courts have provided little guidance. One Illinois judge has reasoned that a voiceprint is more than a simple voice recording (see Rivera v. Google, Inc. at 1097), but beyond this dicta, it is unclear what technical processing or level of identifiability is required to transform a voice into a voiceprint.

This year, plaintiffs have targeted voice recognition technology used by Amazon’s Alexa to recognize consumer commands and McDonalds to recognize repeat customers at its drive-through windows. The current lack of clarity regarding the scope of what constitutes a “voiceprint” under BIPA presents a risky gray area for companies using voice recognition technology. Accordingly, companies collecting and/or analyzing voice data, including via third party vendors, should be wary of the potential for a BIPA lawsuit, even if they do not consider themselves to be creating “voiceprints.” As these cases move forward, courts may provide more clarity as to which practices trigger BIPA applicability. This may help companies mitigate BIPA risk. 

More Plaintiffs Allege “Profiting” Violations

Plaintiffs continue to test out arguments that defendants profit from biometric data in violation of BIPA § 15(c). This provision makes it illegal to “sell, lease, trade, or otherwise profit from” biometric data. A common argument is that defendants profit from biometric data by using such data to create a product or service that they sell. Another plaintiff tactic has been to argue that by marketing a product or service with emphasis on features that allegedly rely on biometric data, a defendant obtains a market advantage that results in increased profits.

Courts have recently provided some guidance on the question of what can be an actionable “profit.” In Vance v. Microsoft Corp., the Western District of Washington reasoned that “otherwise profit” should be interpreted in light of “sell, lease,” and “trade,” which contemplate a commercial transaction. The court held that “§ 15(c) regulates transactions with two components: (1) access to biometric data is shared or given to another; and (2) in return for that access, the entity receives something of value.” Notably, the first prong can be either a direct sale of biometric data, or “the biometric data may be so integrated into a product that consumers necessarily gain access to biometric data by using the product or service”. The court dismissed the 15(c) claim against Microsoft because the plaintiffs only vaguely referenced Microsoft’s use of biometric data to “improve its facial recognition products and technologies,” which improved the products’ effectiveness and made them more commercially valuable. In contrast, the court upheld 15(c) claims against Amazon in the companion case Vance v. Amazon.com Inc. because the allegations that Rekognition “allows users to match new images of faces with existing, known facial images ‘based on their visual geometry’” supported the “reasonable inference that selling Amazon’s products necessarily shares access to the underlying biometric data in exchange for some benefit to Amazon.” However, the court left open the possibility that further factual development could demonstrate that biometric data is not sufficiently integrated into the product to support a viable 15(c) claim. 

Companies that incorporate biometrics into their products or services – by, for example, using face data to train algorithms – should be aware of the risk of 15(c) claims and prepared to defend their use of biometric data in product development.