Welcome to WileyConnect, the Internet of Things blog by
Wiley Rein LLP.

Ninth Circuit Opens the Floodgates to Privacy Litigation

Ninth Circuit Opens the Floodgates to Privacy Litigation

August 9, 2019

This article is co-authored by Boyd Garriott, Megan Brown, and Wesley Weeks.

On Thursday, the Ninth Circuit—in Patel v. Facebookissued an opinion in a much-watched privacy class action suit, inviting more litigation over claimed privacy violations.  The court allowed a class action suit to go forward where plaintiffs had alleged that Facebook violated Illinois’ privacy law by scanning individuals’ faces from photographs uploaded to the social network on an opt-out basis.  In a sweeping decision, the court held that the statutory violation was, itself, a concrete injury sufficient to satisfy the traditional constitutional requirement of Article III standing.  This case will grease the wheels on the deluge of no-harm privacy class action lawsuits.

Patel involved Facebook’s “Tag Suggestions” feature.  On Facebook, users can “tag” one another in photographs.  Doing so links the tagged user’s profile to the photograph.  To facilitate the feature, Facebook launched Tag Suggestions in 2010.  Tag Suggestions uses facial recognition technology to suggest individuals for a user to tag.  To do so, Facebook (1) analyzes photos for “various geometric data points that make a face unique,” creating a “face signature;” and then (2) compares a photo’s “face signatures” to a database of “user face templates,” which are a collection of “signatures that have already been matched” to a user.  Facebook does not create face templates for users who opt out of Tag Suggestions.

The plaintiffs in Patel are Facebook users who argue that Tag Suggestions violates an Illinois law called the Biometric Information Privacy Act (BIPA or the Act).  BIPA’s provisions impose what are essentially procedural requirements, including (1) obtaining written consent before collecting any biometric identifier; and (2) establishing and publishing a retention schedule for destroying such information.  BIPA provides a private right of action for anyone who is “aggrieved” by a violation of the Act, providing for liquidated damages of up to $1,000 for negligent violations of the statute and up to $5,000 for intentional violations.  In January, the Illinois Supreme Court held that plaintiffs are “aggrieved” by procedural violations with no need to show “actual damages.”

Facebook opposed the suit on standing grounds, arguing that, regardless of the claimed procedural violations, the plaintiffs failed to show actual harm.  In particular, Facebook relied in large part on Spokeo v. Robins, in which the Supreme Court held that “a bare procedural violation, divorced from any concrete harm” could not satisfy Article III’s injury-in-fact requirement.  Facebook argued the plaintiffs suffered no concrete harm, citing deposition testimony in which the lead plaintiff said that Tag Suggestions was a “nice feature” and that he had not opted out of it, even though he knew that he could.  Facebook further argued that recognizing a privacy harm from data collection—without any kind of harmful disclosure or real risk thereof—would render Article III’s injury-in-fact requirement meaningless.

The Ninth Circuit rejected these arguments holding that BIPA was established to protect the plaintiffs’ concrete interests.  Citing Fourth Amendment jurisprudence, a 150-year-old law review article, and the Restatement (Second) of Torts, the court held that the right to privacy is a concrete interest.  It further concluded that “the development of a face template using facial-recognition technology without consent (as alleged here) invades an individual’s private affairs and concrete interests.”[1]  The court then examined BIPA and interpretations thereof by the Illinois Supreme Court and concluded that the statute was established to protect that concrete privacy interest by imposing procedural safeguards.

 The Ninth Circuit also held that procedural violations of BIPA “actually harm, or present a material risk of harm” to plaintiffs’ privacy interest.  The court found that “[b]ecause the privacy right protected by BIPA is the right not to be subject to the collection and use of such biometric data, Facebook’s alleged violation of these statutory requirements would necessarily violate the plaintiffs’ substantive privacy interests.” 

This case will have profound implications.  There are hundreds of pending BIPA suits.  Facebook is looking at thousands of dollars in liquidated damages per violation.  Sitting at just over 2 billion users, that can add up fast.  And it’s not just BIPA—states are currently passing a slew of new privacy laws.  What’s worse, the Supreme Court recently declined to take up several privacy-related standing issues, muddying the waters and emboldening plaintiffs’ lawyers seeking to collect easy statutory damages. 

Ultimately, courts and legislatures need to step in.  Courts need to enforce Article III’s injury-in-fact requirement to ensure that the judiciary is being used to resolve real problems, as opposed to letting consumers cash in over features to which they do not hold any real objections.  Legislatures should ensure that they pass laws that require actual harms, not procedural violations with disproportionate liquidated damages.  Failing to do so will deter innovation and bankrupt businesses acting in good faith that have caused no actual harm.

[1] The court did not explain how the data collection was nonconsensual, given the fact that users are afforded the right to opt out, and the named plaintiff readily conceded that he was aware of—and chose not to utilize—that right.

Print Friendly and PDF
Megan Brown Co-Authors National Security Institute’s New Law and Policy Paper on ‘Privacy Regulation and Unintended Consequences for Security’

Megan Brown Co-Authors National Security Institute’s New Law and Policy Paper on ‘Privacy Regulation and Unintended Consequences for Security’

CFPB Highlights Innovative Uses of Data and Machine Learning in Credit

CFPB Highlights Innovative Uses of Data and Machine Learning in Credit