On April 1, 2026, the U.S. Court of Appeals for the Seventh Circuit, which consolidated three interlocutory appeals, issued a significant ruling in Clay v. Union Pacific Railroad Co., that resolves the question of whether Illinois’s 2024 amendment to the Biometric Information Privacy Act (“BIPA”) applies retroactively to cases pending when it was enacted.[1] The court answered in the affirmative, and held that the amendment applies retroactively. This decision is a victory for businesses facing astronomical exposure in pending BIPA litigation.

Background: The Cothron Decision and the Legislative Response

BIPA regulates the way in which private entities collect, retain, and disclose biometric identifiers and biometric information. Section 15 of BIPA describes the substantive requirements, prohibiting private entities from collecting or disseminating an individual’s biometric identifiers without informed consent.[2] Section 20 creates a private right of action and establishes liquidated damages of $1,000 per negligent violation and $5,000 per intentional or reckless violation.[3]

In 2023, the Illinois Supreme Court held in Cothron v. White Castle System, Inc., that a new claim accrues under Section 15 “with every scan or transmission” of biometric information.[4] This decision opened the door for astronomical damages. The court acknowledged policy concerns with such an outcome and “respectfully suggest[ed] that the legislature review these policy concerns and make clear its intent regarding the assessment of damages under the Act.”[5]

The Illinois General Assembly amended Section 20 of BIPA less than a year and a half later in two new subsections providing that a private entity that collects the same biometric identifier from the same person using the same method of collection “has committed a single violation” for which the aggrieved person is entitled to “at most, one recovery.”[6] The amendment did not include an express retroactivity clause.[7]

The Seventh Circuit’s Analysis

Applying Illinois’s retroactivity framework, the Seventh Circuit held that the amendment is retroactive. Under Illinois law, when an amendment is silent on its temporal reach, pending proceedings “shall conform, so far as practicable, to the laws in force at the time of such proceeding.”[8] The key question then becomes whether the amendment is substantive (generally not retroactive) or procedural (retroactive).[9] The Illinois Supreme Court treats remedial changes as procedural.[10]

The Court stated that the “amendment to BIPA Section 20 is a remedial change” which makes it “‘procedural’ under Illinois law, so courts should apply the amendment to cases pending at the time the statute was enacted.”[11]

Practical Implications for Businesses

This decision has immediate and significant practical implications. The ruling effectively eliminates the threat of per-scan, multimillion-dollar damages awards in pending BIPA cases and drastically reduces exposure to businesses.[12] Any settlement considerations, on individual cases or class basis, should be reconsidered based on this new ruling.

The ruling also has broader implications for BIPA compliance going forward. With the per-scan damages threat eliminated, the focus in these cases can return to the merits — namely, whether the company followed the law’s informed consent requirements. Businesses should nevertheless continue to maintain robust BIPA compliance programs, as the statute still provides a private right of action and meaningful per-person damages for violations.

McGuireWoods has a team of attorneys who represent companies in BIPA litigation and advise on biometric privacy compliance matters. For any questions about this alert, please contact our listed attorneys below.


[1] Clay v. Union Pac. R.R. Co., Nos. 25-2185, 25-2761, 25-2762 (7th Cir. Apr. 1, 2026)

[2] Id. at 3 (citing 740 ILCS 14/15(b), (d))

[3] Id. at 4 (citing 740 ILCS 14/20).

[4] Cothron v. White Castle System, Inc., 216 N.E.3d 918, 926, ¶ 30 (Ill., 2023)

[5] Id., ¶ 43.

[6] Clay, Nos. 25-2185, 25-2761, 25-2762, at 5 (citing Pub. Act. 103-0769, 2024 Ill. Laws 6788–89 (2024)).

[7] Id.

[8] Id. at 7.

[9] Id. at 8.

[10] Id.

[11] Id. at 9.

[12] Id. at 3.


On Friday, April 3, 2026, the U.S. District Court for the District of Massachusetts preliminarily enjoined the Trump administration from requiring public colleges and universities in 17 states to submit seven years’ worth of Integrated Postsecondary Education Data System (IPEDS) Admission and Consumer Transparency Supplement (ACTS) survey data. The reporting deadline for the members of two intervenor organizations remains April 14, pending the outcome of a hearing scheduled for April 13.

Colleges and universities should assess their cybersecurity compliance posture and incident response readiness and harden their networks as soon as possible in light of elevated threats.

Since June 2025, the Cybersecurity and Infrastructure Security Agency has cautioned that Iranian government-affiliated actors routinely target U.S. networks and internet-connected devices. The war in Iran and recent Iranian state-sponsored malicious cyber operations suggest U.S. educational institutions may be more vulnerable than usual. They already face a complex web of overlapping federal and state data breach notification requirements, cybersecurity-related risks to Title IV funding eligibility and lasting reputational harm due to cyberattacks.

Read on to learn more about current cyber threats and what institutions should do to prepare.

On March 20, 2026, the White House unveiled its National Policy Framework for Artificial Intelligence, providing a blueprint on legislative recommendations and urging Congress to act. It recommends that Congress create a unified federal standard to reduce the regulatory friction of competing state AI regimes, promote AI innovation, and develop an AI-ready workforce, while ensuring the protection of children, consumers, and intellectual property rights. 

Continue Reading White House Releases AI Legislative Recommendations—Congress Has the Blueprint, but Questions Remain

The California Privacy Protection Agency (CalPrivacy) is entering an aggressive new phase of privacy regulation and enforcement, of which companies doing business in California should be aware. CalPrivacy already brought enforcement actions against many companies, maintains over 100 active investigations and has signaled an increased pace of enforcement.

Continue Reading CalPrivacy Ramps Up Privacy Enforcement

Overview

As we enter the 2026 tax filing season, organizations face a heightened risk of cyberattacks targeting employee information. Tax season is a busy time for cybercriminals, who ramp up efforts to trick businesses and individuals into sharing personal information. Bad actors can use stolen personally identifying information (“PII”) in a variety of harmful ways, including to file fraudulent tax returns and claim refunds. Below we provide an overview of the current threat landscape, key warning signs to watch for, practical prevention strategies, and guidance on legal obligations if your organization is targeted.

Continue Reading Protecting Employee Information From Tax Season Phishing Schemes

On Feb. 10, 2026, U.S. District Judge Jed Rakoff of the Southern District of New York issued a bench ruling holding that a defendant’s use of generative AI to analyze legal exposure is not protected under attorney-client privilege or the work product doctrine. See When AI Isn’t Privileged: SDNY Rules Generative AI Documents Not Protected. On Feb. 17, 2026, Judge Rakoff issued a written opinion confirming the bench ruling and adding important analysis. Read on to learn what the opinion adds on confidentiality, work product and waiver, and for details of the practical implications and open questions left by Judge Rakoff’s opinion.

On Feb. 10, 2026, the U.S. District Court for the Southern District of New York held that a defendant’s use of generative AI to analyze legal exposure is not protected under attorney-client privilege or the work product doctrine. The decision has important implications as clients and nonlawyers increasingly use generative AI tools to assess legal risk, despite disclaimers by AI companies that their tools do not provide legal advice. Use of public AI tools creates significant privilege risks because these tools often lack confidentiality protections and are typically not used under counsel’s direction.

Data Privacy Day offers a natural checkpoint to take stock of a fast‑moving legal landscape. As of January 1, 2026, several significant U.S. state privacy laws and regulatory updates are now live, with additional U.S. and global milestones queued up throughout 2026. Below we summarize important changes already in effect and highlight issues to monitor as the year unfolds.

Continue Reading Data Privacy Day 2026: What Changed on Jan. 1 — And What to Watch Next

On Jan. 14, 2026, the U.S. Food and Drug Administration (FDA) and the European Medicines Agency (EMA) jointly released the “Guiding Principles of Good AI Practice in Drug Development,” a set of 10 high-level principles intended to steer the safe and responsible use of AI across the product lifecycle. While not formal industry guidance, the document provides important insights into FDA and EMA thinking on the deployment of AI during drug and biologic product development and signals future regulatory guidance from both regulators. Read on for further details and takeaways for regulated industry.