Threats to cybersecurity and data privacy are constantly increasing both in volume and complexity.  This trend is expected to continue in 2022.  In a bid to protect cybersecurity and ensure data is properly safeguarded, countries around the world are introducing new laws focused on cybersecurity and data protection.  Armed with new legal frameworks, regulators and law enforcement are placing onerous obligations on organisations who fall victim to cybersecurity breaches.  There are shorter deadlines in which to notify the authorities of data breaches and ever increasing fines and penalties for businesses that fail to respond swiftly and appropriately to a cyberattack.

In this ever-changing area what is on the horizon for 2022?

Continue Reading Cybersecurity and Data Privacy – What to expect in 2022

On Nov. 4, the Department of Defense announced significant changes to the Cybersecurity Maturity Model Certification program, intended to simplify the certification standard and prioritize protection of certain types of controlled defense information.

Read on for an overview of the changes, a timeline for their implementation and implications for defense contractors.

On Oct. 6, the Department of Justice announced a new Civil Fraud Cyber Initiative to “combine the department’s expertise in civil fraud enforcement, government procurement and cybersecurity to combat new and emerging cyber threats to the security of sensitive information and critical systems.”

Read on for details and analysis of this new enforcement initiative and what it means for federal contractors.

On Sept. 15, the Federal Trade Commission issued a policy statement emphasizing that developers of health apps and other connected devices and their service providers must meet breach notification requirements under the Health Breach Notification Rule, including a rapid 10-day notice period to the FTC and a 60-day notice period to individuals and the media. The FTC statement also warned that it would bring enforcement action — and violations could result in civil penalties of $43,792 per violation, per day.

Read on for details about the notification rule and critical next steps for impacted entities.

One might think that any company reasonably anticipates litigation after suffering a data breach, so the work product doctrine would almost inevitably protect its data breach investigation. But only a handful of companies have succeeded in claiming such protection.

In In re Rutter’s Data Security Breach Litigation, Civ. A. No. 1:20-CV-382, 2021 U.S. Dist. LEXIS 136220 (M.D. Pa. July 22, 2021), data breach victim Rutter’s learned of a possible data breach on May 29, 2019. Later that same day, it hired BakerHostetler “to advise [it] on any potential notification obligations.” Id. at *3 (internal citation omitted). The next day BakerHostetler hired consultant Kroll “to conduct forensic analyses on Rutter’s card environment and determine the character and scope of the incident.” Id. (internal citation omitted). But Rutter’s still lost its work product claim. The court pointed to Kroll’s scope of work — which was “to determine whether unauthorized activity . . . resulted in the compromise of sensitive data, and to determine the scope of such a compromise if it occurred.” Id. at *6 (emphases added) (internal citation omitted). The court noted Kroll’s corporate designee’s testimony that “he was unaware of anyone else at Rutter’s contemplating such lawsuits.” Id. at *7. Finally, the court emphasized that “Kroll provided its report to Defendant when it was completed and there was no evidence that it was provided first to BakerHostetler.” Id. at *8. The court similarly rejected Rutter’s attorney-client privilege claim, noting that Kroll’s scope of work made “no mention of attorney involvement” in the investigation, which resulted in a report that “did not include legal input.” Id. at *12-13.

Perhaps there is nothing a company can do to assure work product or privilege protection for such data breach investigations. But this most recent losing effort should at least help companies avoid these fatal facts.

Amazon’s financial records have revealed that the Luxembourg data protection supervisory authority, the Commission Nationale pour la Protection des Données (“CNPD”), is fining the retailer’s European arm (Amazon Europe Core S.à.r.l.) an eyewatering 746 million euros (£636m or $838m) for breaches of the EU’s General Data Protection Regulation (“GDPR”).

When the GDPR was introduced in May 2018, the potential for huge financial sanctions grabbed many headlines: it gives European supervisory authorities the power to impose fines of up to 20 million euros or 4% of annual global turnover (whichever is greater) for breaches of the GDPR. There have been some undeniably sizeable fines issued under the GDPR in the last three years. But the level of this particular fine is extraordinary: it’s the largest GDPR fine issued to date by a considerable margin. The second largest fine ever imposed under the GDPR was a comparatively paltry 50 million euros, levied against Google by CNIL (the French supervisory authority) in early 2019 (which you can read about here).

Continue Reading CNPD v. Amazon, the largest GDPR fine on record – what do we know so far?

New York City’s recently enacted biometric privacy law took effect July 9, 2021. While the law is vague as to exactly who must abide by certain subsections, it is undoubtedly consumer-focused. However, even if employers escape New York City’s biometric ordinance, a looming New York state law may soon impose more expansive biometric requirements on all private entities operating in the state, including employers.

New York City’s Biometric Privacy Law

On July 9, 2021, New York City’s biometric privacy law became effective. The law has two main requirements: Section 22-1202 a. and b.

Section 22-1202 a. provides that any “commercial establishment” that collects, retains, converts, stores or shares biometric identifier information of customers must disclose its practice to customers by placing a clear and conspicuous sign near all customer entrances. This requirement is clearly limited to “commercial establishments” — which the law defines as “a place of entertainment, a retail store, or a food and drink establishment” — and such establishments’ customers.

The applicability of Section 22-1202 b. is less clear. This subsection simply provides:

It shall be unlawful to sell, lease, trade, share in exchange for anything of value or otherwise profit from the transaction of biometric identifier information.

Without the clear “commercial establishment” qualifier, subsection b. could be interpreted to apply to employers. This is notable because New York City’s law provides aggrieved individuals with a private right of action to recoup $500 for each violation and $5,000 for each intentional or reckless violation, as well as attorneys’ fees and costs.

However, legislative history indicates that all aspects of the law are intended to be limited to commercial establishments and their customers, not employees. For example, New York City Council meeting minutes reflect that the law’s purpose is to make it “unlawful ‘to sell, lease, trade, share in exchange for anything of value or otherwise profit’ from the exchange of customer’s biometric identifier information that these establishments have used to identify individuals.”

The law’s definition of “biometric identifier information” also reflects the intended narrow application. “Biometric identifier information” is defined within the law as:

a physiological or biological characteristic that is used by or on behalf of a commercial establishment, singly or in combination, to identify, or assist in identifying, an individual, including, but not limited to: (i) a retina or iris scan, (ii) a fingerprint or voiceprint, (iii) a scan of hand or face geometry, or any other identifying characteristic.

The inclusion of “commercial establishment” thus limits which type of entity all portions of the law, including subsection b., could apply to. Therefore, at most, subsection b. could apply to “commercial establishment” employers, if it applies to employers at all.

New York City’s Chief Privacy Officer will post guidance for those entities likely to be impacted by the new law. It is anticipated that such guidance will clarify whether the law applies to employers.

New York State’s Proposed Biometric Privacy Law

Even if New York City’s law does not apply to employers, a proposed statewide biometric privacy bill, if enacted, would impose more expansive requirements on all private entities operating in New York state, including employers.

New York state’s Biometric Privacy Act, AB 27 (BPA), was introduced earlier this year and has bipartisan support. The BPA is very similar to Illinois’ Biometric Information Privacy Act (BIPA), which has recently become notorious for generating numerous large class action settlements and a variety of appellate issues.

Like Illinois’ BIPA, New York’s proposed BPA generally requires all private entities in possession of biometric identifiers or biometric information to do the following.

  • Develop a publicly available written policy establishing a retention schedule and guidelines for permanently destroying said data when the initial purpose is satisfied, or within three years of the biometric subject’s last interaction with the private entity, whichever occurs first.
  • Satisfy several other preconditions prior to collecting biometric identifiers or information, including:
    • informing the subject in writing that the biometric data is being collected or stored;
    • informing the subject in writing of the purpose and length of time for which the data is being collected; and
    • obtaining a written release signed by the subject of the data being collected.

Additionally, the BPA provides requirements for storing and destroying biometrics, including that biometrics must be stored in at least the same manner that other confidential information is kept and using the industry’s reasonable standard of care.

Also like Illinois’ BIPA, New York’s BPA will undoubtedly spur a flurry of single plaintiff and class action cases against employers. As currently drafted, the BPA provides for a private right of action allowing plaintiffs to recover the greater of actual damages or $1,000 for each negligent violation, and actual damages or $5,000 for each intentional or reckless violation. In addition, plaintiffs are entitled to reasonable attorney’s fees, costs and other relief, including an injunction.

What Should Employers Do Now?

While it remains to be seen whether the proposed BPA will become law, if enacted in its current form, New York employers are sure to face increased litigation regarding any biometric timekeeping or security practices they utilize. Accordingly, while these biometric laws loom, New York employers should consider proactively adopting biometrics policies and obtaining written releases from employees whose biometric data is being collected. Employers should also adhere to general data collection principles, including confidentiality and security measures.

For assistance monitoring and complying with New York’s biometrics privacy and other employment laws, please contact the authors, other members of the McGuireWoods labor and employment team, or your McGuireWoods contact.

Two U.S. Circuit Courts of Appeals recently weighed in on what it takes to establish standing to pursue a Telephone Consumer Protection Act (TCPA) claim. The 5th Circuit held that receipt of one unwanted text message is enough to satisfy Article III, which deviates from a prior 11th Circuit decision holding that one text message does not confer standing. On the other hand, the 3rd Circuit held that a TCPA plaintiff did not have standing where the plaintiff alleged only a TCPA violation, but no harm. Finally, the 11th Circuit issued an opinion interpreting the Fair Debt Collection Practices Act (FDCPA) that has important implications for debt collectors using vendors to place calls or text messages.

Please read our alert.

On June 14, 2021, the Board of the newly-formed California Privacy Protection Agency (“CPPA”) held its first public meeting.  The Board had an extensive agenda, covering topics such as the laws affecting the Board and CPPA, initial hiring strategy for the CPPA, policies and practices on delegations of authority and conflicts of interest, establishment of subcommittees of the Board, notice to the Attorney General regarding the assumption of rulemaking under the California Privacy Rights Act (the “CPRA”), and setting future agenda items and a meeting schedule for the Board.  (As a refresher, when the CPRA passed as a ballot measure last Fall, it established the CPPA as a first-of-its-kind agency solely devoted to the regulation and enforcement of consumer privacy.  The CPPA is tasked with enforcing the CPRA and developing a set of regulations providing guidance for businesses on how to comply with that new law.  For more on the CPRA, please see our post here.)

While the CPPA Board’s June 14 full-day meeting covered a lot of ground, it is clear there is much work to be done for the CPPA to emerge as an independent, fully-functional agency, let alone promulgating regulations in time to meet the CPRA’s July 1, 2022 deadline for final regulations.  Overall, the Board members appeared to be committed to working through these challenges, but acknowledged that they are under a lot of time pressure.

Continue Reading Starting at the Beginning: California Privacy Protection Agency Board Meets for the First Time