New York City’s recently enacted biometric privacy law took effect July 9, 2021. While the law is vague as to exactly who must abide by certain subsections, it is undoubtedly consumer-focused. However, even if employers escape New York City’s biometric ordinance, a looming New York state law may soon impose more expansive biometric requirements on
On April 14, 2021, the United States Department of Labor (the “DOL”) issued for the first time guidance to retirement plan sponsors, fiduciaries, record keepers, service providers and plan participants guidance on cybersecurity issues. The DOL’s press release includes three pieces of guidance, including: (1) Tips for Hiring Service Providers; (2) Cybersecurity Program Best Practices; and (3) Online Security Tips.
The Employee Benefits Security Administration, a sub-agency of the DOL (the “EBSA”) long ago stated that addressing cybersecurity has been on the agency’s “to do” list and even published a report in 2016 reflecting the need for such guidance, which we previously covered here.
The Employee Retirement Income Security Act of 1974, as amended (“ERISA”), includes fiduciary standards that require a retirement plan to be administered in accordance with a standard of care for a prudent person who is familiar with such matters. Common sense dictates that ERISA fiduciaries administer their plans in accordance with industry standards for cybersecurity, safeguard plan assets and ensure that appropriate controls are in place to avoid financial losses to plans that may result from a cybersecurity breach. However, the legal issues concerning who is responsible (plan participant, plan sponsor or record keeper) remain open questions in many jurisdictions.
The technology sector runs the gamut from artificial intelligence (AI), the Internet of Things (IoT) to SaaS companies or cybersecurity, and from the biggest household names to the smallest companies being operated out of garages. The rise of AI and traps for the unwary were previously covered here. Risks of investing in SaaS Solutions can be found here and here. Technology is everywhere in 2021, even in the smallest brick and mortar shops around. Technology investing offers lucrative opportunities for investors large and small, but there are many traps for the unwary, such as “zero-day exploits.”
Continue Reading Tech Investing Part 1: Zero Hour
“Information security is critical to the operation of the financial markets and the confidence of its participants. . . The Division is acutely focused on working with firms to identify and address information security risks, including cyber-attack related risk . . .” SEC Division of Examinations, 2021 Examination Priorities, at 24.
On March 3, 2021, the Securities and Exchange Commission’s newly renamed Division of Examinations (EXAMS) (formerly the Office of Compliance Inspections and Examinations (OCIE)) announced its 2021 examination priorities. Information security and operational resiliency ranked number two out of the top five priorities sending a clear message that the SEC is focused on emergent security threats, particularly cyber-attacks, resulting from the sudden and unprecedented increase in remote operations.
As we discussed in Part I, the United States does not have a single, comprehensive federal law governing biometric data. However, we have recently seen an increasing number of states focusing on this issue. Part I summarized legislative activity on this issue in 2020. In this Part II, we discuss noteworthy legislation to monitor in 2021.
What to Expect in 2021
At least two states—New York and Maryland—have already introduced biometrics legislation in this first month of 2021.
New York – AB 27
On January 6, 2021, the New York Assembly introduced the Biometric Privacy Act (BPA), a New York state biometric law aimed at regulating businesses handling biometric data. BPA will prohibit businesses from collecting biometric identifiers or information without first receiving informed consent from the individual, prohibit profiting from the data, and will require a publicly available written retention and destruction policy. As proposed, the statute contains a private right of action; and if passed, it will permit consumers to sue businesses for improperly collecting and using their biometric data. The statute follows Illinois’s BIPA, allowing recovery of $1,000 per negligent violation and $5,000 per intentional violation, or actual damages, whichever is greater, along with attorney’s fees and costs, and injunctive relief.
Data privacy laws have made significant breakthroughs in recent years, making it a top priority for businesses. From the adoption of the European Union’s General Data Protection Regulation (GDPR) in 2016 to the enactment of the California Consumer Privacy Act (CCPA) in 2018 and the latest ballot approval of the California Privacy Rights Act (CPRA) in 2020, we continue to see data privacy laws develop and garner interest from consumers, businesses, and legislators alike.
Specific biometric privacy laws, in particular however, are often overshadowed by more general data privacy laws. As we discussed in our prior article, biometrics are physical and behavioral human characteristics (i.e., face, eye, fingerprint, and voice features) that can be used to digitally identify a person. As the collection and use of biometric data become more common in daily life and its applications in different industries continue to expand, new privacy considerations will emerge in this field. Biometrics laws, in their own right, require separate recognition because of the nuanced application of these specific laws.
The United States does not have a single, comprehensive federal law governing biometric data. Recently, we have seen an increasing number of individual states focus on this issue, and the recent introduction of legislation in a number of states specifically aimed at protecting the collection, retention, and use of biometric data. In Part I, we summarize some of the legislative activity on biometric laws from 2020. We will describe other noteworthy legislation to monitor for 2021 in Part II.
Data privacy is a top concern for many in-house legal professionals – and for good reason – data privacy and cybersecurity legal requirements are complex and continually evolving. Data Privacy Day is a great day to start addressing your organization’s data privacy and cybersecurity needs.
On Data Privacy Day 2021, here is what is top of mind for some of our Data Privacy & Security Team members:
- Andrew Konia – A Federal Privacy Law: “Calls (pleas?) for federal privacy legislation are nothing new, and last year we came close, with both parties presenting draft bills for consideration (surprise, neither passed!). But now, with the White House and both chambers of Congress under Democratic control, there appears to be renewed (and more serious) interest in a federal privacy law. We have seen (admittedly narrow) hints of the federal government taking a stronger stance on cybersecurity standards with the IoT Cybersecurity Improvement Act of 2020, which applies to federal agency purchases. But you take the recent and intense backlash on “Big Tech’s” use/sharing of data and perceived lack of data transparency, and mix in the Biden Administration’s prioritization of consumer protection generally, and you have the recipe – and a strong political appetite – for a comprehensive federal privacy law.”
- Bethany Lukitsch – California: “CPRA will be here before we know it, and most companies are going to have a lot to do to get ready. Updating privacy policies and adding ‘do-not-share’ links are one thing, but as with CCPA, it’s the behind-the-scenes work that is really going to take some time. It’s certainly not too early to get started.”
Healthcare providers and other covered entities are not required by HIPAA regulations to have “bulletproof” protections for safeguarding patient information stored in electronic form, according to a January 14, 2021 decision of the 5th U.S. Circuit Court of Appeals. In University of Texas M.D. Anderson v. U.S. Department of Health and Human Services, the 5th Circuit vacated a $4.3 million civil monetary penalty imposed by the U.S. Department of Health and Human Services (HHS) against the University of Texas’ M.D. Anderson Cancer Center.
The case arises from three separate incidents where M.D. Anderson employees lost laptops and USB thumb drives that contained unencrypted protected health information (PHI) for more than 34,000 patients. M.D. Anderson reported the breach incidents to HHS’ Office for Civil Rights (OCR), the office tasked with enforcing HIPAA. As a result of the reported breaches, OCR ordered M.D. Anderson to pay $4.3 million in civil monetary penalties (CMPs). M.D. Anderson appealed the decision to an HHS administrative law judge and to the HHS Departmental Appeals Board (DAB), both of which upheld OCR’s penalties. M.D. Anderson argued that the HIPAA regulations do not require encryption, that it complied with the regulations and employed other effective measures to safeguard electronic protected health information (ePHI), that the three incidents were the fault of staff who violated M.D. Anderson’s policies, and that the proposed CMPs were excessive.
The end of the Brexit transition period on 31 December 2020 means the UK now has full autonomy over its data protection policies. As of 1 January 2021 the UK is recognised as a ‘third country’ under EU General Data Protection Regulation (GDPR) rules. The EU-UK Trade and Cooperation Agreement, which is an agreement in principle between the EU and UK, does not yet include a provision for the vast flow of personal data being transferred between the two jurisdictions. The transfer of personal data will be subject to a separate adequacy decision from the EU due in early 2021. This separate adequacy decision will determine whether the EU will allow the ongoing free flow of data from EU/EEA countries to the UK. If an adequacy decision is not granted, then organizations who transfer personal data from the EU/EEA to the UK will have to take additional steps to ensure data being transferred is provided equivalent protections to those under the EEA. The UK has already determined that it considers all EEA/ EU states to be adequate which means that personal data flows from the UK to the EU/EEA will remain unaffected.
Continue Reading The Status of EU–UK Data Flows Following Brexit
In Part II of this series, California-based Ali Baiardo, and London-based Alice O’Donovan, continue their comparison of the GDPR and California privacy law. To view Part I in the series, click here.
NEW DATA PROTECTION PRINCIPLES AND OBLIGATIONS ON BUSINESSES
a. Key data protection principles
The GDPR revolves around seven key data protection principles:
- Lawfulness, fairness and transparency;
- Purpose limitation;
- Data minimisation;
- Storage limitation;
- Integrity and confidentiality (security); and