Header graphic for print

Password Protected

Data Privacy & Security News and Trends

D.C. Circuit Issues Long-Awaited Decision on FCC’s 2015 TCPA Order

Posted in Litigation, Privacy

Nearly two and a half years following the appeal of the Federal Communications Commission’s (FCC) July 2015 Order, the U.S. Court of Appeals for the District of Columbia issued a ruling on March 16, 2018.  On appeal, over a dozen entities sought review of the 2015 Order, in which the FCC interpreted various aspects of the Telephone Consumer Protection Act (TCPA).  The appeal addressed four issues: (1) which devices constitute an automatic telephone dialing system (ATDS or “autodialer”); (2) whether a call to a reassigned phone number violates the TCPA; (3) whether the FCC’s approach to revocation was too broad; and (4) whether the FCC’s exemption for certain healthcare related calls was proper.

In short, the court set aside the FCC’s definition of an ATDS and vacated the FCC’s approach to calls placed to reassigned numbers.  The court upheld, however, the FCC’s broad approach to a party’s revocation of consent and sustained the scope of the FCC’s exemption for time-sensitive healthcare calls.

  1. ATDS

The FCC’s 2015 Order held that the analysis of whether equipment constitutes an ATDS is not limited to its present capacities, but also includes its “potential functionalities”—therefore having the apparent effect of encompassing ordinary smartphones. On appeal, the D.C. Circuit concluded that the FCC’s approach could not be sustained in light of the “unchallenged assumption that a call made with a device having the capacity to function as an autodialer can violate the statute even if autodialer features are not used to make the call.”  The court reasoned that if a device’s capacity includes functions that could be added through app downloads and software additions, and if smartphone apps can introduce ATDS functionality into the device, then all smartphones would meet the statutory definition of an autodialer—and therefore, the TCPA’s restrictions on autodialer calls “assume an eye popping sweep.”  Accordingly, the court found the FCC’s interpretation that all smartphones qualify as autodialers is unreasonably and impermissibly expansive.

Regarding functionality, the FCC identified a basic function of an ATDS as the ability to “dial numbers without human intervention,” but declined to clarify this point, apparently suggesting that a device might still qualify as an autodialer even if it cannot dial numbers without human intervention.  The FCC further said that another basic function of an ATDS is to dial thousands of numbers in a short period of time, but the ruling provides no additional guidance on whether that is a necessary, sufficient, or relevant condition, leaving affected parties “in a significant fog of uncertainty.”  In addressing these questions, the court found the FCC’s guidance gave no clear answer and in many ways provided contradictory interpretations. The court seemed particularly concerned with the practical implications that the FCC ruling seemingly imposed liability even if a system was not used to randomly or sequentially generate a call list, as “[a]nytime phone numbers are dialed from a set list, the database of numbers must be called in some order—either in a random or some other sequence.”  The court set aside the FCC’s ruling on what type of functionality a device must employ to qualify as an autodialer, finding that the FCC could not promote competing interpretations in the same order.

  1. Reassigned numbers and consent

If a call is made to a consenting party’s number, but that number has been reassigned to a nonconsenting party, the FCC’s 2015 Order stated that this situation violates the TCPA—except in the instance of a one-call safe harbor, which enables a caller to avoid liability for the first call to a wireless number following reassignment.  The court found that the FCC’s limitation of the safe harbor to only the first call was arbitrary, questioning why a caller’s “reasonable reliance” on the previous subscriber’s consent necessarily stops being reasonable after there has been only one call, as the first call may give the caller no indication of a possible reassignment.  The court set aside the FCC’s treatment of reassigned numbers in its entirety, finding it could not, without consequence, excise the one-call safe harbor, but leave in place the FCC’s interpretation that the “called party” refers to the current subscriber, and not the intended recipient.  This, the court found, would mean a caller is strictly liable for all calls made to the reassigned number, even without knowledge of the reassignment.

  1. Revocation of consent

The FCC, in declining to unilaterally prescribe the exclusive means for consumers to revoke their consent, instead concluded that a called party may revoke consent at any time and through any reasonable means that clearly expresses a desire to receive further messages.  In upholding the FCC’s approach to revocation, the court found that the FCC’s ruling absolves callers of any responsibility to adopt a system that would entail undue burdens, like training every retail employee on the “finer points of revocation.”  And, under this approach, callers have every incentive to avoid TCPA liability by making available clearly-defined and easy-to-use opt-out methods, therefore making a call recipient’s unconventional and idiosyncratic revocation requests unreasonable.  Finally, the court concluded that nothing in the 2015 Order “should be understood to speak to the parties’ ability to agree upon revocation procedures”—thereby leaving open the possibility of contractually specified revocation methods.

  1. Healthcare-related exemption

The final challenge concerns the scope of the FCC’s exemption of certain healthcare related calls from the TCPA’s prior-consent requirement for calls to wireless numbers.  The exemption is limited to calls that have a healthcare treatment purpose, and excludes calls related to telemarketing, solicitation, or advertising.  The court rejected the argument that any partial exemption of healthcare related communications is unlawful because HIPAA supersedes any TCPA prohibition, finding that the two statutes provide separate protections and, therefore, there is no obstacle to complying with both.  Moreover, the court found that the FCC did not act arbitrarily in affording a narrower exemption for healthcare related calls made to wireless callers, finding that the TCPA assumes the fact that residential and wireless numbers warrant different treatment.  Finally, the court rejected the argument that the FCC erred in failing to recognize that all healthcare related calls satisfy the TCPA’s “emergency purposes” exception to the consent requirement, reasoning that it is implausible to conclude that calls related to telemarketing, solicitation, or advertising are made for emergency purposes.  Therefore, the court upheld the way in which the FCC narrowly fashioned the exemption for healthcare related calls.

Without question, the long-awaited ruling will significantly impact TCPA compliance and litigation.  Stay tuned for additional analysis on the impact of the D.C. Circuit’s ruling.

U.S. Companies: Are You Ready for GDPR?

Posted in Data portability, EU Data Protection, Privacy, Regulation

On May 25, 2018, the General Data Protection Regulation (GDPR) goes into effect. Are you ready?

Who’s affected?  

Organizations, anywhere in the world, that process the personal data of European Union (EU) residents should pay attention to GDPR and its territorial scope.

If you collect personal data or behavioral information from someone in the EU (also referred to as a “data subject” in the GDPR), your company will be subject to the requirements of GDPR. The extended scope of GDPR will apply to your company even if:

  1. the processing of personal data takes place outside of the EU;
  2. no financial transaction takes place; or
  3. your company has no physical operations or employees in the EU.

The definition of “personal data” is broader than the definition of “personally identifiable information”, commonly used in U.S. information security and privacy laws.

Why should you care?

Failing to comply with GDPR may result in a maximum fine of €20,000,000 euros or 4% of global turnover, whichever is higher.

There are questions over how EU regulators will enforce these fines on companies outside of the EU. However, it would be ill-advised to underestimate the EU’s desire to create uniform data privacy laws for its market and the lengths to which regulators may go to accomplish this goal. GDPR extraterritorial enforcement mechanisms with authorities in non-EU countries is very possible.

The potential reputational damage that may result from noncompliance is also something organizations should consider. Non-EU companies, especially those with a strong online presence, should think whether action is required now to avoid the possibility of unfavorable headlines down the line.

How to mitigate risk?

  1. Conduct a Data Privacy Audit (DPA). A DPA should show you the location of data in your company and map the flows of this data. A DPA should also map your current data processing activities against the rights of data subjects which are mandated by GDPR. Examples being, the rights of data subjects to access their personal data and the right to be forgotten. The UK information commissioner’s office has provided helpful guidance on DPAs which can be accessed here.
  2. Put in place processes for deleting data.   One of the 7 principles of GDPR is data minimization. Organizations must not keep data for longer than necessary and data subjects have the right to request the deletion of the personal data that you hold about them (known as the “right to be forgotten”). If not already in place, you should establish processes for deleting personal data: (i) on request; and (ii) if its retention is no longer necessary.
  3. Re-examine consent mechanisms. Consent of the relevant data subject is the basis upon which many organizations comply with the requirements of existing EU data protection laws relating to the processing and storing of such data subject’s personal data. If this is true of your organization you should note that the requirements under GDPR for obtaining consent are more stringent. For example, if you use pre-checked opt-in boxes to gain consent, GDPR clarifies that this is not an indication of valid consent. If your current mechanisms for obtaining consent or the consents that you already have do not meet the standards set by GDPR, you should consider updating such mechanisms and seeking new consents which satisfy the requirements of GDPR.
  4. Appoint a data processing officer (DPO).   If your core activities call for either: (i) regular and systematic monitoring of data subjects on a large scale, or (ii) processing on a large scale of certain categories of data you may be required to appoint a DPO.

If you have any questions or concerns regarding GDPR compliance please email EUDataProtection@mcguirewoods.com.

Mixed Signals on The Future of SEC Cyber Enforcement

Posted in Cybersecurity, Regulation, Securities and Exchange Commission

As previously reported, the U.S. Securities and Exchange Commission (SEC) unanimously voted to approve additional guidance for reporting cybersecurity risks last month. However, it is unclear what, if any, impact the new guidance will have on the rate of SEC enforcement actions in the coming months.

According to a recent study by the NYU Pollack Center for Law & Business and Cornerstone Research, SEC enforcement actions significantly declined last year when compared with 2016. In fiscal year 2016, the SEC brought 92 enforcement actions against public companies and their subsidiaries. In fiscal year 2017, SEC enforcement declined by thirty three percent with the SEC filing 62 enforcement actions against public companies and their subsidiaries. Of the 62 enforcement actions, the SEC filed only 17 actions in the second half of fiscal year 2017. This was the largest semiannual decrease for a fiscal year since the Securities Enforcement Empirical Database (SEED) began collecting data in 2010. Similarly, the total monetary settlements declined from $1 billion over the first half of fiscal year 2017 to $196 million in the second half of the year.

The timing of the decline suggests that the Trump Administration may be reining in regulatory enforcement. However, despite the empirical slow down, Stephanie Avakian and Steven Peikin, the co-directors of the SEC’s enforcement divisions, deny that there has been any directive from the Trump Administration to slow the enforcement arm of the SEC. In fact, during the annual American Bar Association’s white collar conference, the co-directors cautioned that more enforcement actions—especially related to cybersecurity—may be on the horizon. Indeed, the SEC’s new cybersecurity guidelines coupled with the creation of the SEC Cyber Unit at the end of fiscal 2017 will give the SEC new tools to combat cyber related misconduct in 2018.

Spokeo Strikes Down Another Data Privacy Class Action

Posted in Litigation

The Supreme Court’s decision in Spokeo, Inc. v. Robins continues to have an impact on class actions involving data privacy statutes. Most recently, a federal district court dismissed yet another class action involving claims under the Fair and Accurate Credit Transactions Act (FACTA) in Kirchein v. Pet Supermarket, Inc. for lack of subject matter jurisdiction under Spokeo, on the grounds that Kirchein did not establish the injury-in-fact necessary to maintain the case in federal court.

In January 2016, Kirchein filed a putative class action in the U.S. District Court for the Southern District of Florida, alleging violations of FACTA, which prohibits printing more than the last five digits of the credit card number or expiration date on the receipt provided to the customer. FACTA provides a private right of action with statutory damages up to $1,000 for any violation. In August 2016, the court preliminarily approved a $580,000 class action settlement. In October 2017, however, the defendant moved to vacate the preliminary approval order and settlement and reopen the class on the grounds that the class was much larger than the parties anticipated. The Court denied the motion on those grounds, but gave the parties an opportunity to brief the issue of subject matter jurisdiction under Spokeo.

After considering the parties’ briefing, the Court dismissed the case on February 8, 2018 for lack of subject matter jurisdiction, finding that the mere “disclosure of the first six digits of a credit card account number” did not result in an imminent, real risk of harm under Spokeo. In doing so, the Court relied heavily on its own September 2017 decision in a case alleging similar violations of FACTA. In that case, the Court held that merely printing the digits of the credit card on a receipt was insufficient to establish standing when the plaintiff did not allege that any disclosure of his private information actually occurred. Similarly here, Kirchein failed to allege that anyone besides Kirchein himself actually saw the receipt. To the extent that Kirchein relied on store employees seeing the receipt, the Court was unconvinced, finding that to be the same type of disclosure that happened any time a consumer uses a credit card to pay for a transaction.

The Court also rejected Kirchein’s argument that the settlement was still enforceable, despite any lack of standing resulting from Spokeo. The Court noted that Spokeo was not a change in the law, but merely clarified well-established principles of standing, and emphasized that it must have subject matter jurisdiction at all stages of a case, including to approve a class action settlement agreement under Rule 23.

The decision joins those of the Seventh and Second Circuits, as well as several other district courts, which have dismissed FACTA claims for lack of standing under Spokeo. These cases continue to suggest that purely technical violations of data privacy statutes will not satisfy the injury-in-fact requirement under Article III’s standing analysis after Spokeo. Instead, plaintiffs will need to show that a violation of the statute caused harm, likely through the actual disclosure to a third party.

New York Cybersecurity Regulations: Additional Testing and Reporting Requirements Take Effect

Posted in Cybersecurity, Financial Services Information Management, Information Management, Regulation

The one-year transitional period under the New York Department of Financial Services (NYDFS) Cybersecurity Requirements for Financial Services Companies expired on March 1, 2018. Financial services companies that are regulated by NYDFS now face additional requirements for assessing, monitoring, testing and reporting on the integrity and security of their information systems and the overall effectiveness of their cybersecurity programs.

Overview of New York Cybersecurity Regulations

The NYDFS cybersecurity regulations became effective on March 1, 2017, and the initial 180-day transitional period expired on August 28, 2017. The regulations that took effect last year require all covered entities to implement a cybersecurity program that identifies and protects against cybersecurity risks and adopt comprehensive policies and procedures for the protection of the company’s information systems and nonpublic information. The cybersecurity regulations apply to any organization operating under or required to operate under a NYDFS license, registration, charter, certificate, permit, accreditation or similar authorization under the New York Banking Law, Insurance Law or Financial Services Law. Click here for more information about the requirements of the regulations that took effect last year.

Additional Actions Required to Achieve Compliance

On March 1, 2018, additional requirements under the cybersecurity regulations took effect. In addition to the requirements that took effect last year, covered entities that are subject to the cybersecurity regulations must implement the following additional cybersecurity measures: Continue Reading

Recap of the 2018 FTC Privacy Con

Posted in Consumer Privacy/FTC

On February 28, 2018, the Federal Trade Commission (FTC) hosted its third Privacy Con conference in Washington D.C., an event that highlights research and facilitates discussion of the latest research and trends related to consumer privacy and data security. The FTC welcomes privacy and data security researches to inform it of their latest findings, and encourages the dialogue between researches and policymakers to continue well after the conference. The 2018 conference was well attended by many professionals in the data privacy field, who shared the results of their studies and research in data privacy.

The Acting Chairman of the FTC, Maureen K. Ohlhausen, delivered the opening remarks at Privacy Con. Chairman Ohlhausen stated that the FTC has been and will continue to be active in the data privacy field and will continue to bring important cases. She emphasized that this year the FTC will focus on an “economic approach” to data privacy. Chairman Ohlhausen explained this approach does not necessarily require crunching numbers, but rather, will involve applying tools of economic analysis to assess the amount of resources that should be devoted to certain matters. Chairman Ohlhausen said that the FTC will try to better understand the types of injuries consumers suffer from a data breach and devote attention to data privacy cases that cause greater injuries, some of which may be personal and not economic.

Following Chairman Ohlhausen’s opening remarks, professors with technical backgrounds provided in-depth analysis regarding data privacy concerns pertaining to, among other things, email tracking, browser extensions, smart devices, web session recordings, social media advertising, interactive use, smart toys, and crowd sourcing. In short, the key takeaways from these studies are: (1) companies need to have greater transparency regarding voluntary and involuntary leaks of personal information to third parties so that consumers can take greater measures to safeguard their personal identifiable information (PII); and (2) balancing the need to inform consumers about PII leaks, with consumers’ desire to not be inundated with too many requests for permission before PII is disclosed.

With respect to the first point, the panelists identified different circumstances where a consumer’s PII is shared with third parties, which consumers may not even be aware. For example, most consumers are not aware of how intrusive web browser extensions can be, that web sessions on certain sites are recorded and sold to third parties, or that children’s smart toys may be recording conversations and posting them on social media. The panelists emphasized that it is critical for companies to disclose to consumers that their PII is disclosed to the public or third parties through these mechanisms so that they can make informed decisions regarding how to safeguard their privacy.

For the second point, the panelists described the studies they conducted regarding consumers’ privacy expectations to determine under what circumstances consumers would like to provide express permission before PII is disclosed and situations where consumers are comfortable providing implicit consent through predictive behavior and usage. The panelists found that if the information was for a beneficial purpose (such as safety) or information obtained in a public setting, consumers are comfortable disclosing their PII without providing express consent. However, if the information was obtained in a private area or was not for a beneficial purpose, consumers said that they did not want their PII disclosed unless they gave express consent. In short, the results of these studies indicate that consumers’ privacy expectations are content and context dependent.

In sum, the 2018 Privacy Con opened up a great dialogue regarding consumer expectations for data privacy, and the FTC’s focus this year on studying the types of injuries consumers can suffer from a data privacy breach.

New SEC Cybersecurity Guidance Outlines Disclosure Obligations

Posted in Data Security, Regulation, Securities and Exchange Commission

Last week, as previously reported, the U.S. Securities and Exchange Commission (SEC) unanimously voted to approve additional guidance for reporting cybersecurity risks. The release of this guidance underscores the SEC’s intent to prioritize cybersecurity compliance in 2018. The SEC may bring action against boilerplate cybersecurity disclosures that are not specifically tailored to address unique industry challenges. Companies should review and amend current policies and procedures to ensure legal compliance with the updated guidance and mitigate the risk of regulatory enforcement action. This includes companies that are subject to material cybersecurity risks but have not yet suffered a cyber-attack.

Prior SEC Cybersecurity Initiatives

Historically, the SEC has focused its cybersecurity efforts on protecting consumer information by conducting thorough risk assessments and evaluating vulnerabilities. For example, since 2014, the Office of Compliance Inspections and Examinations (OCIE) has made cybersecurity a top priority by reviewing the effectiveness of various cybersecurity programs. In 2015, the SEC announced enforcement actions against companies for lax cybersecurity policies that failed to safeguard consumer information. And in 2017 during the WannaCry Ransomware Attack, the SEC issued an alert to broker-dealers, investment advisers, and investment companies warning them and reminding them to address cybersecurity risks. Similarly, the Financial Industry Regulatory Authority (FINRA) continues to focus on cybersecurity as a top priority and recently, through its exam findings report, detailed effective cybersecurity program practices.

Cybersecurity Policies and Procedures

The release of updated guidance makes it clear that going forward the SEC will more closely examine cybersecurity risk disclosure policies and procedures and bring action against those companies that fail to comply with the guidance. In addition to expanding upon topics from the 2011 guidance, such as associated costs and the likelihood of litigation, the 2018 guidance addresses two new areas: (1) cybersecurity policies and procedures and (2) cybersecurity insider trading prohibitions. The guidance emphasizes the importance of establishing policies and procedures that manage the disclosure of “material cybersecurity risks and incidents in a timely fashion.”

The guidance states that when determining disclosure obligations, companies should avoid “generic cybersecurity-related disclosures” and consider:

  1. the potential materiality of any identified risk;
  2. the importance of any compromised information; and
  3. the impact of the incident on the company’s operations.

In order to determine the “materiality” of a cybersecurity risk, companies should analyze:

  1. the nature, extent, and potential magnitude of the risk; and
  2. the potential harm that could occur including reputational harm, financial challenges, customer and vendor relationships, as well as possible litigation or regulatory actions.

Insider Trading

Although the SEC did not mention any specific data incidents, recent breaches likely played a part in issuing new guidance. The SEC used the new guidance as a reminder to adopt policies and procedures that prevent corporate insiders from trading on material nonpublic information regarding a cyber incident before public disclosure of the incident. This is not the first time the SEC has scrutinized insider trading. In 2015 the SEC announced a $30 million settlement with Ukrainian-based Jaspen Capital Partners Limited and CEO Andriy Supranonok over allegations that they made financial gains by trading on non-public corporate news releases that were hacked from newswire services. The SEC continues focusing on insider trading in the 2018 guidance stating that when there is “selective disclosure of material nonpublic information related to cybersecurity” companies must ensure the material information is disclosed to all investors at the same time and therefore compliant with Regulation FD. The guidance goes on to state that companies should also avoid the mere appearance of improper trading that may occur “during the period following an incident and prior to the dissemination of disclosure.”

SEC Cybersecurity Certification

In addition to insider trading, the 2018 guidance states that disclosure controls and procedures should ensure that relevant cybersecurity risk and incident information is reported to management so that they may make required certifications and disclosure decisions. The inclusion of this concept is unsurprising given the 2014 speech by SEC Commissioner Luis A. Aguilar, in which he said that “ . . . ensuring the adequacy of a company’s cybersecurity measures needs to be a critical part of a board of director’s risk oversight responsibilities.” The 2018 guidance expands on that point and specifically references different disclosure certifications that executive management should consider when assessing the adequacy of procedures for identifying cybersecurity risks. For example, certifications made pursuant to the Exchange Act Rules 13a-14 and 15d-14 as well as Item 307 of Regulation S-K and Item 15(a) of Exchange Act Form 20-F are made on a quarterly and annually basis by upper management and require certification regarding the design and effectiveness of disclosure controls and procedures. When certifying cybersecurity effectiveness pursuant to the aforementioned, the guidance states that certifications and disclosures should consider:

  1. if there are sufficient controls and procedures for identifying cybersecurity risks and incidents;
  2. if there are sufficient controls and procedures for assessing and analyzing the impact of the incidents; and
  3. if cybersecurity risks or incidents threaten “a company’s ability to record, process, summarize, and report” required information, then management should determine if “there are deficiencies in disclosure controls and procedures that would render them ineffective.”

As the number of cyber-attacks has increased, so has the SEC’s interest in comprehensively regulating cyber risks. If your company has suffered a small attack that does not meet the criteria for materiality, the incident still may need to be reported to the SEC because the company may be a target for high profile hackers or state agents. Further, if your company suffers a cyber-attack of any size, the guidance states that you may need to “refresh” previous disclosures during the process of investigating a cybersecurity incident or past events. It goes on to provide that “past incidents involving suppliers, customers, competitors, and others may be relevant when crafting risk factor disclosure.” But even if your company has not suffered a cyber-attack, the SEC expects that your company has adopted and implemented written cybersecurity policies and procedures that protect consumer information, limit insider trading and properly manage cybersecurity risk disclosure.

As noted in our previous post, in contrast to the Democratic commissioners, Chairman Jay Clayton, stated that he believes the guidance will “promote clearer and more robust disclosure” and that he “urge[s] public companies to examine their controls and procedures.” For example, when disclosing significant risk factors pursuant to Regulation S-K and Form 20-F, the guidance suggests that companies should consider the following:

  1. the occurrence of prior cybersecurity incidents, including severity and frequency;
  2. the adequacy of preventative actions taken to reduce cybersecurity risks and the associated costs;
  3. the costs associated with maintaining cybersecurity protections; and
  4. existing or pending laws and regulations that may affect the requirements.

While the guidance does not specifically propose new cybersecurity regulations, it does provide a new focus for the agency as well as additional detail regarding previously articulated issues. Company counsel and executive management should closely examine their disclosures, as well as their overall cybersecurity risk disclosure policies and procedures, to determine if they are compliant with this new SEC guidance.

Employers Beware: Uptick in Privacy Litigation for Collection of Biometric Data

Posted in Data retention, Data Security

The increasingly popular use of biometric authentication technology by employers as a means of tracking employee data, including for timekeeping purposes, can create liability.  Biometric data generally consists of an individual’s physical characteristics and the associated technology used to aggregate this data. Biometric data can include fingerprints, DNA, voiceprints or facial recognition technology. This futuristic means of tracking individuals has its benefits in terms of employee time management (e.g., in lieu of a traditional punch cards), to provide access to a secure facility, or for other authentication purposes. But it also has its pitfalls.

Several states have proposed or enacted legislation protecting individuals’ privacy rights in the collection of their biometric data.  Illinois led the pack by enacting the Illinois Biometric Privacy Act (“BIPA”) in 2008, which requires businesses who collect biometric data to: (1) provide written notice to the individual of the collection; (2) inform the individual of the length of time for which the biometric identifiers are being collected, stored, and used; and (3) obtain express, written consent from the individual prior to collection.  Employers and other private entities must also exercise a reasonable standard of care in handling biometric data.

BIPA creates a private right of action for individuals aggrieved by a statutory violation, and violations can create substantial exposure to an employer, including liquidated damages, attorneys’ fees, costs, and/or injunctive relief.  Since enactment of BIPA, similar legislation has been either enacted or proposed in other states including Texas, Alaska, Connecticut, Montana, New Hampshire, and Washington.

The privacy litigation landscape – particularly in the employment context – has already seen an evolution as a result of these laws designed to protect biometric information, with an uptick in litigation between 2015 and 2017.  In one example, in October, 2017, a rehabilitation center in Illinois called Paramount of Oak Park Rehabilitation & Nursing Center LLC was slapped with a BIPA-violation lawsuit for requiring employees to scan fingerprints twice daily as a means of clocking in and clocking out.  The complaint filed in Cook County, IL calls this practice “invasive” and states:

Unlike a Social Security number, which can be changed, no amount of time or money can compensate [workers] if their fingerprints are compromised by the lax procedures through which defendants capture, collect, store and use their workers’ biometrics.

Notably, this and other lawsuits addressing this issue do not necessarily arrive at the point of challenging use of the data.  Instead, employers are facing liability at the outset for the mere collection of this data when not in compliance with statutory requirements.  Over 30 similar class action lawsuits have been filed in federal and state jurisdictions.

Case law under BIPA and other similar statutes is still developing, and employers should keep a watchful eye on trends in court’s treatment of biometric data protections, restrictions, and requirements in order to ensure compliance. In the interim, and because the cost of non-compliance is substantial, employers should be cautious in their approach to collecting, using and storing its employees’ biometric data.  Specifically, employers should:

  • Draft a written policy regarding collection and use of biometric data, including the company’s process for safeguarding the information, and destruction of data, consistent with state law. Employers should consider including a discrimination disclaimer in their policy, which should be disseminated widely, and review should be made an onboarding and training requirement.
  • Obtain express written consent and a release from each employee before collecting or using their biometric data.
  • Implement a data breach response protocol that includes biometric data and provide notice to employees that a protocol exists.

Cybersecurity: FINRA Guidance through 2018 Priorities and Recent Exam Findings

Posted in Cybersecurity, Financial Services Information Management, Information Management, Notification, Privacy

The Financial Industry Regulatory Authority (FINRA) is ramping up on their commitment to assist the industry in its cybersecurity compliance efforts. Recent guidance to the industry from FINRA includes:

  1. an Examination Findings Report, detailing observations from recent broker-dealer examinations with the goal of assisting broker-dealers in enhancing their compliance programs and better anticipating potential areas of concern (FINRA included compliance areas to highlight based on the frequency of deficiencies and the potential impact on investors and markets); and
  2. the 2018 Regulatory and Examination Priorities, in which, notably, FINRA instructed firms to review the priorities in conjunction with the Examination Findings Report.

FINRA called out cybersecurity, in its Examination Findings Report, as one of the “principal operational risks facing broker-dealers.” While acknowledging the increased threats today, FINRA noted that firms have generally increased their focus on cybersecurity issues and some firms examined are at the forefront of developing “cutting-edge cybersecurity programs.”

FINRA detailed areas in which they observed in the examinations that firms’ cybersecurity programs were either effective or deficient. Reviewing the positives and negatives provides valuable information for firms looking to shore up their cybersecurity programs.

Examples of Effective Practices Include

  • Escalation Protocols: Have an escalation process that ensures appropriate level at the firm is apprised of issues to ensure attention and resolution.
  • Plans to Resolve Issues: Implement detailed resolution steps and time frames for completion.
  • Routine Risk Assessments: Conduct regular risk assessments, including vulnerability and penetration tests.
  • Routine Training: Conduct training for firm employees, including training tailored to different functions, in addition to generic cross-firm training.
  • Branch Office Reviews: Include cybersecurity focused branch exams to assess risks and identify issues.
  • Additional Practices: Implement security information and event management practices, use system usage analytics, and adopt data loss prevention tools.

Examples of Deficient Practices Include: 

  • Failure to Follow Access Management Steps:
    • Not immediately terminating access of departing employees.
    • Failing to have processes to monitor or supervise “privileged users” to identify unusual activity (e.g., assigning extra access rights, unauthorized work outside business hours, or logging in from different geographical locations at or about the same time).
  • Infrequent or No Risk Assessments:
    • No formal risk assessment practices.
    • Unable to identify critical assets or potential risks.
  • Informal Processes for or Lack of Vendor Management:
    • Failed to have formal processes to assess vendor’s cybersecurity preparedness;
    • Failed to include required notification of breaches involving customer information in vendor contracts.
  • Noncompliant Branch Offices:
    • Failed to manage passwords.
    • Failed to implement security patches and software updates.
    • Failed to update anti-virus software.
    • Lacked control of employee use of removable storage devices.
    • Use of unencrypted data and devices.
    • Failed to report incidents.
  • Segregation of Duties:
    • Failed to segregate duties for requesting, implementing, and approving cyber-security rules and systems changes.
  • Data Loss Prevention:
    • Lack of rules to ensure all customer sensitive information is covered.
    • Permitted or failed to block large file transfers to outside or untrusted recipients.
    • Failed to implement formal change-management processes for data loss prevention systems changes.

FINRA’s 2018 Examination and Regulatory Priorities also include cybersecurity as a priority area. In addition to the areas noted above, which FINRA also calls out in the Priority Letter, FINRA noted two additional themes.  One, they will evaluate the effectiveness of firms’ cybersecurity programs in protecting sensitive information. Two, FINRA also reminds firms that they need policies and procedures to determine when a Suspicious Activity Report should be filed regarding a cybersecurity event. (See, FinCEN’s Advisory to Financial Institutions on Cyber-Events and Cyber-Enabled Crime, Oct. 25, 2016.)


FINRA reminds firms that, while exam deficiencies must be addressed, firms often benefit from “proactively” remediating issues before the exam is completed. Acting proactively strengthens firms’ programs and enhances regulatory protections. Our observation, as outside counsel, is that when firms take proactive steps to get ahead of issues, it demonstrates to the regulators that the firm has a commitment to a strong compliance program and, in the right circumstances, may have a material impact on how FINRA decides to resolve an issue.

The information FINRA provides in the Examination Report and Priorities Letter provide roadmaps to enhancing overall compliance, supervisory, and risk management programs. With regard to the focus on cybersecurity, by using this resource, firms can effectively prepare for examinations and potentially prevent program gaps and avoiding cybersecurity incidents.

France: Pragmatism and Flexibility for the GDPR Implementation

Posted in Data Protection and Competition, EU Data Protection, Legislation

The GDPR (General Data Protection Regulation) will be applicable as of May 25, 2018. The (high) level of penalties under the GDPR will become one of the core issues for companies. Indeed the GDPR is based on the European fundamental rights to privacy and data protection and could potentially apply outside the European Union.

In order to reassure companies and as a first step, the French Data Protection Authority (DPA), the CNIL, assured that the application of the GDPR in France will be flexible. This declaration was made on its website this Monday, February 19, 2018.  The CNIL also assured companies that it will provide some assistance to companies in the first months after the entry into application of the GDPR. In this way, an accompanying information guide will be published by the CNIL (co-edited with the French public investment bank) to help companies.

Finally, the CNIL assured companies that it will not sanction by any means each company that does not comply with the GDPR. The approach will be pragmatic with a distinction between the existing fundamental principles (existing under the current law) and the new requirements that need adjustments within companies.

The existing principles for which there will be no flexibility or tolerance are, for example, the obligation to process in a lawful, fair and transparent manner, the obligation to collect data for an explicit and legitimate purpose, the principles of accuracy and data retention and the principle of ensuring appropriate security when processing data. For these principles, the CNIL will control the companies and will apply the GDPR sanctions as of May 25, 2018. The CNIL announced strong verifications of company compliance with these principles.

However concerning new principles, such as the right to data portability, the requirement to nominate a Data Protection Officer (DPO) and the requirement of maintaining a record of processing activities, the goal of the first verifications will be to assist companies and help them in understanding and implementing  these new principles. The French DPA’s intention will not be to take sanctions immediately on each infringement. Indeed, if a company is acting in good faith and cooperate with the CNIL, these verifications will not lead to procedure of sanctions.

This tolerance only concerns the year 2018 at this time.

The CNIL emphasized that the GDPR will lead to the disappearance of the duty of notification to the national DPA. These notifications will be replaced by the record of processing activities and, where the processing is likely to result in a high risk, by the Data Protection Impact Assessment (DPIA).

In this way and as a first step, it will exist as a tolerance for implementing a DPIA for current processing. This tolerance will be time limited. Indeed, the GDPR will impose a reassessment of risks in a dynamic way. As a result, this DPIA will be carried out within a reasonable time of three years.

A few days before this statement, the French National Assembly adopted the draft law on personal data protection, effective on May 25, 2018.