Header graphic for print

Password Protected

Data Privacy & Security News and Trends

Abandoned Mines and Data Retention Policies: Time to Clear the Explosives Out of the Shed

Posted in Data retention

Visit the remains of a nineteenth century gold mine and you may find, far away from the main camp, a lonely little shack. It may be surrounded by trees, or tucked into the crook of a hillside, but it has a distinctly unwelcoming demeanor, as if the building itself wishes to be left alone.  This is the explosives shed, and when the mine was operational, it would have been marked by signs with large letters, secured with a lock, and it would have been off limits to all but a few personnel.  The need to secure this essential component of the business was obvious; the shed’s contents could shut down the mining operation with a single misstep.  Equally obvious is that the shed now stands empty, because removing old explosives the only 100% effective way to prevent a future disaster.

There is a modern equivalent lurking in the data storage facilities of nearly every business. And, while a misstep in the storage of confidential data may not cause immediate physical harm, it still has the ability to shut down the business.  As a result, modern companies spend substantial sums ensuring that confidential records are isolated, secured, and generally off limits to all but a few personnel.

In the past two decades, as electronic means of communication supplanted physical ones, lawyers and courts became more attuned to the needs to, first, retain and not destroy records, and, second, to secure retained company records from unauthorized access and disclosure.  With the 2018 application (in the European Union) of the General Data Protection Regulation (GDPR) on the near horizon, companies should consider whether there is in the United States not merely a duty to take reasonable precautions to secure data, but an affirmative duty to destroy data as well – to clear the explosives out of the shed, so to speak.

The GDPR contains express obligations to destroy certain forms of data (see, e.g., Preamble 39, 65-66; Articles 5(1)(e), 17) (sometimes called “the right to be forgotten”).  As Article 17 provides, a company “shall have the obligation to erase personal data without undue delay where…the personal data are no longer necessary in relation to the purposes for which they were collected.” As of today, there is no U.S. statutory analog to the GDPR.  But a U.S. company may nevertheless have an obligation to destroy domestic records.  Such a duty may arise from contract, from discrete existing state or federal laws, or, importantly, from the need to ensure an adequate defense to unfair business practice and consumer protection claims.

In Part Two of this post, we will examine the affirmative legal obligations to destroy data under U.S. law.

Business as Usual: UK’s New Data Protection Bill and the GDPR

Posted in EU Data Protection, Legislation, Privacy

The UK Government will introduce a new Data Protection Bill (the “Bill”) this year. As highlighted in the Queen’s speech back in June, the Government has committed to introduce the new law and, on Monday, published a Statement of Intent.

The Bill will not change the position that the EU’s new data protection legislation – the General Data Protection Regulation (GDPR) – will bring when it comes into force on 25 May 2018. The UK will still be in the EU at that time and so the GDPR will be automatically transposed into English law and replace the UK’s current Data Protection Act. However, when the UK leaves the EU and is no longer subject to the GDPR, the Bill when then implement the GDPR into English law. The importance of this is two-fold; it will support the UK’s position with regard to preserving personal data flows between the UK, EU and other countries around the world, and gives UK businesses clarity about their data protection obligations following Brexit.

The Bill will also introduce the national member state derogations that are permitted under the GDPR. The Government asked for feedback (Call for Views) on how the UK should deal with these exemptions earlier this year. The Statement of Intent provides some detail on the Government’s proposed approach, which include:

  • Enabling children aged 13 year or older to consent to their personal data being processed (under the GDPR the age for valid consent is 16 unless member states reduce this through national law);
  • Maintaining the UK’s position on processing personal data relating to criminal convictions and other sensitive personal data (enabling employers to carry out criminal background checks in certain circumstances);
  • Enabling organisations to carry out automated decision making for certain legitimate functions (e.g. credit reference checks);
  • Maintaining the UK’s current position with regard to the processing of personal data in relation to freedom of expression in the media, research and archiving.

Two new criminal offences will also be created. An offence of intentionally or recklessly re-identifying individuals from anonymised or pseudonymised data, and an offence of altering records with intent to prevent disclosure following a subject access request. Both offences will be subject to an unlimited fine.

The Bill will also implement the EU’s new Data Protection Law Enforcement Directive (DPLED) in English law. The DPLED sits alongside the GDPR and deals with processing of personal data by the police, prosecutors and other agencies involved in law enforcement. However, unlike the GDPR, the DPLED is an EU Directive (not a Regulation) and so must be implemented into member state law through national legislation by 6 May 2018.

The draft text of the Bill is due to be published and put before Parliament in early September. The Bill will be largely identical in effect to the GDPR. In light of the increased fines imposed by the GDPR (up to €20,000,000 (£17,000,000) or 4 per cent of an organisation’s global annual turnover, whichever is higher), companies should still be continuing with their GDPR compliance efforts to ensure adherence to the new law by 25 May 2018.

Is It Too Late for a Uniform State Approach to Data Breach Notification?

Posted in Data breach, Legislation, Notification

Privacy professionals have long lamented the myriad of approaches each state takes when it comes to data breach notification requirements. According to the National Conference of State Legislatures, 48 states, the District of Columbia, Guam, Puerto Rico and the Virgin Islands have enacted legislation requiring covered companies to make certain notifications to affected consumers and specified regulators when a security breach of personally identifiable information occurs.

When so many states have “left the barn,” can they be corralled into one consistent regulatory scheme? That is the question the Uniform Law Commission plans to delve into over the coming year.  At its recent annual meeting in San Diego, California, the executive committee of the Commission approved a study committee to assess whether it is desirable for the Commission to draft a uniform act governing data breach notification requirements.

What is the ULC?

Created in 1892, the Uniform Law Commission develops and drafts uniform legislation for consideration by state legislatures. Past work of the Commission has produced legislation such as the Uniform Commercial Code and the Uniform Electronic Transactions Act.  The Commission is comprised of several hundred judges, law professors, legislative staff, legislators and attorneys in private practice (“Commissioners”).  Each state, the District of Columbia, Puerto Rico and the U.S. Virgin Islands appoint Commissioners.  I serve as a Commissioner for Virginia and attended the annual meeting in San Diego.

Charge of the Study Committee

The study committee will evaluate “the need for and feasibility of state legislation on data breach notification including consideration of what sorts of personal information should be protected; to whom, when and how notice should be provided and the contents of the notice.” At this time the committee is not authorized “to consider remedies for injury caused by a data breach.”

Why Now?

It was acknowledged by the Commission’s Scope and Program Committee that 48 states have enacted some type of breach notification statute. Often the Commission will not propose a uniform law if a significant number of states have already enacted laws on the subject matter.  Given the various approaches by the States, the lack of uniformity and the emerging importance of privacy issues, the committee determined that there was value in assessing whether a uniform approach might be desirable and attainable notwithstanding that virtually every state has acted in this area.

What Happens Next?

Within the next two months, the President of the Commission will appoint members of the study committee. The committee will begin its work and report its findings to the Scope and Program Committee of the Commission.  If the study committee decides that uniformity in state law is desirable in this area it may recommend that the Commission’s Executive Committee authorize the study committee to being the process of drafting a proposed uniform act that would eventually be provided to the States for consideration and adoption.

It is anticipated that the study committee will seek the input of the National Association of Attorneys General (“NAAG”) on this project. One of the rationales put forth for the Commission to undertake this project was the potential to work jointly with NAAG.  Support of state attorneys general will be important to the overall success of this project.  The study committee will solicit input from a broad array of stakeholders over the next year.

Implications for the Privacy Professional?

The time period for study and then possible drafting of a uniform law may take anywhere from one to three years due to the process followed by the Commission. The study period usually takes a year prior to determining whether to proceed to a drafting committee.  A proposed uniform act is usually debated, revised and further considered for a minimum of two years before it is finalized and sent to the States for consideration.  At that point, the Commissioners of each state are asked to seek introduction of the legislation in their state and to advocate for its passage.

While it is too early to gauge the success of a uniform law on this subject at state legislatures, it is fair to say that the prospects for success will be significantly impacted by whether state attorneys general are onboard with any proposed changes. Consumer protection is a primary focus of every state attorney general and any change to their authority without their support will impact the likelihood of widespread adoption of a uniform approach to this topic.

Short term, this development does not impact a company’s response to a data breach. Long term, if the effort is successful it has the possibility to lower compliance costs when a breach occurs and notification is required.  If there is any possibility of divining a path to a uniform approach, the Commission appears to be the body that has the track record to lead the States to that end.


Computer Viruses Have Evolved: Have Your Antivirus Contract Warranties Kept Up?

Posted in Cybersecurity, Data breach, Data Protection and Competition

By many accounts, 2017 is the 35th anniversary of widely propagating computer viruses. The recent “WannaCry” and “NotPetya” ransomware outbreaks demonstrate that computer viruses (or more broadly, “malware”) are still evolving, developing, and posing new threats. But IT contracts don’t move at the same pace. Contract provisions that address computer virus risk have become commonplace in form contracts for software and cloud computing, and in the long list of representations and warranties applicable to M&A transactions, but those provisions have evolved little since their introduction.

Standard contract language may not meet current needs. It is time to review, update and revise contract considerations for computer malware.

Although the concepts behind computer viruses can be traced back decades earlier, it was a 1982 event that is generally credited as starting today’s line of malware. That year, a 15-year-old from Pittsburgh, Pennsylvania launched the “Elk Cloner” virus, which propagated through the sharing of floppy disks on Apple II computers. Since then, the means of propagating viruses has expanded to exploit modern network computer systems and the effects have evolved from simple prank messages to become a threat to daily business operations, information security and even business continuity.

“WannaCry” and “NotPetya” are not unique but are powerful examples of several of the ways in which malware has evolved. Both of these viruses utilized a flaw in an operating system sub-component that provided a path to get past antivirus defenses. Importantly, that vulnerability had been detected and a corrected version made available so that up to date and patched computers were well defended, but so many computers had not been patched that the outbreak was still quite damaging.

There are many reasons why a particular computer system might not have been patched. While it could be that there was an oversight, there are also many reasons for a failure to patch that are independent of any negligence or malfeasance of the owner of the computer:

  1. Unsupported software.  It may be that the vulnerable software was contained in an older version of a third party system (for example, a computer operating system) that the vendor stopped supporting.  While the vendor might provide a fix for the latest version, older versions would remain vulnerable. In the WannyCry case, a key software provider did break its own protocol and provide an update for the old software that remained in use, but that was an exceedingly rare occurrence.
  2. Compatibility with unsupported software. Closely related to the issue above, it often happens that a company builds additional software that is dependent on a particular version of an operating system.  The company would then be unable to update the operating system without having to re-engineer their custom software.  This leaves the combined system exposed to the vulnerability in the older underlying operating system.
  3. Embedded software unable to update.  Software may be built into industrial or medical equipment (or “internet of things” devices) that is not simply designed to promptly receive updates. In the WannyCry case, expensive hospital equipment, such as MRI scanners, were afflicted, likely for this reason.
  4. Unmanaged equipment. In certain equipment, even if there is a technical mechanism by which software updates can be made, the equipment may be managed by non-IT staff so that knowledge of the requirement and the skills to carry out the update may not be brought to bear.
  5. Lack of resources. Even where none of the above concerns apply, it can often be the case that the necessary resources are not simply not available to an organization or that an organization does not apply the necessary resources to carry out necessary security updates to software in a timely manner.

The failure to patch and the occurrence of these other factors are not always addressed by the common sort of antivirus warranty seen in many IT and transactional contracts. Common contract language focuses on the status of IT systems, that is, whether or not a virus is present, and focuses on whether steps are taken to avoid the introduction of viruses. Common language generally does not address the resilience of a system to withstand the introduction of a virus or other malware.  It is also unusual for the scope of the language to encompass all the smart devices in an enterprise.

To address these broader concerns, it is time to update typical anti-malware language to address the broader risks of un-supported software and software known to be vulnerable, both within and outside of the IT department. Here is a checklist of topics to cover in modernized antivirus warranties:

  1. Absence of any systems that are dependent on software that no longer has appropriate security updates available.
  2. Absence of any systems that are engineered to depend on software such that future security updates are unable to be applied.
  3. Processes in place and carried out to apply all necessary software updates.
  4. Scope of warranty expanded to include all systems that may be vulnerable, whether or not they are an IT component.

DOJ Takes Down AlphaBay, the World’s Largest Dark Web Marketplace

Posted in Cybersecurity, Identity Theft

The U.S. Department of Justice has announced the seizure of AlphaBay, the largest criminal marketplace on the Internet, which was used to sell stolen financial information, identification documents and other personal data, computer hacking tools, drugs, firearms, and a vast number of other illegal good and services throughout the world.

AlphaBay was the largest dark web market with estimated annual sales of hundreds of thousands of dollars, which made it nearly ten times the size of the infamous Silk Road dark web marketplace that was shut down by the government in 2013. AlphaBay operated as a hidden service on The Onion Router (Tor) network, which hid the locations of its underlying servers and the identities of its administrators, moderators, and users.  Its user interface was configured like a conventional e-commerce website, where vendors could sell illegal goods or services in exchange for paying a percentage of the transaction as a commission to AlphaBay.

AlphaBay had a dedicated section of the website where users could purchase stolen credit cards and financial information, as well as stolen personal identifying information (PII) – even offering specific search controls to allow potential buyers to search the listings by location (city, state and country), social security number, birth year, credit limit, PIN number, seller, seller rating, price, and more.

The international operation to seize AlphaBay’s infrastructure was led by the United States and involved cooperation with law enforcement authorities in Thailand, the Netherlands, Lithuania, Canada, the United Kingdom, and France, as well as the European law enforcement agency Europol. On July 5, Alexandre Cazes, a Canadian citizen residing in Thailand, was arrested by Thai authorities on behalf of the United States for his alleged role as the creator and administrator of AlphaBay.  On July 12, Cazes apparently took his own life while in custody in Thailand.

The Federal Bureau of Investigation (FBI) and the Drug Enforcement Administration (DEA) have seized millions of dollars’ worth of cryptocurrencies that represent the proceeds of AlphaBay’s illegal activities, including at least 1,943 Bitcoin, 8,669 Ethereum, 3,691 Zcash, and 11,993 Monero. Cazes and his wife had also amassed numerous other high value assets, including luxury vehicles, residences and a hotel in Thailand.

Prior to its takedown, there were over 250,000 listings for illegal drugs and toxic chemicals on AlphaBay, and over 100,000 listings for stolen and fraudulent identification documents and access devices, counterfeit goods, malware and other computer hacking tools, firearms and fraudulent services. Comparatively, the Silk Road dark web marketplace reportedly had approximately 14,000 listings for illicit goods and services at the time of seizure in 2013 and was the largest dark web marketplace at the time. These numbers indicate that the use of dark web marketplaces for illegal commerce will only continue to grow, despite the closure of AlphaBay.

In his public remarks regarding the seizure of AlphaBay, Attorney General Jeff Sessions stated, “This is likely one of the most important criminal case of the year. Make no mistake, the forces of law and justice face a new challenge from the criminals and transnational criminal organizations who think they can commit their crimes with impunity by ‘going dark.’ This case, pursued by dedicated agents and prosecutors, says you are not safe.  You cannot hide. We will find you, dismantle your organization and network.  And we will prosecute you.”

New Guidance Issued by EU Data Protection Regulators – Does Your Organization Use Social Media During Recruitment?

Posted in EU Data Protection, Privacy, Regulation

The Article 29 Data Protection Working Party (comprising representatives from the data protection regulators in each EU Member State, the European Data Protection Supervisor and the European Commission) has issued an opinion on data processing at work (2/2017) (the Opinion).  The Opinion is not legally binding but it does provide an indication as to how EU data protection regulators will consider and interpret EU data protection law.  The new EU data protection law (the General Data Protection Regulation – or the GDPR) comes into force on 25 May 2018 and will impose significant fines on non-compliant organizations (up to 4% of annual worldwide turnover or €20 million, whichever is higher) in addition to giving individuals more rights with regard to their personal data.  The GDPR does not only apply to EU companies, but can also apply to non-EU based organizations processing EU citizens’ personal data.

The Opinion notes that in light of the increasing amount of personal data that is being processed in the context of an employment relationship, the balance between the legitimate interests of the employer and the privacy rights of the employee becomes ever more important. It provides guidance on a number of specific scenarios including the use of social media during recruitment. Nowadays, employers may be tempted to view job applicants’ social media profiles as part of the recruitments process. However, according to the Opinion, employers may only use social media to find out information about a job applicant where: (a) they have a “legal ground” for doing so; (b) doing so is necessary and relevant for the performance of the position being applied for; (c) the applicant has been informed that their social media profiles will be reviewed; and (d) the employer complies with all of the data protection principles set out in the law.

What steps should your organization take if it wishes to review social media profiles as part of the recruitment process while also complying with the Opinion and EU data protection law? Continue Reading

Huge Relief From eClinicalWorks Decision Not to Hold Customers Liable For Its Vendor’s Actions, But Providers Should Not Drop Their Guard

Posted in Health Information, Regulation

There are inherent risks in any vendor relationship. In the healthcare industry, with myriad regulatory pitfalls, the stakes can be even higher. Several customers of the cloud-based electronic health record (EHR) software vendor eClinicalWorks were relieved by a recent decision in which regulators decided not to take action against them as a result of the alleged wrong-doing of eClinicalWorks. While this decision offers a huge sigh of relief, it should not be seen as an open invitation to adopt a lax approach to vendor engagements.

eClinicalWorks recently agreed to pay $155 million and enter into a five-year Corporate Integrity Agreement to settle allegations that it violated the federal False Claims Act by concealing information indicating that its EHR software failed to meet certain certification requirements from its certifying entity. Such requirements are necessary for eClinicalWorks to meet the “Meaningful Use” standard for EHR under the federal HITECH Act.

Under the HITECH Act, providers can receive incentives for using certified EHR. Providers participating in the Meaningful Use program must attest to the certification of their EHR software in order to qualify for the grants. The United States Department of Justice claimed that eClinicalWorks caused its customers to submit false claims for federal incentive payments tied to the Meaningful Use of EHR when they relied on the improper certification of eClinicalWorks.

In response to the eClinicalWorks settlement, the Centers for Medicare and Medicaid Services (CMS) stated that it would not take action against eClinicalWorks customers who had otherwise acted in good faith with respect to eClinicalWork’s technology. The settlement and, more specifically, CMS’ reaction to it, highlights CMS’ position that providers that may reasonably rely on the representations of their software vendors for accuracy of reporting. CMS further indicated that it does not plan to audit eClinicalWorks customers based on the settlement.

Although CMS’ statement certainly relieves some pressure from healthcare providers who contract with third parties, it is important to note that this settlement is a single situation, and the regulators may take a different approach in the future based on different facts. Furthermore, the Office for Civil Rights (OCR), which is responsible for HIPAA compliance, has not issued an opinion on this topic, and CMS has not published formal guidance to support this position more broadly.

Despite the fact that HIPAA does not (currently) require auditing or any form of specific monitoring of business associates, some form of oversight and/or vendor vetting is often appropriate and may significantly help to reduce the risk of liability if there is a breach or some other issue with the business associate vendor.

Finally, providers cannot ignore issues if they learn of them—regardless of how issues are discovered. Indeed, healthcare providers remain responsible for taking corrective action (including making any necessary disclosures) when they become aware of any HIPAA and HITECH violations by their business associates.

Former Employee Need Not Allege Emails Were Unopened to Assert Claim of Unauthorized Access Under Stored Communications Act

Posted in Data retention, Privacy

Earlier this month, a federal court denied an employer’s motion to dismiss a claim that it violated the Stored Communications Act (SCA) by accessing a former employee’s personal emails, concluding that the plaintiff need not allege the emails were unopened at the time of the alleged unauthorized access. Levin v. ImpactOffice LLC, No. TDC-16-2790 (D. Md. July 10, 2017).

Defendant ImpactOffice LLC (Impact), which supplies office products and services, collected the plaintiff’s company-issued cell phone after she resigned. Id. at *1.  She had previously deleted all emails stored on the phone, including personal emails from her Gmail account. Id. The plaintiff later filed suit in the District of Maryland, seeking a declaratory judgment that the restrictive covenants in her employment agreement are unenforceable and asserting a claim for unauthorized access of her personal emails under the SCA. Id. at *1-2.

According to the complaint, Impact accessed—and forwarded to its own attorney—a number of these personal emails, which were still stored on Google servers, including emails sent and received after the plaintiff resigned and emails between the plaintiff and her attorney. Id. at *1.

The SCA is violated when a person “intentionally accesses without authorization a facility through which an electronic communication service is provided . . . and thereby obtains, alters, or prevents authorized access to a wire or electronic communication while it is in electronic storage in such system.” 18 U.S.C. § 2701(a).  The SCA defines “electronic storage” as “(A) any temporary, intermediate storage of a wire or electronic communication incidental to the electronic transmission thereof; and (B) any storage of such communication by an electronic communication service for purposes of backup protection of such communication.” Id. § 2711(1) (incorporating definitions in 18 U.S.C. § 2510).

In its motion to dismiss, Impact asserted that because the plaintiff did not allege that the emails were unopened at the time of its alleged access, she had not sufficiently alleged that the emails were in “electronic storage” under the SCA. Levin, No. TDC-16-2790 (D. Md. July 10, 2017), at *2.

The court first agreed with Impact’s interpretation of “temporary, intermediate storage” under Part (A) of the definition, citing First, Third, Fourth, and Ninth Circuit precedent, observing that Part (A) is “generally understood to cover email messages that are stored on a server before they have been delivered to, or retrieved by, the recipient.” Id. at *3.

However, the court ultimately concluded that, at this stage, the plaintiff need not “specifically allege that the emails at issue were unopened at the time” of Impact’s alleged unauthorized access due in part to the “fact-intensive” nature of the question. Id. at *4.  Continue Reading

Law Firms’ Data Duty: Protecting Client Information From Cybercriminals

Posted in Cyber Insurance, Cybersecurity, Data breach, Data Security, Health Information, Information Management, Litigation, Other, Privacy

The impact from the recent Petya/NotPetya ransomware attack — or what was reported as a ransomware attack but now appears to be something even more damaging — continues to spread around the globe, with several new companies coming forward as victims, including a prominent law firm.

This attack acts as an unfortunate reminder that the Internet of Things, along with our dependence on technology, has created a host of new legal and ethical challenges for attorneys. Chief among them is the duty owed to clients to keep their information secure.

Put simply, cyberattacks against law firms are a rapidly growing problem that we must collectively work to manage. And we need to do a better job of it. The 2016 ABA TECHREPORT indicated that, overall:

  • 21 percent of law firms reported having no data security policy;
  • Under 20 percent reported having an incident response plan;
  • 37 percent of firms reported downtime or loss of billable hours after a breach;
  • Only 17 percent of attorneys reported they have cyber coverage; and
  • Only 18 percent of law firms reported they have had a full security assessment.

The Threat

Cyberattacks against law firms have only just begun. The cybercriminals executing these attacks understand that law firms are the white whale of cyber victims. Client information is highly confidential and highly lucrative to cybercriminals. The financial and personally identifiable information that an individual company keeps for business operations is nothing compared to the treasure trove of sensitive data law firms maintain on behalf of their hundreds, or even thousands, of clients. Further, law firms possess data that, if stolen, would provide cybercriminals the information necessary to engage in a variety of nefarious activities, such as insider trading, intellectual property theft and corporate espionage.

Law firms are vulnerable to attack in several ways — via mobile devices, home networks, spear phishing, business email compromise and failure to install security patches, to name a few. The vigilant execution of advanced defenses against vulnerabilities must remain a priority.

In addition to securing the network, a host of legal and regulatory challenges continue to evolve and demand constant analysis. Aside from the more well-known regulations — the Health Insurance Portability and Accountability Act, the Gramm-Leach-Bliley Act, EU’s General Data Protection Regulation, and the Telephone Consumer Protection Act — federal and state agencies regularly promulgate and enforce new standards that must be met. This legal regime is further complicated by emerging American Bar Association and state ethical obligations.

Despite continued best efforts to safeguard client information, law firms remain at risk of attack by hackers and those who find opportunity in law firms’ cybersecurity failings. The industry recently found itself targeted by plaintiffs’ attorneys who exploit data breaches by claiming law firms failed to take reasonable steps to maintain data security. Thus, in addition to the cyberthreat itself, the looming threat of class action lawsuits must be considered as law firms develop and implement data security practices.

Our Response

As with every incident, the McGuireWoods data privacy and security team monitors the Petya/NotPetya attack as it develops and we stand ready to assist anyone affected. We provide solutions across industries — including solutions for law firms and colleagues in the legal profession.

In our experience, few businesses maintain an incident response plan that adequately addresses the decision points and considerations presented by distributed ransomware or other advanced threats, or have policies and procedures in place to ensure legal, regulatory and ethical compliance. We can help.

We have publicly offered some preventative measures that firms can take immediately. But we can also provide insight into our internal data privacy and security practices and how we use those practices to protect our clients’ most sensitive information (e.g., enforcing encryption for data at rest and in transit, performing regular security awareness training, using data loss protection functionality, conducting security audits, and aligning our information security plan with the firm’s strategic plan).

Our clients trust us with their most valuable information. They deserve the highest level of data security protection. No law firm is immune to the sophisticated threats today’s cybercriminals develop and propagate, but implementing cybersecurity programs and incident response plans now can significantly reduce the risk of breach, improve response protocols and mitigate financial and reputational loss.

The Toys Have Eyes (and Ears): FTC Updates COPPA Guidance for Internet of Things

Posted in Consumer Privacy/FTC, Cybersecurity, Data breach

The FTC has updated its Children’s Online Privacy Protection Rule (COPPA) Six-Step Compliance Plan for Your Business “to reflect developments in the marketplace” – including the introduction of internet-connected toys and the Internet of Things.

COPPA applies to operators of commercial websites and online services directed to children under 13 that collect, use, or disclose personal information from children, and operators of general audience websites or online services with actual knowledge that they are collecting, using, or disclosing personal information from children under 13. The primary goal of COPPA is to place parents in control over what information operators of websites collect from their young children on the Internet.

In its updated COPPA Compliance Plan, the FTC cautions that COPPA applies not only to websites and mobile apps, but also “to the growing list of connected devices that make up the Internet of Things.” These devices include connected toys and other products intended for children that collect personal information, such as voice recordings or geolocation data. The updated COPPA Compliance Plan also discusses two recently-approved methods for obtaining parental consent:

The FTC issued its updated guidance on COPPA less than a month after receiving a letter from U.S. Sen. Mark R. Warner (D-VA) concerning the agency’s efforts to protect children’s privacy following several high-profile instances of children’s data allegedly being hacked through internet-connected “smart toys.” According to multiple media reports, CloudPets, a product line marketed as “a message you can hug,” stored customers’ personal data in an insecure, public-facing online database. CloudPets reportedly exposed over 800,000 customer credentials and more than two million voice recordings sent between parents and children. Subsequent reports raised questions about security at the device level, with individuals able to hack CloudPets’ toys and remotely control the devices, including the microphone, if they are within Bluetooth range. Sen. Warner also inquired about FTC action in relation to the children’s doll “My Friend Cayla.” In December 2016, privacy advocates filed a complaint with the FTC regarding the doll and raised concerns that it can be used for unauthorized surveillance. In February 2017, Germany’s equivalent of the FTC pulled “My Friend Cayla” off the market due to concerns over the doll’s surveillance capabilities.

Companies should consider how new ways of collecting data, such as voice-activated devices that collect personal information from children, may subject them to obligations under COPPA. The FTC’s guidance also serves as a general reminder to all business to consider how new ways of collecting data from consumers – children and adults alike – may impact their compliance obligations under applicable privacy regulations.