“[P]rivacy legislation should have some kind of safe harbor provision in it so that companies understand that if they take certain steps, what they are doing is consistent with the law.”  Karen Zacharia, Chief Privacy Officer at Verizon

The California Consumer Privacy Act (CCPA) provides unparalleled rights for California residents with regard to data privacy.  The CCPA contains an expansive definition of “personal information” and establishes completely new data privacy entitlements for California consumers, including rights to access, delete and opt-out of the sale of personal information.  In addition, the CCPA provides new statutory damages and consumer private rights of action in the event of a data breach.


Continue Reading

On February 7, 2020, the California Attorney General (AG) published a set of Modified Regulations under the California Consumer Privacy Act (CCPA).  The Modified Regulations take into account some of the comments received from the public late last year and make key changes to multiple definitions and provisions, in at least some cases providing more clarity and specificity than the original version.  The regulatory process is not yet done—the AG is accepting written public comments on the Modified Regulations until February 24, 2020—but it is unlikely there will be many more substantial revisions from this point forward.  It also now seems possible that we will see final Regulations in advance of the July 1, 2020 deadline.  The last step in the process is the AG’s submission of the final rulemaking record for approval by the CA Office of Administrative Law (OAL), which has 30 working days to approve the record before filing of the final Regulations with the Secretary of State.

Continue Reading

Last week a committee of the Virginia House of Delegates voted to send several privacy-related bills to a legislative commission for study after the current legislative session. Among those bills is the Virginia Privacy Act, proposed as a less onerous version of the California Consumer Privacy Act. Other bills referred for study address topics such as requirements for the destruction of records, online advertising and digital services directed to minors, and safe keeping of biometric data.

The Communications, Technology and Innovation Committee voted to “continue” the these privacy-related bills and directed the chairman of the committee to request the Joint Commission on Technology and Science (JCOTS) to study the legislation in advance of the 2021 legislative session. JCOTS consists of 13 legislators and its purpose is to evaluate emerging technology and science with the goal of promoting the development of sound public policies on those topics.


Continue Reading

On January 8, 2020, the Virginia General Assembly will begin its 60 calendar day legislative session. Legislation relating to privacy will be on the agenda, including HB 473, titled the “Virginia Privacy Act,” that proposes to strengthen the data privacy rights of Virginians.

Scope of the Proposed Legislation

The provisions of the legislation apply to “any legal entity (i) that conducts business in the Commonwealth or produces products or services that are intentionally targeted to residents of the Commonwealth and (ii) that (1) controls or processes personal data of not fewer than 100,000 consumers; or (2) derives over 50 percent of gross revenue from the sale of personal data and processes or controls personal data of not fewer than 25,000 customers.” The bill has exceptions to its scope applicable to, among others, local and state governments, credit reporting agencies and financial institutions governed by other privacy laws, and also exempts certain health care related information governed by federal law and employment records.

The legislation focuses on the responsibilities of data controllers, who are primarily responsible for complying with the provisions of the legislation, and data processors, who must adhere to the instructions of the controller and assist a controller in meeting the requirements of the proposed act.


Continue Reading

For years, we have waited with bated breath the arrival of the “Internet of Things” (IoT) to transform garages into smart factories, cars into autonomous vehicles and ordinary homes into smart homes completely controllable by cellphones. Two technologies underpinning this world of the future (inexpensive sensors and 5G networking) will catalyze this vision in 2020. Gartner predicts that connected devices will rise from 8.4B in 2017 to 20.4B in 2020. While the hurdles for this vision are many (increased regulation, privacy concerns, and the trade war, which may bifurcate the IoT due to geopolitical disputes regarding 5G), the McKinsey Global Institute estimates that IoT technologies will create between $3.9T and $11.1T in economic value globally by 2025. Those interested in capitalizing on this world of the future should be mindful of the legal framework of the future (and near present).

Continue Reading

Across the country, school districts use technology to facilitate learning and assist in classroom management. From tracking grades and communicating with parents to monitoring bathroom breaks, technology is everywhere in our schools. But as technology becomes more prevalent in the classroom, what does that mean for student data privacy?

Federal Laws Governing Student Data Privacy

There are several federal laws that govern student data privacy. The Family Educational Rights and Privacy Act (FERPA) protects student educational records and requires the consent of parents or students age 18 or older to consent to the release of education records. The Protection of Pupil Rights Amendment (PPRA) requires parental consent for any federally funded student survey or evaluation that requires the student to provide sensitive information. Lastly, the Children’s Online Privacy Protection Act (COPPA) regulates companies collecting data about kids under the age of thirteen. Under the law, educational products may not require parental consent, and instead, schools can consent on behalf of parents. Importantly, the Federal Trade Commission (FTC) is considering updating COPPA’s regulations. The FTC requested comments on the rule in July and held a workshop in October.


Continue Reading

In less than one month, the California Consumer Privacy Act of 2018 (CCPA) will go into effect and begin a new era of data breach litigation. While the California Attorney General is charged with generally enforcing the state’s landmark privacy law, consumers’ ability to rely on a violation of the CCPA as a basis for violations of other state law statutes will be a concern.

For background, Section 1798.150(a)(1) of the CCPA gives consumers a limited private right of action. The provision allows consumers to sue businesses that fail to maintain reasonable security procedures and practices to protect “nonencrypted or nonredacted personal information” of a consumer and further fail to cure the breach within 30 days. A violation of this data security provision allows recovery of statutory damages of $100 to $750 per consumer per incident or actual damages, whichever is greater, as well as injunctive relief. To determine the appropriate amount of statutory damages, courts must analyze the circumstances of the case, including the number of violations, the nature, seriousness, willfulness, pattern, and length of the misconduct, and the defendant’s assets, liabilities, and net worth.


Continue Reading

This week, the California Attorney General held public hearings on the draft California Consumer Privacy Act (CCPA) regulations it issued in October.  We attended the hearings in both Los Angeles and San Francisco.  One clear message resounded — unintended consequences of the proposed regulations if left as drafted.

Both hearings were well-attended, with dozens of comments from businesspeople, attorneys, and a handful of concerned citizens.  In addition to these two hearings, the Attorney General also held public hearings in Sacramento and Fresno, and is accepting written comments through Friday, December 6, 2019.  If the Los Angeles and San Francisco hearings are any indication, there are many areas in which the Attorney General could provide further clarity should it choose to revise the current draft regulations.


Continue Reading

On October 31, a bipartisan group of senators introduced the Filter Bubble Transparency Act (FBTA), an act which would require large online platforms to be more transparent in their use of algorithms driven by user-specific data.

“This legislation is about transparency and consumer control,” said Senator John Thune (R-S.D.).

“For free markets to work as effectively and as efficiently as possible, consumers need as much information as possible, including a better understanding of how internet platforms use artificial intelligence and opaque algorithms to make inferences.”

The bill is named after Eli Pariser’s book The Filter Bubble, which argues that the personalized search results generated by user-specific data can trap users in an ideological bubble by filtering out content contrary to their ideological viewpoints.


Continue Reading

A recent letter from researchers at the Mayo Clinic to the editor of The New England Journal of Medicine outlined a new challenge in de-identifying, or preserving the de-identified nature of, research and medical records.[1]  The Mayo Clinic researchers described their successful use of commercially available facial recognition software to match the digitally reconstructed images of research subjects’ faces from cranial magnetic resonance imaging (“MRI”) scans with photographs of the subjects.[2]  MRI scans, often considered non-identifiable once metadata (e.g., names and other scan identifiers) are removed, are frequently made publicly available in published studies and databases.  For example, administrators of a national study called the Alzheimer’s Disease Neuroimaging Initiative estimate other researchers have downloaded millions of MRI scans collected in connection with their study.[3]  The Mayo Clinic researchers assert that the digitally reconstructed facial images, paired with individuals’ photographs, could allow the linkage of other private information associated with the scans (e.g., cognitive scores, genetic data, biomarkers, other imaging results and participation in certain studies or trials) to these now-identifiable individuals.[4]

Continue Reading