On January 8, 2020, the Virginia General Assembly will begin its 60 calendar day legislative session. Legislation relating to privacy will be on the agenda, including HB 473, titled the “Virginia Privacy Act,” that proposes to strengthen the data privacy rights of Virginians.

Scope of the Proposed Legislation

The provisions of the legislation apply to “any legal entity (i) that conducts business in the Commonwealth or produces products or services that are intentionally targeted to residents of the Commonwealth and (ii) that (1) controls or processes personal data of not fewer than 100,000 consumers; or (2) derives over 50 percent of gross revenue from the sale of personal data and processes or controls personal data of not fewer than 25,000 customers.” The bill has exceptions to its scope applicable to, among others, local and state governments, credit reporting agencies and financial institutions governed by other privacy laws, and also exempts certain health care related information governed by federal law and employment records.

The legislation focuses on the responsibilities of data controllers, who are primarily responsible for complying with the provisions of the legislation, and data processors, who must adhere to the instructions of the controller and assist a controller in meeting the requirements of the proposed act.


Continue Reading

For years, we have waited with bated breath the arrival of the “Internet of Things” (IoT) to transform garages into smart factories, cars into autonomous vehicles and ordinary homes into smart homes completely controllable by cellphones. Two technologies underpinning this world of the future (inexpensive sensors and 5G networking) will catalyze this vision in 2020. Gartner predicts that connected devices will rise from 8.4B in 2017 to 20.4B in 2020. While the hurdles for this vision are many (increased regulation, privacy concerns, and the trade war, which may bifurcate the IoT due to geopolitical disputes regarding 5G), the McKinsey Global Institute estimates that IoT technologies will create between $3.9T and $11.1T in economic value globally by 2025. Those interested in capitalizing on this world of the future should be mindful of the legal framework of the future (and near present).

Continue Reading

Across the country, school districts use technology to facilitate learning and assist in classroom management. From tracking grades and communicating with parents to monitoring bathroom breaks, technology is everywhere in our schools. But as technology becomes more prevalent in the classroom, what does that mean for student data privacy?

Federal Laws Governing Student Data Privacy

There are several federal laws that govern student data privacy. The Family Educational Rights and Privacy Act (FERPA) protects student educational records and requires the consent of parents or students age 18 or older to consent to the release of education records. The Protection of Pupil Rights Amendment (PPRA) requires parental consent for any federally funded student survey or evaluation that requires the student to provide sensitive information. Lastly, the Children’s Online Privacy Protection Act (COPPA) regulates companies collecting data about kids under the age of thirteen. Under the law, educational products may not require parental consent, and instead, schools can consent on behalf of parents. Importantly, the Federal Trade Commission (FTC) is considering updating COPPA’s regulations. The FTC requested comments on the rule in July and held a workshop in October.


Continue Reading

In less than one month, the California Consumer Privacy Act of 2018 (CCPA) will go into effect and begin a new era of data breach litigation. While the California Attorney General is charged with generally enforcing the state’s landmark privacy law, consumers’ ability to rely on a violation of the CCPA as a basis for violations of other state law statutes will be a concern.

For background, Section 1798.150(a)(1) of the CCPA gives consumers a limited private right of action. The provision allows consumers to sue businesses that fail to maintain reasonable security procedures and practices to protect “nonencrypted or nonredacted personal information” of a consumer and further fail to cure the breach within 30 days. A violation of this data security provision allows recovery of statutory damages of $100 to $750 per consumer per incident or actual damages, whichever is greater, as well as injunctive relief. To determine the appropriate amount of statutory damages, courts must analyze the circumstances of the case, including the number of violations, the nature, seriousness, willfulness, pattern, and length of the misconduct, and the defendant’s assets, liabilities, and net worth.


Continue Reading

This week, the California Attorney General held public hearings on the draft California Consumer Privacy Act (CCPA) regulations it issued in October.  We attended the hearings in both Los Angeles and San Francisco.  One clear message resounded — unintended consequences of the proposed regulations if left as drafted.

Both hearings were well-attended, with dozens of comments from businesspeople, attorneys, and a handful of concerned citizens.  In addition to these two hearings, the Attorney General also held public hearings in Sacramento and Fresno, and is accepting written comments through Friday, December 6, 2019.  If the Los Angeles and San Francisco hearings are any indication, there are many areas in which the Attorney General could provide further clarity should it choose to revise the current draft regulations.


Continue Reading

On October 31, a bipartisan group of senators introduced the Filter Bubble Transparency Act (FBTA), an act which would require large online platforms to be more transparent in their use of algorithms driven by user-specific data.

“This legislation is about transparency and consumer control,” said Senator John Thune (R-S.D.).

“For free markets to work as effectively and as efficiently as possible, consumers need as much information as possible, including a better understanding of how internet platforms use artificial intelligence and opaque algorithms to make inferences.”

The bill is named after Eli Pariser’s book The Filter Bubble, which argues that the personalized search results generated by user-specific data can trap users in an ideological bubble by filtering out content contrary to their ideological viewpoints.


Continue Reading

A recent letter from researchers at the Mayo Clinic to the editor of The New England Journal of Medicine outlined a new challenge in de-identifying, or preserving the de-identified nature of, research and medical records.[1]  The Mayo Clinic researchers described their successful use of commercially available facial recognition software to match the digitally reconstructed images of research subjects’ faces from cranial magnetic resonance imaging (“MRI”) scans with photographs of the subjects.[2]  MRI scans, often considered non-identifiable once metadata (e.g., names and other scan identifiers) are removed, are frequently made publicly available in published studies and databases.  For example, administrators of a national study called the Alzheimer’s Disease Neuroimaging Initiative estimate other researchers have downloaded millions of MRI scans collected in connection with their study.[3]  The Mayo Clinic researchers assert that the digitally reconstructed facial images, paired with individuals’ photographs, could allow the linkage of other private information associated with the scans (e.g., cognitive scores, genetic data, biomarkers, other imaging results and participation in certain studies or trials) to these now-identifiable individuals.[4]

Continue Reading

In one of this year’s largest HIPAA settlements, the U.S. Department of Health and Human Services Office for Civil Rights (OCR) is set to collect $3 million from the University of Rochester Medical Center (URMC). This settlement over potential violations of the Privacy and Security Rules under HIPAA also requires URMC to follow a corrective action plan that includes two years of HIPAA compliance monitoring by OCR.
Continue Reading

Late last week heralded two significant and highly anticipated updates to the California Consumer Privacy Act (CCPA).

On October 10, 2019, the Office of the California Attorney General issued a long-anticipated Notice of Proposed Rulemaking Action regarding the CCPA.  The full text of the proposed regulations can be found here.  The next day, Governor Gavin Newsom signed all seven amendments to the CCPA that came out of the California State Assembly.

This post will address the statutory amendments first since they modify the CCPA itself, then turn to the draft regulations (officially, the “California Consumer Privacy Act Regulations”).
Continue Reading