Across the country, school districts use technology to facilitate learning and assist in classroom management. From tracking grades and communicating with parents to monitoring bathroom breaks, technology is everywhere in our schools. But as technology becomes more prevalent in the classroom, what does that mean for student data privacy?

Federal Laws Governing Student Data Privacy

There are several federal laws that govern student data privacy. The Family Educational Rights and Privacy Act (FERPA) protects student educational records and requires the consent of parents or students age 18 or older to consent to the release of education records. The Protection of Pupil Rights Amendment (PPRA) requires parental consent for any federally funded student survey or evaluation that requires the student to provide sensitive information. Lastly, the Children’s Online Privacy Protection Act (COPPA) regulates companies collecting data about kids under the age of thirteen. Under the law, educational products may not require parental consent, and instead, schools can consent on behalf of parents. Importantly, the Federal Trade Commission (FTC) is considering updating COPPA’s regulations. The FTC requested comments on the rule in July and held a workshop in October.


Continue Reading

In less than one month, the California Consumer Privacy Act of 2018 (CCPA) will go into effect and begin a new era of data breach litigation. While the California Attorney General is charged with generally enforcing the state’s landmark privacy law, consumers’ ability to rely on a violation of the CCPA as a basis for violations of other state law statutes will be a concern.

For background, Section 1798.150(a)(1) of the CCPA gives consumers a limited private right of action. The provision allows consumers to sue businesses that fail to maintain reasonable security procedures and practices to protect “nonencrypted or nonredacted personal information” of a consumer and further fail to cure the breach within 30 days. A violation of this data security provision allows recovery of statutory damages of $100 to $750 per consumer per incident or actual damages, whichever is greater, as well as injunctive relief. To determine the appropriate amount of statutory damages, courts must analyze the circumstances of the case, including the number of violations, the nature, seriousness, willfulness, pattern, and length of the misconduct, and the defendant’s assets, liabilities, and net worth.


Continue Reading

This week, the California Attorney General held public hearings on the draft California Consumer Privacy Act (CCPA) regulations it issued in October.  We attended the hearings in both Los Angeles and San Francisco.  One clear message resounded — unintended consequences of the proposed regulations if left as drafted.

Both hearings were well-attended, with dozens of comments from businesspeople, attorneys, and a handful of concerned citizens.  In addition to these two hearings, the Attorney General also held public hearings in Sacramento and Fresno, and is accepting written comments through Friday, December 6, 2019.  If the Los Angeles and San Francisco hearings are any indication, there are many areas in which the Attorney General could provide further clarity should it choose to revise the current draft regulations.


Continue Reading

On October 31, a bipartisan group of senators introduced the Filter Bubble Transparency Act (FBTA), an act which would require large online platforms to be more transparent in their use of algorithms driven by user-specific data.

“This legislation is about transparency and consumer control,” said Senator John Thune (R-S.D.).

“For free markets to work as effectively and as efficiently as possible, consumers need as much information as possible, including a better understanding of how internet platforms use artificial intelligence and opaque algorithms to make inferences.”

The bill is named after Eli Pariser’s book The Filter Bubble, which argues that the personalized search results generated by user-specific data can trap users in an ideological bubble by filtering out content contrary to their ideological viewpoints.


Continue Reading

A recent letter from researchers at the Mayo Clinic to the editor of The New England Journal of Medicine outlined a new challenge in de-identifying, or preserving the de-identified nature of, research and medical records.[1]  The Mayo Clinic researchers described their successful use of commercially available facial recognition software to match the digitally reconstructed images of research subjects’ faces from cranial magnetic resonance imaging (“MRI”) scans with photographs of the subjects.[2]  MRI scans, often considered non-identifiable once metadata (e.g., names and other scan identifiers) are removed, are frequently made publicly available in published studies and databases.  For example, administrators of a national study called the Alzheimer’s Disease Neuroimaging Initiative estimate other researchers have downloaded millions of MRI scans collected in connection with their study.[3]  The Mayo Clinic researchers assert that the digitally reconstructed facial images, paired with individuals’ photographs, could allow the linkage of other private information associated with the scans (e.g., cognitive scores, genetic data, biomarkers, other imaging results and participation in certain studies or trials) to these now-identifiable individuals.[4]

Continue Reading

In one of this year’s largest HIPAA settlements, the U.S. Department of Health and Human Services Office for Civil Rights (OCR) is set to collect $3 million from the University of Rochester Medical Center (URMC). This settlement over potential violations of the Privacy and Security Rules under HIPAA also requires URMC to follow a corrective action plan that includes two years of HIPAA compliance monitoring by OCR.
Continue Reading

The EU-US Privacy Shield (Privacy Shield) has passed its third annual review by the European Commission. A framework constructed by the US Department of Commerce and the European Commission to enable transfers of personal data for commercial purposes, the Privacy Shield enables companies from the EU and the US to comply with data protection requirements when transferring personal data from the EU to the US.

The Privacy Shield was approved by the European Commission on 12 July 2016, and was subject to annual reviews to try and avoid failures that resulted in the downfall of the Safe Harbor Principles, which it replaced. The reviews evaluate all aspects of the functioning of the Privacy Shield framework.
Continue Reading

The U.S. Department of Health and Human Services Office for Civil Rights (OCR) has collected over $2.15 million in civil penalties from Miami-based Jackson Health System (JHS) for multiple violations of the Security and Breach Notification Rules under HIPAA. JHS is a nonprofit academic medical system that serves approximately 650,000 patients a year in six major hospitals and a network of affiliated healthcare facilities. This is the first publicized imposition of civil monetary penalties under HIPAA in recent years, in contrast to the many publicized settlements of alleged violations, indicating that JHS’ violations were severe.
Continue Reading

Recent headlines have detailed foreign-state actors targeting utilities and independent power producers in the United States to gain access to critical infrastructure at the nation’s utilities and military installations.[1]  Cybersecurity practices within the independent power industry vary widely depending on the asset type and the operator’s sophistication.  Despite this risk, purchase agreements and credit agreements for renewable energy facilities do not typically address compliance with cybersecurity standards.  Generic representations and covenants relating to compliance with law or maintenance of project assets in compliance with prudent industry practices inadequately protect acquirers and lenders from cybersecurity risks.  The overwhelming majority of renewable power projects are considered low impact under NERC’s Critical Infrastructure Protection standards and, thus, not subject to significant regulation.[2]

Continue Reading