This week, the California Attorney General held public hearings on the draft California Consumer Privacy Act (CCPA) regulations it issued in October.  We attended the hearings in both Los Angeles and San Francisco.  One clear message resounded — unintended consequences of the proposed regulations if left as drafted.

Both hearings were well-attended, with dozens of comments from businesspeople, attorneys, and a handful of concerned citizens.  In addition to these two hearings, the Attorney General also held public hearings in Sacramento and Fresno, and is accepting written comments through Friday, December 6, 2019.  If the Los Angeles and San Francisco hearings are any indication, there are many areas in which the Attorney General could provide further clarity should it choose to revise the current draft regulations.


Continue Reading

On October 31, a bipartisan group of senators introduced the Filter Bubble Transparency Act (FBTA), an act which would require large online platforms to be more transparent in their use of algorithms driven by user-specific data.

“This legislation is about transparency and consumer control,” said Senator John Thune (R-S.D.).

“For free markets to work as effectively and as efficiently as possible, consumers need as much information as possible, including a better understanding of how internet platforms use artificial intelligence and opaque algorithms to make inferences.”

The bill is named after Eli Pariser’s book The Filter Bubble, which argues that the personalized search results generated by user-specific data can trap users in an ideological bubble by filtering out content contrary to their ideological viewpoints.


Continue Reading

A recent letter from researchers at the Mayo Clinic to the editor of The New England Journal of Medicine outlined a new challenge in de-identifying, or preserving the de-identified nature of, research and medical records.[1]  The Mayo Clinic researchers described their successful use of commercially available facial recognition software to match the digitally reconstructed images of research subjects’ faces from cranial magnetic resonance imaging (“MRI”) scans with photographs of the subjects.[2]  MRI scans, often considered non-identifiable once metadata (e.g., names and other scan identifiers) are removed, are frequently made publicly available in published studies and databases.  For example, administrators of a national study called the Alzheimer’s Disease Neuroimaging Initiative estimate other researchers have downloaded millions of MRI scans collected in connection with their study.[3]  The Mayo Clinic researchers assert that the digitally reconstructed facial images, paired with individuals’ photographs, could allow the linkage of other private information associated with the scans (e.g., cognitive scores, genetic data, biomarkers, other imaging results and participation in certain studies or trials) to these now-identifiable individuals.[4]

Continue Reading

In one of this year’s largest HIPAA settlements, the U.S. Department of Health and Human Services Office for Civil Rights (OCR) is set to collect $3 million from the University of Rochester Medical Center (URMC). This settlement over potential violations of the Privacy and Security Rules under HIPAA also requires URMC to follow a corrective action plan that includes two years of HIPAA compliance monitoring by OCR.
Continue Reading

Late last week heralded two significant and highly anticipated updates to the California Consumer Privacy Act (CCPA).

On October 10, 2019, the Office of the California Attorney General issued a long-anticipated Notice of Proposed Rulemaking Action regarding the CCPA.  The full text of the proposed regulations can be found here.  The next day, Governor Gavin Newsom signed all seven amendments to the CCPA that came out of the California State Assembly.

This post will address the statutory amendments first since they modify the CCPA itself, then turn to the draft regulations (officially, the “California Consumer Privacy Act Regulations”).
Continue Reading

Welcome to a three-part series that provides an overview of the California Invasion of Privacy Act (CIPA), examines recent CIPA litigation involving smart speakers, and proposes defenses in response to an alleged violation.

CIPA in the Age of Smart Devices

The California Invasion of Privacy Act (CIPA)[1]—traditionally used by law enforcement and the plaintiffs’ bar to address illegal recording/eavesdropping on phone calls—has seen renewed interest in the age of smart speakers. Smart speakers, such as Amazon’s Alexa, Google Home and Apple’s Siri, are voice-enabled devices where the user utters a “wake word” to activate a “virtual assistant”.  A number of putative class actions have recently been filed over these “virtual assistants” and whether they illegally record individuals without their consent.  This recent spate of lawsuits highlights CIPA-compliance risks associated with these new technologies. This article provides an overview of CIPA’s history and features, addresses recently filed CIPA smart-device cases, and recommends defenses for responding to a smart device CIPA action.
Continue Reading

Social media posts have become so common and reflexive that people often fire off posts without appropriately considering the consequences.  This can be costly on multiple fronts.  In the health care context, beyond the risk of losing patients (and the revenue they bring), inappropriate posts can result in Health Insurance Portability and Accountability Act (HIPAA) violations.  Indeed, as the Director of the U.S. Department of Health and Human Services Office for Civil Rights (OCR) has stated, “Social media is not the place for providers to discuss a patient’s care… [doctors] and dentists must think carefully about patient privacy before responding to online reviews.”  Of course, this warning is not limited to dentists; all health care providers should take heed. 
Continue Reading

As discussed here, the California Consumer Privacy Act of 2018 (CCPA), in its current state, likely applies to businesses that collect the personal information of their employees.  AB 25, which passed in the California Assembly on May 29, 2019, sought to address this issue by removing employees and job applicants from the CCPA’s definition