On November 9, 2020 the FTC entered into a consent agreement with Zoom Video Communications, Inc. to address concerns over the videoconferencing platform’s security practices. With the onset of the COVID-19 pandemic, the need for a reliable, online videoconferencing and meeting platform skyrocketed. Zoom met that need. It advertised its platform as a secure space with various safety measures to protect user data, including “end-to-end” 256-bit encryption. In short order, individuals, businesses, and organizations quickly flocked to the user-friendly communications platform; and, by the end of April 2020 Zoom’s user base was booming.

Then came a backlash of sorts. The FTC began investigating Zoom’s security practices, and private plaintiffs brought class-action lawsuits alleging violations of the California Consumer Privacy Act and failure to adhere to Zoom’s terms of service. The FTC’s complaint alleged several concerns with Zoom’s advertising and security promises, concluding that Zoom made misleading claims about the strength of its encryption and security of its platform that gave customers a false sense of security. The five-count complaint alleged that Zoom:Continue Reading FTC “Zooms” Into Settlement Agreement with Communications Company Over Concerns with its Security Practices

Monetary penalties are the attention-grabbing headline when the FTC or any regulator brings an enforcement action against a company.  They are the looming threat to incentivize and influence compliance.  Over the summer, FTC Chairman Joseph J. Simons (“Chairman Simons”) issued a statement in connection with a settlement that Chairman Simons believes “the goal of a civil penalty should be to make compliance more attractive than violation.  Said another way, violation should not be more profitable than compliance.”
Continue Reading FTC Fines: FTC Chairman Reminds Companies That Fines Are the FTC’s Strategic Tool To Deter Noncompliance

On October 31, a bipartisan group of senators introduced the Filter Bubble Transparency Act (FBTA), an act which would require large online platforms to be more transparent in their use of algorithms driven by user-specific data.

“This legislation is about transparency and consumer control,” said Senator John Thune (R-S.D.).

“For free markets to work as effectively and as efficiently as possible, consumers need as much information as possible, including a better understanding of how internet platforms use artificial intelligence and opaque algorithms to make inferences.”

The bill is named after Eli Pariser’s book The Filter Bubble, which argues that the personalized search results generated by user-specific data can trap users in an ideological bubble by filtering out content contrary to their ideological viewpoints.Continue Reading Bipartisan Bill Seeks to Increase Internet Transparency and Consumer Control Over Content