The convergence of the General Data Protection Regulation and the investigation into Russian interference in the 2016 election has created a perfect privacy storm. Social media platforms’ complacency on this front, and the resulting public backlash, have further amplified the pressure on legislatures to react.  Although state legislatures have been quick to do so (most notably California, which passed a sweeping new privacy law in June), Congress has not.

Recently, Senator Mark Warner (D-VA) issued a draft white paper proposing 20 policy approaches to combat these issues.  The proposals seek to enhance user privacy, increase transparency, and dam the deluge of misinformation that, to date, has run through social media platforms largely unchecked.

A few of Warner’s proposals are relatively uncontroversial:  for example, he recommends endowing the FTC with rulemaking authority so that it could respond to technological changes.  He also suggests:

  • Imposing a fiduciary duty on platforms to safeguard user information;
  • Amending the Communications Decency Act to limit provider immunity under state tort laws;
  • Requiring platforms to determine the origin of posts and/or accounts and identify inauthentic accounts;
  • Improving disclosure requirements associated with online political advertisements (which are currently much less regulated than those on TV or radio);
  • Implementing at least some features of GDPR, such as the right to be forgotten, and mandatory 72-hour breach reporting;
  • Requiring platforms to share anonymized user activity data with public researchers, who could provide analysis and feedback to legislators;
  • Requiring explicit and informed user consent for collection or processing of user data (also an element of GDPR);
  • Mandating that platforms provide more granular information to consumers about the uses of their data, including a disclosure of the dollar amount the platform assigns to each user’s data; and
  • Determining by statute that “dark patterns” are unfair and deceptive trade practices. Dark patterns are website design features that  prompt users toward certain provider-favorable privacy settings (such as word choice, physical placement of options on the screen, and forcing users to click through several confusingly worded pages to select a particular setting).

As is immediately evident from this list, the goals of limiting disinformation and protecting privacy often clash.  For example, requiring platforms to determine the origin of posts and accounts would necessarily entail location tracking, and would substantially curtail users’ ability to post anonymously.  That, in turn, raises privacy concerns as well as free speech implications.  Similarly, sharing user data with public researchers presents numerous inherent data security risks, even if the data is anonymized.

Perhaps even more controversial is Warner’s assertion that, due to the increasing marginal return of user data (i.e., more data is worth more money), large social media platforms are inherently anti-competitive.  To combat this, Warner suggests imposing data portability and interoperability requirements on “dominant platforms” to facilitate new providers’ entry into the market.  The promised upside here seems to be competition for competition’s sake; the paper does not discuss how creation of new social platforms would improve the other problems that the proposals seek to solve.  In fact, Warner admits that these notions are also in tension with the goal of enhancing user privacy:  the more data changes hands, the more opportunities for interference arise.

Ultimately, this paper offers a number of useful discussion points, but also highlights just how much ground the government must cover before it can address these issues on a comprehensive basis.