On October 31, a bipartisan group of senators introduced the Filter Bubble Transparency Act (FBTA), an act which would require large online platforms to be more transparent in their use of algorithms driven by user-specific data.

“This legislation is about transparency and consumer control,” said Senator John Thune (R-S.D.).

“For free markets to work as effectively and as efficiently as possible, consumers need as much information as possible, including a better understanding of how internet platforms use artificial intelligence and opaque algorithms to make inferences.”

The bill is named after Eli Pariser’s book The Filter Bubble, which argues that the personalized search results generated by user-specific data can trap users in an ideological bubble by filtering out content contrary to their ideological viewpoints.

The Filter Bubble Transparency Act

The proposed legislation prohibits the use of opaque algorithms on covered internet platforms without first providing users with notice. The notice provided must be clear and conspicuous and must be shown when a user first interacts with the opaque algorithm. Additionally, in conjunction with the notice the platform must provide users the option of choosing an unfiltered version of the platform that runs on input-transparent algorithms. These prohibitions would go into effect one year after the enactment of the FBTA.

Opaque v. Input-Transparent Algorithms

Algorithmic ranking systems (such as ranking of search results, content recommendations, or the display of social media posts) can be divided into “opaque algorithms” and “input-transparent algorithms.” An “opaque algorithm” is defined as an algorithmic ranking system that determines the order or manner that information is furnished to a user on a covered internet platform based on user-specific data that was not expressly provided by the user to the platform for such purpose. Under the FBTA, expressly provided user data includes:

  • User-supplied search terms, filters and speech patterns, saved preferences, and the user’s current geographical location, and
  • Data supplied by the user that expresses the user’s desire that information be furnished to them, such as social media content a user follows or subscribes to.

Expressly provided user data does not include:

  • History of the user’s connected device, including the user’s history of web searches and browsing, geographical locations, physical activity, device interaction and financial transactions, and
  • Inferences about the user or the user’s connected device, without regard to whether such inferences are based on data expressly provided by the user.

An “input-transparent algorithm,” on the other hand, is an “algorithmic ranking system that does not use user-specific data of a user to determine the order or manner that information is furnished to such user on a covered platform,” unless such data is expressly provided for such purpose by the user. A current real world example of an input-transparent algorithm is Twitter’s “sparkle button” that allows users to see the latest tweets first rather than seeing Twitter’s algorithmically chosen top tweets.

Covered Internet Platform

The FBTA applies to “covered internet platforms” which includes “any public-facing website, internet application, or mobile application.” The bill does not, however, apply to platforms that are operated for the sole purpose of conducting research that is not made for direct or indirect profit or a platform that is wholly owned, controlled and operated by a person that:

  • Did not employ more than 500 employees in the past six months,
  • Averaged less than $50 million in annual gross receipts, and
  • Annually collects or processes personal data of less than a million individuals.


The FBTA would be enforced with civil penalties by the Federal Trade Commission under the Federal Trade Commission Act. The bill does not contain any private right of action or grant state attorney generals the right to bring civil suits for violations of the FBTA.


The FBTA is the latest in the recent trend of increased scrutiny of popular internet platforms, especially popular social media platforms, and how such platforms leverage user data to provide services to their customers. It is especially noteworthy that the FBTA targets larger companies in the same manner as the Algorithmic Accountability Act by only applying to those companies that make over $50 million or more annually and involve the data of at least one million individuals. While it is not certain if the FBTA will become law, it is strong evidence that there is growing support in Congress to regulate the conduct of popular internet platforms.