The Federal Trade Commission (FTC) has been busy this April bringing, litigating and settling privacy and cybersecurity-related actions.  Below is a summary of the notable privacy and cybersecurity related actions thus far this month.

 April 7: Charged with Deceiving Customers

The FTC charged the operators of the website, (Jerk) with “operat[ing] a purported social networking website estimated to contain between 73.4 and 81.6 million unique consumer profiles. These profiles included ‘Post a Jerk’”– a feature that allowed visitors to vote on consumers as “jerk” or “not a jerk.”  The FTC claims (and Jerk denies) that contained profiles for consumers of all ages, including children under sixteen years-old.

The complaint alleges that profiles on Jerk:

contained a comment field for users to write comments about the profiled subject. Some profiles included comments such as “Omg I hate this kid he’s such a loser . . . Nobody in their right mind would love you . . . not even your parents love [you].” 

According to Bureau of Consumer Affairs Director, Jessica Rich, this complaint arises, in part, out of a concern for consumers’ reputations:

“In today’s interconnected world, people are especially concerned about their reputation online, and this deceptive scheme was a brazen attempt to exploit those concerns,” said Jessica Rich, Director of the FTC’s Bureau of Consumer Protection.

Jerk also allegedly created barriers to having one’s image and information removed from the site. When consumers contacted Jerk to have their images taken down, Jerk allegedly required them to pay a $25 fee.

The FTC also claims that Jerk wrongfully “created or caused to be created the vast majority of Jerk profiles using information from Facebook,” including photos marked as “private” by the Facebook user.  This alleged unconsented-to-use of Facebook photos allegedly belies a Jerk written representation that its profiles were user-created.

The FTC claims that Jerk’s representations about where it got information about its profiles were deceptive. It also alleges that Jerk violated Child Online Privacy Protection Act (COPPA) by posting photos of, and gathering data from, children under sixteen without parental consent. 

 To view a copy of the In re complaint, click here. 

April 9: If it Looks Like a Credit Reporting Agency and Talks Like a Credit Reporting Agency, it Must Follow the Rules for Credit Reporting Agencies…

Two on-line data brokers, Instant Checkmate Inc. and InfoTrack,  agreed to settle with the FTC on charges that they violated the Fair Credit Reporting Act (FCRA) by providing credit reports on consumers to users such as prospective employers and landlords without taking reasonable steps to ensure the accuracy of such reports or ensuring that the receiving entity had a permissible reason to use such reports.  The FTC complained that the reports wrongfully identified innocent job applicants as potential sex offenders.

Consumers shouldn’t have to worry that they’ll be turned down for a job or an apartment because of false information in a consumer report, said Jessica Rich, Director of FTC’s Bureau of Consumer Protection.   

 Click here to read FTC’s entire press release about the settlements.

 April 11: Is Big Data Unfair to the Underserved?

The question of whether “big data” analytics fail to account for the poor, minorities and other under-served communities is an important one.   The FTC is now taking this question to business and consumers during a public workshop scheduled for September 15, 2014.  The agency hopes the workshop will address the following issues, according to Chairwoman Ramirez:

 A growing number of companies are using big data analytics techniques to categorize consumers and make predictions about their behavior….[O]ur workshop will examine the potentially positive and negative effects of big data on low income and underserved populations.”

 The FTC says these big data practices are appropriate:

  • Customer rewards programs
  • Offering different prices or discounts to consumers based on neutral data (e.g. a financial institution can base a credit decision on balances in savings, checking or on a credit card).
  • To tailor advertising for financial products (e.g., to wealthier customers)
  •  To assess credit risks of particular populations (e.g, aggregate scoring models that assess credit risks based on the credit characteristics of groups of consumers who shop at certain stores)

 According to the FTC, these uses of big data “create efficiencies, lower costs, and improve the ability of certain populations” to access services.  The questions asked at the workshop will include:

  •  How are companies using big data?
  •  What are the consumer benefits?
  •  What are the organizational benefits?
  •  How do existing laws apply to such practices?
  •  Are companies assessing the impact of big data on low in

 To see the full FTC press release click here.